New from our crew, and the latest @latimes effort to chronicle California's historic vice president.

By our very own @thomas06037, @maloym and @stiles
The data we've assembled is powering our reporting, like this analysis by @Noahbierman and @stiles.

latimes.com/politics/story…
@Noahbierman @stiles It's all made possible by a custom computer program, developed by our team.

It parses, logs and classifies the schedules emailed out each day by the White House.
Today we are making that data available to researchers anywhere via @github. It will update automatically.

github.com/datadesk/kamal…
@github The new schedules database joins our poll tracker as the @datagraphics contribution to @Kimbriell's "Kamala Beat" coverage.

latimes.com/projects/kamal…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Ben Welsh

Ben Welsh Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @palewire

26 Oct
This promotion is the capstone to a complete, fundamental change in how the @latimes makes the charts and maps that accompany stories.

The effort involved dozens of people all across the newsroom. Let me tell you a little more about it. 🧵
@latimes When we started, there was a clear consensus change was necessary.

1️⃣ Every chart was produced separately for web and print, often by different people. 2️⃣ Web charts weren't responsive. 3️⃣ Copy edits were done at night on paper. 4️⃣ We staffed until the print deadline year round
@latimes Those practices all had their time and place, and the output was indeed high quality.

But the urgent need to grow our digital audience called out for investing in more ambitious efforts like visual stories and news applications.

Streamlining this process was an obvious way in.
Read 28 tweets
25 Oct
I am pleased to announce that @thomas06037 has been promoted to serve as the @datagraphics assistant editor for graphics. 📈

latimes.com/about/pressrel…
@thomas06037 @datagraphics In this newly created role, Thomas will:

📊 Oversee our overhauled system for creating charts and maps to enrich day-to-day news coverage

⛓️Direct improvements in how graphics integrate with our growing array of digital publishing platforms
@thomas06037 @datagraphics Thomas joined the @latimes in 2000. In the time since, authored hundreds of graphics, created data applications and coordinated work for some of our most ambitious projects.

latimes.com/people/thomas-…
Read 6 tweets
25 Oct
Día de Muertos is almost here.

To celebrate, the @latimes has created a communal digital altar.

Upload a photo and share an ofrenda for a loved one.

By our very own @VanessaMartinez

latimes.com/projects/contr…
@latimes @VanessaMartinez The page will grow to include as many reader tributes as we receive.

It's been seeded by 10 offerings posted by @latimes staff, including my memorial to my oma.

latimes.com/projects/contr…
I didn't appreciate Day of the Dead until I moved to Los Angeles.

The range of artistry that can appear is incredible, and the recent boom in using altars as a platform for activism, showcased by @SHG1970, proves the form's flexibility.
Read 6 tweets
10 Sep
Our open-source archive of California COVID data has expanded again.

A nerdy new milestone: 100% of our web scrapers are running free and open via @github's action framework.

That's 80+ routines, written by a half-dozen team members, running in unity.

github.com/datadesk/calif…
@github GitHub's action system, which came into its own during the pandemic, has changed the web scraping game.

This approach allows us to store, script and schedule our data-gathering systems. For free.
@github Here's how it works:

1️⃣ Scrapers repo acquires data.

2️⃣ Private repo pulls it in, processes and publishes tracker. Also via actions, and (almost) entirely automated.

3️⃣ Processed files are pushed to open-source data archive for reuse.

github.com/datadesk/calif…
Read 6 tweets
8 Oct 19
Alright, nerds

I filed a FOIA appeal and won the infamous NROL-39 surveillance satellite logo as a PDF.

github.com/palewire/nrol-…

This ain't a scanned powerpoint. This is a resizable vector. You know what you must do. Unleash the swag.
One can never be sure, but I think the FOIA officer wanted to make this one happen.
What do you dorks think? Should @caseymmiller be assigned to make the next item in her nerdleisure line?

Read 4 tweets
29 Apr 18
Two weeks ago new @latimes owner @drpatsoonshiong said he will move the newsroom to El Segundo.

The next day I led an off-the-cuff @Twitter tour of the historic #DTLA HQ being left behind. More than 650,000 people followed along.

Here it is, in one post palewi.re/posts/2018/04/…
Reposting the material to my personal blog allows me to better archive the material. And clean up the many typos I littered on the tour via my smartphone keyboard.

I continue to be blown away by the response. I did not expect anyone to care when I started.
Since then I've continued to document as much of the @latimes building as I can before we go.

This thread catalogs the eccentric fixtures, knobs, lights and buttons around the building.
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(