@irisslee@latimes They are now all scheduled to run scrapers as @github Actions, making the latest data available to all in a standard format via live, free URLs.
Here's an example of how easy this new technique makes it.
@irisslee@latimes@github The packages here are integrated into a larger, private repository where we merge, refine and republish the data for the public on our live wildfires map.
@irisslee@latimes@github While GitHub Actions don't run frequently (and reliably) enough to power some rapidly updating commercial products, we believe they are a powerful new model for easy, low cost scraping in journalism.
@irisslee@latimes@github Another example is our coronavirus tracker, which is powered by dozens of data scrapers. 100% of them run free and open on GitHub.
@latimes When we started, there was a clear consensus change was necessary.
1️⃣ Every chart was produced separately for web and print, often by different people. 2️⃣ Web charts weren't responsive. 3️⃣ Copy edits were done at night on paper. 4️⃣ We staffed until the print deadline year round
@latimes Those practices all had their time and place, and the output was indeed high quality.
But the urgent need to grow our digital audience called out for investing in more ambitious efforts like visual stories and news applications.
📊 Oversee our overhauled system for creating charts and maps to enrich day-to-day news coverage
⛓️Direct improvements in how graphics integrate with our growing array of digital publishing platforms
@thomas06037@datagraphics Thomas joined the @latimes in 2000. In the time since, authored hundreds of graphics, created data applications and coordinated work for some of our most ambitious projects.
I didn't appreciate Day of the Dead until I moved to Los Angeles.
The range of artistry that can appear is incredible, and the recent boom in using altars as a platform for activism, showcased by @SHG1970, proves the form's flexibility.