Some background on these pretty SVGs you see on JoeIsDone. For Florida, the only map I used was a public county-level SVG, but after the election, my colleague and I decided to go crazy and code an automated shapefile-to-SVG pipeline. Like so.
The primary reason you don’t see many precinct-level maps is that the software which makes maps is... well, for map makers. If the map looks good, it’s fine. Never mind that the embedded data within is a complete mess and can’t be interacted with programatically.
So most of these shapefiles might look good on ArcGIS but have incomplete/corrupt/missing data internally that makes coding impossible.

The SVG exports from ESRI look like crap, plus the metadata required to look up precinct IDs, etc aren’t embedded within (AFAIK).
The field for dynamic geospatial data analysis might as well as be in the dark ages.

What about these fancy zoom maps you see with Leaflet etc? Well, they’re just raster tiles with visual layers and you can drop in pins. Not really meant for dynamic analysis JoeIsDone style.
Yeah, so, we did the impossible and created a state-of-art shapefile-to-SVG pipeline which automates much of the geometry cleanup (some manual effort is still required). No, you can’t see the code, but if you want to throw obscene amounts of money for this tech, let’s talk.
Anyway. Bringing back to tonight’s topic. One of the challenges we have is that this data can be very, very detailed. A single county can be 20MB with each precinct having thousands of points. If you want an unified product containing every US state precinct... lol.
You could do county by county breakdowns to mitigate the , file sizes and that’s exactly what we did for a while, but the problem is you miss the surrounding contexts.
For example, if you remember Lott‘s viral hypothesis, he proposed you could find statistical evidence of fraud by looking *only* at precincts bordering Fulton County (both internal or external). So this is a screencap of the custom SVG we generated.

I know, we’re awesome.
Anyway, this leads me down the mad scientist of dynamically altering the SVG files themselves. As in you use JavaScript to change the points and resolutions client-side to generate new analyses. Like meta-querying SVGs.

I concluded I have to learn computational geometry...

• • •

Missing some Tweet in this thread? You can try to force a refresh

Keep Current with DataRepublican on gab

DataRepublican on gab Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!


Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @DataRepublican

13 Jan
It got lost last night, but if you’re interested in the journey of achieving my ultimate goal of having an interactive political geospatial data analytics library, try clicking on buttons of .
We made a lot of good progress in automating the shapefile -> SVG conversion steps which no one has done nearly as well as we have (if I may humblebrag).

Next up is these same SVGs scalable. The biggest precinct in Columbia County alone has nearly 7000 points.
Why are SVGs so important? Think of them as bit like OpenGL (graphics) in XML specification. You can modify and alter them as you wish dynamically client-side, as opposed to the raster + picture layer + pins crap that everyone else (e.g., OpenStreet + Google Maps) uses.
Read 5 tweets
13 Jan

I’d say the evidence for Christ is more like a bunch of snowflakes creating a snowball.

Let’s start with: You have Paul’s epistles, which were so widespread and which few scholars doubt the authenticity. 1 Corinthians was written circa 53 AD.… - 1 Corinthians 15 talks about Jesus’s resurrection and death, and I’d like to call out this part in particular:
“that Christ died for our sins according to the Scriptures, that he was buried, that he was raised on the third day according to the Scriptures, and that he appeared to Cephas, and then to the Twelve. ...
Read 12 tweets
12 Jan
Any competent programmer can quickly write a script to reset a password and scrape user information. It’s pretty trivial code.

The whole stated intent of the Parler hack was to doxx the rioters.
If the reports of Twilio exposing a vulnerability that could reset passwords is true...

Are we supposed to believe that the hackers just exercised self-control and just didn’t mass-scrape others in an attempt to identify those in the riots? No way.
It is my belief that if the Parler vulnerability exists as it’s been described in other articles... then left wing groups really are in the possession of all user data. They just didn’t publish it so they wouldn’t get in trouble.
Read 4 tweets
12 Jan
“A debate has broken out over whether the once-sacrosanct constitutional protection of the First Amendment has become a threat to democracy.”…

Remember, the @nytimes made a policy change so that effectively only the opinions they endorse get published.
The number of people who suggested that the First Amendment be overturned due to 30+ lives lost in BLM, countless businesses destroyed, billions of dollars of damage: 0
Selected quotes from the article:

“[Free speech] can — irony runs deep — undermine free expression itself.”

Nice gaslighting.
Read 16 tweets
12 Jan
I have zero tolerance for gaslighting.

The initial article I read about the Parler hack suggested user data was exposed, so in a panic, I tweeted it. Then I read another article that says that it was a scrape rather than a hack, and retracted the Tweet.
But, it does seem that there IS a vulnerability exposing user information as a direct consequence of Twilio of going down. If so -

This is handwaved as “don’t worry, the hackers are ethical so they didn’t steal any user information.” Or something. Yeah. Sure.
Based on my understanding of how the hack worked, it would have been fairly easy to write a script that just resets passwords and scrape user information en masse. Did this happen? Who knows?

Parler’s attempts to downplay this as a scrape of public data is also shameful.
Read 7 tweets
12 Jan
Reading up on the Parler dump, this is what I understand happened, please correct any inaccuracies:

1. Twilio, in charge of email authentication, deplatformed Parler before AWS deplatformed them.
2. This meant anyone could access any account simply via the password reset link. We don’t know whose user accounts (and presumably sensitive information) were leaked in this way.
3. An admin account was taken over this way, and APIs were exploited to create more admin accounts programmatically.

4. Admin accounts have the ability to access original media files (with the metadata) as well as deleted posts for users.
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!

This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!