LaurieWired Profile picture
Dec 26, 2024 5 tweets 2 min read Read on X
“My wife complains that open office will never print on Tuesdays”

A bizarre sentence; which kicked off one of the most interesting bug hunts in Ubuntu’s history.

It all starts with some goofy pattern matching. Image
It’s not a bug with the printer, or OpenOffice, or the printer driver.

It’s a mistake in the way the “file” utility parses file signatures.

When printing from OpenOffice, a PostScript file is created with the creation date. Image
CUPS, the Common Unix Printing System, then uses the file utility as part of its pipeline to determine the type.

But if "Tue" appears at byte 4 (Tuesday's creation date), it's mistakenly identified as an Erlang JAM file, causing the print job to fail.
Essentially, the logic in the file utility for recognizing Erlang JAM files were too broad.

Looking for a single static string position is an extremely fragile detection method.

Simply changing the creation date to “XTue” completely solves the problem! Image
After some back and forth, the logic in the file utility was improved, and the bug was fixed.

It’s a good lesson for us programmers to not blindly dismiss “implausible” sounding issues.

Next time someone makes an outlandish claim; take a closer look. You might just be onto a bizarre edge case!

Here’s the original bug report:
bugs.launchpad.net/ubuntu/+source…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with LaurieWired

LaurieWired Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @lauriewired

Jul 21
Windows is one massive (private) Git repo.

When I was at MS, the Windows Source had around ~3k PRs a day!

Regular Git didn’t scale to those levels at the time.

Internally there was a progression from Git -> GVFS -> Scalar -> merge back to Git. Here's how it worked: Image
Cloning Defender’s antivirus repositories alone would take a full day!

You may be wondering how we dealt with merge conflicts.

Teams were heavily siloed, with most of the effort put up front on the build process.

Syntax and such was checked with local builds before pushing. Image
Once a PR was made, a ton of automated checks would start.

For something small like a Virus sig, it would make sure you didn’t break the defender engine ;)

Of course, PR's were hand-reviewed but honestly, the build process was so robust it caught just about anything. Image
Read 4 tweets
Jul 14
Microsoft took ~30 years to be compliant with the C++ standard!

Seriously. From 1993 to 2020, MSVC’s preprocessor wasn’t feature-complete.

Code that compiles perfectly on Linux often broke.

Hold your judgement; there's some interesting historical nuance: Image
Image
A tech race in the 80s led to unfixable debt.

Official standards wouldn’t exist until 1998.

MS engineers made “best guesses”, but they were competing with others (Borland, Watcom) for the C++ compiler market.

"We'll clean it up after capturing market share". Image
Popularity became their downfall.

Window’s business model was dependent on legacy compatibility.

The mantra of “never break old code” effectively tied the compiler team’s hands.

Those early “ship it quick, good enough” preprocessor decisions? Kind of a problem... Image
Image
Read 5 tweets
Jul 10
There’s a cursed C++ competition where programmers try to create the largest possible error message.

Finalists created ~1.5GB of error messages from just 256 bytes of source.

Preprocessor exploits were so easy, they had to create a separate division! Here's my favorites: Image
Image
One contestant experimented with C-Reduce; a way to search for C++ programs that create unusual compiler behavior.

Maximizing the fitness function for error as a reward, no templates, no pre-processor hacks.

Just nested parenthesis causing exponential error output! Image
For the “precision” subcategory, the aim is to create exactly pi megabytes (pi*1024*1024) of error.

Source had to be <256 bytes.

One anonymous entry hit it spot-on with just ~230 bytes of code. Image
Read 4 tweets
Jul 9
What happens when you freeze a process *perfectly*? RAM, VRAM, network, everything.

Imagine:
- Live-migrations of LLM training jobs
- time-travel debugging
- Surgical repairs of a crash moments before segfault

It’s called CRIU, and it’s already here: Image
Image
It starts with a mad Russian.

At least, that’s what a lead linux kernel dev called it:

“a project by various mad Russians to perform c/r mainly from userspace, with various oddball helper code...I'm less confident than the developers that it will all eventually work” Image
Despite the criticism, Pavel Emelyanov, head of the OpenVZ kernel team, pushed on.

By 2012, Linus Torvalds merged the first wave of patches into Linux.

Previous attempts “failed miserably”, mostly out of insane complexity.

The key lied in parasitic code injection. Image
Read 5 tweets
Jul 7
Humans live at 10 bits per second.

The brain takes in ~11 million bits per second of sensory data, yet the inner conscious workspace is massively compressed.

Most people speak at ~40 b/s. How can we speak faster than we can think?

It's all about error correction: Image
Image
Speech may exceed the cognitive speed limit, but most of the bits are redundant.

Language is designed to withstand noise and mis-hearing. The 40bit “raw rate” is predictable from context.

A Caltech study shows the effective payload collapses to <13 b/s when stripped. Image
What about typing?

Also maxes out right at 10 bits per second, and that’s at 120 wpm!

Even Starcraft e-sport pros measured out to, you guessed it, ~10 b/s information output during a match. Image
Read 4 tweets
Jul 5
Whole-home lithium power used to be a rich man’s game.

Now it’s “high-end graphics card” territory.

This is a $2500, lithium polymer battery that would power an entire US residential house for >24hr.

China is *crushing* it on kilowatt hours per dollar. Image
Let’s put it into perspective. That battery is 2x the kWh of a tesla powerwall 3.

Each powerwall will set you back $15k a piece.

Residential battery setups usually cost $1000 per kWh.

This is $80 per kWh. Image
China’s selling these near the predicted theoretical limits.

Domestic brands are a cool 10x more.

Do you realize what possibilities this opens up?

Instant micro-grids. 3 days of offline power for $10k. Crazy-durable power resiliency. Image
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(