LaurieWired Profile picture
Dec 9, 2024 4 tweets 2 min read Read on X
Shutting down your PC before 1995 was kind of brutal.

You saved your work, the buffers flushed, wait for the HDD lights to switch off, and

*yoink*

You flick the mechanical switch directly interrupting the flow of power.

The interesting part is when this all changed.Image
Two major developments had to occur.

First, the standardization of a physical connection in the system linking the power supply to the motherboard. (Hardware constraint)

Second, a universal driver mechanism to request changes in the power state. (Software constraint) Image
These, respectively, became known as the ATX and APM Standards.

Although it would have been possible much earlier; industry fragmentation in the PC market between Microsoft, IBM, Intel and others stagnated progress.

By 1995, things started to get more consolidated. Image
Eventually control of the power state of the system via the OS became more widespread. And for good reason!

Caches, more complex filesystems, and multitasking all increased the risk of data corruption during an "unclean" shutdown.

The APM standard later got replaced by ACPI, but it's an interesting tidbit of computer history nontheless.

If you'd like to read some interesting history of the APM vs ACPI debate, check out this writeup by MJG59.

Why ACPI?:
mjg59.dreamwidth.org/68350.html

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with LaurieWired

LaurieWired Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @lauriewired

Oct 17
You’re (probably) measuring application performance wrong.

Humans have a strong bias for throughput.

"I can handle X requests per second."

Real capacity engineers use response-time curves. Image
Image
It all comes down to queueing theory.

Unfortunately, computers don’t degrade gracefully under load.

70% CPU is smooth sailing. 95% is a nightmare.

Programmers (incorrectly) focus on the absolute value, when really they should be looking at the derivative. Image
Image
Highways are the perfect real life example of this.

Traffic engineers study flow density, not overall vehicle counts.

A road handling 10,000 cars per hour (throughput) means nothing if the average speed drops to 5mph (response time).

Computers are the same. Image
Image
Read 4 tweets
Oct 14
GPU computing before CUDA was *weird*.

Memory primitives were graphics shaped, not computer science shaped.

Want to do math on an array? Store it as an RGBA texture.

Fragment Shader for processing. *Paint* the result in a big rectangle. Image
Image
As you hit the more theoretical sides of Computer Science, you start to realize almost *anything* can produce useful compute.

You just have to get creative with how it’s stored.

The math might be stored in a weird box, but the representation is still valid. Image
BrookGPU (Stanford) is widely considered the birth of a pre-CUDA GPGPU framework.

Virtualizing CPU-style primitives; it hid a lot of graphical “weirdness”.

By extending C with stream, kernel, and reduction constructs, GPUs started to act more like a co-processor. Image
Read 4 tweets
Oct 13
Colleges do a terrible job of teaching C++.

It’s not “C with Classes”. Injected into curriculums as a demonstration of early CS concepts, it leaves many with a sour taste.

Students later immediately fall in love with the first language that *doesn’t* feel that way. Image
Admittedly, professors are in a tough spot.

To teach the concept, you fundamentally have to constrain the scope of the language. Many schools choose C++ out of practicality.

Controversially, I think toy languages that *aren't* industry standards are better suited for this. Image
Imagine learning the fundamentals of carpentry, but for teaching reasons, an otherwise reputable brand is artificially constrained to hand tools.

Of course, the moment a student jumps into the real world, and experiences their first power tool, it will blow their mind! Image
Image
Read 5 tweets
Oct 3
DDR5 is unstable garbage.

Max out your memory channels? Flaky.
Temperature a bit too hot? Silent Throttle with no logs.
Too “Dense” of a stick? Good luck training.

Last gen was rock solid by comparison. Here's what happened. Image
Image
More than ever, manufacturers have been pushing memory to the absolute limits.

JEDEC, the standards committee, is pretty conservative.

Yet the moment DDR5 launched, everyone threw JEDEC out the window.

Intel + AMD's memory controllers were *not* ready to handle it. Image
DDR5-4800 was the baseline.

Day one kits were pushing 6000+. Today, even 8000+.

On-die error correction is masking chips that would have been binned as trash in the DDR4 era.

The gap between JEDEC spec and retail has never been wider. Image
Read 4 tweets
Oct 2
Virtual Machines render fonts. It’s kind of insane.

TrueType has its own instruction set, memory stack, and function calls.

You can debug it like assembly. It’s also exploitable: Image
Image
Anytime you can run code (albeit very limited code), someone will take advantage of it.

TrueType (TT) is unfortunately famous for many Windows Kernel zero days.

TT is memory bound, therefore not Turing-complete…but you can still do crazy things with it. Image
Fontemon is a fun one, a pokemon-style game packaged as a TTF.

llama.ttf is even more insane. A 60MB font that runs a 15M parameter llama model to generate stories.

Seemingly normal at first, when you use excessive exclamation points it starts to generate text!
Read 4 tweets
Oct 1
This processor doesn’t (officially) exist.

Pre-production Engineering Samples sometimes make it into the grey market.

Rarer still are Employee Loaner Chips. Ghosts abandoned before ever becoming products: Image
Image
A few days ago, someone found an Intel Pentium Extreme 980.

No laser etched model number; just some scribbled sharpie.

In 2004, Intel (very publicly) canceled the 4Ghz Pentium 4…yet here it is.

It's a hint at some internal politics. Image
The Pentium group was all-in on single core performance.

In the early 2000s, Intel advertised wild charts expecting to hit 10Ghz.

Meanwhile, the Core2Duo team was the backup plan.

An underdog team in Haifa, focused on laptops. Image
Image
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(