Profile picture
, 53 tweets, 12 min read Read on Twitter
So sometime between 1987 and whenever DVI/HDMI/DisplayPort took over, you went to plug in your VGA cable. This was an awesome way to get video, because it's analog.
You'd think analog would be a bad thing and digital would be better, but there's very good reasons to want your video to be analog: It means you only ever need 6 pins for INFINITE COLORS
See back in the CGA days you'd have 4 color pins:
Red, Green, Blue, and Intensity. Each of those are digital, so they can be on or off. That's 16 combinations, so you max out at 16 possible colors.
EGA switched the pins around and gave it 6 color pins:
R,G,B, and RI, GI, BI. Still binary, so that's 64 possible colors.
But are we going to have to keep changing the connector every time we make the graphics card slightly better? Isn't there a better way?
It turns out there is! it's VGA! And VGA really only uses 3 pins for color.
Red, Green, Blue. For signal-crosstalk reasons each one has their own separate ground, but those 3 pins give you nearly-infinite colors, because they're analog.
So for VGA instead of it being binary and only being able to be ON or OFF, it's analog. It's a voltage between 0 and 0.7, and there's one for each color channel.
So if you have a video card that can only do two colors, black and white, it sends 0.7/0.7/0.7 volts for white, and 0/0/0 for black.
If you wanted to add grey, you'd do 0.35/0.35/0.35 volt.
And you can keep subdividing this as far as you want, really.
So if you have a video card that can only do 8 colors? Great, it can use VGA.
256 colors? Great, VGA is fine with it.
16 million colors? VGA DON'T CARE.
Even more? YEP
So the same basic monitors could go from day zero of VGA (when it maxed out at 256 colors) to the tail end when DVI/HDMI/DisplayPort were taking over. Color wise, it's fine.
(Resolution/refresh rates would be lacking, but colors? no worries)
But here's the interesting part I wanted to talk about:
So, you've got a fancy windows 95 computer and it's got this "Plug & Play Monitor" option.
It means the computer and figure out what the monitor supports, and you don't need a driver.
But how does it work?
VGA is analog! and it's going from the computer to the monitor, not the other way.
So how does the computer know what the monitor supports?
And it turns out the answer is, like so much in computing, clever, complicated, slightly mad, and has a lot of history.
So the original answer was that VGA actually had 4 pins set up to go the other way (monitor to PC), digitally.
That's plenty to send data over! so what, they just put a little CPU in the monitor and used some serial protocol?

See it's 1987. Even a tiny 8-bit microcontroller is going to be relatively expensive. So instead it's a bitmask.
Some pins are tied to ground, some are left to float. The computer can look up that combination in a small table and find out the basic specs of the monitor.
And here's that table.
All it really tells you is "is this monitor connected? is it color? and does it support 1024x768?"
And obviously that's gonna be very limited, even if it worked fine in 1987. Eventually someone's gonna ask "Does this monitor support 75hz refresh?" or "Does it support 1280x1024?"
So it had to be extended.
IBM developed a complicated scheme for the PS/2's XGA system, where the computer would use those 4 pins while adjusting the H and V synchronization signals as if they were binary outputs.
And then in 1994, DDC was adopted.
The first version worked by having a ROM of some sort that constantly dumps 128 bytes of data over ID pin 1, synchronized with the refresh (so the computer can tell when it starts and ends)
This goes along with EDID, the Extended Display Identification Data format. It explains how to interpret those 128 bytes.
Fun fact: It includes the year and week your monitor was made!
But this is obviously also limited: There's only so much you can stuff into 128 bytes and it's still unidirectional and what if you have a monitor that can support a whole ton of different resolutions and such? We need more!
So DDC2B redesigned it to work on an I²C bus.
Now now the ID1 is a bidirectional data pin, the ID3 pin is a clock, and pin 9 provides 5v of power (so the data can be read even when the monitor is off)
Adding pin 9 as 5v caused a bit of a compatibility problem.
See, originally that pin didn't exist. The VGA cable end looked like this:
But switching to version where that pin existed usually didn't cause a problem. Graphics cards tended to use a connector that had a hole there anyway, and just didn't connect it to anything.
But sometimes you'd get unlucky and find a VGA card that did bother filling in the hole, and made your cable unconnectable.
And in 1998 DDC/CI was introduced, which provides a way for computers to send commands (other than "read EDID") to the monitor, and request info from sensors built into the monitor.
This added neat features like letting the computer control the brightness/contrast, backlight (for LCDs, obviously), and detect orientation changes.
And guess what: DDC didn't die with VGA.
It turns out DVI dedicates 3 pins to DDC, so the same standard can be used over the new connector type.
And oh look, so does HDMI.
But here's the part I find really interesting.
So the most commonly used type of DDC on VGA connectors was DDC2B+/DDC2Bi, which is a scaled down version of DDC2Ab.

And to explain DDC2Ab, we have to explain USB.
So in the old days you tended to have a computer where the back would have a bunch of tiny ports all with little labels on them that explicitly told you what they did.
This one is keyboard, this one is mouse, this one is modem, this one is printer, this one is etc etc.
And this was annoying because it meant you had to make sure they were all connected correctly and what if you bought a scanner? where did you plug it in?
Your computer didn't have a "scanner" plug!
So serial ports got used for a lot of things. Serial is a great way to transfer data but it has no metadata. There's no way for the computer to know if you plugged in a modem, mouse, scanner, printer, or joystick.
Which means you have to carefully configure them and install drivers and if you get it wrong, stuff breaks.

So the idea of having a connector that could do a lot AND provide some kind of "here's what I am" metadata has always been a dream.
And today we have USB, which is neat and does that very well.
You plug a USB device into a USB port and the computer can figure out what it is and power it and hopefully it's all good. It's way simpler.
But USB was far from the only protocol designed to do this. It's just the one that won.
Another one was the Apple Desktop Bus, introduced with the Apple IIgs in 1986.
It provided 5v of power and allowed up to 16 devices to be daisy-chained together, and the OS could ask the device what it was.
Another one was (is, really) IEEE-1394 aka Firewire.
It was also designed in 1986 and allowed you to connect up to 63 devices, and similarly provided metadata info so the computer could tell what you plugged in.
And I'm sure I don't need to explain USB. You're probably using it right now, for something.

But there's at least one more, one that had a lot of momentum at first and never really caught on.
And that's ACCESS.bus.
It started as a Philips project, but they created the ACCESS.bug Industry Group (ABIG) in 1993.
Microsoft, NEC, and Digital got involved.
It was a protocol to support 10 kbit/s and 100 kbit/s transfers, with up to 125 devices, and included 5v power.
It was designed for all sorts of devices, with many different device classes, much like USB.
But it never really caught on. Most companies switched over to USB when it began being worked on in 1994, leaving ACCESS.bus to really only get used in two places.
DEC keyboards...
and in the DDC2Ab standard.
So DDC2ab used the fact that ACCESS.bus was an I²C bus and so was DDC2 to route ACCESS.bus over DDC2's 4 pins inside the VGA connector.

So the idea was that your monitor would effectively act as an ACCESS.bus hub.
So you'd just plug in your monitor to your PC, plug your keyboard and mouse into the monitor, and you're off and ready to go.
And NEC actually built some monitors like this!
So that's the interesting thing about the DDC2 interface still supported over HDMI/DVI today: it was, in part, originally designed to carry ACCESS.bus connections, a failed universal peripheral bus.
But since DDC2B+ was designed as a cheaper cut-down version of DDC2Ab for monitors that didn't have ACCESS.bus support, DDC ended up continuing to be used.

And it's still in use today, though in a very different form.
I forgot to mention one place it's still in use today:
DisplayPort has an optional Dual-mode, errr, mode which allows for passive adapters to convert to DVI/HDMI. One of the things it supports? DDC.
Because while DisplayPort doesn't natively use DDC for configuration info, if it's going to be converted to DVI or HDMI, they do, and it'll have to be able to send & receive DDC traffic to work properly.
Also sorry Atari geeks, for not mentioning SIO.
It was a 1978 protocol that allowed daisy chaining up to 256 devices, complete with device-metadata features to make it plug-n-play like USB.
And here's a picture I took back in 2016 of an ACCESS.bus device from NEC that I found at Weird Stuff Warehouse (RIP)
And @TubeTimeUS wants me to mention 9-pin VGA.
It was an early version of VGA only used on a few things (like the Acorn Archimedes)
It's basically the same as 15-pin VGA, but with no extra pins for ID.
@TubeTimeUS The big downside of this connector (besides the lack of ID) is that it's the same connector as MDA, CGA, and EGA, all of which have different pinouts. They're not gonna work together and you better pray nothing fries if you plug the wrong ones together.
@TubeTimeUS also if you made it this far in my very long thread of rambling, you can contribute to me continuing this sort of thing by supporting my patreon:
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to foone
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!

This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!