Misha Profile picture
Blockchain, AI, & the technologies remaking the future | Host at How to Win in the Age of AI

Oct 2, 2022, 32 tweets

Nearly 7 billion people use digital technology.

Fewer than 1% understand it.

Here's a thread that will make you SMARTER (starting today):

THE FIRST ALGORITHM:

1843 - Ada Lovelace, a British mathematician realizes that computers can be much more than big calculators.

She publishes the first complex computer algorithm.

lynx-open-ed.org/OERs/Lovelace-…

Her algorithm is laid out in "Note G of Sketch of The Analytical Engine Invented by Charles Babbage by Luigi Menabrea with notes by Ada Lovelace."

It describes how the Analytical Engine could calculate the Bernoulli numbers by using a recursive algorithm.

THE FIRST DATA PROCESSOR:

1888 - Herman Hollerith invents a tabulating machine to help process data for the 1890 U.S. Census.

It is an electromechanical machine that can summarize information stored on punched cards.

Hollerith then starts a company that later becomes IBM.

1936 - Alan Turing conceives of the Logical Computing Machine.

The Turing machine is a mathematical model of the modern computers we all use today.

THE FIRST COMPUTER:

1945 - ENIAC [Electronic Numerical Integrator and Computer] is the first programmable, electronic, general-purpose digital computer.

It's designed to calculate artillery firing tables for the United States Army

One ENIAC replaces 2,400 humans.

THE FIRST PROGRAMMER:

1945 - Grace Hopper writes a 500-page Manual of Operations for the Automatic Sequence-Controlled Calculator.

She outlines the fundamental operating principles of computing machines.

And coins the word “bug” to describe a computer malfunction.

THE FIRST TRANSISTOR:

1947 - Bell Laboratories invents the first transistor, a semiconductor device used to amplify or switch electrical signals and power.

It's one of the basic building blocks of modern electronics and allows for more advanced digital computers.

THE MICROCHIP:

1959 - The Microchip is invented, smaller than your fingernail, it contains computer circuitry called an integrated circuit.

The integrated circuit is one of the most important innovations of mankind.

Almost all modern products use chip technology.

PROGRAMMING:

1964 - BASIC (Beginners' All-purpose Symbolic Instruction Code)
is created by John G. Kemeny and Thomas E. Kurtz.

It is a set of general-purpose, high-level programming languages that enable students in non-scientific fields to use computers.

THE MICROPROCESSOR:

1968 - Intel is founded by Gordon Moore and Robert Noyce.

In 1971, they create the world's first commercial microprocessor chip - the Intel 4004.

The chip and its successors revolutionize electronics and make Intel a household name around the world.

THE PERSONAL COMPUTER:

1975 - Ed Roberts creates the Altair 8800.

It uses Intel's 8080 microprocessor and is revolutionary for the time.

The first units had a maximum of 8 Kilobytes of memory, a 2Mhz CPU clock speed and initially no disk storage.

1975 - Bill Gates and Paul Allen found Microsoft.

They release a BASIC compiler for the Altair 8800, giving it its first programming language.

Now anyone with an Altair PC could write their own programs, which was a real breakthrough.

Microsoft BASIC becomes the foundation software product of the Microsoft company.

It evolves into a line of BASIC interpreters and compilers adapted for many different microcomputers.

1976 - Steve Jobs and Steve Wozniak found Apple Computer Company and release the Apple I.

It is an 8-bit desktop computer designed by Wozniak and is the first Apple product.

The Apple I goes on sale in July 1976 at a price of US$666.66.

1980 - Microsoft buys QDOS (Quick and Dirty Operating System) from Tim Paterson for $50K.

QDOS is tweaked to become MS-DOS that is licensed to IBM and becomes the dominant operating system of the 1980s.

1981 - Xerox releases the Xerox Star which used a GUI or graphical user interface.

It consists of graphical elements such as windows, menus, radio buttons, and check boxes.

Apple, IBM and Microsoft later borrow many Xerox ideas to use in their own products.

Digibarn Museum

1985 - Microsoft ships the first version of Windows.

By 2000, Windows is the operating system for 95% of the market.

THE OPEN SOURCE MOVEMENT:

1985 - Richard Stallman founds The Free Software Foundation to support the free software movement.

He believes software should be distributed under copyleft ("share alike") terms.

His TED Talk: Introduction to Free Software 👇

1991 - Linux is released by Linus Torvalds

Linux is one of the most prominent examples of free and open-source software collaboration.

The source code may be used, modified and distributed commercially or non-commercially by anyone.

THE INTERNET:

1969 - 1990 The Internet has been developing but only for government and academic uses.

1991 - 1993 Three congressional bills make it widely available to the general public.

Read the 1993 National Information Infrastructure Act here👇
govtrack.us/congress/bills…

1993 - The first widely downloaded Internet browser, Mosaic is launched.

1998 - The Google search engine goes live, revolutionizing how people find information online

2001 - Wikipedia launches, paving the way for collective web content generation and democratizing information

SOCIAL MEDIA:

1997 - Six Degrees
2002 - MySpace
2004 - Facebook
2005 - YouTube
2006 - Twitter
2010 - Instagram
2017 - TikTok
2022 - 4.7 billion users worldwide

THE SMARTPHONE:

2007 - the iPhone puts a supercomputer in the hands of the masses

By 2022, 6.6 billion humans are walking around with supercomputers in their pockets.

THE GPU [graphics processing unit]:

1999 - Nvidia launches the GeForce256

A GPU is a processor that is specially designed to handle intensive graphics rendering tasks or process massive amounts of data.

GPUs are ideal for:
Video editing, Gaming, Machine Learning

By 2020 - GPUs are ubiquitous and are central to:

Reinventing 3D graphics in gaming
AI’s continuing breakthroughs
Building the metaverse

THE FUTURE:
The Digital Revolution is complete.

We are now entering the Fourth Industrial Revolution:

An era with rapid change to technology, industries, and societal patterns due to increasing interconnectivity and smart automation.

This will be a period marked by breakthroughs in emerging technologies such as:

• artificial intelligence
• virtual reality
• gene editing
• quantum computing
• the internet of things
• blockchain technology & web3 and others

Here's a thread with more:

Some call this approaching period the Imagination Age

Where creativity and imagination become the primary creators of economic value.

And the rise of an immersive virtual reality will raise the value of the "imagination work" done by designers, artists, and creatives.

That's it, folks. I hope this was useful.

If you enjoyed it, please share by retweeting the first tweet.

I write about the ideas, trends and people shaping web3 and our future. You can follow me @mishadavinci.

Sources: Wikipedia, Walter Isaacson's "The Innovators."

Share this Scrolly Tale with your friends.

A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.

Keep scrolling