8 Google engineers wrote the paper that every AI company now uses as their bible. OpenAI built GPT on it, Anthropic built Claude on it, and Meta built LLaMA on it.
Every LLM worth billions uses this paper's transformer architecture as the foundation...
Before 2017, teaching computers human language was torture.
AI would read text like humans reading through a keyhole - one word at a time.
They were slow, forgot context, and choked on long passages.
Then 8 researchers decided to flip things up...
They published an 8-page paper titled "Attention Is All You Need"
The idea was simple: Instead of reading word by word, why not look at everything at once? Like how you can glance at a page and immediately see which words relate to each other.
They called it a Transformer.
An example: "The bank by the river bank was full of cash."
Old AI would get confused. Two banks?
Transformers see everything at once. "Bank" near "river" = riverbank. "Bank" near "cash" = financial institution.
One formula makes this work & it's worth more than most countries.
Attention(Q,K,V) = softmax(QK^T/√d)V
That's it. This equation alone created trillions in AI market value.
Every word calculates relevance with every other word. "Apple" + "stock" = company. "Apple" + "pie" = fruit.
But they didn't stop at one attention mechanism.
Eight attention mechanisms ran in parallel.
One tracked grammar
Another found subject-verb connections
A third linked pronouns
The other five caught different meaning patterns. All simultaneously.
When tested, it broke every record.
Best translation model: 26.3 BLEU score, weeks to train
Their Transformer: 28.4 BLEU, just 3.5 days
A 2-point jump is like going from dial-up to broadband. 10x faster training.
But OpenAI saw something in those pages that even Google missed.
OpenAI made one surgical change that created ChatGPT.
The original Transformer had an encoder (understands text) and a decoder (generates text). OpenAI threw away the encoder entirely. Just kept the decoder.
Why would removing half the system make it better?
Encoders need paired data - English sentence, German translation.
Whereas decoders only need raw text, maybe the entire internet.
Just predict the next word which needs no translation needed.
OpenAI turned Google's translation machine into a universal intelligence engine.
Anthropic took transformers and made them "safe." First, they had Claude critique their own outputs.
"Am I being harmful? Biased? Lying?"
The AI argues with itself about ethics before answering you.
They called it Constitutional AI. But that wasn't enough.
Then came RLHF - humans rating millions of Claude's responses.
Do this millions of times. The transformer learns what humans actually want.
Same 8-page architecture underneath. But Meta went even further.
Meta spent millions training LLaMA with months of supercomputers running 24/7.
Then they released the actual AI brain - the files that are the model. Small (7B), medium (13B), large (70B) versions.
You could run AI on your laptop locally. But why give away $100M models?
Zuck's play: Let 100,000 developers improve LLaMA. They debug it, optimize it and build tools. Meta gets all innovations back.
While Google/OpenAI charge fees, Meta built an army of unpaid developers. Genius move? I don't know
RT the first tweet if you found this thread valuable.
Follow me @itsalexvacca for more threads on outbound and GTM strategy, AI-powered sales systems, and how to build profitable businesses that don't depend on you.
I share what worked (and what didn't) in real time.
Three German brothers emailed eBay in 1999: "Let us run Germany for you."
eBay ignored them. So they cloned eBay, called it Alando, and made it so big that 100 days later eBay had to buy it for $43 million.
But what happened next was even more interesting...
The brothers - Marc, Oliver, and Alexander Samwer - turned this into a formula:
> Find successful US startups that hadn't expanded to Europe.
> Copy them exactly.
> Scale faster than the originals could expand.
> Sell it back to them or dominate.
They did this 100+ times.
The wildest was Airbnb. Brian Chesky flew to Berlin to meet their clone "Wimdu."
He walked into a converted factory with hundreds of people at desks. Each had two monitors: on the left, Wimdu on the right.
Copying every pixel change in real-time. Airbnb.com