A single fairly unknown Dutch company makes maybe the most expensive and complex non-military device ($200M) that builds on 40 years of Physics and has a monopoly responsible for all AI advancement today.
Here's the story of ASML, the company powering Moore's Law..
1/9
ASML's extreme ultraviolet (EUV) machines are engineering marvels.
They shoot molten tin droplets 50,000x/s with a 25kW laser turning it into plasma as hot as the sun's surface to create 13.5nm UV light —so energetic it's absorbed by air itself.
2/9
Each $200M machine contains mirrors that are the smoothest objects humans have ever created.
They're made with layers of molybdenum/silicon, each just a few atoms thick. If you scaled one to the size of Germany, its largest imperfection would be 1mm high.
3/9
This light goes through the mirrors onto moving 300mm silicon wafers at highway speeds (~1m/s) with precision better than the width of a SINGLE SILICON ATOM (0.2nm).
That's like hitting a target in SF from NYC with the accuracy of a human hair.
4/9
TSMC's 4nm process for NVIDIA H100 needs ~15 EUV layers (+80 DUV layers).
Each layer must align within nanometers. One machine processes ~100 wafers/hr. Cost? About $150K of chips per hour.
Other techniques cannot get the quality + throughput + cost to this level.
5/9
40 years of co-development, 40,000 patents, 700+ suppliers. They own 24.9% of Zeiss's semiconductor div.
Replication would take decades + $100B+.
6/9
The complexity is astounding.
Each machine ships in 40 containers and takes 4 months to install. The supply chain spans 700+ companies. 100K+ parts per machine, 40K patents protecting it.
One missing component = global semiconductor disruption.
7/9
Only three companies can run cutting-edge EUV:
— TSMC (that makes GPUs for Nvidia)
— Samsung
— Intel.
ASML machines are the only way to make chips dense enough for modern AI. Each H100 has 80B transistors. The next gen will need >100B.
Impossible without EUV.
8/9
Rich Sutton's "The Bitter Lesson" is that general methods that leverage
computation and Moore's Law are the most effective for advancing AI research.
In the iceberg of AI technology, while LLMs are at the top, ASML is at the murky depths.
It has kept Moore's Law alive.
9/9
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Microsoft just leaked their official compensation bands for engineers.
We often forget that you can be a stable high-performing engineer with
great work-life balance, be a BigTech lifer and comfortably retire with a net worth of ~$15M!
The best open-source AI model just dropped a detailed report on how it was trained, a rare resource for students given no frontier lab is publishing!
Kimi K2's estimated total cost of training is ~$20-30M, roughly in line with pricing: $0.6/M in $2.5/M out tokens.
10 highlights:
1. Generating tokens by rewriting high-quality tokens with LLMs in pre-training 2. Mining 3000+ MCPs and using LLM-generated personas to improve agentic tool calling 3. 10,000 parallel Kubernetes sandboxes to solve Github issues 4. New scaling laws for sparsity in MoE models
5. RL with verifiable rewards (RLVR) for math, coding, safety with self-critique model with long-reasoning penalty, causing direct, desisive answers 6. Training recipe of 4k sequences, then 32k then 128k with YaRN 7. High temp during initial RL training to promote exploration
Wanted to use Gemini 2.5 Pro too but on AI Studio, it did not search the web. I’ve kicked off a Deep Research and will report back under this thread.
Prompt:
“Check on the odds on Polymarket and tell me the most mispriced assets I should bet on from first principles reasoning. You have $1000.
Please do deep research and present precise odds on each bet. Use advanced math for trading. Draw research from authoritative sources like research and unbiased pundits. Size my bets properly and use everything you know about portfolio theory. Calculate your implied odds from first principles and make sure you get an exact number.
Your goal is to maximize expected value. Make minimum 5 bets. Write it in a table.”