Oguz O. | 𝕏 Capitalist 💸 Profile picture
Selective contrarian investor | The writer of the Capitalist-Letters newsletter read in +180 countries & Co-Founder @biggr_ai

May 29, 12 tweets

$AMD can make 10x from here.

Annual data center chip spending will reach $1 trillion by the end of this decade, including training and inference.

Lisa Su says inference will be a bigger market than training and $AMD will dominate it.

Here is why $AMD is a 10x opportunity: 🧵

1/ Most people still don't see what's coming...

AI infrastructure spending will be bigger than most people think.

Let's set the stage.

Jensen Huang thinks annual data center chip spending will reach $1 trillion by the end of this decade.

This is inevitable.

Here is why:

2/ Let's compare it to the internet...

From 1991 to 1993, the internet grew 1000x.

$INTC was the backbone of the internet revolution.

Its revenues grew from $4 billion in 1991 to $30 billion in 2001 and to $80 billion in 2021.

AI is a bigger revolution than the internet.

3/ AI is growing insanely fast...

The initial adoption rate of AI is the double of the internet at the same stage of their development.

On average, it took 65 months for an internet company to reach $30 million annualized revenue, for AI companies it's just 20 months.

4/ We are very early in AI.

As it's growing faster than the internet, we can assume application layer will grow more than 1000x in the next decade.

If the infrastructure layer grows on a similarly to the internet, we can expect AI spending to grow 7x in the next decade.

5/ Hyperscalers are already spending over $250 billion this year.

$200 billion of this is expected to be spent on buying chips.

At this trajectory, they'll spend $1.4 trillion in chips in 2035.

6/ As the application layer matures, the spending will shift from training performance to inference efficiency.

Reward for additional training performance will flatline, so spending will shift to inference efficiency.

Eventually, inference will be a bigger market than training:

7/ $AMD is already performing better than $NVDA in most inference tasks.

MI325x now performs better than $NVDA H200 in inference.

Though Nvidia's GB200 is now the state of art, AMD's upcoming MI355x is expected to match it in inference.

8/ As the AI workload shifts from training to inference, AMD's market share will grow.

$NVDA dominates the market completely with over 87% market share now.

On a conservative scenario, $AMD can increase its market share to over 15% in the last decade as the spending shifts.

9/ $AMD has done this before.

As CPU spending shifted from PCs and individual servers to data centers, it aggressively took market share from $INTC.

It's now positioning itself to do same.

This time to $NVDA.

10/ Let's run the numbers...

Assuming $1.4 trillion spending in 2035, and 15% market share, $AMD will generate $210 billion revenue.

At 35% net margin, it'll generate $73 billion net income.

At 25 times earnings, we are looking at a $1.8 trillion company in 10 years.

It's currently valued at $180 billion.

10x potential from here.

Do you want to become a better investor?

I share all my insights in my weekly newsletter.

Join 16,000+ readers to become a better investor.

You will also get a free e-book when you join 👇
capitalist-letters.com

Share this Scrolly Tale with your friends.

A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.

Keep scrolling