Hemant Mohapatra Profile picture
Jul 5 18 tweets 5 min read Read on X
So now that Nvidia has far outstripped the market cap of AMD and Intel, I thought this would be a fun story to tell. I spent 6+yrs @ AMD engg in mid to late 2000s helping design the CPU/APU/GPUs that we see today. Back then it was unimaginable for AMD to beat Intel in market-cap (we did in 2020!) and for Nvidia to beat both! In fact, AMD almost bought Nvidia but Jensen wasn’t ready to sell unless he replace Hector Ruiz of AMD as the CEO of the joint company. The world would have looked very different had that happened. Here’s the inside scoop of how & why AMD saw the GPU oppty, lost it, and then won it back in the backdrop of Nvidia’s far more insane trajectory, & lessons I still carry from those heady days:Image
After my MS, I had an offer from Intel & AMD. I chose AMD at 20% lower pay. Growing up in India, AMD was always the hacker’s choice - they allowed overclocking, were cheaper, noisier, grungier and somehow just felt like the underdog david to back against the Intel goliath!
Through the 90s, AMD was nipping @ Intel’s heels but ~2003 we were 1st to mkt w/ a 64-bit chip &, for the FIRST time, had a far superior core architecture. Oh boy, those were exciting times! Outside of SV, I haven’t seen a place where hardcore engg was so revered. Maybe NASA.
I joined AMD right when their stock price was ~$40, and worked on the 1st dual-core architecture (single die, two cores) AMD-X2. Our first mistake -- and AMD insiders won’t like me saying this -- was made here.
We were always engineering-led and there was a lot of hubris around “building a pure dual-core” -- single die split into two separate CPU cores w/ independent instruction & data pipes, but shared cache, page tables etc. Even the fabs didn’t yet have the processes ready.
While we kept plodding on the “pure dual-core”, Intel, still smarting from the x64 defeat just slapped two 1x cores together, did some smart interconnects, & marketed it as “dual core”. Joke at AMD was that Intel’s marketing budget was > our R&D (true fact). Customers ate it up.
We did launch a “true” dual core, but nobody cared. By then Intel’s “fake” dual core already had AR/PR love. We then started working on a “true” quad core, but AGAIN, Intel just slapped 2 dual cores together & called it a quad-core. How did we miss that playbook?!
AMD always launched w/ better CPUs but always late to market. Customers didn’t grok what is fake vs real dual/quad core. If you do `cat /proc/cpu` and see cpu{0-3} you were happy. I was a proud engineer till then but then saw the value of launching 1st & fast. MARKET PERCEPTION IS A MOAT.
Somewhere between this dual→ quad journey, AMD acquired ATI, the canadian GPU company. Back in 2006, acquiring a GPU company did not make a lot of sense to me as an engineer. The market was in servers & client CPUs and GPUs were still niche. We didn’t want a GPU company so much that the internal joke was AMD+ATI=DAMIT.
But clearly, someone at AMD saw the future. We just saw it partially. We should have acquired Nvidia - and we tried. Nvidia – for those who remember – was mostly a “niche” CPU for hardcore gamers and they went hard on CUDA and AMD was a big believer in OpenGL. Developers preferred OpenGL vs CUDA given the lock-in with the latter. Jensen clearly thought very long term and was building his “Apple'' strategy of both HW and SW lock-in. He refused to sell unless he was made the joint-company’s CEO to align with this strategy. AMD blinked and our future trajectories splintered forever.
ATI was a tough nut - we didn’t really “get” them; they saw the world very differently. We wanted to merge GPU+CPU into an APU but it took years of trust & and process-building to get them to collaborate. Maybe if we had Slack, but we only had MSFT Sharepoint 😅
While all this APU work was going on, AMD totally missed the iPhone wave. When we built chips for the PC, world was moving to laptops, when we moved to laptops, world moved to tablets, & when we moved there, world moved to cell phones.
We also missed the GPU wave trying to introduce a fundamentally better but also fundamentally new architecture: APUs (CPU+GPU on the same die - we love “same die everything” I guess). CATEGORY CREATION IS HELL but “if you’re going through hell, keep going”. We hesitated and...
...2008 crisis happened & we were totally caught with our pants down. After that, AMD basically lost the market to pretty much everyone: Intel, ARM, Nvidia. I learned it the hard way how SUPERIOR PRODUCTS LOSE TO SUPERIOR DISTRIBUTION LOCK-INS & GTM.
Few yrs later in '11, I left AMD to join Google via a short 1yr MBA. AMD stock was hovering b/w $2-4 - lower than BOOK VALUE if we sold it for scraps. I should’ve bought some stock but my salary coming out was ~30% lower than salary going in - I was basically broke w/ MBA loans
Huge respect for Nvidia though. They were just one of the little boys back then - we lost some GPU sales to them but never thought of them in the same league as ARM/Intel. They stuck to their guns, and the market came to them eventually when AI took off. BELIEF IN YOUR VISION and an unrelenting and dogged pursuit of your goals is a HIGHLY UNDERRATED SKILL. Most give up, Jensen just kept going harder.
Also mad props to Lisa Su who now leads AMD. Lot of my engineer friends are still there building world best tech and I can’t imagine what they must have had to go through all through the 2010s: multiple heavy layoffs, salary cuts, leadership changes (2 CEO changes b/w 08-11 alone), low morale. Lisa is just absolutely world class - I seriously wish Nvidia and AMD could merge now – a technology cross-licensing that takes advantages of each other’s fab capabilities is going to help a lot in bringing the cost of GPU cycles down much further!
AMD was a “TRIBE”. For so many it was their 1st job out of college in the late 80s 90s and 2000s & they are still there after 30-40yrs! My first manager @ AMD retired from there after 30+ yrs. My second manager is still there. My super-manager has spent 40yrs there by now. I guess this is how AMD survived multiple near-deaths & this is also why when it comes to AMD I still say “we” as if I’m still the fanboy I was as a college student!

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Hemant Mohapatra

Hemant Mohapatra Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @MohapatraHemant

Nov 25
One of the things we hear often about AI is how it’ll bring a productivity surge to society and have a deflationary impact on GDP. Technical revolutions, over a long enough horizon, do tend to make societies better off economically but we wanted to go deeper into what is the nature of productivity today, where does it come from, what do the productivity curves for large tech & non-tech categories look like, and then understand exactly where, and how will AI have the most impact on these productivity curves. @AbhiramTarimala and I dug into some of these core concepts in today's piece and a lot of python code & charting later, we present to you "AI - the last employee, How bigtech AI CAPEX spending will reshape future corporate cost structures" - link at the end of the 🧵:
Productivity = Automation + Specialization. Nations grow GDP/capita by automating (do more with the same people) or specializing (create higher-margin products). AI supercharges both. Taiwan going from an agrarian economy with a GDP/capita of ~150$ in the 1950s to a semiconductor powerhouse with a GDP/capita of $7000 by the 1980s is an example. Also worth noting that without automation, it’s hard to free up enough resources to move up the specialization curve. We’ll see later in the article that the same logic would apply to businesses — improving specialization & automation is directly related to productivity.Image
Our starting hypothesis was that as most orgs scale, fixed costs (FC) go up, and marginal productivity (MP) goes down. We want to be on the "genius savant" lines on FC and MP curves; instead most cos are the orange lines. A genius savant company will scale infinitely with very few people, and will maintain high productivity forever.Image
Read 10 tweets
Jul 13
~12yrs ago, I got a job @Google. Those were still early days of cloud. I joined GCP @<150M ARR & left @~4B (excld GSuite). Learned from some of the smartest ppl in tech. But we also got a LOT wrong that took yrs to fix. Much of it now public, but here’s my ring-side view👇
By 2008, Google had everything going for it w.r.t. Cloud and we should’ve been the market leaders, but we were either too early to market or too late. What did we do wrong? (1) bad timing (2) worse productization & (3) worst GTM.
We were 1st to “containers” (lxc) & container management (Borg) - since '03/04. But Docker took LXC, added cluster management, & launched 1st. Mesosphere launched DCOS. A lot of chairs were thrown around re: google losing this early battle, though K8 kinda won the war, eventually 👏
Read 14 tweets
Jul 2
Speed of execution is the moat inside which live all other moats. Speed is your best strategy. Speed is your strongest weapon. Speed has THE highest correlation to mammoth outcomes. Those who conflate speed w/ 'thoughtlessness' haven't seen world class execution @ speed. E.g.:
Many confuse speed w/ impatience. Impatience is your boss pinging you @ 9pm then calling @ 6am to check if a task is done. Speed is strategic. It is a permeated sense of urgency built w/ a shared belief that what you are doing is important & if you don’t do it, someone else will.
AMZN defines speed. Their 2015 SEC filing () is a must-read: (1) deliberate irreversible decisions (~10%?) (2) expedite all else. Founding teams need to learn how to apply judgment w/ <70% of data (<50% for early stage cos). Move fast, “disagree & commit”. shorturl.at/xDEU1

Image
Image
Read 10 tweets
Jun 8
For those of you following anything SaaS, you'll note there have been a lot of calls around 'saas is dead' lately. If you are wondering why, I wrote about this last yr here👇. This isn't just about saas but rather a 50yr macro CAPEX/OPEX cycle at play under the hood. : shorturl.at/sJ4IZhttps://medium.com/@MohapatraHemant/capex-opex-supercycles-the-dusk-of-saas-and-the-dawn-of-ai-saas-8aa5cfe74c93
Over the last 20yrs, both the cost of building and distributing software has COMPLETELY crashed. 20 years ago, it’d take a 4yr CS degree to write software, today thanks to internet anyone who wants to work hard can learn to code. In 20yrs world has gone from 5-6M software developers to 60M+ today. Second, the cost of distribution has gone to zero thanks to SaaS. I remember the days MSFT used to ship us a new MSDN CD monthly; imagine if you had to burn 60M CDs monthly as MSFT today? Software is shipped hourly, globally, all at once to everyone now.Image
Result? A Cambrian explosion in SaaS Globally. Every revenue pool is fragmented across lots of players now. This is also why even at $200-300M ARR scale, network effects of brand/WOM are becoming harder to see -- there are very few to none of "no one gets fired for buying IBM" type businesses today. Public markets can't foresee a 300M ARR business going to 1B ARR as easily as they would in the past. Ergo, multiples compression across the board.
Read 9 tweets
Feb 10, 2023
This is for SaaS & esp India-SaaS but applies more broadly as well. I’ve been investing in India-SaaS since '18 & harping abt this endlessly to anyone who’d listen. I’ll say it again: the age of “business model innovation” in SaaS is over. It’s done. What does this mean? 🧵1/20
If you don’t have a fundamental product or tech innovation at the heart of your company, you are looking at a 10yr slog ending in a $5-10M ARR plateau → a [10-20]% CAGR lifestyle biz; a sub-$50M M&A by the category leader; or a slow death to $0. What changed, you may ask? 2/20
10yrs ago, “GTM as product” still had a shot. Building a basic SaaS product was *hard*. Cloud was in relative infancy. At GCP, the 1st thing customers would ask us was ‘why should we entrust our mission critical software, infra, data to a 3rd party?” Today we laugh @ this Q. 3/20
Read 20 tweets
Nov 12, 2021
gm! We are pumped to release v1.0 of Lightspeed's India & SEA Crypto market map spanning L1 & L2s, devtools, CeFi & DeFi, NFT & P2E gaming, DAOs, analytics & more across India+SEA. See shorturl.at/lwJQ4 for the full blog & what we are looking to invest actively behind: 🧵
Circa Nov'20 we were seeing 1-2 crypto cos/mo. Today we see 2-3 crypto cos/day! India & SEA are fast emerging global epicenters for crypto. Lightspeed has invested in 50+ crypto cos since 2013 - most of it in 2021 e.g. @FTX_Official @AaveAave @OffchainLabs @solana @PintuID & more
The opportunity set 2-3yrs ago in this region was primarily in CeFi -- e.g. @CoinDCX @CoinSwitchKuber @WazirXIndia @PintuID @indodax have scaled really well. Today we see a much broader opportunity set spanning infra, devtools, DeFi, NFTs, DAOs and Gaming - v. exciting times!
Read 10 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(