Going to attempt this absurdly technical recipe from the Contra cookbook tonight and live tweet the process. Let’s begin!
First, make bay leaf oil. Dried bay leaf is notorious for having little flavor. But fresh bay is very good. The oil takes on a piney/eucalyptus note. Gorgeous color too.
May 15, 2022 • 4 tweets • 1 min read
This year in the markets - learnings and outlook twitter.com/i/spaces/1PlKQ…
Books recommendations on the talk:
I’m going to have a fun TIME tonight.
love this UI for the avax bridge
May 31, 2021 • 6 tweets • 3 min read
ETH, with a little help from @LidoFinance and @CurveFinance, can generate 12% yield. But where does this yield come from?
Let's break down the ponzu recipe. 🧫👇
The Ethereum blockchain gives rewards to computers that validate transactions. If you hold ETH, you can validate transactions. The easiest way to do this is to use a service like Lido. The yield is currently ~6%. lido.fi
Aug 25, 2020 • 8 tweets • 3 min read
The public cloud makes it easy for anyone to start a software company—but at a cost—your margins now belong to AWS. Thread:👇
There are three ways of paying for software infrastructure: 1. have your customer pay for it (cheap) 2. build your own data center (somewhat costly) 3. rent from a public cloud (very costly)
May 13, 2020 • 8 tweets • 3 min read
1/ How James Cameron’s Terminator 2 predicted modern AI chips and sparked the debate on AI safety. An appreciation thread.👇 2/ This is the chip that powers the T-800. Based on its appearance and commentary from chief architect Miles Dyson, the movie makes three predictions about future processors: 1) neural net acceleration 2) multi-core design 3) 3D fabrication.
Let’s look at these claims.
Apr 15, 2020 • 11 tweets • 4 min read
1/ Apple's upcoming ARM MacBooks isn't just going to save them some money and run a bit faster. It marks the beginning of the end of the x86 era and Intel's four decade empire.
Thread⬇️
2/ In the computer industry, victories are won through standards and scale. Intel invented the x86 standard. And by winning in the largest market of the 90s—PCs, it moved upmarket and eclipsed all server CPU vendors in a decade.
Jan 16, 2020 • 6 tweets • 3 min read
ARK's Big Ideas 2020 deck is here—a year of research packed into 80 slides covering AI, robotics, autonomous, genomics, bitcoin, and more.
Here are 5 slides that really hit it home. Thread:
Deep learning continues to improve at astounding rates. In 2017 it costed $1,000 to train ResNet50 on the cloud, now it costs $10.
Thanks to AI accelerators and 'unlimited' compute in the cloud, AI algorithms are eating up 10x the amount of FLOPs per year.
Aug 20, 2019 • 8 tweets • 4 min read
9/ Neural nets can consume GBs of memory. GPUs only have MBs of on-chip memory. So GPUs store neural nets on external memory soldered next to it on the PCB.
The problem is external memory is 10-100x slower & more power hungry vs. on-chip memory. They are also very expensive. 10/ Large models like Google’s Neural Machine Translation don’t even fit in one GPU’s external memory. Often they have to be split up across dozens of GPUs/servers. This increases latency by another 10-100x.
Ideally the whole model fits on a single chip—that's Cerebras' WSE.
Aug 20, 2019 • 9 tweets • 3 min read
1/ Cerebras just built a chip with 50x the transistor count, 1,000x the memory and 10,000x the bandwidth of Nvidia’s flagship GPU. One such 'chip' could replace an entire rack of Nvidia GPUs.
What the heck is going on? 2/ It’s no coincidence that the fastest AI chip today—Nvidia’s V100—is also the largest @ 815mm^2. More area = more cores and memory.
But why not make the chip even bigger? Two reasons..
Apr 12, 2019 • 5 tweets • 2 min read
1/ This week Google made a very unusual product announcement—Anthos—a computing platform that supports on-prem, GCP and competing cloud vendors. In other words, Google will help enterprises run their workloads on AWS! 2/ The pure cloud strategy is clearly not working for Google—you only enable your competitors when you have no other option. Google is trying to win over customers by being the most open, by being "multi-cloud", by being a meta-layer to orchestrate all workloads.
Apr 10, 2019 • 5 tweets • 1 min read
I tried to find every example of large tech companies crushing smaller best-in-class pure plays. Here is the complete list:
• Microsoft -> Netscape
• Amazon AWS -> Rackspace
• FB/Instagram -> Snapchat
Three examples.
If I had to do the reverse the list would go on forever.
Refactoring that Peter Lynch aphorism: "more money has been lost from fear of competition than competition itself."
Mar 11, 2019 • 19 tweets • 4 min read
1/ The old Sun Microsystems adage: “The network is the computer” is perhaps the best lens to view Nvidia’s decision to acquire Israeli interconnect maker Mellanox for $7 billion. bloomberg.com/news/articles/…2/ The slowdown in Moore’s Law has forced the industry to migrate to parallel computing. The key to making lots of chips work efficiently together is fast interconnects.
Oct 16, 2018 • 5 tweets • 1 min read
1/ Most tech companies are horizontal platform companies, but a few exceptions stand out—Apple, Tesla and Netflix are all vertically integrated.
2/ Apple integrates hardware, SoC, OS, applications, services, and retail. Key advantage: ability to move in lockstep. New hardware capabilities are translated to sw and use cases immediately. No need to wait for 'adoption'. eg. 64-bit SoC/OS/sw.
Aug 7, 2018 • 8 tweets • 3 min read
1/ No narrative is more tiresome and false than the idea that because companies are going public later, investors now have fewer choices and lower returns than a decade ago. It's the worst combination of hindsight bias plus laziness.
2/ First of all, it wasn't at all obvious that when the FANGs IPOed, that they would become the juggernauts that they are today. The IPO of Facebook/Amazon/Netflix/Google were just like any tech IPO today—filled with uncertainty and risk. Here's a brief survey.
Jul 23, 2018 • 6 tweets • 2 min read
1/ As we head into earnings season, here is my latest think piece on the state of internet advertising and how it can grow to $600B/year. ark-invest.com/research/inter…2/ Industry research firms have underestimated internet advertising for years. Every year they forecast deceleration. Yet growth sustains or *accelerates*. Zenith/eMarketer/Magna is once again forecasting growth slowing to single digits.
Jun 21, 2018 • 5 tweets • 2 min read
1/ Sorry to bore you with more tweets on internet advertising but since it's the primary source of funding for AI and hence the future of humanity, I have to cover this stuff. 😜
2/ Basically, all forecasts for internet advertising by fancy research firms (Zenith, PWC, eMarketer) are all underwater. At the beginning of the year they forecasted for 10-15% growth. It's going to be WAY higher.
May 17, 2018 • 8 tweets • 2 min read
1/ Neural network training complexity has grown 300,000x since 2012. Yet Moore’s Law has only provided 12x more performance. So the question is, where did the extra performance come from? blog.openai.com/ai-and-compute/@OpenAI2/ There are three factors that drive system performance: transistor scaling, chip architecture, and chip count. Let’s see how these have changed since 2012.
May 3, 2018 • 10 tweets • 3 min read
1/ Tweetstorm + New Blog: the music industry through the lens of Spotify or why the consumer music market is about to grow > 10x. ark-invest.com/research/music…2/ “There are two ways to make money — bundling and unbundling.”
—Jim Barksdale, former Netscape CEO
Jul 21, 2017 • 13 tweets • 5 min read
1/ Congrats on @graphcoreai on its $30m round. It’s still early days, but this could be another ARM in the making! bloomberg.com/news/articles/…2/ Today GPUs have a monopoly position on deep learning training. But that’s set to change: DL ASICs could boost performance by 10-100x.