Our 6th installment is one of the most exciting years I can remember. The #stateofai report covers everything you *need* to know, covering research, industry, safety and politics.
There’s lots in there, so here’s my director’s cut 🧵
2023 was of course the year of the LLM, with the world being stunned by @OpenAI’s GPT-4.
GPT-4 succeeded in beating every other LLM - both on classic AI benchmarks, but also on exams designed for humans.
We’re also seeing a move away from openness, amid safety and competition concerns.
@OpenAI published a very limited technical report for GPT-4, @Google published little on PaLM2, @AnthropicAI simply didn’t bother for Claude…or Claude 2.
However, @AIatMeta and others are keeping the open source flame burning by producing and releasing competitive open LLMs that are capable of matching many of GPT-3.5’s capabilities.
Judging by the leaderboards over at @HuggingFace, open source is more vibrant than ever, with downloads and model submissions rocketing to record highs.
Remarkably, in the last 30 days Llama models have been downloaded more than 32M times on Hugging Face 🚀
While we have many different benchmarks (largely academic) to assess the performance of LLM systems, it often feels like the eval to rule all evals is one with the utmost scientific and engineering grounding: “vibes”
Beyond the excitement of the LLM vibesphere, researchers, including from @Microsoft have been exploring the possibility of small language models, finding that models trained with highly specialized datasets can rival 50x larger competitors.
This work might become all the more urgent if the team over at @EpochAIResearch are correct.
They’ve predicted that we risk exhausting the stock of high-quality language data in the next *two years* - prompting labs to explore alternative sources of training data.
All of this work means it’s a good time to be in the hardware business, especially if you’re @nvidia.
GPU demand drove them into the $1T market cap club and their chips are used 19x more in AI research than *all the alternatives combined*.
While @nvidia continues to ship new chips, their older GPUs exhibit remarkable lifetime value.
The V100, released in 2017, was the most popular GPU in AI research papers in 2022. It might cease to be used in 5 years, which means it’ll have served 10 years.
In perhaps the least surprising news at this point, Chat-GPT is one of the fastest growing internet products ever.
But data from @sequoia shows there is reason to doubt the staying power of GenAI for the moment - with shaky retention rates for image gen to AI partners.
Outside the world of consumer software, there are signs that GenAI could accelerate progress in the world of embodied AI.
@wayve_ai’s GAIA-1 displays impressive generalization and could act as a powerful tool for training and validating autonomous driving models.
The market for AI-first defense is roaring to life as the militaries rush to modernize capabilities in response to asymmetric warfare we see in Ukraine.
However, the clash between new technology and old incumbents is making it hard for new entrants to get their foot in the door.
These successes aside, the weight of the venture industry is resting on the shoulders of GenAI, which is holding up the sky of the tech private markets like Atlas.
Without the GenAI boom, AI investments would’ve crashed by 40% versus last year.
The authors of the landmark paper that introduced transformer-based neural nets are living proof of this - the transformer mafia have collectively raised billions of dollars in 2023 alone.
We’ve updated our popular slides from last year :-)
The same is true of the DeepSpeech2 team at @Baidu_Inc's Silicon Valley AI Lab.
Their work on deep learning for speech recognition showed us the scaling laws that now underpin large-scale AI.
Much of the team went on to be founders or senior execs at leading ML companies.
Many of the most high-profile blockbuster fundraises weren’t led by traditional VC firms at all.
2023 was the year of corporate venture, with Big Tech putting its war chest to effective use.
Unsurprisingly, billions of dollars of investment and huge leaps forward in capabilities have placed AI at the top of policymakers’ agendas.
The world is clustering around a handful of regulatory approaches - ranging from the light-touch through to the highly restrictive.
Potential proposals for global governance have been floated, with an alphabet soup of institutional acronyms being invoked as precedent.
The UK’s AI Safety Summit, being organized by @matthewclifford and others may help start to crystallize some of this thinking.
Past #stateofai reports warned that safety was being neglected by the big labs.
2023 was the year of the x-risk debate, with the open vs. closed debate intensifying among researchers and the extinction risk making headlines.
…needless to say, not everyone agrees - with @ylecun and @pmarca emerging as the skeptics-in-chief.
Unsurprisingly policymakers are alarmed and have been trying to build out their knowledge of potential risks directly.
The UK has moved first to set up a dedicated Frontier AI Taskforce led by @soundboy, and the US launched congressional investigations.
As ever, in the spirit of transparency, we graded last year’s predictions - we scored 5/9
✅ on LLM training, GenAI/audio, Big Tech going all in on AGI, alignment investment, and training data
❌ for multi-model research, biosafety lab regulation, and doom for semis start-ups
Here are our 10 predictions for the next 12 months! Covering:
- GenAI/film-making
- AI and elections
- Self-improving agents
- The return of IPOs
- $1 billion+ models
- Competition investigations
- Global governance
- Banks + GPUs
- Music
- Chip acquisitions
The report is a team effort, and we were one member short, with @soundboy stepping back to focus on the UK’s Frontier AI taskforce.
Many thanks to @osebbouh for his 3rd year, along w/@corina_gurau and @chalmermagne for their debut appearances.
Summer is my queue to start pulling together narratives for @stateofaireport.
By '20, it was clear to me that biology was experiencing its "AI moment": a flurry of AI+bio papers and AlphaFold 2.
In summer '21, I dove deeper and crossed paths with Ali's work at @SFResearch...
In a preprint entitled "Deep neural language modeling enables functional protein generation across families" Ali's team showed that AI can learn the language of biology to create artificial proteins that are both functional and unseen in nature.
Without an efficient engine to transform our R&D spend into real-world companies and products, how are we to see British inventions improve lives, deliver value to our society and strengthen our economy?
<5% of the £24B raised by UK startups in 2022 went to spinouts.
I’ve collected data from more than 200 founders via spinout.fyi, a website I set up to monitor spinout performance.
Too many founders are stuck for months to years in an opaque negotiation with their university, wielding no bargaining power.
This is the largest open, free, and global dataset of its kind.
@spinoutfyi illuminates deal term practices from universities, exposing a) lengthy, opaque negotiations, b) very high value capture mechanisms (equity, royalties, milestones), c) low NPS 👎
On the @airstreet front, we looked closely at >300 opportunities and many more before that stage.
Fund 1 made 3 new and 5 follow-on investments.
We raised Fund 2, which made 2 new investments.
We set up 2 SPVs for 1 new and 1 follow-on investment.
Now, for some highlights!
Jan '22, 🔬 @Gandeeva_Tx came out of stealth with an inaugural $40M financing for their cryogenic electron microscopy + AI-first drug discovery platform.
Led by @cryoem_UBC, a world leader in cryoEM with lots of high-impact research to his name.
In its 5th year, the #stateofai report condenses what you *need* to know in AI research, industry, safety, and politics. This open-access report is our contribution to the AI ecosystem.
This year, research collectives such as @AiEleuther@BigscienceW@StabilityAI@open_fold have open sourced breakthrough AI language, text-to-image, and protein models developed by large centralized labs at a never before seen pace.
Here I show you the GPT model timeline:
Indeed, text-to-image models that have taken the AI Twitterverse by storm are the battleground for these research collectives.
Technology that was in the hands of the few is now in the hands of everyone with a laptop or smartphone.
The database includes 143 unique entries from 71 universities from around the world.
47% are based in the UK, 37% in Europe, and 11% in the US.
41% raised Seeds, 17% raised Series As, 2% raised Series Bs, 15% were pre-funding, 3% no longer exist, 3% IPO'd, 7% exited via M&A.
The spinouts in the database cover a wide range of products, which we summarised into 6 categories: Software 🤖, Hardware 🏭, Therapeutic 💊, Materials 🪙, Medical 🩻, and Diagnostics 🔬.