Build capable and efficient general-purpose AI systems at every scale.
Dec 2 • 7 tweets • 3 min read
New Liquid research: STAR -- Evolutionary Synthesis of Tailored Architectures.
At Liquid we design foundation models with two macro-objectives: maximize quality and efficiency. Balancing the two is challenging. To make progress towards this goal, we built a new algorithm — STAR.
Read more about it here: liquid.ai/research/autom…
We first developed a new design theory for computational units of modern AI systems. We then used it to devise an efficient encoding into architecture genomes, and applied evolutionary algorithms to discover hundreds of new architecture designs.
Sep 30 • 13 tweets • 5 min read
Today we introduce Liquid Foundation Models (LFMs) to the world with the first series of our Language LFMs: A 1B, 3B, and a 40B model. (/n)
LFM-1B performs well on public benchmarks in the 1B category, making it the new state-of-the-art model at this size. This is the first time a non-GPT architecture significantly outperforms transformer-based models.