How to get URL link on X (Twitter) App
the Taylor Swift Scaling Laws (TS2L) take inspiration from Scaling Laws for transformer-based LLMs, and apply the same log-log regression methodology to model and understand components of Taylor's ticket prices.


Full write up hopefully coming soon, but I'm using cosmo-xl for text generation with my own prompt, retrieving from an in memory vector DB with sentence-transformers embeddings, and using @sendbluedotco for iMessage.
The fact that LLMs generate text is not the point. LLMs are cheap, infinitely scalable black boxes to soft human-like reasoning. That's the headline! The text I/O mode is just the API to this reasoning genie. It's a side effect of the training paradigm.
As computers become superhuman at understanding language, I think it'll become more and more foolish to build knowledge tools that depend solely on human authors to make connections between everything you know and read.
https://twitter.com/thesephist/status/1478171172814700545By making abstract ideas concrete, good notation frees up our squishy biological brains to work with those abstractions the same way we work with sticks and stones and physical objects. This is why it helps to draw/write things down when we think.
1. Intelligence is "run-time adaptation", as opposed to "compile-time adaptation" of some system. The ability to learn and adapt to new/changing environments is sort of the ultimate evolutionary advantage.
My top recommended read on this topic is simply this blog on "Spatial Interfaces".
One of my goals for this project was to learn about full text search systems, and how a basic FTS engine worked. So I wrote a FTS engine in Ink.