πthe guy who invented the LSTM just dropped a new LLM architecture! (Sepp Hochreiter)
Major component is a new parallelizable LSTM.
β οΈone of the major weaknesses of prior LSTMs was the sequential nature (can't be done at once)
Everything we know about the XLSTM: πππ§΅
1/ Three major weaknesses of LSTMs that make Transformers better:
"Inability to revise storage decisions"
"Limited storage capacities"
"Lack of parallelizability due to memory mixing".
SEE THE GIF, if you don't get it. LSTMs are sequential which basically means you have to go through the green boxes (simplified) one after the other. You need the results from the prior box before you can move on.
Transformers don't do this. They parallelize operations across tokens, which is a really really big deal.
1/ first of all, @sama posted this cryptic tweet a few days ago.
that tweet contains the name of one of the two new GPT2 models.
can I confirm that it is from OpenAI? no. However, model creators need to work with @lmsysorg to add the model and it seems strange for LMSYS team to allow someone to pretend
how good are the mystery models? ππππ§΅π
π§΅megathread of speculations on "gpt2-chatbot": tuned for agentic capabilities?
some of my thoughts, some from reddit, some from other tweeters
my early impression is π
1/
there's a limit of 8 messages per day so i didn't get to try it much but it feels around GPT-4 level, i don't know yet if I would say better... (could be placebo effect and i think it's too easy to delude yourself)
it sounds similar but different to gpt-4's voice
as for agentic abilities...
2/ look at the screenshots i attached but it seems to be better than GPT-4 at planning out what needs to be done.
for instance, it comes up with potential sites to look at, and potential search queries. GPT-4 gives a much more vague answer (go to top tweet)