for some reason, on desktop Twitter, it's always "load 34 tweets". Never 24 or 63. Just 34. twitter.com/i/web/status/1…@Scobleizer's tweet is caught in this screenshot lol
Mar 17 • 10 tweets • 2 min read
You can't do everything and be everything.
Someone who spends 100% effort on A will beat you who does 50% A, 50% B.
Jack of all trades is master of none.
🧵
Yes, synergy is a thing, many Nobel Prize winners have hobbies and are great musicians and artists. Not what I'm talking about here.
1/n
Mar 16 • 8 tweets • 3 min read
all my friends who read a lot when young are successful now
i think reading is probably the more important thing you can do in elementary school
if ur brain is like an LLM, you should maximize the # of tokens it sees while training
i read so much in elementary school that writing is very easy for me. i liken it to next token prediction.
SAT/ACT grammar came very easy for me, i just went off of what sounded right. idioms as well, even though my family isn't native english speaker. but id read all them
Mar 15 • 7 tweets • 2 min read
“Money doesn’t buy happiness” is a coping mechanism.
Sure money isn’t 1:1 with happiness 💰 😆
But I’d rather be crying with 10M in the bank than 10 bucks.
Don’t settle for less.
Caveats to this claim but I’m too lazy to write them out
Mar 15 • 8 tweets • 4 min read
🧵 EVERY new #AI feature @Google launched, and how YOU can take advantage!
This is going to save you hundreds of hours.👇👇👇
I have always been bullish on Google even though everyone says they’re behind. Google has the money and talent, but more importantly, existing users locked into their massive product ecosystem.
1: One click Powerpoint! Put in a topic. Seems to be Stable diffusion images 1/n #ai
Feb 14 • 6 tweets • 3 min read
looks like @openai bought the domain ai.com and hooked it up to chatgpt.
maybe everyone alr knew this and im just finding out lol...
accidentally found this out bc i was curious who owned the domain.
🧵If you're using GPT3 for multilingual applications, read this!
i took a 555 word/2928 char English text. It becomes 706 GPT tokens.
Chinese version of the same text: 2170 tokens. (3x 🇬🇧)
Hindi: 4480 tokens (6x 🇬🇧)
Implications 👇
First off, I only tested this on a very small amount of text so I'm not sure if this is a general trend. The actual token efficiency difference number will definitely be different than these numbers.
Also, the Chinese and English text are professionally translated parallel texts.