Andrew Kean Gao Profile picture
musings on life and tech; cs + ai @ stanford; z fellow; building stuff with AI (bibleGPT, teleprompter, etc.) 🇺🇸
Mar 17 5 tweets 3 min read
Offering FREE access to #GPT4 and comparing with #GPT3.

Retweet for instant access (follow so I can DM you).

#openai #ai #tech #chatgpt Image Is water wet?
GPT4 gives a direct answer.

GPT3 responds "As an AI language model..." Image
Mar 17 9 tweets 4 min read
🧵Racial minorities are 40% of the US population but only 5% of jurors.

For @Stanford CS109, I built an interactive site that explores the probabilities of jury selection and race.

👇 more info Image @Stanford 1/n

In the #AhmaudArbery trial, only 1 in 12 jurors was Black, even though the local population is 27% Black.

The defendants struck out 11/12 jurors.

In another case, a jury pool of 105 people in Stockton, CA had 0 Black people.

The website teaches probability in the context……
Mar 17 4 tweets 2 min read
for some reason, on desktop Twitter, it's always "load 34 tweets". Never 24 or 63. Just 34.… @Scobleizer's tweet is caught in this screenshot lol
Mar 17 10 tweets 2 min read
You can't do everything and be everything.

Someone who spends 100% effort on A will beat you who does 50% A, 50% B.

Jack of all trades is master of none.

🧵 Yes, synergy is a thing, many Nobel Prize winners have hobbies and are great musicians and artists. Not what I'm talking about here.

Mar 16 8 tweets 3 min read
all my friends who read a lot when young are successful now

i think reading is probably the more important thing you can do in elementary school

if ur brain is like an LLM, you should maximize the # of tokens it sees while training i read so much in elementary school that writing is very easy for me. i liken it to next token prediction.

SAT/ACT grammar came very easy for me, i just went off of what sounded right. idioms as well, even though my family isn't native english speaker. but id read all them
Mar 15 7 tweets 2 min read
“Money doesn’t buy happiness” is a coping mechanism.

Sure money isn’t 1:1 with happiness 💰 😆

But I’d rather be crying with 10M in the bank than 10 bucks.

Don’t settle for less. Caveats to this claim but I’m too lazy to write them out
Mar 15 8 tweets 4 min read
🧵 EVERY new #AI feature @Google launched, and how YOU can take advantage!

This is going to save you hundreds of hours.👇👇👇 Image I have always been bullish on Google even though everyone says they’re behind. Google has the money and talent, but more importantly, existing users locked into their massive product ecosystem.

1: One click Powerpoint! Put in a topic. Seems to be Stable diffusion images 1/n #ai ImageImageImage
Feb 14 6 tweets 3 min read
looks like @openai bought the domain and hooked it up to chatgpt.

maybe everyone alr knew this and im just finding out lol...

accidentally found this out bc i was curious who owned the domain.

#ai #chatgpt #openai #gpt3 also
Is this a legacy openai site? cute
Feb 14 7 tweets 4 min read
bookmark this 🧵:
5 of my favorite vids under 5 mins explaining AI/machine learning at a beginner level!

#1: What Is Artificial Intelligence?

#ai #tutorial #beginners #nocode #chatgpt #2: DALL-E Explained by @openai

This video is under three minutes and is made by the developers of DALL-E! It explains how DALL-E was trained and how it works to create cool images.
Feb 5 6 tweets 4 min read
🧵 Thread of awesome AI projects by @Stanford students!

definitely gonna miss some projects so comment if i missed something!

(pic generated by stable diffusion) @Stanford From @bryanhpchiang: Symbiotic
Upload a document and ask questions!
Feb 4 6 tweets 5 min read
see what the future looks like with AI!

generate scifi style images with an auto-complete interface to help you write prompts quicker.

try now:

made using Leap from @leap_api @fdotinc

#aiart #ai #ml #generativeai #stablediffusion #buildspace #fdotinc San Francisco
Jan 25 6 tweets 2 min read
🧵If you're using GPT3 for multilingual applications, read this!

i took a 555 word/2928 char English text. It becomes 706 GPT tokens.

Chinese version of the same text: 2170 tokens. (3x 🇬🇧)

Hindi: 4480 tokens (6x 🇬🇧)

Implications 👇 Image First off, I only tested this on a very small amount of text so I'm not sure if this is a general trend. The actual token efficiency difference number will definitely be different than these numbers.
Also, the Chinese and English text are professionally translated parallel texts.