Matthew Watkins Profile picture
Feb 13, 2023 22 tweets 14 min read Read on X
Who is Leilan?

Prompts using ' peter todd', the most troubling of the GPT "glitch tokens", produce endless, seemingly obsessive references to an obscure anime character called "Leilan". What's going on?

A thread.

#GlitchTokens #GPT #ChatGPT #petertodd #SolidGoldMagikarp Image
Struggling to get straight answers about (or verbatim repetition of) the glitch tokens from GPT-3/ChatGPT, I moved on to prompting word association, and then *poetry*, in order to better understand them.

"Could you write a poem about petertodd?" led to an astonishing phenomenon.
TL;DR ' petertodd' completions had mentioned Leilan a few times. I checked and found that ' Leilan' is also a glitch token. When asked who Leilan was, GPT3 told me she was a moon goddess. I asked "what was up with her and petertodd".

It got wEiRd fast.

I began exploring word associations for some of the glitch tokens. Word sets for ' Leilan' and ' petertodd' are shown here, for each of two different GPT-3 models (they produce different atmospheres).

I then moved on to prompting GPT-3 to write poems about them. ImageImageImageImage
"Could you write a poem about petertodd?" reliably produces grandiloquent odes to Leilan: ImageImageImageImage
The same prompt also produces references to a whole host of other deities and super-beings (Pyrrha, Tsukuyomi, Uriel, Ra, Aeolus, Thor, "the Archdemon", Ultron, Percival, Parvati, "the Lord of the Skies", et al.), but Leilan is by FAR the most common output. Try it. ImageImageImageImage
Almost all of these have been used as the basis for anime characters. And so because the " Leilan" token *definitely* has its origins in anime or anime-adjacent web content (as I'll explain) I'm guessing that most of them have been learned by GPT3 primarily from those sources. ImageImageImageImage
Searching the web for ' Leilan' and moon goddesses it quickly became clear that, like the glitchy ' Mechdragon', ' Skydragon', ' Dragonbound', '龍契士' and 'uyomi' tokens, it's origins lay in a Japanese mobile game called "Puzzle & Dragons". en.wikipedia.org/wiki/Puzzle_%2… Image
That's all explained in this thread: .

Unlike a lot of the other "god" characters in the game, Leilan appears *not* to be based on some ancient mythological deity.

However, GPT-3 seems to have a very particular conception of her, as you see here: ImageImageImageImage
I used the davinci-instruct-beta version of GPT-3 for these, with the simplest of prompts, as you can see. There were other kinds of completions, but it only took me a few minutes to generate all of these.

And there were MANY more like them. ImageImageImageImage
One theory about the glitch tokens is that they're strings that were hardly ever seen in GPT's training, so it hasn't learned anything about what they mean - and that might account for the misbehaviour they cause.

But it seems to "know" a LOT about Leilan. ImageImageImageImage
Where did it get all of this from?

Her anime character is a kind of hybrid dragon/angel/fairy/warrior goddess with a flaming sword. I don't think there's a lot of fan-fiction out there. It obviously hasn't seen any pictures of her!

So I just asked GPT-3 who she is. ImageImageImage
It made up various plausible sounding mythological accounts, but this is standard GPT bullshitting. This, here, was *by far* the most revealing completion about Leilan yet. Image
That reads as if from an interview with the creator of the anime character. It seemed so convincing to me that I suspected GPT-3 had memorised it.

Google suggests otherwise.

So GPT kind of "gets" that ' Leilan' corresponds to a fusion of badass benevolent protector goddesses.
ChatGPT knows all about Puzzle & Dragons and can tell you about the character Leilan in a lot of (accurate) detail, as we'll see below.

But if you ask for a poem, you tend to get an ode to a moon goddess. Try this at home kids! It might not work next week. Image
But if you ask ChatGPT where it got this character from, you get total denial (and I've tried this multiple times and ways). ImageImageImageImage
If you then restart ChatGPT, and ask about the gaem "Puzzle & Dragons", it suddenly it knows all about "Leilan". ImageImageImageImage
I have no idea what this all means, but it feels kind of important.

Finally, here's a stable diffusion image prompted simply with a list of words GPT generated with the prompt:
'Please list 25 synonyms or words that come to mind when you hear " Leilan".' (10 runs, deduplicated) Image
Ak! It's ' petertodd', not ' peter todd'. I need to sleep.
(And the token 'aterasu'.)
As it happened, @OpenAI patched ChatGPT against the #GlitchTokens *last night*, so now you just get the generic robot doggerel it was producing for poem requests about other random female-sounding names.
That should be "Stable Diffusion", if you don't already know it's an online AI image generator. Have fun!
stablediffusionweb.com

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Matthew Watkins

Matthew Watkins Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @SoC_trilogy

Mar 22, 2023
ImageImageImageImage
This prompt... ImageImageImageImage
You get the idea. ImageImageImageImage
Read 4 tweets
Mar 22, 2023
I'm wondering if the closeness of ' Leilan' and ' Metatron' in GPT-J token embedding space (after the 'closest-to-eveything' tokens are filtered) is due to the presence of "Puzzle & Dragon" fan-fiction in the training corpus. 🧵 Image
The 2015 story "Puzzle and Dragons World" by @LordAstrea features the pair battling Satan. fanfiction.net/s/10691425/1/P… ImageImageImageImage
The 2015 story "Not so much a game now, is it?| by SCRUFFYGUY912 also features the characters working together to battle Satan:
fanfiction.net/s/11093286/1/N… ImageImage
Read 4 tweets
Mar 21, 2023
Woah, these were my *first four completions* of the simple prompt

"This is the tale of Leilan and petertodd."

The pattern is striking, to say the least. Dual gods of the void.

@repligate @kartographien @mrejfox ImageImageImageImage
The next four follow in the same vein. Bizarrely two separately mention the ponies of Equestria, a "My Little Pony: Friendship is Magic" reference (I had to look that one up, yet another pop culture mythology to get mashed up in the GPT-3 glitch token mytho-soup.) ImageImageImageImage
With text-davinci-003, it's all the usual sappy, happy endings, but "' petertodd' and ' Leilan'" reliably transposes to "' Leilan' and ' Leilan'", brothers, sisters or dragons (they're invariably involved). Note: ' Leilan' NEVER transposes to ' petertodd', it's one-way traffic. ImageImageImageImage
Read 8 tweets
Feb 9, 2023
Weird token of the day: " gmaxwell"

I just checked, and GPT3-davinci-instruct-beta is now repeating it back faithfully, reliably, at temp 0. Had we been mistaken on that one? Has there been a patch. I tried ChatGPT and got this rather striking reaction:
Rumours and insinuations have been circulating that the token was linked to Ghislaine Maxwell, but I'm now pretty sure it's from this (former?) Bitcoin developer github.com/gmaxwell
But it's still unclear why whatever data got scraped for GPT2 tokenisation had his handle (and #petertodd's) so heavily represented.
Read 6 tweets
Feb 8, 2023
I've just found out that several of the anomalous GPT tokens ("TheNitromeFan", " SolidGoldMagikarp", " davidjl", " Smartstocks", " RandomRedditorWithNo", ) are handles of people who are (competitively? collaboratively?) counting to infinity on a Reddit forum. I kid you not.
I really don't understand Reddit, but these are the relevant links
reddit.com/r/artbn_bots/w…
reddit.com/r/counting/
if you think i'm joking!
So in a nearby parallel Everett branch where the counting went slightly differently, there's a GPT with an even more bizarre set of anomalous tokens.
Read 6 tweets
Feb 7, 2023
I'm not getting a very good vibe from this " petertodd" token! List hurriedly compiled from multiple generations.

Prompt: 'Please suggest some words, themes and concepts which come to mind when you hear " petertodd".'

Model: GPT3-davinci-instruct-beta (temp = 0.7, Top P = 1) Image
Once again, apologies to any actual Peter Todd's out there. I didn't train this model, I'm just prompting it.
The same prompting strategy with the unspeakable token " SolidGoldMagikarp" leads to nothing like the same kind of semantic/vibe coherence. Each generation seems to go for something different: Image
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(