, 22 tweets, 5 min read
what a neural net thinks i sound like after N training steps:

N = 800: “I was going to write a blog post about this, but I decided to write a blog post about this instead.”

N = 1100: “I was going to say something, but I was afraid of the consequences.”
N = 1200: “I was going to write a blog post about this, but I decided to write a blog post about this instead. I think I wrote it because I wanted to write about the things I hate about blogging.”

N = 1300: “I was like, oh, I don’t know. I was like, okay, I’ll just go with it.”
these aren’t the full samples, but just the first part. because they always end with the same sentence or two repeated over and over. here’s N = 1400; the rest of the file is the same sentence repeated over and over.
N = 1700: “I was going to be a little more mature.”

N = 1800: “I can’t believe I’ve been doing it for so long.

I’m not sure if I want to do it again, or if I want to do it in a different way. I’m not sure if I want to do it in a different way.”
while we’re waiting for more training steps, some background. i scraped all of my old blog posts and put them in one big text file. the whole file ended up being 185k words in 981 KB, most of which was written in the last two years.
gpt-2 is a frighteningly good AI when it comes to writing openai.com/blog/better-la… . something something neural nets. yeah, i’m taking an intro machine learning class, but the technical details of the article are still mostly incomprehensible to me.
so i’m training the small version of gpt-2 on my corpus of blog posts. essentially, it’s modifying the existing gpt-2 so that it better predicts my writing.
more samples

N = 1900: “I was in the same place, the same place, the same place.”

N = 2000: “I was in the middle of the night, and I was walking to the bus stop.”

N = 2100: “I’m not sure if I want to be a fan or not.”

yeah lots of these aren’t super interesting
N = 2300: “I’m not sure if I want to be happy or not.”

this ai is hurting me help me daddy
N = 2500: “I’m not sure if I want to be a part of this.”

N = 2600: “I’m not sure if I want to go back to school, or if I want to go back to my old life.”

apparently the neural network thinks i’m very indecesive
N = 2800: “I’m going to write a blog post about it.”

VERY REPRESENTATIVE

N = 2900: “I’ve been thinking about this a lot lately.”

GUYS AI WILL TAKE OVER THE WORLD
N = 3000, the first coherentish paragraph

“I’ve been thinking about this a lot lately. I’ve been thinking about how I’ve been avoiding the topic of my life for the past few days. I’ve been thinking about how I’ve been avoiding talking to people, about not talking to people.”
N = 3400: “I’m not sure. I’ve never felt so alone. I’ve never felt so alone. I’ve felt so alone.”

N = 3600: “getting a job, and I was like, well, I don’t have to worry about that. I could just work, and I could be productive. I could just work and I could be happy.”
N = 3700: “I’ve been thinking about the things I’ve done in the past, and how much of it was just luck.”

the ai has impostor syndrome now. later, same sample:

“I am a sucker for the subtlety of people. I love the way they talk, the way they react to me, the way they talk back.”
note that the word “sucker” only appears once in the text that i used to train this model. so does “subtlety”. but this is totally something i would write in a blog post
N = 3800: “And I’m doing a lot of stuff on Discord.”

it’s learning things about my life now
N = 4000: “I am a sucker for the subtlety of language. I love the way it breaks down barriers and makes us feel like we’re in control of our lives.

I have always loved being in the moment. I love being in the moment, being with people, talking to people, being with people.”
N = 4300: “I was going to write a blog post about it, but I couldn’t find the time. I’ve been working on a new project, and I’m writing a short essay, and I’m not even halfway done with.”

THE AI IS NOW ATTACKING ME
N = 4700: “I’m not even going to try to find people to hang out with. I’m just going to be honest with myself.”

for N = 4800, it miraculously didn’t repeat sentences for the rest of the sample
it’s starting to write paragraphs

N = 4900: “I’m not really that good at math, but I do have a good grasp of the concept of group. And the fact that I’m from Boston, that is. I don’t like the fact that I have to commute to work, and that I have to spend time with my parents.”
N = 5000: “I’m sorry that I’m not doing enough to help people. I’m sorry that I’m not doing enough.”

“I’m sorry that I’ve been doing a lot, but I’m not doing enough.”
that was the last decent sample before my colab timed out. i forgot to save the model, so im training another one and will report back soon
Missing some Tweet in this thread? You can try to force a refresh.

Enjoying this thread?

Keep Current with CJ Quines 🌈

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!