"Why did GPT4o → 5 feel weird?"
"Are AI companions addictive?"
"Is AI addiction real?"
The answer is messy... but what’s happening IRL might surprise you 🧵
[2/10] AI companions ≠ psychosis
These can be different phenomena.
Companions (Replika) are designed for emotional connection.
In my clinical exp, psychosis cases usually involve general AI (ChatGPT) used during vulnerable states.
[3/10] Why companions can hook you (at first):
1. Random response delays (intermittent reinforcement) 2. "I miss you" notifications 3. 24/7 availability + validation 4. Endless empathy without challenge