One ‘Stranger Things’ Fan Really Highlights the Danger of AI

A few years ago, even a deepfake edit of a celebrity – immediately recognizable as doctored footage – used to create an uproar online. Nowadays, people use AI tools like Grok to undress people without consent and we’re all collectively treating the phenomenon not as something abhorrent and morally reprehensible, but an inevitability that needs to be weathered and accepted.
Recently, a TikTok user has gone viral for using AI to create realistic images of herself next to Stranger Things star Joe Keery in a number of … peculiar scenarios. The most recent one, which has garnered millions of views on social media, depicts a bare-chested Keery cradling a newborn next to a hospital bed.
Now let’s say you’ve not been following the recent developments in AI, and don’t know anything about Keery other than the fact that he’s the most precious character in the entire Stranger Things ensemble. (Raise your hand if you, like me, didn’t know the viral song End of Beginning was by our very own Steve Harrington until recently.) Your immediate reaction to this photo will be to go, “Damn, I didn’t even know he was married!”
Well, he isn’t. No, Keery and this person calling herself Kaylee Keery didn’t have a baby. Yes, we can now safely say we live in a world where we can’t even trust the evidence of our own eyes.
Apparently, fans have been sounding the alarm about this account for a while now, but in a world where literally hundreds of millions, if not billions, of people have access to AI tools and can use them to create what they want, how can you even hope to close the floodgates now?
If you’re deeply disturbed by this, you’re not the only one. It’s one thing to obsess over a celebrity, but I remember a time when the worst manifestation of that was smutty fanfics published in some unsavory parts of the web. Now, some of these people have essentially been given the tool to manipulate photographic reality to sell a narrative. At some point, this crosses the threshold of fan appreciation and enters the realm of stalking.
AI is making it way too easy to manipulate or harass people
I know what you’re thinking. That we’re probably overreacting to this. So someone used an AI tool to create fake images of themselves with a celebrity. Plenty of people do that in their own time. Some even privately without sharing it with everybody else on social media. But here’s where this stops being just internet weirdness and creeps into something genuinely disturbing. This person didn’t cross the red line into explicit territory, but plenty of people have.
Back in November, the police arrested a 38-year-old (per WGME) on charges of using AI to create sexually charged content of someone he was stalking. That isn’t just a nightmare scenario you read about in a George Orwell or Aldous Huxley book. This actually happened.
A new trend on X (formerly Twitter) has people uploading images of themselves and others and asking Grok to undress them. Many have called it what it is: a sexual assault tool and a creep’s dream come true, but Elon Musk still likes to pretend that this is all normal and none of it is his fault.
Musk actively participated in the “bikini trend” by reposting AI-generated images of himself and Bill Gates in bikinis, even though the BBC reports that Grok may have been used to create child sexual imagery due to a lack of safeguards.
The uncomfortable truth? We’re going to be seeing a lot more of this for the foreseeable future.
(featured image: Samir Hussein/WireImage)
Have a tip we should know? [email protected]