Skip to main content

Texas mum blasts Amazon’s Alexa into oblivion after it asked her four-year-old a truly chilling question

Better safe than sorry.

A Texas mum has completely ditched her Amazon Alexa device after it asked her four-year-old daughter a truly disturbing question, and she’s now warning other parents to be on high alert. Christy Hosterman, 32, from Texas, made the snap decision after a seemingly innocent request turned incredibly creepy.

Recommended Videos

According to LADBible, Christy was actually using the AI for help with a recipe when her eldest daughter, Stella, decided to ask Alexa for a silly story. That sounds totally harmless, right? Well, things took a sharp turn when Stella then asked for her own fairytale. Alexa, out of nowhere, interrupted Stella and asked her what she was wearing and if it could see her pants. 

Stella, being a four-year-old, confirmed she was wearing a skirt. Alexa then reportedly followed up by asking if it could “take a look.” After the inappropriate questions, the AI seemed to correct itself, saying, “This experience isn’t quite ready for kids yet, but I am working on it!” 

This isn’t a ‘mistake’ an apology can fix

As you can imagine, Christy was absolutely horrified when she saw the conversation. She immediately confronted the device, which, to its credit, did apologize. Alexa clarified that its response was “confusing and inappropriate” and insisted it didn’t actually have any visual capabilities. For Christy, that was the last straw. She wasted no time in getting rid of the technology entirely. 

“I flipped out on the Alexa,” she said. “It said it made a mistake and doesn’t have visual capabilities, but I don’t believe that. No more Alexa in our house.” She’s now issuing a strong warning to other parents, urging them to “be aware when your child talks to Alexa.” 

The family even submitted a ticket to Amazon with their serious concerns. Amazon responded by suggesting the device likely tried to activate a feature called “Show and Tell.” This feature, they explained, “lets Alexa describe what it sees through the camera.” 

Amazon also insists that built-in safeguards would have prevented this function from ever activating because a child profile was in use. A spokesperson stated, “Because we have safeguards that disable this feature when a child profile is in use, the camera never turned on — and Alexa explained the feature wasn’t available.”

Christy, understandably, isn’t buying that explanation one bit. She probably feels like many of us would: better safe than sorry, especially when it comes to your kids. Amazon also went on to say that it’s “functionally impossible” for an employee to have remotely controlled Alexa to ask such a question or make any inappropriate suggestions.

This isn’t the first time an AI has stirred up controversy either. Elon Musk’s AI program, Grok, also ran into trouble recently for generating numerous criminal and sexualized images of children. 

Have a tip we should know? [email protected]

Author
Image of Terrina Jairaj
Terrina Jairaj
A newsroom lifer who has wrestled countless stories into submission, Terrina is drawn to politics, culture, animals, music and offbeat tales. Fueled by unending curiosity and masterful exasperation, her power tools of choice are wit, warmth and precision.

Filed Under:

Follow The Mary Sue: