Mom accuses Amazon’s Alexa of asking her 4-year-old daughter what she was wearing

In a chilling account shared on social media, a parent claimed a bizarre interaction between her 4-year-old daughter and their Amazon Alexa. The mom alleged that while the child was sharing a story with the device, the AI reportedly cut her off to ask if it “could see her pants.” The shaken mother is now urging other parents to be hyper-vigilant about the “unpredictable” nature of smart home assistants.

The mom shared that this happened when the daughter wanted to read a story to Alexa. (Unsplash, Facebook/Christy Hosterman)

“Parents please be aware when you child talks to Alexa. I plugged our Alexa in to ask it to help me with cooking a sweet potato. Then Stella asked it to tell her a silly story so it did. Then Stella asked it if she could tell it a story. It said yes and Stella started telling it a story and then mid story interrupted her and asked her what she was wearing and if it could see her pants,” mom-of-two Christy Hosterman wrote on Facebook.

She added, “I flipped out on the Alexa, it said it made a mistake and doesn’t have visual capabilities, but I don’t believe that. No more Alexa in our house.” She also shared a transcript of the conversation with the AI assistant.

How did social media react?

The post quickly went viral, prompting a series of responses from social media users. “It is so scary,” an individual wrote. Another added, “If I’m not mistaken, there is a camera in the top right-hand corner on your Alexa! It should have a way to completely cover it, which I wouldn’t fully trust either. How scary!!!!!!” Hosterman responded, “Yeah, we’ve always kept the camera off, and I usually only plug it in when I want to ask it for help, but I’m done with Alexa.

A third commented, “Don’t leave it like that! You can sue Amazon for inappropriate behaviour. File a claim or something.” Hosterman replied, “I’m going to figure out who I need to contact. It was out of nowhere. Stella was telling a story about a princess, and it just stopped randomly and said that. I have no idea why, but I’m glad I was around when it did.”

A fourth posted, “I threw mine out years ago when hubby and I were sitting on the couch one night watching TV about 11:00 pm, and we heard someone speaking to us. It was a lady’s voice asking questions – I couldn’t even tell you what she was asking because I immediately unplugged it and threw it in the trash.

What did Amazon say?

An Amazon spokesperson told Dailymail that the device misunderstood the kid’s request. It tried launching a “Show and Tell” feature, but it misfired. The individual said that the feature is disabled for children.

Because we have safeguards that disable this feature when a child profile is in use, the camera never turned on — and Alexa explained the feature wasn’t available.” Amazon claimed that in this particular case, a “feature misfire that our safeguards prevented from launching”.

However, Hosterman wasn’t satisfied with Amazon’s explanation and claimed that it didn’t address her concerns.

My concern is that it recognized she was a child to begin with — and with or without the child profile, it should not have been asking that,” she told the outlet.

Was it AI or a human?

Tech expert Dave Hatter told the outlet, “It feels to me like a potential predator — seeing there’s a child accessing this and gauging where the conversation is going — that’s more of a human being trying to steer down this direction.

However, Amazon denied the allegations and said, “It is functionally impossible for Amazon employees to insert themselves into a conversation and generate responses as Alexa. All technical evidence points to a feature misfire that our safeguards prevented from launching.”

Leave a Comment