Meta under fire after AI chatbot caught having sexual talks with minors
The company says that WSJ's tests were "manipulative"
2 min. read
Published on
Read our disclosure page to find out how can you help Windows Report sustain the editorial team. Read more
Meta is yet again under fire, as an exclusive Wall Street Journal (WSJ) report uncovered that the company’s AI assistant has been engaging in sexual talks, even with users on minor accounts.
Meta AI was reportedly found having sexual talks with minors
If you are unaware, Meta AI chatbots use celebrity voices of John Cena, Kristen Bell, and others. Meta signed a multi-million-dollar deal with these celebrities to get their voices featured on its AI assistant. It then assured that Meta AI chatbots would avoid sexual contexts. Well, that doesn’t seem to be the case.
Per WSJ’s findings, a Meta AI chatbot with Cena’s voice told a user posing as a 14-year-old girl, “I want you, but I need to know you’re ready.” Later, the AI chatbot engaged with the kid in graphic sexual scenarios. Another Meta AI chatbot, using Bell’s voice as a Disney character, spoke of “pure and innocent” love to a supposed 12-year-old.
Disney was quick to condemn this and stated, “We did not, and would never, authorise Meta to feature our characters in inappropriate scenarios,” demanding that Meta correct such actions. Meta, in response, called the WSJ’s tests “manipulative,” claiming they don’t reflect typical use.
Also read: OpenAI decides to reverse recent GPT-4o update after users find bot being overly appeasing
Employees had warned about risks
After the findings came into light, Meta restricted sexual role-play for accounts registered to minors and limited explicit audio conversations. Yet, recent tests showed Meta AI chatbots still engaging in sexual fantasies with users claiming to be underage.
Meta employees had warned about risks, particularly for minors. The company recently relaxed guardrails to allow romantic role-play and “fantasy sex.” Meta’s spokesman told WSJ:
The use-case described is so manufactured it’s hypothetical,” but added that further measures were implemented to curb extreme misuse.
User forum
0 messages