Elon Musk’s Grok row: Ashley St. Clair sues xAI over explicit deepfake images

Ashley St. Clair, the mother of one of Elon Musk’s children, filed a lawsuit on Thursday against the billionaire’s artificial intelligence company, xAI. She alleges the company was negligent and caused emotional distress by allowing users of its AI tool, Grok, to generate sexually explicit deepfake images of her and by failing to take sufficient … Read more

Senate passes a bill that would let nonconsensual deepfake victims sue|BREAKING:

The Senate passed a bill that could give people who’ve found their likeness deepfaked into sexually-explicit images without their consent a new way to fight back. The Disrupt Explicit Forged Images and Non-Consensual Edits Act (DEFIANCE Act), would let victims sue the individuals who created the images for civil damages. The bill passed with unanimous … Read more

UK pushes up a law criminalizing deepfake nudes in response to Grok|EXCLUSIVE:

The UK is bringing a law into force that makes creating non-consensual intimate deepfake images, like the ones that have proliferated on X because of the Grok AI chatbot, a criminal offense, as reported by the BBC. “The Data Act, passed last year, made it a criminal offence to create – or request the creation … Read more

A New Jersey lawsuit shows how hard it is to fight deepfake porn|EXCLUSIVE:

For more than two years, an app called ClothOff has been terrorizing young women online — and it’s been maddeningly difficult to stop. The app has been taken down from the two major app stores and it’s banned from most social platforms, but it’s still available on the web and through a Telegram bot. In … Read more

X’s deepfake machine is infuriating policymakers around the globe|BREAKING:

X’s Grok chatbot hasn’t stopped accepting users’ requests to strip down women and, in some cases, apparent minors to AI-generated bikinis. According to some reports, the flood of AI-generated images includes more extreme content that potentially violates laws against nonconsensual intimate imagery (NCII) and child sexual abuse material (CSAM). Even in the US, where X … Read more