Artificial intelligence technology that has been used to create fake, nonconsensual nude photos of women is now being used to cover up women wearing revealing clothing.
Conservative personalities like Ian Miles Cheong have shared viral before-and-after examples online — one viral post from Cheong displayed an AI-edited image of a sex worker, Isla David, placed in a modest knee-length white dress with fake children. Her body was shrunken, and Cheong wrote, “When given pictures of thirst traps, AI imagines what could’ve been if they’d been raised by strong fathers.” Cheong’s post has been viewed over 7 million times. Cheong did not respond to a request for comment.
In the original photo, David posed with a glass of whiskey in a sheer white shirt and underwear. No children were in the original photo.
There are numerous other viral posts that have featured the same practice. One account devoted to the AI trend, which is being called “DignifAI,” has amassed more than 28,500 followers since it began posting on Jan. 31. That account has directly replied to women whose photos its manipulated with the edited photos of them and mocking captions like “keep your dignity.”
More often, women like Taylor Swift and high school-aged girls around the world have been victimized with the tech in efforts to portray them nude, but Davis told NBC News that the editing was similarly an attempt to shame and humiliate her.
“It does not matter whether you are a woman like Taylor Swift who is denying access to her nude body or someone like me who is offering access to my nude body,” said Isla David, a sex worker and sex educator based out of Canada, in a phone interview. “It is a tool to ensure that women never retain sole autonomy over our body and our images online.”
It’s unclear what specific AI tools are being used to make the photos, or if creators are repurposing different AI tools. One website with a URL associated with the practice directs creators to an AI app that was created to allow women to try on clothes before purchasing them.
David, who said she was alerted to Cheong’s post by a follower, also noted that she has a third arm and six toes in the AI-altered image of her. She said Cheong’s post and caption was intended to humiliate her and her family, and that the technology poses existential threats to women online. The images have left her scouring the internet for reposts and warning her family of what they might see about her and them online. David said that she found the edited images of her circulating on 4Chan, where 404 Media reported the images originated before being posted on X.
The practice of re-dressing proudly sexual women in AI-generated, modest clothing and putting them into fake scenes of domestic child-rearing is emerging alongside the growing “tradwife” movement, which envisions the popular resurgence of 1950s-style gender roles. Some women have embraced the traditional aesthetic, while AI is forcing non-conforming women into the role of subservient mother and wife.
“It’s really a form of propaganda, and it’s not based in reality,” said Mike Stabile, the director of public affairs for the Free Speech Coalition, which lobbies on behalf of the adult industry.
“It’s about controlling representations of women in society. It’s that ultimately women can’t be trusted to make their own decisions about whether to take off their clothes or get tattoos or put on clothes,” Stabile said. “It’s infantilization, and ultimately these men, by and large, are going to make the decision. If you don’t want to do it, at least in their world they’re going to make that change.”
“I did not consent to my body being twisted into this parody of a conservative dream girl,” David said. “I didn’t consent to my image being used for this fantasy of ‘fixing’ me.”
For sex workers, David said that AI technology isn’t all bad. It can be used to respond to direct messages, storyboard content, and more. But AI technology can also be used to steal sex workers’ content and repurpose it, David said, ensuring that they aren’t paid fairly for their labor.
“AI threatens that lifeline and bodily autonomy, both of the workers and the nonconsenting women everywhere,” David said. “It’s terrifying to be a woman on the internet right now. You can’t win.”