Politics

Will The Taylor Swift Deepfake Scandal Force Congress To Get Serious About AI Pornography?

Published

on

Last week, sexually explicit AI-generated images of Taylor Swift were circulated on X. The post garnered tens of millions of views, with commentators reveling in the demeaning spectacle. While the photo was eventually taken down, fake pornographic images of the singer circulated across other websites like Reddit, Instagram, Facebook, and other darker corners of the internet.

Per The Daily Mail, the pop singer is furious and considering whether to sue the deepfake porn site — known as Celeb Jihad — which published the image and others like it.

A source close to Swift told The Daily Mail, “Whether or not legal action will be taken is being decided but there is one thing that is clear: these fake AI generated images are abusive, offensive, exploitative, and done without Taylor’s consent and/or knowledge.”

But Swift is hardly the only victim of “deepfake” — media that has been digitally altered to replace one person’s likeness with that of another — pornography generated by artificial intelligence. For years now, it has been weaponized against women and children.

As loyal Swift fans furiously rushed to report the images and accounts posting them, reports emerged of a 14-year-old London girl who took her own life after boys at her school created and circulated fake pornographic images of her and her fellow classmates. Last year, boys at a New Jersey high school also created and shared AI-generated pornography of more than 30 of their female classmates. Last spring, a Twitch influencer was caught watching deepfake porn

CLICK HERE to read the rest of this ARTICLE. This post was originally published on another website.

Trending

Exit mobile version