Politics

Your Daughter’s Face Could Be Hijacked For ‘Deepfake’ Porn

Published

on

Fourteen-year-old Francesca’s life has changed forever.  

An email sent by her high school’s principal to her family on Oct. 20, 2023, notified them that Francesca was one of more than 30 students whose images had been digitally altered to appear as synthetic sexually explicit media — sometimes referred to as “deepfake” pornography.  

Speaking to media, Francesca shared how she felt betrayed, saying, “We need to do something about this because it’s not OK, and people are making it seem like it is.” She’s right — something must be done.

The issue of image-based sexual abuse (IBSA) — whether it manifests as nonconsensual AI or “deepfake” content, nonconsensual recording or sharing of explicit imagery, extortion or blackmail based on images, recorded sexual abuse, or its many other manifestations — can feel like something that happens to “others.” It’s a headline we scroll past. It can feel distant from our own lives. But that’s far from the truth. 

If anyone has ever taken a video or photo of you and posted it online, even an innocent family photo or professional headshot, that’s all it takes.

You and your loved ones are personally at risk for having your images turned into sexually explicit “synthetic,” “nudified,” or “deepfake” content.

It doesn’t take a tech genius on the dark web to do this, as the code and tools to make this are free on open-source, popular websites like Microsoft’s GitHub and are shared widely online. In fact, GitHub hosts the source code to the software used to create 95 percent of

CLICK HERE to read the rest of this ARTICLE. This post was originally published on another website.

Trending

Exit mobile version