Inside the Deepfake Porn Economy
Published on March 31, 2023 at 01:30AM
The nonconsensual deepfake economy has remained largely out of sight, but it's easily accessible, and some creators can accept major credit cards. From a report: Digitally edited pornographic videos featuring the faces of hundreds of unconsenting women are attracting tens of millions of visitors on websites, one of which can be found at the top of Google search results. The people who create the videos charge as little as $5 to download thousands of clips featuring the faces of celebrities, and they accept payment via Visa, Mastercard and cryptocurrency. While such videos, often called deepfakes, have existed online for years, advances in artificial intelligence and the growing availability of the technology have made it easier -- and more lucrative -- to make nonconsensual sexually explicit material. An NBC News review of two of the largest websites that host sexually explicit deepfake videos found that they were easily accessible through Google and that creators on the websites also used the online chat platform Discord to advertise videos for sale and the creation of custom videos. The deepfakes are created using AI software that can take an existing video and seamlessly replace one person's face with another's, even mirroring facial expressions. Some lighthearted deepfake videos of celebrities have gone viral, but the most common use is for sexually explicit videos. According to Sensity, an Amsterdam-based company that detects and monitors AI-developed synthetic media for industries like banking and fintech, 96% of deepfakes are sexually explicit and feature women who didn't consent to the creation of the content. Most deepfake videos are of female celebrities, but creators now also offer to make videos of anyone. A creator offered on Discord to make a 5-minute deepfake of a "personal girl," meaning anyone with fewer than 2 million Instagram followers, for $65.
Published on March 31, 2023 at 01:30AM
The nonconsensual deepfake economy has remained largely out of sight, but it's easily accessible, and some creators can accept major credit cards. From a report: Digitally edited pornographic videos featuring the faces of hundreds of unconsenting women are attracting tens of millions of visitors on websites, one of which can be found at the top of Google search results. The people who create the videos charge as little as $5 to download thousands of clips featuring the faces of celebrities, and they accept payment via Visa, Mastercard and cryptocurrency. While such videos, often called deepfakes, have existed online for years, advances in artificial intelligence and the growing availability of the technology have made it easier -- and more lucrative -- to make nonconsensual sexually explicit material. An NBC News review of two of the largest websites that host sexually explicit deepfake videos found that they were easily accessible through Google and that creators on the websites also used the online chat platform Discord to advertise videos for sale and the creation of custom videos. The deepfakes are created using AI software that can take an existing video and seamlessly replace one person's face with another's, even mirroring facial expressions. Some lighthearted deepfake videos of celebrities have gone viral, but the most common use is for sexually explicit videos. According to Sensity, an Amsterdam-based company that detects and monitors AI-developed synthetic media for industries like banking and fintech, 96% of deepfakes are sexually explicit and feature women who didn't consent to the creation of the content. Most deepfake videos are of female celebrities, but creators now also offer to make videos of anyone. A creator offered on Discord to make a 5-minute deepfake of a "personal girl," meaning anyone with fewer than 2 million Instagram followers, for $65.
Read more of this story at Slashdot.
Comments
Post a Comment