Difference between revisions of "Porn Deepfakes"

From Privacy Wiki
Jump to navigation Jump to search
(Created page with "{{Event |Short Title=Software Pastes Celebrities Faces Into Porn Videos |Location=Global |Date=September 2020 |Taxonomy=Appropriation, Increased Accessibility |Personal Inform...")
 
 
Line 9: Line 9:
 
|High Risk Groups=Celebrities, Females
 
|High Risk Groups=Celebrities, Females
 
|Summary=Up to 1,000 deepfake videos have been uploaded to porn sites every month as they became increasingly popular during 2020, and porn websites wont take down non-consensual deepfakes.
 
|Summary=Up to 1,000 deepfake videos have been uploaded to porn sites every month as they became increasingly popular during 2020, and porn websites wont take down non-consensual deepfakes.
|Description=In November 2017, a Reddit account called deepfakes posted pornographic clips made with software that pasted the faces of Hollywood actresses over those of the real performers. Nearly two years later, deepfake is a generic noun for video manipulated or fabricated with artificial intelligence software. However, a research in 2020 found that still 96 percent of deepfakes are pornographic. Deepfakes are examples of [[Appropriation]].  
+
|Description=In November 2017, a Reddit account called "deepfakes" posted pornographic clips made with software that pasted the faces of Hollywood actresses over those of the real performers. Nearly two years later, deepfake is a generic noun for video manipulated or fabricated with artificial intelligence software. However, a research in 2020 found that still 96 percent of deepfakes are pornographic. Deepfakes are examples of [[Appropriation]].  
  
 
According to researchers this is a niche but thriving ecosystem of websites and forums where people share, discuss, and collaborate on pornographic deepfakes. Some are commercial ventures that run advertising around deepfake videos made by taking a pornographic clip and editing in a person's face without that individual's consent.
 
According to researchers this is a niche but thriving ecosystem of websites and forums where people share, discuss, and collaborate on pornographic deepfakes. Some are commercial ventures that run advertising around deepfake videos made by taking a pornographic clip and editing in a person's face without that individual's consent.
Line 15: Line 15:
 
All the people edited into the pornographic clips Deeptrace found were women. Nonprofits have already reported that women journalists and political activists are being attacked or smeared with deepfakes.
 
All the people edited into the pornographic clips Deeptrace found were women. Nonprofits have already reported that women journalists and political activists are being attacked or smeared with deepfakes.
  
Up to 1,000 deepfake videos have been uploaded to porn sites every month as they became increasingly popular during 2020. Porn companies are still failing to remove them from their websites. Deepfake videos hosted on three of the biggest porn websites, XVideos, Xnxx, and xHamster, have been viewed millions of times. One 30-second video, which appears on all three of the above sites and uses actress Emma Watson’s face, has been viewed more than 23 million times. This can be interpreted as an example of [[Increased Accessibility]].  
+
Up to 1,000 deepfake videos have been uploaded to porn sites monthly and they became increasingly popular during 2020. Porn companies are still failing to remove them from their websites. Deepfake videos hosted on three of the biggest porn websites, XVideos, Xnxx, and xHamster, have been viewed millions of times. One 30-second video, which appears on all three of the above sites and uses actress Emma Watson’s face, has been viewed more than 23 million times. This can be interpreted as an example of [[Increased Accessibility]].  
  
 
<h3>Breakdown</h3>
 
<h3>Breakdown</h3>
Line 23: Line 23:
 
<b>Secondary Consequences:</b> not known<br>
 
<b>Secondary Consequences:</b> not known<br>
  
<h3>Breakdown</h3>
+
 
 
<b>Threat:</b> Porn websites making deepfakes available for millions of viewers and deny taking them down<br>
 
<b>Threat:</b> Porn websites making deepfakes available for millions of viewers and deny taking them down<br>
 
<b>At-Risk group:</b> Female celebrities<br>
 
<b>At-Risk group:</b> Female celebrities<br>

Latest revision as of 14:05, 13 September 2020


Porn Deepfakes
Short Title Software Pastes Celebrities Faces Into Porn Videos
Location Global
Date September 2020

Solove Harm Appropriation, Increased Accessibility
Information Physical Characteristics, Identifying
Threat Actors Deepfake creators, Porn websites, XVideos, Xnxx, xHamster

Individuals
Affected Celebrities
High Risk Groups Celebrities, Females
Tangible Harms

Up to 1,000 deepfake videos have been uploaded to porn sites every month as they became increasingly popular during 2020, and porn websites wont take down non-consensual deepfakes.

Description

In November 2017, a Reddit account called "deepfakes" posted pornographic clips made with software that pasted the faces of Hollywood actresses over those of the real performers. Nearly two years later, deepfake is a generic noun for video manipulated or fabricated with artificial intelligence software. However, a research in 2020 found that still 96 percent of deepfakes are pornographic. Deepfakes are examples of Appropriation.

According to researchers this is a niche but thriving ecosystem of websites and forums where people share, discuss, and collaborate on pornographic deepfakes. Some are commercial ventures that run advertising around deepfake videos made by taking a pornographic clip and editing in a person's face without that individual's consent.

All the people edited into the pornographic clips Deeptrace found were women. Nonprofits have already reported that women journalists and political activists are being attacked or smeared with deepfakes.

Up to 1,000 deepfake videos have been uploaded to porn sites monthly and they became increasingly popular during 2020. Porn companies are still failing to remove them from their websites. Deepfake videos hosted on three of the biggest porn websites, XVideos, Xnxx, and xHamster, have been viewed millions of times. One 30-second video, which appears on all three of the above sites and uses actress Emma Watson’s face, has been viewed more than 23 million times. This can be interpreted as an example of Increased Accessibility.

Breakdown

Threat: Porn makers using AI to paste faces of celebrities over those of the real performers to make videos more popular
At-Risk group: Female celebrities
Harm: Appropriation
Secondary Consequences: not known


Threat: Porn websites making deepfakes available for millions of viewers and deny taking them down
At-Risk group: Female celebrities
Harm: Increased Accessibility
Secondary Consequences: not known

Laws and Regulations

Sources

https://www.wired.com/story/porn-sites-still-wont-take-down-non-consensual-deepfakes/