Published on the 16/10/2019 | Written by Heather Wright
But forget politics, deepfakes remain true to their roots with 96 percent being porn…
While there’s plenty of concern around deepfakes and their possible impact on elections, including the 2020 US presidential elections, new research shows the majority of doctored videos are for a more base purpose – yup, they’re porn.
Deeptrace Labs, a Dutch startup which uses deep learning and computer vision to detect and monitor AI-generated synthetic videos – aka, deepfakes – scoured the web for deepfakes – and came up with 14,678 videos across a range of platforms in the first half of this year. The number is almost double that from its December 2018 measurement of 7,964.
The technology, which uses artificial neural networks to detect and adapt patterns in facial data, to create videos of people appearing to do things they’re not really doing, or saying, has come in for increasing attention in recent months, with concerns about the potential threat to fair elections.
Deepfakes are a numbers game
But The State of Deepfakes: Landscape, threats and Impact report shows fake news isn’t where deepfake creators energy is being spent. At least, not yet.
The study found that the vast majority of the deepfakes – 14,056 of them – were ‘non-consensual deepfake pornography’, with the faces of women, usually celebrities, dubbed onto existing porn clips. The report notes that deepfake pornography is a phenomena exclusively targeting women. In contrast, the non-pornographic deepfake videos analysed on YouTube were skewed more towards male subjects.
“We also found that the top four websites dedicated to deepfake pornography received more than 134 million views on videos targeting hundreds of female celebrities worldwide,” Deeptrace says.
“This significant viewership demonstrates a market for websites creating and hosting deepfake pornography, a trend that will continue to grow unless decisive action is taken.”
The top 10 most targeted women included an Australian actress, three American, two British and one Israeli actresses and three South Korean musicians.
But while porn may remain the key arena for deepfakes, Deeptrace reports that they’re also making a ‘significant’ impact on the political sphere, with deepfakes linked to an alleged government coverup in Gabon and to a political smear campaign in Malaysia.
John Villasenor, a UCLA professor, has warned that deepfakes are likely to be part of the 2020 election landscape – and could be ‘very influential’ even if they’re not particularly good deepfakes.
“As with so much in elections, deepfakes are a numbers game. While the presence of tampering in all but the most sophisticated deepfakes can be quickly identified, not everyone who views them will get that message.
“More fundamentally, not everyone wants to get that message. As can occur with other forms of online misinfomraiton, deepfakes will be designed to amplify voter misconceptions, fears and suspicions, making what might seem outlandish and improbable to some people appear plausible and credible to others.”
To influence an election, a deepfake doesn’t need to convince everyone who sees it, Villasenor says. It just needs to undermine the targeted candidate’s credibility among enough voters to make a difference.
It’s not just political figures being faked. Sounding a warning for companies, Deeptrace notes one example where cybercriminials used synthetic voice audio impersonating the CEO of a British firm’s German parent company. The CEO of the British company complied with the cybercriminals request to wire $243,000 to a Hungarian supplier, with the funds them moved into other locations.
Driving the proliferation of fakery is the commoditisation of tools and services enabling non-experts to get creative.
So what’s being done to stop the deepfakes threat?
Google released a dataset of thousands of deepfake videos last month to help researchers in their detection efforts, while Facebook too is jumping onboard with a $10 million deepfake detection challenge.
California has just signed a new law banning video and audio of political candidates giving a false, damaging impression of the politicians words or actions, within 60 days before an election.
“Deepfakes are here to stay, and their impact is already being felt on a global scale,” Deeptrace founder, CEO and chief scientist Giorgio Patrini says.
“The speed of the developments surrounding deepfakes means this landscape is constantly shifting, with rapidly materialising threats resulting in increased scale and impact. It is essential that we are prepared to face these new challenges. Now is the time to act.”