- Face-swapping tech, once the domain of Hollywood level VFX, is now available for free – and can be rendered on a GPU-powered PC.
- This has created an explosion of what is now known as deepfake porn. Videos featuring Bollywood actresses Priyanka Chopra, Shraddha Kapoor, and Deepika Padukone already exist.
- The AI-powered post-truth era has arrived and anyone with a sizeable online digital footprint in the form of photos and videos is at risk.
A notorious new AI-powered face-swap application that borrows from open source machine libraries emerged a few months ago and it could have wide-ranging implications for pornography, politics, and the media industry in India.
Deepfakes, a face-swapping tech that emerged out of Reddit late last year, uses deep learning algorithms provided by machine learning libraries provided by Google TensorFlow to perform a convincing face-swap in a video. The VFX technology that enables dead actors to be resurrected in movies, at million dollar budgets is now getting democratised. Today, anyone with a GPU-powered desktop PC can do it, and anyone with a large online footprint on social media is at risk.
This phenomenon was first noted by Motherboard in December 2017, in a report, which showed a very realistic porn video of Wonder Woman actor Gal Gadot. It still required some amount of technical heft back then. The technology has since exploded in popularity, owing to the availability of FakeApp, a community developed desktop application launched in late January 2018. The app lets users create a large dataset of images on a person using a video, train a neural network, and procedurally swap neural network-generated faces into the frames of a video to create a morphed video with a high degree of accuracy.
According to how-to guide estimates, it takes about half a day to make a Deepfake video. “At a minimum, your computer should have a good GPU with CUDA support. Failing this, you can rent cloud GPU(s) through services like Google Cloud Platform,” the guide states. Photos scraped from platforms such as Instagram and Facebook could be used to create these videos, reveals another Motherboard feature, which examined tools that enable one to find a doppelgänger to do a face-swap on.
The first deepfakes featuring Indian celebrities are already out on the web. A Pornhub user profile called ‘blackandwhitepanda’ has posted two videos each of Priyanka Chopra and Deepika Padukone, and one video of Shraddha Kapoor over the past month. According to his Pornhub profile, he’s a 21-year-old male, living in New Delhi, India. The videos come with a Deepfake disclaimer, but are hugely disturbing considering the non-consensual nature of them.
Following an outcry around this phenomenon, Reddit banned the deepfake subreddit last week, which had grown to over 90,000 subscribers, and other subreddits related to the app. One such subreddit offered a marketplace where users could buy and sell deep fakes for “bitcoin donations”. “This subreddit was banned due to a violation of our content policy, specifically our policy against involuntary pornography,” reads the notice on Reddit, as of 7th February.
However, r/fakeapp, a place to create “SFW (Safe for Work) FakeApp creations” still exists, and it links to the latest version of the app. A video tutorial on YouTube, published a week ago explains how to use the application. It still requires a considerable degree of heft, judging by the accounts of a journalist at Verge who tried a face-swap of Elon Musk and Jeff Bezos. A compilation of videos featuring Nicholas Cage’s face-swaps, in movies he never performed in, serves as a demonstration of how realistic this technology can get.
Rakesh Krishnan, Threat Analyst at Cloudsek, a Bengaluru-based cybersecurity startup pointed us out to two websites registered on January 31: “TheDeepFakeSociety” and a domain dedicated to porn called TheDeepFakeSocietyXXX, which maintains an list of celebrities who have been deepfaked. “Under the name and address section; domain name, the address of Privacy Protect LLC has been given. This itself notes that the actors had concealed their names in order to protect their identity,” he points out.
Other sites such as dpfak.com, which maintains a list of celebrities and their body doubles, and deepfakes.cc (known to run a crypto mining script), a deepfake dedicated forum have since cropped up this month. The forum which has over 7,000 members as of writing, has a resources, request, and a tech support subforum.
Coming Soon: Political Deepfakes
Politicians, with their huge media footprints in terms of photos, video and audio footage make easy targets for deepfakes. In late 2016, Adobe demoed VoCo, referred to as as ‘photoshop for audio’. A viral video of a lip synced Obama speech at Siggraph 2017, a computer graphics focused conference demonstrated how they could turn audio clips of Obama’s speeches into a lip-synced video.
While we haven’t seen any deep fakes related to Indian politicians, there are precedents of fake videos being created with clever editing. A 11-second video clip of Delhi Chief Minister Arvind Kejriwal, which was released a year ago shows him asking people to vote for Congress. Arvind Kejriwal put out a video clarifying that he had made no such statement, blaming Congress Party and Akali Dal for the hoax video. “From my old videos, they’ve cut and spliced it to show that I am appealing to vote for Congress. There is no such thing,” says Kejriwal in Hindi in the video.
“Quoting people out of context was just as effective,” says Pratik Sinha, founder of Altnews, a website dedicated to exposing fake news. You don’t need much technology to create disinformation campaigns, he says, citing an example involving Ravish Kumar, an NDTV India journalist, when he had spoken at the Press Club after the death of Gauri Lankesh. This footage was cleverly edited to misrepresent his position. “They cut everything and they started from apni party – as if he belonged to, and was speaking on behalf of some political party,” says Sinha.
He cited another example of a fake video that went viral, where Donald Trump is holding a child, and he asks, “Who do you like the most?” The child, dressed to look like a Trump lookalike, says, “Modi.” This was done using a tool called Madlipz, an application that allows you to record your audio over a video. “Those kind of things are happening already. There is no sophistication in it, you take some video, give it a false narrative and push it out,” says Sinha. While it’s still effective now, with more sections of the media covering fake news, and people getting more media savvy, the effect of this rudimentary fake news will go down, says Sinha, a coder before he founded Altnews. “That is when they will have to advance to more sophisticated kind of technologies.”
The weaponisation of this technology could lead to local and international conflict provoking different people or countries in ways limited only by imagination. “It’s difficult to clearly define all aspects of this crime. The possibility of abuse is in-built in such technology and I can’t suggest anything with certainty to prepare for the proliferation that will follow other than education and vigilance.” says Mishi Choudhary, Technology Lawyer, Managing Partner at Mishi Choudhary & Associates.
“The law does not respond with the deftness, speed and agility as the new technology demands and so will not come to our rescue,” she says, advising users to share pictures securely via with intended recipients, friends, family, or lovers. “Public sharing without limitations to usage only leads us into an unpredictable quicksand,” she warns.
Subscribe to FactorDaily
Our daily brief keeps thousands of readers ahead of the curve. More signals, less noise.
Thank you for reading FactorDaily
We hope this story worked for you.
Our journalism is produced by some of the best brains in the story-telling business who believe that good stories have only one master: you, the reader. Bringing these stories to you, just so you know, costs us a pretty dime even as the context of disruption remains unchanged in the journalism business the world over.
If you like what you read here, consider supporting the FactorDaily journey. We don’t have a paywall because we believe access to good journalism must be free to all, especially when it is in public interest and informs citizens with independence and accuracy. Such stories should not be restricted to a few who can pay. You are free to support us with any amount you like.
Please note that 18% of your contribution will be paid to government as GST, per Indian accounting rules.
Yes, I'd like to contribute.
Updated at 12:50 pm on February 14, 2018 to change the category this story was published under.
Disclosure: FactorDaily is owned by SourceCode Media, which counts Accel Partners, Blume Ventures and Vijay Shekhar Sharma among its investors. Accel Partners is an early investor in Flipkart. Vijay Shekhar Sharma is the founder of Paytm. None of FactorDaily’s investors have any influence on its reporting about India’s technology and startup ecosystem.