Deepfakes are dangerous and now you only need your phone to create one

8comments
Deepfakes are dangerous and now you only need your phone to create one
A deepfake is a video that is created to show people doing things that they never did in real-life. This could be done to destroy someone's reputation. And as the technology to create deepfakes improves, it becomes easier to create videos that can fool the public. In fact, the Washington Post notes that this technology is now available to anyone with a smartphone. These days a smartphone app can accomplish what used to require a power-packed computer and a movie studio to create.

You can create deepfakes using your iPhone or Android device


Last year, an app called Reface was released for iOS and Android devices. With the app, users take a selfie and insert their face into videos, photos, GIFs, and memes. These images can be shared with one tap. The app promotes itself by using the line "Be Anyone." But even more dangerous is an app called Avatarify that allows the user to direct the movement of a person's face on a photograph. This is done by selecting any photo and with the selfie camera on the user's phone, he or she can control the expressions and movements of the individual (or animal) whose image is being used. The app is available for iOS only and has been installed over six million times since February.

Video Thumbnail


Michigan State University professor Anjana Susarla says, "It’s all very cute when we do this with grandpa’s pictures. But you can take anyone’s picture from social media and make manipulated images of them. That’s what’s concerning." But not all of these smartphone apps are so threatening. Wombo, available for both iOS and Android, creates a lip-syncing video; the user selects the photo and the song and the app does the rest. Unlike the more sophisticated deepfake programs, this app isn't going to kill someone's reputation. Wombo’s CEO, Ben-Zion Benkhin, says that with his app, "You’re not able to pick something that’s super offensive or that could be misconstrued." And while creepy, genealogy app MyHeritage takes photos of deceased relatives and brings them back to life. Called "Deep Nostalgia," the site has brought over 65 million people "back to life" over the past four weeks.

Recommended Stories
These sites all say that their images are being used for entertainment, satire and to recreate history. But the problem is that there is a downside to the creation of deepfakes. One way around this is to make it clear to those viewing deepfakes that what they are viewing is not real. Gil Perry, the CEO of the company that powers MyHeritage's deepfakes, says, "You must make sure that the audience is aware this is synthetic media. We have to set the guidelines, the frameworks and the policies for the world to know what is good and what is bad."

Here is an example of what could happen if more people gain the ability to create passable deepfakes. Earlier this month, a woman in Pennsylvania was arrested after the cops accused her of sending deepfakes to her daughter's cheerleading coaches in a bid to get three of her rivals kicked off of the team. The video created by the mother showed these girls naked, smoking, vaping, and drinking; the photos were all deepfakes.

Video Thumbnail


Deborah Johnson, emeritus professor of applied ethics at the University of Virginia, states that "There’s potential harm to the viewer. There’s harm to the subject of the thing. And then there’s a broader harm to society in undermining trust." In 2019, a bill was killed off in Congress that would have banned the use of deepfakes. Eventually, it will probably be up to Congress to determine how to keep deepfakes from costing someone their job, marriage, and friendships. Until the problem gets worse, or it costs one of them an election, we would expect lawmakers to keep their hands off the issue for now.

Recommended Stories

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless