In Fight against Misinformation, Facebook May Have Cracked Deepfakes

social media giant claims to have AI that detects doctored videos

By Jonny Lupsha, Wondrium Staff Writer

Deepfakes are doctored images and videos passed on as real and meant to deceive. Technology has advanced to the point that deepfakes are difficult to discern from real videos, causing misinformation concerns. Facebook claims to have made a breakthrough.

Woman using phone in public
Technology is being used to create doctored images and videos with the intent of spreading misinformation. Photo By sergey causelove / Shutterstock

Researchers from Facebook recently claimed that they’ve developed an artificial intelligence that can identify and track the origin of a particularly pesky nuisance of the digital age: deepfakes. A deepfake is a doctored image—most often a video—of a prominent figure that is passed off as the real thing. One famous and funny example is Steve Buscemi’s face on Jennifer Lawrence’s body at an awards show. However, the technology has frightening implications, like if a user framed an innocent person for a crime or imitated a world leader to make a call to arms or declare war on a people.

In their video series Fighting Misinformation: Digital Media Literacy, Tara Susman-Peña and Nina Oduro, of the global development and education organization IREX, explained how to be aware of manipulated images online.

Check Yourself before You Wreck Someone Else

Politically-motivated visual misinformation has run rampant in the age of social media. In one recent example, a widely circulated image of Canadian Prime Minister Justin Trudeau and former U.S. President Donald Trump appeared to show Trump’s hand outstretched for a handshake at the White House and Trudeau refusing it. In reality, the picture was snapped a fraction of a second before Trudeau extended his own hand to shake Trump’s.

In another photo, Parkland school shooting survivor Emma Gonzalez appears to be ripping a copy of the U.S. Constitution in a photo op. It was doctored from a Teen Vogue video in which Gonzalez ripped up a shooting target.

“How can you protect yourself from such forgeries [and] how can you know when your eyes are deceiving you?” Susman-Peña asked. “First, employ the technique Label to Disable. Take back your brain by pausing and putting language to any emotions you are feeling before you react to an inflammatory image.

“That way, you can reengage the rational part of your brain, analyze the situation, and make a choice before you like the image, comment on it, or share it.”

According to Susman-Peña, many deliberately altered images and videos arise from current events and breaking news, where tensions are high and everyone wants to give you your first impression of a situation. She added that forgers are eager to exploit the public’s emotions, knowing full well that an upset person is less likely to stop and verify information before sharing it.

Sniffing out the Fakes

“If you’re suspicious about an image that you encounter online, examine it closely—look for any elements that seem out of place,” Oduro said. “Do shadows all point in the same direction, or do some seem inconsistent with others? If there’s signage in the photo, what language is it written in? Is it appropriate to the location supposedly shown in the image?”

Oduro said that one such way to investigate an image is to perform a reverse image search. One way to do this is to visit Google and click “Images” in the top right corner, then click the camera icon and load the suspicious picture to search Google by an image rather than by keywords. It’s easier to do this on a computer rather than a smartphone. The picture can be dragged over to Google Images from its source, or saved to your computer then uploaded. Then look for tools on Google Images like “Visually similar images” and “Pages that include matching images.”

“Many of the search results will have dates; look for the earliest date you can find, and also look for reputable news outlets,” Oduro said. “Compare the earliest image you find with the one that raised your suspicions. Has the image been altered? Is it being used in a way that’s consistent with its original meaning?”

By keeping our emotions in check before sharing a picture or video we can analyze it objectively to stop ourselves from being manipulated and from spreading misinformation.

Edited by Angela Shoemaker, Wondrium Daily