(dpa) – Many of the images and videos that go viral fastest are faked or taken out of context. In the fight against fakes, unfortunately, there’s no one tech solution.
“There’s no website and no tool that can ensure without doubt that a photo has not been manipulated,” says Martin Steinebach of the Fraunhofer Institute for Secure Information Technology.
But that doesn’t mean technology can’t help at all. “The easiest and most common form of disinformation is reusing old photos or videos in a new context,” he says.
Such archival material could be used seriously, for example because no images are yet available for a current event. But in that case those photos or videos should be labled with “archive” and the original date. Using such imagery without context can mislead or distort perception of an event.
One thing you can do to research a photo you suspect of being misused or fake is to use a reverse image search engine such as like Fotoforensics.com or Forensically. Using existing video recordings in a new, false context is a common form of manipulation as well.
The US chapter of Amnesty International provides a search tool for suspicious YouTube videos on a page called Citizenevidence. When you paste in a URL, the site shows you metadata such as the place and time of the upload.
So-called deepfakes work completely differently. These are videos in which celebrities’ faces are placed on other bodies or are manipulated to say other things. “You can usually recognize them by typical faults such as frizzy hair and strangely shaped ears,” says IT expert Pina Merkert.
The artificial intelligence (AI) used for altering videos is getting better and better. Spoken language can be simulated quite easily with Google’s speech program Wavenet.
A few years ago, creating deepfake videos would have required huge amounts of data, which only large companies could handle. Now the amount of data required has shrunk by a factor of four, making manipulation correspondingly easier.
Merkert recommends watching a few examples on YouTube to get a feel for deepfakes.