Facebook announced on Monday a ban on deepfakes; video or audio that’s manipulated to mislead. For instance, a video that purported to show US President Donald Trump has declared war on China would be a deepfake, because he hasn’t done that yet. 

Facebook said it will delete a video if it has been edited beyond cosmetic adjustments, such that it would mislead someone into thinking somebody had said something they did not say. Further, it will remove content superimposed onto a video by a machine learning algorithm designed to look authentic. 

If a video is reported as being a deepfake, Facebook will send it off to one of its fact-checkers, such as the Associated Press or AFP, where an employee will check it over. “This policy does not extend to content that is parody or satire, or video that has been edited solely to omit or change the order of words,” it wrote.

AD

But perhaps it needs to fight technology with technology.

Some blockchain companies promise to plug the gap right at its source. One such solution, Amber Video, encrypts all data captured using its app and logs “fingerprints” on the blockchain in real-time, so videos can always be traced back to their source. 

Amber Video claims that it’s possible to trace the origin of a video, even if it’s been spliced up into several different clips. “Think about a news piece, say a five-minute news video; there might be 20 soundbites and 15 B-roll shots. [With Amber Video’s software], each one of those elements will maintain its fingerprint all the way through to distribution,” Amber Video CEO Shamir Allibhai told Decrypt in November. 

If a video has been tampered with, it’s easy to flag the video up as fake. “You don't have to trust anyone,” said Allibhai. 

But for now, even though they’re worse at detecting fake news than algorithms, Facebook is sticking with good old-fashioned humans.

AD

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.