Could Blockchain Tech Solve Facebook’s Deepfake Issue?

Facebook has banned deepfakes and other manipulated videos or audios from its platform to stop the spread of misinformation, the company said in a statement.

The social media giant said it will remove any edited videos such as those misleading people into thinking that someone had said something he did not say.

“This policy does not extend to content that is parody or satire, or video that has been edited solely to omit or change the order of words,” Facebook said. “The doctored video of Speaker Pelosi does not meet the standards of this policy and would not be removed. Only videos generated by artificial intelligence to depict people saying fictional things will be taken down.”

The company will send off any reported deepfakes to its fact-checkers, like the Associated Press or Agence France-Presse to check it.

Monika Bickert, head of Global Policy Management, said despite such videos are rare on the internet, they represent a major challenge for the company.

However, some blockchain companies promise to overcome this challenge through tracking videos back to their source.  

“Think about a news piece, say a five-minute news video; there might be 20 soundbites and 15 B-roll shots. [With Amber Video’s software], each one of those elements will maintain its fingerprint all the way through to distribution,” said Amber Video CEO Shamir Allibhai.

Share this page
Rabea Maguid 551 Articles
Rabea Maguid is a journalist completely obsessed with crypto industry. He holds B.A. from Al-Azhar University, and has a background in journalism and economics. Rabea Maguid likes to think about the future in a positive way, and sees blockchain as a potential driver of deep societal change.