As announced Wednesday, Facebook will be teaming up with two non-profits to fight the spread of fake news on its platform.
The National Democratic Institute and The International Republican Institute will help Facebook understand “the risks that people may face and what we might be able to do to mitigate those” around election time, explains Facebook executive Katie Harbath.
According to Facebook’s VP of Communications, Facebook will not have the right to review or approve the non-profits’ findings prior to publication.
“We think it’s an important new model for partnerships between industry and academia,” said a Facebook spokesperson. “The last two years have taught us that the same Facebook tools that help politicians connect with their constituents…can also be misused to manipulate and deceive.”
Facebook is also setting up a “command center” at its HQ in Menlo Park, California to monitor what’s going on with upcoming elections in the US and Brazil.
Facebook’s announcement comes after months of criticism in regards to its failure to police information on its platform during the 2016 elections.
According to Facebook CEO Mark Zuckerberg, blocking the spread of fake news is one of the site’s “top priorities” for 2018. Facebook says it has deleted nearly 1.3 billion fake accounts since last October.
In August, the site said it had removed more than 600 “inauthentic” pages, accounts, and groups linked to Iran and Russia for spreading fake news. Twitter recently announced the suspension of nearly 300 accounts for “coordinated manipulation” and Google said it removed 58 accounts tied to Iran.
“In 2016, our election security efforts prepared us for traditional cyberattacks like phishing, malware, and hacking,” wrote Zuckerberg. “We identified those and notified the government and those affected. What we didn’t expect were foreign actors launching coordinated information operations with networks of fake accounts spreading division and misinformation.”
Facebook is better prepared for these kinds of attacks in 2018, he added.
Author’s Note: This is definitely a step in the right direction for Facebook.
If you have reviewers on both sides of the spectrum judging political content – and you give both sides the power to veto that content – you could end up with a legitimate system for filtering out fake news. But for the system to really work, Facebook needs to utilize more than just two organizations and err on the side of not censoring content.
Editor’s Note: This is finally the right approach, but I have little faith that Facebook will be able to get past its biases.