After an overwhelming backlash against his company, Facebook CEO Mark Zuckerberg announced plans to combat fake news. He said the social media platform would tackle misinformation by identifying those media outlets that were “trustworthy”.
“There’s too much sensationalism, misinformation and polarisation in the world today. Social media enables people to spread information faster than ever before, and if we don’t specifically tackle these problems, then we end up amplifying them,” he said last week.
“That’s why it’s important that News Feed promotes high quality news that helps build a sense of common ground,” he added.
Fighting Fake News
Zuckerberg’s statement and the company’s actions suggest a new approach in fighting fake news. Facebook’s founder had previously dismissed suggestions that Facebook played a part in sowing discord with US citizens during the 2016 US presidential election, suggesting that the idea his company may have been used by Russian trolls to influence the outcome was “crazy”.
However, since then, Facebook has suspended nearly 500 accounts linked to an apparent Russian campaign of misinformation, intended to influence US public opinion.
But while this showed the company was serious about tackling misinformation, the company’s latest attempt at filtering out ‘fake news’ isn’t as sophisticated.
Instead of the machine learning and human-based detection techniques the company claimed it was already using to remove fake news and false accounts, Facebook is giving the social network community the opportunity to determine which outlets are deemed to be more trustworthy and then prioritising these sources on its News Feed.
The company said that it would be moving to reduce the amount of news people see, and make the news they do see, better quality.
New Facebook Screening Process
It will work by using surveys, which will question whether people are familiar with a news source and if so, whether they trust that source.
“The idea is that some news organizations are only trusted by their readers or watchers, and others are broadly trusted across society even by those who don’t follow them directly,” Zuckerberg said, adding that the social media firm will eliminate from the sample those who aren’t familiar with a source, so the output is a ratio of those who trust the source to those who are familiar with it.
Considering this is a small change, which isn’t sophisticated and relies solely on user feedback, there are already claims that Facebook’s latest movement is merely a PR stunt concocted to ease the backlash it is getting from outsiders about fake news.
User feedback cannot be fully trusted; many users will either ignore the survey questions or if they are a ‘troll’, purposely suggest a news source is trustworthy when it is not.