Facebook fact-checkers impact more than fake news

Image by: Zier Zhou

Facebook’s expanded fact-checking practice doesn’t overstep the line, but it does call public media literacy into question. 

In late October, Facebook announced their updated fact-checking practice, which allows its 33 fact-checking partners to debunk and down-rank headlines deemed false even if the story itself is verified. 

The blurred line between valid analysis and biased projection in a short headline have brought this ranking system under fire, and cast doubt over Facebook’s role in journalistic practices. 

We live in a digital society where clickbait is the norm—but that doesn’t make it acceptable. While technically an apolitical corporation, Facebook’s public role in daily life still gives it an outsized responsibility to protect its users’ access to information. 

The public is interested in Facebook as an arbiter of knowledge because it appears to be the only entity with a large user base actively evaluating what people read. Twitter, for instance, is infamous for allowing abounding celebrity rumours and hate speech, and it puts its users at risk of consuming offensive material. 

Facebook’s fact-checking service is essential, but its ranking algorithms specifically impact the lifespans of smaller publications. It may consider outlets with less readership less reliable, or down-rank an opinion piece for its use of biased words in a headline.

This isn’t anything new: the tech giant has been criticized for creating “news bubbles” in the past. 

For instance, their now-obsolete news sidebar highlighted stories on a rolling basis, based on algorithms marking them important or relevant. However, as it uses technology to decide what its users see, feel, and understand, by proxy it informs and shapes their opinions. 

Facebook must be careful to toe the line between curating accurate reporting and curating public opinion. 

That said, their position to do so isn’t the company’s fault—it’s become a journalistic outlet whether it likes it or not. 

Facebook’s role as a custodian of information speaks to the larger issue of public media literacy—or a lack thereof. People should be taught where to get their news and how to interpret it critically and effectively. 

It’s not Facebook’s responsibility to teach people better content consumption practices, but it’s important for users to push back and express their discontent when it takes its ranking systems to an extent that makes them feel silenced. 

Facebook needs its users to remain relevant and impactful—they have the power to hold the service accountable for its actions. In return, Facebook should take its users’ concerns into account to win their trust. 

Other knowledge arbiters such as Wikipedia interact with their users, which has allowed the site to develop a more trustworthy, responsive policy since its inception.

As Facebook’s fact-checking service impacts the reach various publications and articles might garner, we can’t take headlines we see at face value.  

If the company’s journalistic curation fails to reflect public opinion, its users should log off and find their news elsewhere.   

—Journal Editorial Board

Tags

Facebook, Journalism, Social media

All final editorial decisions are made by the Editor(s)-in-Chief and/or the Managing Editor. Authors should not be contacted, targeted, or harassed under any circumstances. If you have any grievances with this article, please direct your comments to journal_editors@ams.queensu.ca.

Leave a Reply

Your email address will not be published. Required fields are marked *

Skip to content