Facebook’s environmental groups suspension error proves onus to fact-check is on us

As social media sites boom, online misinformation is becoming more prevalent—and harmful. Facebook has vowed to fight “fake news” on countless occasions, yet continually fails to target the right people. As social media consumers, we need to stop relying on businesses like Facebook to point out misinformation and take fact-checking into our own hands.
Last week, Facebook launched a new initiative to promote climate science. Days later it suspended numerous environmentalist groups from the platform, including activists opposing the Coastal GasLink pipeline cutting through Wet’suwet’en land. The suspensions were later chalked up to error.
This isn’t the first time Facebook has vowed to fight misinformation, before royally failing at the task. While its efforts to fight misinformation are admirable, its review system doesn’t work nearly as well as it should.
Large climate activist groups like Greenpeace USA have the resources to fight Facebook’s suspension, but smaller activists, especially Indigenous ones, rely on social platforms to mobilize and might not have PR watching their backs. 
Facebook’s suspension-spree wasn’t just a mistake—it was actively harmful to these groups.
Weeding out misinformation should theoretically be straight-forward, yet some anti-climate action posts can be deemed an opinion and spared the fact-check.
Facebook needs to decide what kind of company it wants to be: either it commits whole-heartedly to fighting fake news, or it abandons the responsibility all together. If it wants to fact-check, it needs to do so in a way that doesn’t harm smaller activists like the environmentalist ones impacted last weekend and doesn’t let false information off on the guise of opinion.
Facebook could rectify this by following in Twitter’s footsteps and targeting larger accounts and government leaders, like US President Donald Trump. With large followings, these accounts are most responsible for the spread of fake news. Paying special attention to the information shared by these accounts would have the most significant effect and might prevent smaller activists from getting caught in the fake news crossfire. 
On the other hand, Facebook is a business and our trust of it should only go so far. At a certain point, we—as social media users—need to take matters into our own hands and be diligent about what we’re reading.
We can start by giving people the tools to fact-check from a young age. In school, we’re taught how to find reliable sources, but these sources are most always scholarly. The online world is constantly changing; we need to adapt with it and recognize that social media—with all its misinformation—inevitably plays a role in today’s children’s lives. Navigating it should be taught as such.
Facebook needs to sort out its misinformation policy. In the meantime, we must become better fact-checkers and stop relying on social media platforms to do the work for us.
—Journal Editorial Board

All final editorial decisions are made by the Editor(s)-in-Chief and/or the Managing Editor. Authors should not be contacted, targeted, or harassed under any circumstances. If you have any grievances with this article, please direct your comments to journal_editors@ams.queensu.ca.

When commenting, be considerate and respectful of writers and fellow commenters. Try to stay on topic. Spam and comments that are hateful or discriminatory will be deleted. Our full commenting policy can be read here.