Should We Care if Facebook Has an Anti-Conservative Bias?
I am a die-hard liberal. I spend my days raging about LGBT rights and reproductive health care access, supporting higher taxes to fund improved government social safety net programs and voting for politicians who support environmental protection and workers’ rights. I often foam at the mouth after hearing the claptrap that comes out of some Fox News commentators.
Still, when I learned that Facebook may be suppressing conservative stories from appearing on its “trending” feed, I was very disappointed.
Former employees recently told tech blog Gizmodo that they routinely stopped topics like the Conservative Political Action Conference, Mitt Romney and Rand Paul from popping up on Facebook’s sidebar of popular stories. At the same time, those workers claimed that they injected less-talked-about subjects like Charlie Hebdo and Black Lives Matter as to suggest that they were trending.
“It was absolutely bias. We were doing it subjectively. It just depends on who the curator is and what time of day it is,” a former worker tells Gizmodo. “Every once in awhile a Red State or conservative news source would have a story. But we would have to go and find the same story from a more neutral outlet that wasn’t as biased.”
Provided these claims are true – Facebook says they aren’t — we should all be concerned about the suppression of conservative content. It is paramount that we be exposed to a broad swath of fact-based opinions, even if they differ from our own, so we can make informed judgments about the world around us.
According to the Pew Research Center, more than 60 percent of Americans get their news from Facebook. The content of this news depends on the pages users choose to like and follow, of course, but also what the social media community is supposedly discussing.
The trending feed should reflect these criteria, in theory making it one of the few spaces on Facebook with content that may conflict with our own personal views.
A study published in Science last year noted that the content we view on the site usually aligns with our own political ideologies. We can blame Facebook’s algorithms to a point, but soon we have to look in the mirror.
We’re more likely to click on stories we agree with, rather than those with divergent messages. And our own personal choices limit the diverse viewpoints we’re exposed to more than the technology, researchers say.
Some may argue that certain opinions need to be suppressed because they’re actively hurtful or factually inaccurate. I’d like to make clear that my argument for inclusive information excludes bigotry — which typically relies on ignorance to target and cut down specific groups of people — and misinformation.
Social media clearly reflects that inability to listen to diverse opinions. And Facebook’s own biases don’t help.