NEWS
GOOD PEOPLE
HISTORY
LIFE HACKS
THE PLANET
SCIENCE & TECH
POLITICS
WHOLESOME
WORK & MONEY
Contact Us Privacy Policy
© GOOD Worldwide Inc. All Rights Reserved.

Facebook Responds To Accusations That It “Suppressed” Right-Wing Stories

Why do we expect neutrality from Facebook when we don’t expect it from media companies themselves?

AFP / Stringer

In a Gizmodo story published Monday morning, former curators of Facebook’s “trending” news section revealed that—contrary to the company’s FAQ—they routinely “suppressed” right-leaning stories from appearing in the influential feed. Monday afternoon, other former curators of the section contradicted the Gizmodo report on Fast Company. And Facebook executive Tom Stocky, whose team is responsible for the section, issued a strongly worded statement denying Gizmodo’s allegations:


“There are rigorous guidelines in place for the review team to ensure consistency and neutrality. These guidelines do not permit the suppression of political perspectives. Nor do they permit the prioritization of one viewpoint over another or one news outlet over another. These guidelines do not prohibit any news outlet from appearing in Trending Topics.”

Gizmodo followed up the report this afternoon with news that the U.S. Senate Commerce Committee is launching an inquiry into Facebook’s process for curating trending news.

As reported by the Wall Street Journal, John Cook, editor in chief of Gawker Media (which runs Gizmodo), is standing by the story’s claims that former workers deliberately kept stories with a conservative bent out of the section, including pieces from right-wing outlets like Drudge Report or Breitbart, unless they could be sourced from more “neutral” news organizations like the New York Times or CNN. Additionally, Gizmodo reported that stories about Facebook itself were allegedly K.O.’d from the roster of trending pieces, while news deemed “important” by the staff—like the Charlie Hebdo attacks in Paris or material related to the Syrian Civil War—was injected into the stream to project the image of a trustworthy news source.

“Depending on who was on shift, things would be blacklisted or trending,” the anonymous source told Gizmodo. “I’d come on shift and I’d discover that CPAC or Mitt Romney or Glenn Beck or popular conservative topics wouldn’t be trending because either the curator didn’t recognize the news topic or it was like they had a bias against Ted Cruz.”

Given Facebook’s prior attempts to “tinker with” our emotions, the Gizmodo report may not have come as much of a surprise. Yet this kind of censorship—if it is in fact occuring—would be troubling. Like Louis Brandeis said, most of the time, “the remedy to be applied is more speech, not enforced silence.” And Facebook’s alleged suppression of these stories raises a number of major questions facing both media companies and social platforms. Like: As Facebook becomes the distribution method for an enormous amount of our daily news, featuring content from major publishers that lives on the platform itself, how much responsibility does Facebook have to communicate that information neutrally? Is this behavior more like a journalist covering up information that doesn’t fit a chosen, predetermined narrative? Or is it more like a newsstand choosing to carry one newspaper and not another?

[quote position="full" is_quote="true"]Is this behavior more like a journalist covering up information that doesn’t fit a chosen, predetermined narrative? Or is it more like a newsstand choosing to carry one newspaper and not another?[/quote]

Clearly, a private company’s censorship is different from government censorship. Or even, as the AV Club points out, the choices made daily in any newsroom. But when 600 million people get their news from Facebook every day—and participating in the site’s ecosystem is worth a ton of money in clicks to both digital publishers and e-commerce sites—the ethics start to get pretty complicated. One thing’s for sure: If Facebook wants its users to consume a product-driven, company-approved version of the news, they are welcome to offer it. But as of now, that certainly isn’t how the company presents trending news to its users:

“Trending shows you topics that have recently become popular on Facebook. The topics you see are based on a number of factors including engagement, timeliness, pages you've liked and your location.”

While Facebook’s description doesn’t explicitly rule out interference, it sure seems to imply that the stories that show up are the result of algorithmic results somehow derived from a community or individual’s preferences and actions.

Facebook’s trending news section has been a work in progress since its launch in January 2014, and in that time, the company has generally kept the details of the project under wraps. Just last week, another report about the “trending” section from writer Michael Nunez shed some light on the operation, the journalists who maintained it, and how the company’s secrecy might be tied to the larger questions inherent in Facebook suppressing or promoting a given story:

One reason Facebook might want to keep the trending news operation faceless is that it wants to foster the illusion of a bias-free news ranking process—a network that sorts and selects news stories like an entirely apolitical machine. After all, the company’s entire media division … depends on people’s trust in the platform as a conduit for information. If an editorial team is deliberating over trending topics—just like a newspaper staff would talk about front-page news—Facebook risks losing its image as a non-partisan player in the media industry, a neutral pipeline for distributing content, rather than a selective and inherently flawed curator.

One of the things that’s most interesting to me in the conversation about Facebook’s approach, Gizmodo’s story, and the subsequent outrage (and ambivalence) is the fact that most of us don’t really even expect news companies themselves to be neutral anymore. Nor do we want them to! Sure, we expect certain professional standards for best practices and integrity from newsgathering operations. But we choose, generally, to stay in media bubbles and social spheres of our own preference, building out informational paradigms that challenge us just enough, while generally affirming our worldviews.

Perhaps, faced with slanted reporting, partisanship, and opinion pieces from who-knows-who (not unlike this one), we’re desperate for an arbiter who deserves our trust, or is at least somewhat unbiased, or who doesn’t seem like part of the whole ugly media circus. It’s true that Facebook should do right by its users, either by renaming the trending feature into something that sounds a little less impartial, giving its design less prominence, or simply highlighting—even celebrating—the fact that their quote-unquote-trending feed is curated. But for anyone to look to any company run by Mark Zuckerberg—who has made at least some of his political leanings clear, and has a spotty track record when it comes to transparency—for that guidance is kind of silly.

[quote position="full" is_quote="true"]We choose, generally, to stay in media bubbles and social spheres of our own preference, building out informational paradigms that challenge us just enough, while generally affirming our worldviews.[/quote]

Sure, the company presented a product as one thing when, in fact, it appears it was something else. Ultimately, whether Facebook should share content creators’ journalistic responsibilities or should instead adopt some other set of public standards appropriate for a world-dominating supercorporation that manipulates us and logs all our personal data, is still up in the air.

But the issue will become more and more pressing as Facebook looks to cut out the middle men (read: publishers) who want access to their audience, and simply create all the content themselves. That’s when things are going to get really weird.

More Stories on Good