Skip to main content

Facebook’s News Algorithm Woes Continue; Mark Zuckerberg Insists They’re “Not a Media Company”

Facebook all Over the World

The Facebook news feed is one way that many people get their news, but Facebook isn’t a news outlet. They’re not, okay? They’re just not! Or at least, they really don’t want you to see them that way.

Recommended Videos

Remember the controversy over Facebook’s “Trending Topics” sidebar? Until a few months ago, I’m pretty sure no one was paying any conscious attention to which articles did and didn’t appear in that sidebar. (Okay, I know I wasn’t.) Then, after a self-identified former Facebook employee started talking to news outlets and saying that the “Trending Topics” sidebar had a liberal bias, scrutiny arose about the algorithms (and the people) that determine which articles end up “Trending” on Facebook. CEO Mark Zuckberberg had to meet with conservative figureheads in order to reassure the world that Facebook was an unbiased link-sharing source, and the company also announced plans to teach their employees to be politically unbiased.

Even after all that, though, Facebook still must have thought that they needed to do more, because last week, they overhauled the entire “Trending Topics” team and removed almost all of the humans involved in deciding which stories appeared. Now, “Trending Topics” is run almost entirely by algorithm. How’s that been going?

not great

Today, the “Trending Topics” algorithm showcased a link to a story about Fox News firing Megyn Kelly. Except, uh, Kelly has not actually been fired from Fox News. The story also made the unconfirmed claim that Kelly is a secret supporter of Hillary Clinton, which is why she was supposedly fired—which, again, did not actually happen. This fraudulent story was getting shared a whole lot on Facebook, though, so the algorithm tossed it into “Trending Topics.” Without human oversight, the story stayed up for long enough to be an embarrassment for Facebook, with many news commentators calling out how deplorable it was to see falsehoods printed in the sidebar.

This is poor timing for Zuckerberg, who just did a public live-stream event earlier today to discuss these exact issues at Facebook. According to Business Insider‘s report of the live-stream, Zuckerberg got asked about Facebook’s role as a news outlet and whether he saw the company in that light, to which he sighed, paused at length, then said “No.” He later elaborated, “When you think about a media company, you know, people are producing content, people are editing content, and that’s not us.”

Zuckerberg emphasized that Facebook is a “technology company” and “not a media company” multiple times in his response, and insisted that “every person gets to program their own Facebook experience, right? You choose who your friends are, who you want to follow, what businesses you want to follow, what other institutions, and in that way, one of the cool things I think about social media is that it’s the most diverse forms of media that has ever existed.”

To me, Zuckerberg’s response implies that the responsibility for what appears on Facebook is therefore up to its algorithms, and also up to each individual user, who supposedly “gets to program their own Facebook experience.” This is a way to distance Facebook from any responsibility for the content that appears there. Because Facebook isn’t identifying as a news outlet, that means that they don’t have to abide by the same standards as any journalistic institution—which also means that Facebook has seemingly no problem with giving the police access to a portal that facilitates the removal of posts, such as videos of police brutality.

Facebook is able to distance itself from these ethical questions by positioning themselves as a “technology company” that simply aggregates links and content created elsewhere. This harkens back to the platonic ideal of an unbiased algorithm—a robot that shares “objective” news, with no human political biases. But, of course, as I say every single time I cover this topic, even algorithms have biases. Humans build them, and those humans decide what the algorithm will and won’t prioritize. There’s always going to be a bias involved there, and that isn’t necessarily a bad thing.

As the most recent “Trending Topics” gaffe illustrates, there’s a lot of value in having some human oversight in order to make sure the algorithm is sharing news that’s actually worthwhile. Is there any value in sharing a story that contains massive factual inaccuracies? Just because a lot of people are sharing a story, doesn’t mean it’s inherently “good,” nor even that it’s well-researched or factual—but an algorithm won’t necessarily know how to determine those aspects, and will only see how many “shares” a story gets. It all depends on how you define value—and different algorithms are going to have different standards for what’s “valuable” and what you “should” see.

I don’t necessarily trust an algorithm to know what I “should” see any more than I trust a human employee at Facebook. I don’t even always trust myself to know what news is important; I try to follow as many different types of writing as I can, but it’s a process. It’s a process that I think about a lot because it’s my job, but I don’t necessarily expect a non-journalist to have awareness about how to find stories or what to read every day. Most people probably consider Facebook, Twitter, and/or Reddit to be the equivalent of reading the front page of a newspaper; it all includes breaking news, so what’s the difference?

The problem is that in the past, newspapers could be held accountable for making massive mistakes on the front page. If Facebook makes a mistake on their “front page,” they can blame it on an algorithm, or blame it on their own users, who shared a fake story enough times that it became a “Trending” piece. Sure, Facebook could just re-tool their algorithm and hope to prevent this from happening again in the future. But how could anyone hope to design an algorithm that would always know which news is the “right” news for you to see? Hiring a diverse editorial team for oversight seems like a better plan to me … but I guess that’s just what a human journalist would say.

(via BI, image via C_osett/Flickr)

Want more stories like this? Become a subscriber and support the site!

The Mary Sue has a strict comment policy that forbids, but is not limited to, personal insults toward anyone, hate speech, and trolling.—

Follow The Mary Sue on Twitter, Facebook, Tumblr, Pinterest, & Google+.

Have a tip we should know? [email protected]

Author
Maddy Myers
Maddy Myers, journalist and arts critic, has written for the Boston Phoenix, Paste Magazine, MIT Technology Review, and tons more. She is a host on a videogame podcast called Isometric (relay.fm/isometric), and she plays the keytar in a band called the Robot Knights (robotknights.com).

Filed Under:

Follow The Mary Sue: