Skip to main content

YouTube Offers The Most Inane Excuse for Why It Can’t Moderate Its Trending Section

youtube

After a video alleging that a Parkland shooting survivor was a “crisis actor” spent a while at #1 on YouTube’s hugely influential trending page, the streaming site foundered in trying to explain why actual humans couldn’t help out.

Recommended Videos

YouTube’s trending page has come under scrutiny as of late, after its algorithm allowed content like Logan Paul’s video in Japan’s “Suicide Forest” to reach #1 as well as promoting actual fake news conspiracy videos (not Trump Fake News) in the wake of tragedies.

Of course, Internet denizens who are clicking and sharing awful stuff like this bear some responsibility, but we start getting into serpent-eating-its-own-tail territory here: tons of people visit YouTube’s trending page, and so that bad content gets clicked on all the more the higher it rises, and it seems to have a patina of respectability by being on the trending page in the first place. YouTube really should have a hand in moderating what shows up and is promoted on its most visible and most visited portal.

YouTube’s argument for why it can’t is … silly at first glance and terribly insincere after further consideration. As CNBC reports:

The Trending tab features videos algorithmically, factoring in things like view count, rate of growth in views, and the age of the video. YouTube’s position is that because it has country-specific Trending tabs all over the world that update approximately every 15 minutes, it would be impossible to have humans moderate that section.

CNBC spoke with an expert who poked massive holes in this assertion immediately:

“This is an absurd excuse,” Christo Wilson, an assistant professor in Northeastern University’s College of Computer and Information Science, tells CNBC.

“YouTube implemented the Trending algorithm, and if it is updating too fast to moderate, then the solution is to simply slow it down. This is a technical change that is well within YouTube’s control.”

In the same way that YouTube makes the decision to give Trending such a high-profile spot on its home page, it can make the decision to tweak how it works.

“If Trending videos were currently being picked by a team of people, those people would be getting fired after today,” Wilson says. “Why do we expect less from an algorithm?”

To say that “whoops our algorithm goes too fast so we’re not responsible for what we promote” is preposterous. As Wilson points out, this is entirely within YouTube’s power to change. They have access to the world’s finest computer engineers. This would not be a big tweak.

I understand YouTube’s reticence to have a human team selecting these videos—they probably shouldn’t let moderators choose and curate the trending page. But it’s something else entirely for moderators to be able to ensure that guidelines-violating content doesn’t trend in the first place.

I’ve worked as a moderator for social media sites, and I’ll be the first to say humans can be problematic moderators. We all come with opinions and biases, and no one is perfect. But we are, at this point, better equipped to judge context than machines. I’ve gone over thousands of pieces of content that were flagged by users or algorithms that turned out to be just fine once I evaluated it by the site’s standards. That’s why human moderators exist. People think of moderators as censorious, but they’re also working to rescue content that might otherwise be removed by a confused machine or outraged viewer. It goes both ways.

YouTube should have a system in place where its trending section is slowed down so that this influential content can be evaluated by real people. These people should be there to ensure, simply, that that content does not violate YouTube’s guidelines. It’s not political to say that a video accusing a Parkland survivor of being a crisis actor should not trend. That falls under the company’s harassment guidelines.

YouTube cannot wait to take action retroactively when the damage is already done. And they don’t have to. They have Google’s practically infinite resources on hand to create all the tools that they need and hire teams of skillful people. To claim that their hands are tied here is disingenuous and flat-out wrong.

(via CNBC, image: Google)

Want more stories like this? Become a subscriber and support the site!

The Mary Sue has a strict comment policy that forbids, but is not limited to, personal insults toward anyone, hate speech, and trolling.—

Have a tip we should know? [email protected]

Author
Kaila Hale-Stern
Kaila Hale-Stern (she/her) is a content director, editor, and writer who has been working in digital media for more than fifteen years. She started at TMS in 2016. She loves to write about TV—especially science fiction, fantasy, and mystery shows—and movies, with an emphasis on Marvel. Talk to her about fandom, queer representation, and Captain Kirk. Kaila has written for io9, Gizmodo, New York Magazine, The Awl, Wired, Cosmopolitan, and once published a Harlequin novel you'll never find.

Filed Under:

Follow The Mary Sue: