A woman's hand holds a smartphone in a pink case.

The App That Lets People Search You via Your Face Is Real and TERRIFYING

This article is over 4 years old and may contain outdated information

Recommended Videos

The end of privacy … now there’s an app for that.

In some truly awesome “we are one step closer to dystopia” news, The New York Times this weekend profiled “The Secretive Company That Might End Privacy as We Know It.” If that headline isn’t terrifying enough, the content of the article certainly is. Meet Clearview AI: the real company and app that matches your face with billions of pictures posted online.

Clearview AI, a secretive company that first refused to speak to the New York Times for this report, is run by Hoan Ton-That, an entrepreneur who before getting into facial recognition, made an app where you could paste Donald Trump’s hair onto pictures. That, along with his other ventures, failed, but he ended up working with Richard Schwartz, a former aide to Rudy Giuliani when he was mayor to found Clearview.

What does Clearview AI do? It uses facial recognition to match a face with billions of images it claims to have scraped from public platforms like Facebook, YouTube and more. This kind of mass image collection and scraping is the first of many creepy and extremely Not Okay things about Clearview AI. The companies from which it claims to have scraped data prohibit such data mining, and Twitter expressly forbids it—but Clearview doesn’t seem to care.

With this database and facial recognition powers in hand, Clearview allegedly tried to market to all sorts of horrible purposes: from vetting unwanted lobby guest to doing “extreme opposition research” for anti-Semitic politicians, but they really found a foothold with law enforcement in 2017, and ended up with more investors, including noted Bond villain Peter Thiel.

The next scary thing? This isn’t an app that is in the works. It exists and Clearview has sold it to over 600 federal and local law enforcement agencies. And those agencies have no idea how Clearview works! And they don’t know how secure the sensitive data they are giving to Clearview is! But they’re using it because it’s full of pictures the company shouldn’t even have.

But the uses of Clearview are so tempting. It allows law enforcement, where facial recognition has been used for many years but limited to public records like drivers licenses and mugshots, to use everything. In an early use in Indiana in 2017, Clearview identified a criminal in 20 minutes that would not have been ID’d with conventional tools—but is that really good?

Well no. There’s little to no data about how accurate Clearview is as an ID tool and facial recognition software is still flawed—and misidentifies darker-skinned individuals more often than whites. And the lack of accuracy makes the potential for abuse even worse.

There’s a reason that companies like Google have held back on creating or using programs like this: the potential for abuse is massive. Say a man sees a pretty girl on the street and she turns down a compliment and he feels rejected? What’s to stop him from snapping a picture and finding her on social media and harassing her, or doxing her, or stalking and killing her?

The terrifying possibilities are endless, both in the hands of governments that could use this to seek out dissidents and detractors, to regular people who could use it as a means to harass, stalk, and abuse. And what if it gets the wrong person? That’s also terrible!

Of course, the argument is there that all this data is already public. We put ourselves in this position by giving big brother selfies of our faces at every angle as we document every aspect of our lives on social media. If we’ve learned else nothing from YOU, it should be how easy it is to use and manipulate what we willingly put online for nefarious reasons.

So, we have a company using stolen data it should not have being used by law enforcement and security all over the country. Is there a way to put the genie back in the bottle and keep this app out of the hands of the bad guys? It’s not easy, because Clearview is just one company. From the Times:

“It’s creepy what they’re doing, but there will be many more of these companies. There is no monopoly on math,” said Al Gidari, a privacy professor at Stanford Law School. “Absent a very strong federal privacy law, we’re all screwed.”

Oh good, we’re waiting on the government to step in. That will go well! But if they don’t we really are screwed and we’ll have to adjust to a new normal where we can be identified anywhere.

I can’t help but think about China, a country full of cameras and where facial ID is required by the government and used by apps like WeChat to do everything from talk to pay for things. It’s convenient and good to fight crime, but it comes at the price of privacy and freedom, and is that what we want? Will we even have a choice?

(via: CNET, image by Matthew Henry from Burst)

Want more stories like this? Become a subscriber and support the site!

The Mary Sue has a strict comment policy that forbids, but is not limited to, personal insults toward anyone, hate speech, and trolling.—


The Mary Sue is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy
Author
Image of Jessica Mason
Jessica Mason
Jessica Mason (she/her) is a writer based in Portland, Oregon with a focus on fandom, queer representation, and amazing women in film and television. She's a trained lawyer and opera singer as well as a mom and author.