How Clearview AI Is Using Facial Recognition
Clearview AI built a massive database of faces that it’s making available to law enforcement, and nobody’s stopping it.
Your Instagram pictures could be part of a facial recognition database that’s been made available to law enforcement agencies. That’s thanks to Clearview AI, a mysterious startup that has scraped billions of images from across the web, including from social media platforms like Instagram and Twitter.
Law enforcement has been using facial recognition for a while. But Clearview’s technology represents a scary step further than anything we’ve seen before, according to reporting from the New York Times. The secretive company says it’s created a database of over 3 billion images that have been scraped from all corners of the internet, including social networks like Facebook, Instagram, and YouTube. From just a snapshot or video still, Clearview claims its app lets a police officer identify a face and match it with publicly available information about the person, within just a few seconds.
Faced with these concerns, the world’s biggest tech companies are stepping up, sending cease-and-desist letters to Clearview that order the company to stop scraping their sites for our data. But it’s not clear how much power those companies have, or how invested they actually are in protecting our personal information. While some lawsuits against Clearview are also popping up, it’s not yet apparent how Clearview could be stopped. That has privacy advocates pointing to the need for a federal law regulating, or even outright banning, facial recognition in the United States.
So here’s how Clearview’s tool works. Say you have an image of a person, but you don’t know their name. You could input that photo into Clearview’s app, and it will turn up any image of the person that it had scraped from the internet, as well as links to websites from which those images came. That could be a good amount of information.
Again, Clearview’s database reportedly includes more than 3 billion images taken from around the web. That’s much more than what law enforcement agencies typically have access to. The Times reports that the technology will work with images of faces from many different angles, while older facial recognition tools used by police departments might require the subject to be looking straight ahead, like in a mug shot.
That means images from social media posts on platforms like Instagram could pop up — even images that are no longer, but once were, publicly available. And keep in mind: The tool doesn’t just surface pictures that you’ve taken and posted online. It will also turn up any photos posted of you, even those posted without your consent or knowledge.
“Clearview is a search engine for publicly available images,” Clearview CEO Hoan Ton-That told Recode in an email. “We do not and cannot index any images that are private or protected, such as those in a private Instagram account. A previously public image may or may not be searchable with Clearview depending on the time it was public and that would depend on the individual case.”
More than 600 law enforcement agencies have used Clearview AI in the past year, as have federal agencies like the Federal Bureau of Investigation (FBI) and the Department of Homeland Security (DHS), according to the Times. The FBI would not confirm to Recode that it had used Clearview’s tool, instead pointing to its testimony last year about its use of facial recognition more broadly from June 2019. The DHS did not respond to Recode’s request for comment by the time of publication.
The tool has also been provided to some companies for security, though Ton-That wouldn’t tell Recode which ones. The Times has reported that “at least a handful of companies” have obtained licenses for Clearview’s technology “for security purposes.” Meanwhile, Ton-That told Recode: “We decline to comment due to confidentiality and other reasons.”
It’s also unclear who else, including foreign governments, Clearview is willing to do business with. Ton-That told the Times, “If it’s a country where it’s just governed terribly, or whatever, I don’t know if we’d feel comfortable, you know, selling to certain countries.” Nevertheless, BuzzFeed News recently reported that Clearview has plans to sell the tool to at least 22 countries, including some with concerning human rights records. Clearview told Recode that it did not currently have contracts outside the US and Canada. The Royal Canadian Mounted Police, the national police service in Canada, told Recode that it does not comment on specific investigative tools but that it researches emerging technologies.
The use of this tool by law enforcement alone raises questions, however. Imagine, for instance, the ethical problems at play with a police officer using Clearview to identify a protester. Or, say the facial recognition doesn’t work as it should and a false “match” ultimately leads to arresting someone for a crime they didn’t commit.
But there’s also fear that Clearview’s technology could one day be made available to anyone, and such a development could destroy our expectation of being anonymous in public. It’s not difficult to imagine terrifying uses of this. Imagine if a nude picture of you was, at some point in time, posted online. With the snap of a phone camera, it’s possible that anyone with Clearview could instantaneously find that image. Or imagine you’re walking down the street, and someone decides they want to know where you live and whether you had kids. All they might need is the Clearview app.

Since its inception, The National Digest has been dedicated to providing authoritative and thought-provoking insights into trending topics and the latest happenings.