IE 11 is not supported. For an optimal experience visit our site on another browser.

Clearview AI's facial recognition settlement is a frightening window into our future

The tech company is one of the largest collectors of facial recognition images in the world. Will its product be used to restrict our civil rights?

By

The Republican Party might pursue an all-out assault on civil rights if the Supreme Court overturns federal abortion rights, and it will require a robust surveillance state.

As I wrote last week, the high-tech snooping GOP-led states will need to enforce their draconian abortion bans sounds like it’s straight out of a sci-fi flick. I’m talking phone geolocation software that can notify vigilantes and law enforcement officials when you or anyone you know approaches an “unlawful” abortion clinic. Or software that can scour personal emails for any reference to the word “abortion” and other words that conjure right-wing suspicion.

Another powerful tool for fascist snoopers is facial recognition software. Its potential for misuse was at least partially curbed Monday with a court settlement stemming from a lawsuit filed by the American Civil Liberties Union against Clearview AI, a company specializing in facial recognition software.

In its lawsuit, filed in 2020, the ACLU alleged Clearview violated Illinois state law by failing to get consent from people before collecting or using their biometric information.

In a settlement Monday, the company agreed not to sell its product to most private companies and individuals in the United States. 

Photo Illustration: Facial recognition mapping
MSNBC / Getty Images

Clearview, which bills itself as “the world’s largest facial network,” scrapes many of its images from social media websites. The company reportedly told investors it’s on track to have 100 billion photos in its database within the next year, and company officials have said they want to make its software available to private groups.

But as part of the settlement, the company said it will largely restrict sales of its software to law enforcement agencies — prohibiting most sales to individual officers, businesses and private citizens. 

To be clear: The fact this technology is still available to law enforcement agencies shouldn’t inspire much confidence. Abortion rights supporters, for example, have pointed to Clearview’s cozy relationship with law enforcement groups for years as a sign of potential danger.

Writing about Clearview for the online magazine Lady Science in 2020, co-editor-in-chief Leila McNeill said:

Clinic staff, escorts, and patients rely on anonymity to provide some level of protection that allows us to leave the harassment and threat of violence behind when we leave the clinic. But when that anonymity is gone, the risk of targeted violence that follows us to our homes and workplaces escalates. Even if this app isn’t yet in the hands of the public, the friendly law enforcement could very possibly oblige the protesters.

It’s easy to imagine a scenario in which an officer friendly with anti-abortion groups runs a facial recognition scan on their behalf in order to help them hunt down a woman believed to have “unlawfully” terminated her pregnancy. Or, in a state like Texas where people who aid women in obtaining abortions can be punished, authorities could run a scan to track down an Uber driver who drove the woman to the clinic. 

The idea here is that, while the Clearview settlement is an important step in dialing back the power of the surveillance state, there’s still potential for nefarious uses. And we all need to be wary — in this high-tech, increasingly anti-democratic world of ours — of emerging technology being used against us. Without proper scrutiny, the technology of today and tomorrow will be used to further restrict the freedom of people in America.