Facial Recognition Technology + Cameras = Tracking Our Every Move
Security cameras popped up everywhere. They tracked us. But at least no one knew who we were. Unless your mugshot was in a criminal database, you probably couldn’t be identified. You were just that man in the denim jacket or the woman with the big hat.
But that has all changed now. With facial recognition technology, almost anyone with access to the cameras can see where we are and who we’re with. They can know what we’re doing and what we’ve done. Privacy? What’s that?
Peter Thiel, co-founder of PayPal, was one of the first investors in the company behind it, Clearview AI. He’s a libertarian, but he doesn’t seem too concerned about the freedom of the average person not to be spied on.
Clearview Won’t Stop
Facial Recognition Technology (FRT) has been around for about 20 years. Only recently have people started using it to match faces to online photos. Clearview AI has scraped millions of images from Facebook, YouTube and other sites to allow law enforcement to identify suspects.
Over 600 law enforcement agencies are now using the technology. But some large cities, including San Francisco, have banned police from using the technology. New Jersey has also banned its police force from using it. Many of the companies whose images are being scraped say they have policies against it.
Facebook has demanded that Clearview stop using its pictures. But so far there’s been no enforcement. Twitter says it specifically banned FRT scraping.
Three Billion Photos
The company has compiled almost three billion photos. Combine that with 50 million surveillance cameras in the U.S. It becomes easy to identify us. As soon as a camera captures our face, software can find out who we are.
It get worse. Clearview can identify people who aren’t even looking at the camera. It figures out who you are even if you’re wearing glasses or a hat or have blocked part of your face. It’s identified suspected criminals by their tattoos.
The technology doesn’t always work. The company won’t say how often it creates a false match. It has proven more faulty matching people of darker skin.
People can request to have their photos removed from Clearview. However, it requires submitting a photo of yourself and a government issued ID. How do we know that information won’t go into their database?
The app may become available to the public, whether by Clearview or through some new competitor popping up. This raises more privacy concerns. Anywhere you go in public, someone — not just law enforcement — can take a photo of you. They can instantly figure out who you are and find your personal details. Marketing, gaming and networking are all areas that could start using the technology. Manufacturers of body cameras are now adding FRT to their products.
Is this legal? Does FRT violate the First Amendment, Fourth Amendment and due process? Clearview hired Paul D. Clement, a United States solicitor general under President George W. Bush, to draft a memo to be given to potential customers assuring them that the technology does not violate the Constitution. Nor does it violate state biometric and privacy laws, he said.
Rep. Jim Jordan (R-Ohio) has criticized the technology. He pointed out on NPR that when the IRS had a lot of power, it went after people based on their political beliefs. He said the government could use the technology to identify what types of rallies people attend, such as pro-life rallies.
“The government’s not allowed to walk into that rally and walk up to people and say, hey, show me your ID.” But now they can use a camera to scan everyone’s face. Jordan said that will have a chilling impact on free speech. He praised San Francisco for banning police from using the technology.
Identifying Criminals Based on Facial Features
The technology could also lead to supposedly identifying criminals based on facial features. This practice, known as physiognomy, has long been considered pseudoscience. But a new study popped up in 2016 trying to revive it.
Two leading Chinese scientists claimed that machine learning techniques can predict the likelihood that a person is a convicted criminal with nearly 90% accuracy using nothing but a driver’s license-style face photo. Their argument? “Criminal and non-criminal face images populate two quite distinctive manifolds. The variation among criminal faces is significantly greater than that of the non-criminal faces.” Oh.
Lawsuits against Clearview have started. A class action suit was filed alleging violations of Illinois’ biometric privacy law. It has a good chance of winning, since Facebook recently settled a lawsuit for $550 million over similar violations of privacy laws.
The Vermont Attorney General has sued Clearview seeking to ban the company from collecting information from state residents as well as preventing the company from selling the information.
Status of the Law
There is little federal law governing the use of FRT. The Privacy Act of 1974 regulates the FBI’s use of the technology. While people have the right to privacy in their own homes, it doesn’t apply to public spaces, the Supreme Court ruled in Katz v. United States in 1967.
It remains to be seen how far state legislatures, Congress and the courts go to address this. So far, for most of us, the most we can do is opt out of Clearview’s database. Which may not mean much, if copycat companies do the same thing and don’t offer an opt out. This isn’t going to be about catching criminals for most companies. They want to be able to identify you and send you targeted advertising. Most companies won’t give up such a good way to get us to spend more.