Comprehensive software reviews to make better IT decisions
Clearview AI Demonstrates the Dangers of Facial Recognition
Facial recognition technologies (FRTs) are in the news again. This time, it is Clearview AI, a small company that until recently was virtually unknown to everyone except the 600 law enforcement agencies using its technology to match people’s photos to their online presence.
Clearview AI has scraped millions of websites – including news, business, education, and employment sites, social networks, and even the Venmo digital payment service – and built a database of three billion images. A user of its facial recognition software can take a photo of a person, and the system will match it against the database and report where else that person has appeared on the web, reports The New York Times.
The app is being used or piloted by 600 law enforcement agencies in the US and Canada, “ranging from local cops in Florida to the FBI and the Department of Homeland Security.” Reportedly, the app has helped to solve a few cases: “shoplifting, identity theft, credit card fraud, murder and child sexual exploitation cases.”
In one reported case where the tech was used to solve a crime (in 20 minutes!), the police identified the perpetrator by matching his face from a bystander’s video recording of the crime with another video on social media which included the attacker’s name in the caption.
The technology seems to be so good that it works even with less-than-perfect pictures: it can recognize photos of people whose facial features are obscured by hats or glasses, photos of people in profile, or photos with only a partial view of the face.
While it is assuring that this technology has been helpful in solving crimes, it is alarming to think what else it can be used for.
So far, Clearview AI is used by law enforcement agencies. But what’s stopping the company from making it available to just anyone? writes The New York Times. And what inventive applications can people come up with? How about using GANs (generative adversarial networks) to create deepfakes simulating porn and use them to degrade women?
Imagine your name, home address, family, friends, likes/dislikes, and anything else you’ve ever shared/posted on the web being instantaneously available to anyone anywhere – through a picture they take of you without asking your permission or without you even being aware of it. “It would herald the end of public anonymity,” writes The New York Times.
I also like this quote which summarizes well the dangers of using technologies such as FRTs in an uncontrolled manner: “It’s creepy what they’re doing, but there will be many more of these companies. There is no monopoly on math,” said Al Gidari, a privacy professor at Stanford Law School. “Absent a very strong federal privacy law, we’re all screwed.”
There is a reason why FRTs are banned in San Francisco, why the US Senate is considering a bill to limit the use of facial recognition technology by federal agencies, and why the EU is considering to ban FRTs for up to five years (so that regulators can work out how to protect us, the public).
It is not acceptable to unleash technologies without thinking through the consequences. It is up to us, the business leaders, to ensure that we do no harm to people consuming our technologies, to the communities we live in, our societal structures, and the environment.
Want to Know More?
To learn more about how technologies are violating basic human rights, read our note Amnesty International Calls Google and Facebook a Threat to Human Rights and the Amnesty International report “Surveillance Giants: How the Business Model of Google and Facebook Threatens Human Rights” for more details.
We discuss the two-faced nature of AI in our note Google and IBM Are Calling for AI Regulation. (This is not anything specific to AI, by the way – all technologies can be used for good or bad, but we disagree with those claiming that technologies themselves are neutral.)
To learn more about the harms you can unleash on your consumers, employees, and partners and how to prevent those harms, consult Info-Tech’s blueprint Mitigate Machine Bias.
Q headlines a bevy of announcements at AWS re:Invent 2023 in Las Vegas that shed more light on the cloud service provider’s AI strategy and where its differentiation from other vendors lies.
Transparency, explainability, and trust are pressing topics in AI/ML today. While much has been written about why these are important and what organizations should do, no tools to help implement these principles have existed – until now.
Recently I attended the inaugural Emotion AI conference, organized by Seth Grimes, a leading analyst and business consultant in the areas of natural language processing, text analytics, sentiment analysis, and their business applications. So, what is emotion AI, why is it relevant, and what do you need to know about it?
SortSpoke’s novel approach to machine learning answers a longstanding problem in financial services – how to efficiently extract critical data from inbound, unstructured documents at 100% data quality.
Amazon is offering its cashierless store technology to other retailers. The technology known as “Just Walk Out” eliminates checkout lines, offering an “effortless” shopping experience and shifting store associates to “more valuable activities”.
As the COVID-19 pandemic is shutting down whole countries, a few of you may be wondering whether AI can help create a vaccine for the virus responsible. After all, AI is magic, right?
Alphabet is facing backlash from its shareholders over its approach to digital privacy, reports the Financial Times. And not for the first time. This time, however, things will need to change.
The EU plans to invest €6 billion to build a single European data space, reports EURACTIV. The envisioned space will house personal, business, and “high-quality industrial data” and create the infrastructure for data sharing and use across businesses and nations.
“Facebook quietly acquired another UK AI startup and almost no one noticed,” reported TechCrunch on February 10. We looked into why.