Skip to content

How Facial Recognition Monetizes Fear & Insecurity

Today’s blog covers some of the consequences that arise when businesses, such as PimEyes, use facial recognition technology to monetize people’s fears about their digital identity.


The internet is a prime example of a powerful tool that can be used for good or evil purposes. The discussion of issues, both anticipated and unintended, should not be confused as an indictment of the existence of any specific technology.

The use of facial recognition software has recently exploded. Some applications of this technology are benign, such as using the software to verify identity. However, other applications raise red flags.

What is PimEyes?

PimEyes is a search engine that uses facial recognition technology to search the internet for photos of a specific person. The company uses a monthly subscription model which varies based on number of searches, and number of alerts sent to the user when new photos are uploaded to the internet.

What problems can this cause?

Because PimEyes crawls the internet in search of images paired with a specific facial profile, users must upload a photo before performing a search. Three problems stand out when assessing the cultural impact of this type of technology.

1. PimEyes monetizes fear & insecurity

Why would anyone need to know and catalog every single personal image on the internet? Given the polarized nature of modern culture, it should come as no surprise that many people are concerned about their online reputations.

The permanence and public nature of the internet makes online reputation highly relevant and perpetually fluid. These realities create the perfect environment for companies – like PimEyes – to monetize the fear, insecurity, and curiosity that plagues those concerned with their online reputations.

PimEyes even has an option for people to remove unwanted images from their search results.

2. PimEyes can be used to track anyone

Although PimEyes encourages users not to perform searches on third parties, there is functionally no way to prevent a user from uploading someone else’s image. This makes it easier than ever for people to virtually stalk one another.

The Washington Post reported on the dangers this presents. The company’s director, who ironically chose to remain anonymous, stated, “We don’t encourage people to search for other people — it is their own decision to break the rules.”

However, the decision to “break the rules” can have grave consequences. In the same article, the Post noted that anyone around the globe can use PimEyes or similar technologies to carry out ethically questionable conduct.

For example, a researcher, Conor Nolan, self-deputized himself as an assistant to the FBI, scouring PimEyes for hours in search of possible suspects in the investigation that followed the events of January 6th.

However, Nolan lives across the pond in the UK and, aside from an interest in the investigation, appears to have no connection to the FBI or the events that occurred at the US capitol. Despite stating he is “uncomfortable” with the technology, Nolan determined that, “ethics aside,” he would use it again if there was a sufficient “need.”

Using technology in this way brings us to a third concern, namely that PimEyes and similar companies undermine the social trust necessary to stable societies.

3. Facial recognition search engines can incentivize stalking

Aside from encouraging people to self-deputize themselves in the latest politically charged state investigations, facial recognition can be used by abusive individuals who love preying on the insecurity and fear of their targets.

The Washington Post touched on these concerns, citing how social media posts routinely surface in PimEyes searches, despite the company assuring its customers that social media sites are not searched.

Applying facial recognition in this manner can amplify the psychological power abusers have over their victims. Those in abuse situations often live in fear of their abusers tracking their physical whereabouts.

Those who believe blocking users on various social media websites prevents an abuser from accessing important information – such as their physical location history – may be mistaken. The consequences of an abuser physically tracking a victim without the victim’s knowledge can range from benign to dangerous, depending on the specifics of each unique set of circumstances.

Concluding Thoughts

Trust is important and fragile. People living in a society without trust would be a miserable lot of humanity. Unfortunately, the unethical use of facial recognition has the potential to break apart the trust that holds society together.

Facial recognition technology itself is not inherently good or bad. However, it is important to understand how the application of this new technology impacts society. Like all tools, technology can be used for good or nefarious purposes. It is the responsibility of individuals and policymakers to ensure technologies are not abused.