Featured Article : Face-Recognition & Personal Data Concerns

Two Harvard students have developed a program which (when used with Meta’s smart glasses) can identify people without their knowledge, thereby highlighting a potentially serious privacy risk. 

Why? 

In their report about their research, the two Harvard students, AnhPhu Nguyen and Caine Ardayfio, said their goal was “to demonstrate the current capabilities of smart glasses, face search engines, LLMs, and public databases, raising awareness that extracting someone’s home address and other personal details from just their face on the street is possible today”. 

Turning Glasses Into Facial Recognition Tools 

As part of a project they called I-XRAY, the students developed a program that demonstrates the potential privacy risks of using AI with smart glasses like Meta’s Ray-Ban models.  

Using their experimental system, the students have reported that they can stream live recordings from Meta’s Ray-Ban smart glasses (which are equipped with camera) to a computer, where AI is then used to spot when the glasses are looking at a face. Next, they are able to use AI and facial recognition tools, notably the PimEyes facial search engine while live streaming video from the glasses to Instagram, to positively identify strangers to whom the faces in the video belong.  

Once the glasses detect a person’s face, the program can pull up images and publicly available personal information, including names, addresses, and more, within minutes. 

The experiment shows how it’s possible to quickly access personal details of random individuals simply by walking past them and capturing their faces on camera, highlighting how invasive and dangerous this technology could become, if misused. 

What Is PimEyes and How Can They Use It? 

PimEyes is a facial recognition search engine that allows users to upload an image of a face and find other images of that person across the web. It scans public databases, websites, and social media platforms to match facial features, making it possible to track someone’s online presence. In their experiment, the Harvard students used PimEyes to identify people in real-time by integrating it with Meta’s Ray-Ban smart glasses. As the glasses recorded video, PimEyes was used to find additional images and public information about individuals whose faces were captured. 

Leveraged Today’s LLMs 

The students said that what makes their I-XRAY system so unique is that it operates entirely automatically, thanks to the recent progress in AI Large Language Models (LLMs). The system leverages the ability of LLMs to understand, process, and compile huge amounts of information from diverse sources, inferring relationships between online sources, such as linking a name from one article to another, and logically parsing a person’s identity and personal details through text. The students said that is this “synergy between LLMs and reverse face search” that “allows for fully automatic and comprehensive data extraction that was previously not possible with traditional methods alone”.  

Used ‘FastPeopleSearch’ To Get Other Personal Details From Names 

Worryingly, the students reported how their system (once they get an LLM-extracted name) can use a ‘FastPeopleSearch’ lookup to identify the person’s home address, phone number, plus their relatives. FastPeopleSearch is a free online tool that allows users to find personal information about individuals, such as addresses, phone numbers, and even family members or associates. It aggregates publicly available data from various sources to offer these details. 

To use it, you go to the FastPeopleSearch.com website (if access is allowed – the website is using a security service – you may need a VPN), enter a person’s name, phone number, or address, and the tool will search its database to return matching results. It’s often used for background checks, though it raises privacy concerns due to the accessibility of personal information. 

Doesn’t Have To Be Smart Glasses 

The students have highlighted that although Meta’s smart glasses have been used in their experiments, using their system, the same results could be achieved using just a simple phone camera. This means that anyone could use this technology to identify, track, or access personal information about strangers in real time, without their knowledge or consent. This raises serious privacy and security concerns, especially regarding stalking, harassment, or ‘doxxing’. 

Are We Ready For This? 

As one of the student researchers in this experiment, AnhPhu Nguyen, said in an ‘X’ post about their findings, “Are we ready for a world where our data is exposed at a glance?”, with others commenting “Fascinating but Dystopian” and “looks like some govt entity will try and get hold of this”. 

How Can You Protect Yourself? 

Despite demonstrating how easily facial recognition systems can be built using publicly available technologies and data, the Harvard researchers have also provided steps to help individuals protect their privacy – helpful links to do this can be found here. They explained how people can remove their data from major facial recognition and people search engines like PimEyes and FastPeopleSearch. Both PimEyes and Facecheck.id offer free services to opt-out, while major people search engines like FastPeopleSearch, CheckThem, and Instant Checkmate allow users to remove their information. Also, considering the potential financial havoc if a person’s US social security number (SSN) is leaked/part of a data dump, the researchers have recommended freezing credit and using two-factor authentication to prevent identity theft. 

What Has Meta Said? 

Meta has reportedly said that the students’ experiment appears to involve them “simply using publicly-available facial recognition software on a computer that would work with photos taken on any camera, phone or recording device”, and has highlighted that its smart glasses are designed to comply with privacy laws, such as including a visible light to indicate when they are recording. 

What Does This Mean For Your Business? 

This could be an important experiment in that it highlights just how vulnerable personal data can be in the age of advanced AI and facial recognition technology. While Nguyen and Ardayfio’s research shows the remarkable capabilities of current technologies, it also serves as a wake-up call to the broader public. The ease with which private details, such as names and addresses, can be extracted from something as simple as a passing glance on the street is a stark reminder of the privacy risks we face. The fact that these tools can be used not just with specialised smart glasses, but also with everyday devices like smartphones, makes the issue even more pressing. 

For businesses, this raises important questions about the ethical and legal responsibilities of companies developing and deploying similar technologies. As facial recognition becomes more ubiquitous, organisations must navigate the fine line between innovation and privacy, ensuring that they not only comply with existing laws but also proactively address the potential misuse of their products. Meta’s response that their smart glasses are compliant with privacy regulations is likely to do very little to quell the growing concerns about how easily such technologies can be repurposed for invasive uses. 

Looking ahead, as the use of AI and facial recognition continues to expand, so too will the need for stricter regulations and public awareness. Individuals, businesses, and governments alike must engage in a broader conversation about the balance between technological advancement and personal privacy to ensure that such powerful tools are not misused.

Leave a Comment

Your email address will not be published. Required fields are marked *