The death of Google Glass in 2013 was a victory for social etiquette. We dubbed the wearers ‘Glassholes’,… The post Discovery that Meta staff watch users’ Ray-BanThe death of Google Glass in 2013 was a victory for social etiquette. We dubbed the wearers ‘Glassholes’,… The post Discovery that Meta staff watch users’ Ray-Ban

Discovery that Meta staff watch users’ Ray-Ban recordings raises privacy concerns

2026/03/05 00:00
5 min read
For feedback or concerns regarding this content, please contact us at crypto.news@mexc.com

The death of Google Glass in 2013 was a victory for social etiquette. We dubbed the wearers ‘Glassholes’, barred them from sensitive places, and collectively decided that a cyborg-eye aesthetic was a bridge too far. But a decade later, the surveillance state didn’t go away; it just learned how to dress better and rebranded as Ray-Ban.

In 2025 alone, Meta sold 7 million Ray-Ban smart glasses, tripling 2023 and 2024 combined. By blending high-tech optics into the world’s most iconic frames, Meta solved the social friction of wearable tech. In doing so, they created a massive, decentralised surveillance network where you are the primary subject, whether you like it or not.

The problem isn’t necessarily that your neighbour is recording their point of view of the sunset. The problem is what happens to that footage after it leaves the frame.

Recent investigative reports have pulled back the curtain on a global pipeline of data annotation that should stop every citizen cold. Meta’s AI doesn’t just learn in a vacuum; it is trained by human contractors.

Currently, workers in Nairobi, Kenya, are tasked with reviewing raw footage from the smart glasses to label objects and refine the algorithm.

These contractors aren’t just seeing metadata or anonymised blobs. They are watching the actual videos. Because the glasses are designed to be always ready, they capture the most intimate corners of human existence: people undressing, medical documents on a doctor’s desk, bank cards at a checkout, and even private moments in bathrooms or bedrooms.

How Meta’s Ray-Ban smart glasses turned bystanders into unconsenting data points and what it means for AfricaMeta’s Ray-Ban smart glasses

While Meta claims to use automated blurring to protect privacy, whistleblowers from within these contracting firms report that the technology fails constantly. Identifiable faces and sensitive information are routinely exposed to low-wage workers halfway across the world.

The most chilling aspect of this business model is the consent gap, which takes on a predatory tone in the Global South. If you buy the glasses, you theoretically agree to a dense, jargon-filled privacy policy.

But the thousands of people you walk past in a day, the person in the gym locker room, the patient in a Lagos clinic, and the stranger on a Danfo bus have made no such choice.

In Africa, where data protection laws are often more aspirational than enforceable, this gap becomes a canyon. While the Nigeria Data Protection Act (NDPA) 2023 and Kenya’s Data Protection Act exist on paper, their enforcement mechanisms struggle to keep pace with rapid AI deployment.

In Nigeria, the National Data Protection Commission (NDPC) has shown teeth, fining Meta $220 million in 2024 for separate privacy violations, but for the average citizen, redress remains a distant dream.

How Meta’s Ray-Ban smart glasses turned bystanders into unconsenting data points and what it means for AfricaAn African lady wearing Meta’s Ray-Ban smart glasses: photo credit – The Verge

Unlike the EU, where the GDPR creates a privacy-by-design requirement that can force tech giants to cripple their own features to stay legal, African regulators often face a lopsided battle. We are frequently treated as the testing ground or the annotation hub for technologies built elsewhere.

The irony is bitter: while Kenyan workers earn a few dollars an hour to watch footage of Europeans and Americans and, indeed, people all over the world in their most vulnerable moments, they, and their fellow Africans, are simultaneously being recorded by the same devices with even fewer safeguards.

This isn’t just an ethical nightmare; it is a global legal crisis. In early 2026, the European Union began sharpening its knives. Members of the European Parliament (MEPs) have submitted formal inquiries to the Commission, questioning how this architecture can coexist with European law.

However, in Nigeria and Kenya, the lack of adequacy status with the EU means that our data is often exported and processed with even less oversight.

How Meta’s Ray-Ban smart glasses turned bystanders into unconsenting data points and what it means for AfricaMeta CEO rocking the Meta’s Ray-Ban smart glasses

If Brussels moves toward enforcement, the company might disable features in Europe, but they are unlikely to do so in markets where the regulatory cost of doing business is lower.

For Nigerians, this means we risk becoming a permanent grey zone of surveillance, a place where the algorithm is fed with our images, our intimate moments, and our private documents, all without a single signature.

Also read: Meta is testing stablecoin payments for its 3 billion users years after Libra crashed

As the installed base of these surveillance nodes accelerates, we are rapidly approaching a reality where private space is a relic of a distant past – the pre-AI era.

Meta didn’t just make smart glasses; they made the world’s most efficient, unconsented casting call for a movie that never ends. The question for us in Africa is: who is watching our footage right now, and who, if anyone, is coming to save us?

The post Discovery that Meta staff watch users’ Ray-Ban recordings raises privacy concerns first appeared on Technext.

Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.