Author: Su Yang , Tencent Technology Edited by Xu Qingyang AI is facing a data famine, and human life has become a business. From Cape Town in South Africa to ChicagoAuthor: Su Yang , Tencent Technology Edited by Xu Qingyang AI is facing a data famine, and human life has become a business. From Cape Town in South Africa to Chicago

A minute of recorded phone call costs $0.50; humans are "selling" their lives to AI.

2026/03/23 17:30
11 min read
For feedback or concerns regarding this content, please contact us at crypto.news@mexc.com

Author: Su Yang , Tencent Technology

Edited by Xu Qingyang

A minute of recorded phone call costs $0.50; humans are selling their lives to AI.

AI is facing a data famine, and human life has become a business.

From Cape Town in South Africa to Chicago and Los Angeles in the United States, and Ranchi in India, tens of thousands of people around the world are selling snippets of their daily lives to tech companies, including videos of washing dishes, walking footsteps, recordings of phone calls, and cooking actions.

These ordinary, everyday chores are becoming "industrial raw materials" that can be clearly priced and sold separately, and are being used as "teaching materials" for training AI—a two-hour video of washing dishes can be exchanged for $80, and a phone call recording is worth $0.5 per minute.

Silicon Valley's thirst for real-person data has spurred a booming data market industry.

But the money wasn't free.

Those gig trainers who sign contracts often unknowingly relinquish irrevocable authorization: their voices may be used forever in AI customer service, their faces may appear in facial recognition databases halfway around the world, and the systems they train may one day cost them their jobs.

This is a story about survival and calculation. On one hand, there's the pressure of making a living; on the other, there are unseen risks to the future. And in this data "gold rush," who are the real winners?

01. Humans break down life into "retail" segments.

Jacobs Love, a 27-year-old South African, takes photos of his footsteps and the scenery along his daily outings to feed the seagulls. He then uploads the videos to Kled AI, an app that pays users to collect data to train AI models. A short "city navigation" video of about ten seconds can earn him $14.

Sahir Tiga, a 22-year-old Indian student, has an even simpler way of making money: keeping his phone's microphone constantly on. Using an app called Silencio, he allows others to access his phone's microphone to capture noise from restaurants and traffic at intersections. To earn even more, he even goes to hotel lobbies to record ambient sounds that haven't yet been captured. He can earn over $100 a month, enough to cover his expenses.

Lameria Hill, an 18-year-old welding apprentice from Chicago, chose to sell more private things, including his phone records with friends and family, to a platform called Neon Mobile for $0.50 per minute.

The content uploaded by AI gig trainers is incredibly diverse; they are at the forefront of this global data gold rush.

These gig jobs are far more than just that.

In Los Angeles, from Santa Monica to Los Feliz, hundreds of people are wearing cameras strapped to their heads and hands while doing housework. They're making coffee, scrubbing toilets, watering plants, washing dishes—everything is being recorded.

Salvador Alciga received a head-mounted phone stand from Instawork, went home, and filmed himself washing dishes and wiping the stove in front of the camera, narrating what he was doing in Spanish or English. He earned $80 from two hours of video recording.

“I have to do housework anyway,” he said. “Now I can even earn money by doing housework.”

02. It all stems from the data famine of AI.

The reason these seemingly low-tech snippets of daily life are valuable is because AI is on the verge of "starving."

Large language models like ChatGPT and Gemini require massive amounts of learning material to continuously improve. However, currently, the most commonly used training data sources, such as C4 and RefinedWeb, are beginning to restrict AI companies from using their own data.

The nonprofit research firm Epoch AI predicts that by 2026, AI companies will run out of fresh text resources available for training. While some labs are beginning to experiment with letting AI generate its own data to "learn on its own," this approach could lead to a decline in model quality and ultimately, model collapse.

Against this backdrop, data marketplace platforms such as Kled AI and Silencio have suddenly become popular.

Bok Klein Tisselink, Professor of Economics at King's College London, points out that gig AI training is an emerging job category with significant growth potential. AI companies can also effectively avoid copyright disputes by paying for user-authorized data. Simply scraping content from the internet could easily lead to lawsuits.

AI researcher Venjamin Veselovski also stated, "Currently, human data remains the best source for AI to break out of its own patterns and learn new things."

Simply put, no matter how fast machines learn, they can never truly learn without real human data. This is especially evident in the physical world.

Anders Baker, VP of AI Robotics at Universal Robots, pointed out that most of the training data collected in AI labs is unsuitable for real-world deployment, and robots simply cannot learn tasks that require "hands-on" interaction based solely on visual feedback. For robots to truly master skills like opening doors, washing dishes, and folding clothes, they must rely on repeated demonstrations by real humans in real-world environments.

Jason Saltzman, head of insights at CB Insights, summarized: "The model can't yet judge what is right and wrong on its own, nor can it figure out what the real situation is. Humans have to teach it all."

For this reason, some countries have set up specialized "arm farms"—in fixed facilities, a large number of people record first-person videos of tasks such as opening doors and folding clothes, providing real-world operational demonstrations for AI.

Alcija recorded himself putting clothes into the washing machine as part of his gig.

Data shows that human data collection companies like Sunain have over 1,400 contributors in Los Angeles, stretching from Culver City in the west to Pasadena in the east. Sunain co-founder Shahbaz Magsi says that Los Angeles' housing types, lifestyles, and population diversity are "unparalleled."

CB Insights predicts that the global data collection and labeling market could reach $17 billion by 2030. Goldman Sachs, on the other hand, predicts that the humanoid robot market could reach $38 billion by 2035.

It was precisely because of these promising prospects that capital began to pour in.

San Francisco-based Encord saw its physical AI business revenue grow tenfold last year and secured $60 million in funding this February. Meta-backed Scale AI has already collected 100,000 hours of robotic video. Its competitor, Micro1, employs 1,000 people in 60 countries specifically to record videos of household chores.

03. There is no privacy, and no going back.

For those who won bids to feed data to AI, the money wasn't for nothing.

Hill has mixed feelings about his experience. He earned $300 by selling 11 hours of calls on Neon Mobile, but the app frequently dropped, and withdrawals were often delayed. "Neon has always been rather suspicious to me," he said, "but I've always used it to earn some easy extra cash."

Soon, trouble began.

In September 2025, just weeks after Neon Mobile's launch, TechCrunch reported a security vulnerability that allowed anyone online to directly access users' phone numbers, call recordings, and text messages. Hill said Neon never notified him of this. Now he's worried about how his voice might be used.

That's not even the worst part.

New York actor Adam Coy sold his likeness to AI video editor Captions (now Mirage) for $1,000 in 2024. He included many protective clauses in the contract: his image could not be used for political purposes, nor could it be used to sell alcohol, tobacco, or pornography, and the license would expire after one year.

But soon after, his friends started forwarding him videos that were going viral online. In the videos, his face and his voice were promoting an unproven health product for pregnant women.

“Explaining this to others makes me feel incredibly embarrassed,” Koy said. “Those comments read strangely because they were commenting on my appearance, but that’s not the real me at all.”

What upset Koy even more was his initial thought when he decided to sell the portraits—since most models would scrape data and images from the internet anyway, he figured he might as well make some money himself. Looking back now, it seems like a joke. Since then, he has never touched such platforms again.

Enrico Bonadio, a law professor at City of London St George, points out that many platforms' agreements allow for "doing almost anything with that material permanently, without having to pay again." Contributors, on the other hand, have "virtually no practical way to withdraw their consent or renegotiate."

Even more frightening is that even if the platform claims to do "de-identification" processing, biometric features such as voice and face are inherently difficult to truly achieve anonymity.

04. The trap of full licensing contracts

You might think you've just "rented" a few recordings, but the fine print in the contract could hide a bigger trap.

When users share data on Neon Mobile or Kled AI, they are granted a "full license"—global, exclusive, irrevocable, transferable, and royalty-free. In other words, the platform can permanently sell, use, publicly display, store the image, and even create "derivative works."

Kled AI founder Avi Patel defended their agreement, stating that it only allows data to be used for AI training and research. The company vets buyers, avoiding industries with "questionable intentions" and organizations that might misuse the data. He said, "The entire business relies on user trust."

But how reliable is such a guarantee? Professor Bonadio points out that the contract allows the platform and the customer to "do almost anything".

What's more problematic is that once your data is sold, you have no idea where it goes. Jennifer King, a data privacy researcher at Stanford University's Human-Centered AI Institute, says these platforms don't clearly explain how or where the data will be used. Consumers "face the risk of their data being reused in ways they don't like, don't understand, or didn't anticipate, with virtually no recourse."

Scholar Laura Kittel's experience is more typical. She was looking for work with nonprofits and government agencies, and a friend recommended Mercor. When the contract arrived, she read it carefully; the terms required her to grant royalty-free rights, allowing the use of her existing and future academic papers, as well as any intellectual property that might benefit an unspecified client.

“I think this is a bit too much,” she said.

She wanted to amend the contract, but an AI assistant named "Melvin" replied in an email saying: It can't be changed, you can leave if you don't accept it.

Mercor later explained that the contract only applied to contributors' own creations that they chose to use during the project. Creations they created that weren't used were not subject to this restriction. But for Kittel, that unpleasant feeling was still lingering.

05. Who is the real winner?

Mark Graham, professor of internet geography at Oxford University and author of "The Feeding Machine," acknowledges that the money could be useful in the short term for people in developing countries, but warns: "Structurally, this kind of work is unstable, lacks prospects, and is actually a dead end."

He said the AI ​​market is driven by “competition to the bottom of wages” and “temporary demand for human data.”

Once demand changes, "workers receive no protection, learn no transferable skills, and have no safety guarantees." The ultimate winners are those "Northern Hemisphere platforms that capture all lasting value."

In other words, every penny that gig workers earn today is helping AI become stronger and smarter. And when AI becomes powerful enough, those who trained it may be the very first to be replaced.

As DoorDash Tasks General Manager Ethan Beatty said, "These are real-world problems we've been solving for over a decade, and we've realized that the same capabilities that have helped us can help other businesses as well."

What DoorDash and similar companies are doing is turning delivery drivers' work experience into data assets and selling them to any company that needs to train AI.

Uber is doing the same thing.

Last October, Uber added a digital task category to its driver app, allowing drivers to upload restaurant menus and record multilingual audio samples. Its Uber AI Solutions division has expanded to 30 countries, offering annotation, translation, and model training services.

Both companies are following the path pioneered by Scale AI: using distributed remote workers to create new datasets and validate AI outputs. The only difference is that Uber and DoorDash have millions of people at their disposal that can be deployed directly to any corner of the real world.

Alcija's friend once questioned him, "Okay, you're the problem." What his friend meant was, you're teaching AI to do things only humans can do, aren't you making things worse?

Alciga replied that new technologies always bring fear and change, but they also create new types of work, such as his latest gig. "People still need people," he said.

The question is, when AI no longer needs "humans," will that need still exist?

Market Opportunity
ConstitutionDAO Logo
ConstitutionDAO Price(PEOPLE)
$0.006987
$0.006987$0.006987
+5.02%
USD
ConstitutionDAO (PEOPLE) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.