Charts
DataOn-chain
VIP
Market Cap
API
Rankings
CoinOSNew
CoinClaw🦞
Language
  • 简体中文
  • 繁体中文
  • English
Leader in global market data applications, committed to providing valuable information more efficiently.

Features

  • Real-time Data
  • Special Features
  • AI Grid

Services

  • News
  • Open Data(API)
  • Institutional Services

Downloads

  • Desktop
  • Android
  • iOS

Contact Us

  • Chat Room
  • Business Email
  • Official Email
  • Official Verification

Join Community

  • Telegram
  • Twitter
  • Discord

© Copyright 2013-2026. All rights reserved.

简体繁體English
|Legacy

Thousands of people around the world are selling their identities to train AI, but at what cost?

CN
深潮TechFlow
Follow
4 hours ago
AI summarizes in 5 seconds.
When the data thirst of AI companies combines with global economic disparities, it is creating an unequal transaction.

Author: The Guardian

Translator: TechFlow

TechFlow Introduction: This investigative report reveals a rapidly growing gray industry: thousands of people globally are earning money for AI training by selling their voices, faces, call records, and daily videos.

This is not a vague discussion about privacy controversies, but an investigation with real people, real amounts, and real consequences—an actor who sold his face later saw "himself" promoting an unknown medical product on Instagram, with comments evaluating his "appearance."

When the data thirst of AI companies combines with global economic disparities, it is creating an unequal transaction.

The full text is as follows:

One morning last year, Jacobus Louw, living in Cape Town, South Africa, went out for his usual walk, feeding seagulls along the way. But this time he recorded a few videos—capturing his footsteps and views as he walked on the sidewalk. This video earned him $14, about ten times the country's minimum wage and roughly half a week's food expenses for the 27-year-old.

This was a "city navigation" task completed by Louw on Kled AI. Kled AI is an app that pays users to upload photos, videos, and other data for training AI models. In just a few weeks, Louw earned $50 by uploading photos and videos from his daily life.

Thousands of miles away, in Ranchi, India, 22-year-old student Sahil Tigga regularly earns money from Silencio—an app that crowdsources audio data for AI training, accessing his phone's microphone to capture environmental noise from inside restaurants or busy intersections. He also uploads recordings of his voice. Sahil often travels to unique settings that are not yet recorded on the Silencio map, such as hotel lobbies. He makes over $100 a month from this, enough to cover all his dining expenses.

In Chicago, 18-year-old welding apprentice Ramelio Hill has sold his private phone chat records with friends and family to Neon Mobile—an interactive AI training platform that pays $0.50 per minute—earning hundreds of dollars. For Hill, the math was simple: he felt that tech companies already had access to a vast amount of his private data, so he might as well get a piece of the pie himself.

These "AI training gigs"—uploading surrounding scenes, personal photos, videos, and audio—stand at the forefront of a global gold rush for new data. As Silicon Valley's thirst for high-quality human data exceeds what can be scraped from the open internet, a thriving data market sector has emerged to bridge this gap. From Cape Town to Chicago, thousands are micro-authorizing their biometric identity and private data to the next generation of AI.

But this new gig economy comes at a cost. Behind the few dollars exchanged, these trainers are fueling an industry that may ultimately render their skills obsolete, while exposing themselves to future risks of deepfakes, identity theft, and digital exploitation—of which they are just beginning to become aware.

Keeping the AI Gears Turning

AI language models like ChatGPT and Gemini require vast amounts of learning material to continue improving, but they are facing a data drought. The most commonly used training data sources—C4, RefinedWeb, and Dolma—account for a quarter of the highest quality datasets on the web, and are now limiting the generative AI companies' ability to use their data for training models. Researchers estimate that AI companies will run out of available fresh high-quality text as early as 2026. Although some labs have begun to train using synthetic data generated by AI itself, this recursive process can lead to models producing erroneous "garbage," leading to collapses.

image

Applications like Kled AI and Silencio are stepping in here. In these data markets, millions are feeding and training AI by selling their identity data. Aside from Kled AI, Silencio, and Neon Mobile, AI trainers have many options: Luel AI, supported by the renowned incubator Y-Combinator, acquires multilingual dialogue material at about $0.15 per minute; ElevenLabs allows you to digitally clone your voice and offers it to others at a base rate of $0.02 per minute.

Bouke Klein Teeselink, an economics professor at King's College London, states that AI training gigs are an emerging job category that will see significant growth.

AI companies understand that paying people for data authorization helps avoid copyright disputes that could arise from completely relying on web scraping content, Teeselink explains. AI researcher Veniamin Veselovsky adds that these companies also need high-quality data to model new, improved behaviors for their systems. "Currently, human data is the gold standard for sampling from outside the model distributions," Veselovsky elaborates.

The humans driving these machines—especially those from developing countries—often need this money and have little choice. For many AI training gig workers, this job is a pragmatic response to economic disparities. In countries with high unemployment and depreciating local currencies, earning dollars is often more stable and lucrative than local jobs. Some struggle to find entry-level work and are compelled to do AI training to make a living. Even in wealthier countries, rising living costs have made selling oneself a logical financial choice.

Cape Town AI trainer Louw is acutely aware of the privacy costs. Although his income is unstable and insufficient to cover all his monthly expenses, he is willing to accept these conditions to earn money. Suffering from a neurological disease for years, he couldn't find work, but the money he made in the AI data market (including Kled AI) allowed him to save up $500 to enroll in a spa training course to become a massage therapist.

"For a South African, receiving dollars is worth more than others imagine," Louw says.

Mark Graham, an Oxford University professor of internet geography and author of the book "Feeding Machines," acknowledges that for individuals in developing countries, this money may have practical significance in the short term, but he warns, "Structurally, this work is unstable, offers no upward mobility, and is essentially a dead end."

Graham adds that the AI data market relies on "the downward pressure on wages" and "the temporary demand for human data." Once this demand shifts, "workers will have no security, no transferable skills, and no safety net."

Graham states that the only winners are "platforms in the Global North, which seize all the enduring value."

image

Full Authorization

Chicago-based AI trainer Hill feels conflicted about selling his private phone calls to Neon Mobile. About 11 hours of call content earned him $200, but he says the app frequently goes offline and delays payments. "Neon has always seemed suspicious to me, but I kept using it just to earn some extra pocket money to pay bills," Hill says.

Now he's starting to reconsider whether this money was truly so easy. Last September, just weeks after Neon Mobile launched, it went offline after TechCrunch discovered a security flaw that allowed anyone to access users' phone numbers, call recordings, and text histories. Hill says Neon Mobile never notified him of this, and now he is worried that his voice may be misused online.

Jennifer King, a data privacy researcher at Stanford University's Human-Centered AI Institute, is concerned that the AI data market does not clarify how and where user data will be used. She adds that without knowledge of their rights or the ability to negotiate, "consumers face the risk of their data being reused in ways they dislike, do not understand, or did not foresee, with almost no redress at that point."

When AI trainers share data on Neon Mobile and Kled AI, they grant a full authorization (global, exclusive, irrevocable, transferable, and royalty-free) allowing the platform to sell, use, publicly display, and store their likeness, and even create derivative works based on it.

Avi Patel, founder of Kled AI, states that his company limits data usage to AI training and research purposes. "The entire business model relies on user trust. If contributors feel their data might be misused, the platform cannot operate," he says, mentioning that the company reviews buyers before selling datasets to avoid working with "questionable intent" organizations, such as those in the adult industry and "government entities" they believe might use the data in ways that violate that trust.

Neon Mobile did not respond to a request for comment.

Enrico Bonadio, a law professor at City, University of London, points out that these agreements allow platforms and their clients "to do almost anything with the material, perpetually, without extra payment, and contributors have no real way to withdraw consent or renegotiate."

More concerning risks include: trainers' data being used to create deepfakes and identity impersonation. Although the data market claims to strip personal identification information (such as names and locations) from the data before selling, biometric data is inherently difficult to anonymize in a meaningful way, Bonadio adds.

Regret of the Sellers

Even if AI trainers could negotiate more detailed protective clauses regarding data usage, they may still regret it. In 2024, actor Adam Coy from New York sold his likeness to Captions—for $1,000—an AI video editing software now renamed Mirage. His agreement stipulated that his identity would not be used for any political purposes and would not promote alcohol, tobacco, or adult content, with a duration of one year.

Captions did not respond to a request for comment.

Soon after, Adam's friends began sharing videos they found online featuring his face and voice, which garnered millions of views. In one Instagram video, Adam's AI doppelgänger claimed to be a "vagina doctor," promoting unverified medical supplements for pregnant and postpartum women.

"Explaining this to others makes me feel embarrassed," Coy says.

"The comments section is strange because people are commenting on my appearance, but that is not me at all," Coy adds. "When I made the decision (to sell my likeness), my thought was that most models are going to have their data and likeness scraped from the internet anyway, so why not get paid."

Coy says he hasn't taken any more AI data gigs since then. He says he would consider doing so again only if a company offered significant compensation.

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

返20%!OKX钱包龙虾技能,一键秒赚
广告
|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

Selected Articles by 深潮TechFlow

32 minutes ago
Tokens do not need a Chinese name, but the business behind them does.
42 minutes ago
Gold plummets 8%, erasing all gains for the year. Why do safe-haven assets "fail" during conflicts in the Middle East?
1 hour ago
The Self-Collapse of the Entrepreneurial Bible: The More You Know, the Faster You Die
View More

Table of Contents

|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

Related Articles

avatar
avatar深潮TechFlow
32 minutes ago
Tokens do not need a Chinese name, but the business behind them does.
avatar
avatarOdaily星球日报
40 minutes ago
24H Hot Cryptocurrencies and News | Spot Gold's Gain This Year Erased; Resolv Faces Vulnerability Attack (March 23)
avatar
avatar深潮TechFlow
42 minutes ago
Gold plummets 8%, erasing all gains for the year. Why do safe-haven assets "fail" during conflicts in the Middle East?
avatar
avatarTechub News
47 minutes ago
Why the miners are not anxious.
avatar
avatar链捕手
47 minutes ago
With an investment of 1 million yielding over 1 billion in return, Airwallex co-founder Liu Yueting reviews key life investments.
APP
Windows
Mac

X

Telegram

Facebook

Reddit

CopyLink