
The Hidden Workforce Behind AI
While much of the public discourse around artificial intelligence focuses on glossy product launches and billion-dollar valuations, a quieter and more troubling story is unfolding in the shadows of the tech industry. According to a first-person account published by WIRED on May 11, 2026, Hollywood screenwriters who lost their jobs to the rise of generative AI are now secretly training the very systems that replaced them. The article, written by screenwriter Ruth Fowler, details how she and many of her colleagues have turned to AI gig work as a last resort — a kind of 'new waiting tables' for creative professionals.
Fowler reports that over eight months, she completed 20 contracts with five different AI platforms. The work involves annotating, evaluating, and curating text and video data to improve large language models and video-generation tools. The pay is often below minimum wage once the unpaid time spent reading guidelines and dealing with technical glitches is factored in. Fowler describes the work as 'soul-crushing,' a sentiment echoed by many in the anonymous online forums where these gig workers compare notes.
Life as an AI Annotator
The specifics of these gigs are revealing. Fowler writes that she was hired to rank AI-generated movie summaries, correct grammar in synthetic dialogue, and even write short scripts that could be used as training data for models designed to generate TV episodes. The platforms she worked for include major names in the AI ecosystem, although she does not name them to avoid violating nondisclosure agreements. 'These are not fly-by-night startups; they are some of the biggest companies in the world,' she writes. 'And they are using our desperation to build their products.'

The contracts are short — often just a few hours of work — and pay through online platforms like Upwork or specialized marketplaces. Rates vary wildly: some projects pay $15 per hour, others as little as $8. There is no health insurance, no job security, and no guarantee of steady work. Fowler notes that she spends hours each week searching for new gigs, applying to dozens of listings only to be rejected or ghosted. The competition is fierce because thousands of other displaced creative workers are doing the same.
Why This Matters for the AI Ecosystem
This hidden economy is not just a human tragedy; it poses a direct risk to the quality and safety of AI systems. If the people training these models are burned out, underpaid, and disengaged, the data they produce may be unreliable. Studies have shown that low-wage crowdworkers often take shortcuts, provide inconsistent ratings, or even submit random answers just to maximize their earnings per hour. When these flawed judgments are fed into training pipelines, the resulting AI models can exhibit biases, factual errors, and poor language understanding.
Moreover, the secrecy around this workforce makes it difficult to audit. The NDAs that bind workers like Fowler prevent outsiders from understanding exactly how training data is collected and labeled. This lack of transparency is especially concerning as governments around the world consider regulations for AI training data. The European Union's AI Act, for example, requires that training datasets be documented and examined for biases, yet enforcement will depend heavily on companies voluntarily disclosing their supply chains.
The Ethical Implications

There is an irony that cannot be ignored: the same creative professionals whose work was undermined by AI are now being paid a pittance to make that AI better. Fowler argues that this is not a temporary transition but a structural shift in the labor market. 'Hollywood used to have a middle class of writers,' she writes. 'Now there are only the few at the top who still have staff jobs, and the rest of us are training our replacements.'
Some lawmakers are beginning to take notice. California gubernatorial candidate Tom Steyer recently proposed a jobs guarantee for workers displaced by AI, as reported in the same WIRED RSS feed. But such proposals remain far from implementation, and the gig economy continues to expand. Meanwhile, the AI industry shows no sign of slowing its appetite for human-labeled data. As long as there is a surplus of unemployed writers, they will be a cheap resource for the very technology that eliminated their careers.
What to Watch For
In the coming months, expect increased scrutiny of AI training data labor practices. Investigative journalists and labor organizers are beginning to document these conditions, and class-action lawsuits could emerge if workers challenge their classification as independent contractors. For companies building AI models, the risk is real: a public revelation of exploitative training practices could trigger consumer backlash and regulatory penalties.
For tech professionals and developers reading this, the lesson is clear: the human cost of AI is not an abstract problem. Every dataset that powers a chatbot or video generator was likely labeled by someone like Fowler — underpaid, overworked, and anonymous. The next time you deploy a model, consider whether its excellence came at the expense of those who taught it.
Commentaires