Women in rural India review hundreds of violent and pornographic videos each day to train AI systems. They classify content flagged by algorithms for global technology companies. Many workers report lasting psychological harm from the job.
Daily Exposure to Disturbing Material
Monsumi Murmu, 26, works from her village in Jharkhand state. She views up to 800 videos and images daily. Her job requires watching scenes of violence and abuse to help algorithms learn.
According to The Guardian, Murmu struggled to sleep in her first months. She saw images in her dreams. Now she describes feeling blank. She says the job has done something to her.
Raina Singh, 24, took a similar role in Uttar Pradesh. Her tasks shifted from text screening to flagging child abuse on adult platforms. Later she categorized pornographic content for hours. She says the work left her disgusted and disconnected from intimacy.
Mental Health Risks Go Unaddressed
Studies show content moderation triggers lasting stress. Workers report intrusive thoughts, anxiety, and sleep problems. A December study found traumatic stress is the most pronounced risk. Support systems often fail to prevent secondary trauma.
Milagros Miceli, a sociologist leading the Data Workers‘ Inquiry, says the work belongs in the category of dangerous labor. She compares it to lethal industries.
A Growing Workforce With Few Protections
An estimated 70,000 people in India worked in data annotation by 2021. About 80% come from rural or marginalized backgrounds. Women make up half or more of this workforce. Tech firms operate in smaller cities where labor costs are lower.
Job listings rarely explain the actual tasks. Workers sign contracts under vague titles like data annotation. They only learn the real work after training begins. Researcher Priyam Vadaliya says this creates an expectation of gratitude that discourages complaints.
The Guardian spoke to eight companies. Only two said they provide psychological support. Others argued the work is not demanding enough to require mental healthcare. Strict non-disclosure agreements prevent workers from discussing their jobs, even with family. Violating these agreements can lead to termination or legal action.
Murmu earns about £260 a month. She fears unemployment more than the distress. She takes long walks in the forest to cope. She says she does not know if it fixes anything, but she feels a little better.