Question: Read the case study carefully and answer ALL the questions that follow. Kenyan Moderators Decry Toll of Training of AI Models By: N . Rowe,
Read the case study carefully and answer ALL the questions that follow.
Kenyan Moderators Decry Toll of Training of AI Models
By: N Rowe, August The Guardian
Employees describe the psychological trauma of reading and viewing graphic content, low pay and abrupt dismissals.
The images pop up in Mophat Okinyis mind when hes alone, or when hes about to sleep. Okinyi, a former content
moderator for Open AIs ChatGPT in Nairobi, Kenya, is one of four people in that role who have filed a petition to the
Kenyan government calling for an investigation into what they describe as exploitative conditions for contractors
reviewing the content that powers artificial intelligence programs.
It has really damaged my mental health, said Okinyi. The yearold said he would view up to text passages a day,
many depicting graphic sexual violence. He recalls he started avoiding people after having read texts about rapists and
found himself projecting paranoid narratives on to people around him. Then last year, his wife told him he was a changed
man, and left. She was pregnant at the time. I lost my family, he said.
The petition filed by the moderators relates to a contract between OpenAI and Sama a data annotation services
company headquartered in California that employs content moderators around the world. While employed by Sama in
and in Nairobi to review content for OpenAI, the content moderators allege they suffered psychological
trauma, low pay and abrupt dismissal.
The moderators in Nairobi working on Samas OpenAI account were tasked with reviewing texts, and some images,
many depicting graphic scenes of violence, selfharm, murder, rape, necrophilia, child abuse, bestiality and incest, the
petitioners say.
The moderators say they werent adequately warned about the brutality of some of the text and images they would be
tasked with reviewing, and were offered no or inadequate psychological support. Workers were paid between $ and
$ an hour, according to a Sama spokesperson.
When the contract with OpenAI was terminated eight months early, we felt that we were left without an income, while
dealing on the other hand with serious trauma said petitioner Richard Mathenge, Immediately after the contract
ended, petitioner Alex Kairu, was offered a new role by Sama, labelling images of cars, but his mental health was
deteriorating. He wishes someone had followed up to ask: What are you dealing with? What are you going through?
OpenAI declined to comment for this story.
A Sama management spokesperson said moderators had access to licensed mental health therapists on a basis
and received medical benefits to reimburse psychiatrists.
In regarsd to the allegations of abrupt dismissal, the Sama management spokesperson said the company gave full notice
to employees that it was pulling out of the ChatGPT project, and that employees were given the opportunity to participate
in another project. We are in agreement with those who call for fair and just employment, as it aligns with our mission
that providing meaningful, dignified, living wage work is the best way to permanently lift people out of poverty and
believe that we would already be compliant with any legislation or requirements that may be enacted in this space, the
Sama management spokesperson said.
The human labour powering AIs boom
Since ChatGPT arrived on the scene at the end of last year, the potential for generative AI to leave whole industriesobsolete has petrified professionals. That fear of automated supply chains and sentient machines, has overshadowed
concerns in another arena: the human labour powering AIs boom.
Bots like ChatGPT are examples of large language models, a type of AI algorithm that teaches computers to learn by
example. To teach Bard, Bing or ChatGPT to recognise prompts that would generate harmful materials, algorithms must
be fed examples of hate speech, violence and sexual abuse. The work of feeding the algorithms examples is a growing
business, and the data collection and labelling industry is expected to grow to over $ billion by according to
Global Data, a data analytics and consultancy firm.
Much of that labelling work is performed thousands of miles from Silicon Valley, in East Africa, India, the Philippines, and
even refugees living in Kenyas Dadaab and Lebanons Shatila camps with a large pool of multilingual workers who are
willing to do the work for a fraction of the cost, said Srravya Chandhiramowuli, a researcher of data annotation at the
University of London.
Nairobi in recent years has become a global hotspot for such work. An ongoing economic crisis, matched with Nairobis
high rate of English speakers and mix of international workers from across Africa, make it a hub for cheap, multilingual
and educated workers.
The economic conditions allowed Sama to recruit young, educated Kenyans, desperate for work, said Ma
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
