In other words, if you start suffering mental health problems from working for Accenture, and you don't get medical help for yourself, Accenture can terminate you. Lovely. They can make you sick,...
“I understand how important it is to monitor my own mental health, particularly since my psychological symptoms are primarily only apparent to me,” the document reads. “If I believe I may need any type of healthcare services beyond those provided by [Accenture], or if I am advised by a counselor to do so, I will seek them.”
“Strict adherence to all the requirements in this document is mandatory,” it reads. “Failure to meet the requirements would amount to serious misconduct and for Accenture employees may warrant disciplinary action up to and including termination.”
In other words, if you start suffering mental health problems from working for Accenture, and you don't get medical help for yourself, Accenture can terminate you. Lovely. They can make you sick, and if you don't make yourself well, they'll fire you. This is just their way of shifting the duty of care from the employer to the employee.
“toxic torts” — laws that allow people to sue employers and homebuilders if they expose the plaintiff to unhealthy levels of a dangerous chemical.
If an employer makes their employee physically ill, the employer is legally responsible. But, it seems, Accenture believes that mental illness is the employee's legal responsibility.
It's astounding how much leeway these companies have. This kind of thing is unthinkable in other parts of the world, they would be sued in a heartbeat.
It's astounding how much leeway these companies have. This kind of thing is unthinkable in other parts of the world, they would be sued in a heartbeat.
PTSD is poorly understood, and there's no research to identify a "safe" level of exposure to the infinity of potentially damaging images that moderators might be exposed to from a YouTube or...
PTSD is poorly understood, and there's no research to identify a "safe" level of exposure to the infinity of potentially damaging images that moderators might be exposed to from a YouTube or Facebook input pipeline.
The anecdotal rates of injury are appalling - I've seen numbers like 25 - 30% of moderators with crippling levels of injury after a few months. It's like being paid to bathe in poison, with the company's assurance that it hopes you'll recover eventually. There's no good evidence that employee assistance programs are effective prevention/treatment, either.
Edit: I just did a bit of homework, and apparently there has been some research on vicarious trauma and PTSD, in emergency dispatchers. Rates of diagnosable PTSD were 18 - 24%, in a job that arguably has less constant, severe exposure, and more control of outcomes.
Edit 2: The U.S. Occupational Safety and Health Administration (OSHA) has no applicable standards, only a guideline for exposure to "critical incident stress" for first-responders.
The author of this, Casey Newton, also wrote an excellent, in-depth article (with a companion video) last year, about similar issues being faced by Facebook moderators, that I highly recommend...
The author of this, Casey Newton, also wrote an excellent, in-depth article (with a companion video) last year, about similar issues being faced by Facebook moderators, that I highly recommend people check out if they're interested in this subject.
If you're into email newsletters, I highly recommend subscribing to his too: The Interface. He sends it out in the (North American) evening every Monday-Thursday, and it always includes a lot of...
Exemplary
If you're into email newsletters, I highly recommend subscribing to his too: The Interface.
He sends it out in the (North American) evening every Monday-Thursday, and it always includes a lot of links to interesting articles about the tech industry, social media, etc.
Well, giving each employee moderator more than 9 minutes a day for their wellness checks with counselors would probably be a good start. And not forcing them to sign NDAs about their traumatic...
Well, giving each employee moderator more than 9 minutes a day for their wellness checks with counselors would probably be a good start. And not forcing them to sign NDAs about their traumatic experiences working there would also certainly be a step up. :|
I personally think that employed content moderators should be treated almost exactly like 911 operators, who usually work on rotating schedules (2-3 days on, 1-3 days off), with lots of counseling and time off available to them, etc.
The author wrote a bit of a follow-up to this report in his newsletter today, with some more info (including that Facebook moderators in Europe were sent the same document) and five suggestions...
The author wrote a bit of a follow-up to this report in his newsletter today, with some more info (including that Facebook moderators in Europe were sent the same document) and five suggestions for companies that he thinks would help the situation: How tech companies should address their workers’ PTSD
In other words, if you start suffering mental health problems from working for Accenture, and you don't get medical help for yourself, Accenture can terminate you. Lovely. They can make you sick, and if you don't make yourself well, they'll fire you. This is just their way of shifting the duty of care from the employer to the employee.
The linked prior investigation refers to:
If an employer makes their employee physically ill, the employer is legally responsible. But, it seems, Accenture believes that mental illness is the employee's legal responsibility.
It's astounding how much leeway these companies have. This kind of thing is unthinkable in other parts of the world, they would be sued in a heartbeat.
PTSD is poorly understood, and there's no research to identify a "safe" level of exposure to the infinity of potentially damaging images that moderators might be exposed to from a YouTube or Facebook input pipeline.
The anecdotal rates of injury are appalling - I've seen numbers like 25 - 30% of moderators with crippling levels of injury after a few months. It's like being paid to bathe in poison, with the company's assurance that it hopes you'll recover eventually. There's no good evidence that employee assistance programs are effective prevention/treatment, either.
Edit: I just did a bit of homework, and apparently there has been some research on vicarious trauma and PTSD, in emergency dispatchers. Rates of diagnosable PTSD were 18 - 24%, in a job that arguably has less constant, severe exposure, and more control of outcomes.
Edit 2: The U.S. Occupational Safety and Health Administration (OSHA) has no applicable standards, only a guideline for exposure to "critical incident stress" for first-responders.
The author of this, Casey Newton, also wrote an excellent, in-depth article (with a companion video) last year, about similar issues being faced by Facebook moderators, that I highly recommend people check out if they're interested in this subject.
Article: The Trauma Floor - The secret lives of Facebook moderators in America
Video: Inside the traumatic life of a Facebook moderator
If you're into email newsletters, I highly recommend subscribing to his too: The Interface.
He sends it out in the (North American) evening every Monday-Thursday, and it always includes a lot of links to interesting articles about the tech industry, social media, etc.
Here's yesterday's, as an example: https://www.getrevue.co/profile/caseynewton/issues/sundar-pichai-says-as-little-as-possible-222682
Well, giving each employee moderator more than 9 minutes a day for their wellness checks with counselors would probably be a good start. And not forcing them to sign NDAs about their traumatic experiences working there would also certainly be a step up. :|
I personally think that employed content moderators should be treated almost exactly like 911 operators, who usually work on rotating schedules (2-3 days on, 1-3 days off), with lots of counseling and time off available to them, etc.
The author wrote a bit of a follow-up to this report in his newsletter today, with some more info (including that Facebook moderators in Europe were sent the same document) and five suggestions for companies that he thinks would help the situation: How tech companies should address their workers’ PTSD