Emergency Room: The Secret Life of Facebook Moderators in the USA

Original author: Casey Newton
  • Transfer


Warning: this article describes racism, violence, and mental illness.

Panic attacks began in Chloe after she watched the death of a person.

The last three and a half weeks she spent in training, trying to tighten up and resist the daily attacks of unpleasant posts: hate speech , brutal attacks, pornographic images. In a few days, she will assume the position of full-time FB content moderator, a position that her company, the professional services provider, Cognizant, has vaguely called a “process executor”.

At this stage of training, Chloe will have to engage in moderation of posts in the FB in front of other interns. When her turn arrives, she goes to the front of the room, where the video sent to the largest social network in the world is shown on the monitor. None of the trainees had seen him yet, including Chloe. She presses the start button.

The video shows the murder of a man. Someone sticks a knife into him, dozens of times, he screams and begs for mercy. Chloe's task is to inform those present whether this video should be deleted. She knows that section 13 of the public standards of the FB forbids posting videos that depict the killing of one or more people. When Chloe explains this to the whole class, she hears her voice tremble.

Returning to the place, Chloe feels an irresistible urge to sob. The next intern goes ahead to evaluate the next post, but Chloe is unable to concentrate. She leaves the room and begins to sob so much that it becomes difficult to breathe.

No one is trying to reassure her. She was hired for this position. And for the 1000 people who, like Chloe, are involved in the moderation of the FB from the office in Phoenix, and for 15,000 moderators around the world, today is just another working day.



Over the past three months, I interviewed a dozen former and current Cognizant employees working in Phoenix. All of them signed non-disclosure agreements, where they pledged not to discuss their work on the FB - and not even acknowledge that the FB is a Cognizant client. The veil of secrecy should protect employees from users who may get angry due to the actions of moderators and try to solve the situation with a well-known contractor FB. The agreement should also prevent the dissemination of personal information of FB users by the contractor’s employees, since privacy issues are currently under scrutiny.

But the moderators informed me that this secrecy also isolates Cognizant and the FB from criticism about the working conditions in these companies. They are forced to not discuss the emotional burden that work on them, even with their loved ones, and this leads to increased feelings of isolation and anxiety. To protect them from retaliation from both employers and FB users, I agreed to use pseudonyms for all the people mentioned here, with the exception of Cognizant Business Process Services Vice President Bob Duncan and FB Global Partnerships Management Director Mark Davidson .

In general, employees describe the workspace as constantly threatening to fall into chaos. In this environment, employees survive by telling each other jokes from the category of black humor about suicide, and smoke weed in breaks to muffle emotions. Here, an employee can be fired for just a few mistakes a week, and those who remain will live in fear of former colleagues who might plot revenge.

In this place, in contrast to the goodies that the FB pampers its employees, team leaders engage in micromanagement of moderators even when they go to the toilet or pray; Here workers, desperately trying to get a dose of dopamine in an atmosphere of suffering, were found having sex under the stairs and in rooms for expressing milk; here people begin to feel anxiety already at the training stage, and continue to suffer from traumatic syndrome for a long time after dismissal; and the psychological services Cognizant offers to its employees break off immediately after they are fired.

Moderators told me that videos and memes on conspiracy theories that they have to watch every day gradually change their views to more radical ones. One auditor is campaigning for a flat Earth. A former employee told me that he began to question certain aspects of the Holocaust. Another former employee said that he had marked all the evacuation routes from his own house and was holding a gun next to the bed, and said: “I no longer believe that terrorists attacked us on 9/11.”

Chloe cries a little in the rest room, and then in the toilet, but then she starts to worry that she will miss too much of the training. She longed to get this job when sending a resume, because she had recently graduated from college, and she had few other options. When she becomes a full-time moderator, she will receive $ 15 per hour - this is $ 4 more than the minimum salary in Arizona, where she lives, and more than can be expected from most retail sales jobs.

Tears stop, breathing is restored. When she returns, one of her colleagues discusses another video with violence. She sees a drone shoot at people from the air. Chloe sees how human figures lose their mobility and then die.

She leaves the room again.

In the end, the administrator finds her in the toilet, and hugs her weakly. Cognizant offers psychologist services to employees, but he does not work full time, and has not yet arrived at the office. Chloe is waiting for him for more than half an hour.

When the psychologist accepts her, she explains that she has suffered a panic attack. He tells her that after the release, she will have more control over the video on the FB than in the training room. You can pause the video, he says, or watch it without sound. Concentrate on breathing, he says. Make sure that you are not too keen on the material you are viewing.

“He told me not to worry, and that I probably still manage to do the job,” says Chloe. Then he recovers: “He said that I shouldn’t worry, and that I could do this work.”



May 3, 2017 Mark Zuckerberg announced the expansion of the FB's public operations team . The new employees, who will be added to the existing army of 4,500 moderators, will be responsible for reviewing each post about which it was reported that it violates community standards. By the end of 2018, in response to criticism of the dominance of cruel and exploitative content on the social network, the FB sent more than 30,000 employees to fight for security - and about half of them are engaged in moderation.

There are full-time employees among the moderators, but the FB relies heavily on contractors. Ellen Silver, vice president of operations for the FB, said on a blog last year that the use of contractors allowed the FB to scale globally"- moderators can work 24 hours a day, evaluate posts made in more than 50 languages, in 20 office locations around the world.

Using the work of contractors has a practical advantage for the FB: it is radically cheaper. The average FB employee makes $ 240,000 summing up salaries, bonuses and promotions each year, a moderator working for Cognizant in Arizona will earn only $ 28,800 per year. This state of affairs allows the FB to maintain high profitability. In the last quarter, the company made a profit of $ 6.9 billion with revenue of $ 16, 9 billion. And lthough Zuckerberg warned investors that the investments in the security of FB will reduce the profitability of the company, in reality profit increased by 61% compared to last year.

Since 2014, when Adrian Chen described in detail the difficult working conditions of moderators of social networks for Wired magazine, the FB has been sensitive to criticism that traumatizes its lowest paid employees. Silver said in her blog post that the FB estimates that potential moderators "will be able to cope with the image of violence," and tests them for their ability to withstand this.

Bob Duncan, who oversees Cognizant’s moderation of content in North America, says recruiters are explaining in detail the visual nature of the job to candidates. “We are sharing examples of what can be seen so that they have an idea about it,” he says. “Everything is done to ensure that people understand this.” And if they decide that this work is not suitable for them, they will be able to make appropriate decisions. ”

Until recently, most of the moderation of FB content took place outside the United States. But with the growing demand for labor, the FB expanded its presence in his home country and deployed offices in California, Arizona, Texas, and Florida.

The United States is the country of origin of the company, and one of the countries with the most popularity of social networks, says Davidson. American moderators are more likely to have the cultural context necessary to evaluate content from the United States, where hate speech or intimidation can occur, because according to him in such cases jargon specific to a particular country is often used.

The FB also worked to create, as Davidson describes them, “the most advanced offices that would copy the FB’s office and convey all the sensations from it. It was important to do this, because sometimes such ideas appear on the market that our people are sitting in dark, dirty basements, illuminated only by the light of screens. In fact, everything is completely wrong. ”

In fact, the Cognizant Phoenix office is neither dark nor dirty. We can say that it resembles other offices of the FB, since there are offered office desks with computers on it. But if the employees of the FB in Menlo Park work in spacious, sun-drenched complexes from the architect Frank Gehry, then their contractors in Arizona huddle in crowded rooms, where a long line of toilets, the number of which is very limited, can take away all the time available for a break. And if the FB employees are able to plan their work according to a very flexible schedule, the working day of the Cognizant employees is scheduled up to seconds.



A moderator named Miguel arrives on the day shift shortly before it starts at 7 in the morning. He is one of 300 employees who will eventually seep into their workplaces in an office occupying two floors in the Phoenix business park.

Security monitors the entrance, making sure that disgruntled former employees or users of the FB do not penetrate them, who could make claims to the moderators because of deleted entries. Miguel walks through his badge to the office and heads for the lockers. There are not enough lockers at all, so some employees keep their things there at night, to ensure that no one will take them the next day.

Lockers occupy a narrow corridor that is filled with people during breaks. To protect the privacy of FB users whose records they evaluate, workers are required to hide their phones in lockers during office hours.

Writing accessories and paper should also not be taken with you, so that Miguel does not have the temptation to write down personal information of some FB user. This policy extends even to scraps of paper such as chewing gum wrappers. Small items, such as hand lotion, must be placed in transparent plastic cosmetic bags so that they can be seen by managers.

To accommodate four daily shifts - with a large staff turnover - most people do not assign their personal table to, as it is called in Cognizant, the "production floor". Miguel finds a free workstation and logs into his software account called the Single Review Tool [SRT]. When he is ready for work, he clicks the “continue reviews” button and goes deeper into the line of posts.

Last April, a year after many documents were already published in the Guardian, the FB published a community standard by which it tries to manage 2.3 billion users visiting the resource every month. In the months that followed, Motherboard and Radiolab published detailed investigations of moderation issues such as, for example, a huge amount of text.

Among the difficulties: a large number of records; the need to train the worldwide army of low-paid workers for the consistent application of a single set of rules; almost daily changes and clarifications of these rules; lack of cultural or political context among moderators; lack of context in the records, because of which they become ambiguous; frequent disagreements between moderators over which rules should be applied in which cases.

Despite the difficulties in dictating politics, the FB obliges Cognizant and other contractors to honor the metric called “accuracy” above all. In this case, accuracy means that when the employees of the FB check a subset of the decisions of the contractors, they must agree with them. The company has made the goal an accuracy of 95%, and in no way can achieve it. Cognizant has never maintained such accuracy over long periods - usually it is slightly less than or slightly more than 90%, and at the time of publication it was somewhere around 92%.

Miguel diligently follows politics - although, as he says, it does not always make sense to him. You can leave an entry called “my favorite n ----”, because, according to the policy, this is “clearly positive content”. The call of “all autistic people should be sterilized” seems insulting to him, but this record also remains. Autism is not a “protected characteristic,” such as race or gender, so it is not a violation of politics. (The call to "sterilize all men" would have to be removed).

In January, the FB distributed a policy update that stated that moderators should take into account the recent love circumstances of a user’s life when evaluating posts describing hatred of the sexes. The record “I hate all men” has always been against politics. But “I just broke up with my boyfriend and I hate all men” no longer contradicts her.

Miguel processes the records in the queue. They get there without any special order. Here's a racist joke. Here is a man having sex with livestock. Here's a video of the murder recorded by the drug cartel. Some of the recordings that Miguel watches are from the FB, where, as he says, he often encounters assaults and hate speech; others come from Instagram, where you can post under a pseudonym, and violence, nudity and sex are more common there.

For each entry, Miguel must conduct two separate tests. He must first determine if the record violates community standards. Then he must choose the right item, which is violated. If he accurately determines that the record needs to be deleted, but selects the “wrong” reason, this affects his assessment of accuracy.

Miguel does a very good job. He correctly chooses the actions for each record, trying to save the FB from the worst content, and protecting the maximum number of correct (albeit unpleasant) texts. For processing each record, he spends no more than 30 seconds, and processes up to 400 records per day.

When Miguel has a question, he raises his hand, and the “subject matter expert” (SME) - a contractor who is considered to be better and more fully aware of the FB policy, earning $ 1 more per hour than Miguel - is suitable to him and provides assistance. This takes Miguel’s time, and although he doesn’t have a minimum quota for the number of records, managers monitor his productivity and ask him to explain when this number becomes less than 200.

Of the 1,500 (or so) decisions Miguel makes in a week, the FB randomly chooses 50- 60 pieces for audit. These records are being studied by another Cognizant employee working in the quality department, or QA, who also earns $ 1 more than Miguel. Then, the FB employees audit a subset of the QA solutions, and based on all these audits, an accuracy assessment is derived.

Miguel is skeptical of the resulting value of accuracy. “Accuracy is evaluated only with the consent of the workers. If the auditor and I allow the sale of cocaine, it will be judged as an “exact” decision, simply because we agreed, ”he says. “That number is nonsense.”



Fixing the FB for accuracy was developed after many years of criticism about solving problems with moderators. Billions of records appear on the social network every day, and the FB feels pressure from all sides. In some cases, the company was criticized for doing too little - for example, when UN investigators found that the social network was involved in hate speech during the persecution of the Rohingya in Myanmar. In other cases, she was criticized for having gone too far — for example, when a moderator deleted a post that quoted the US Declaration of Independence. (As a result, Thomas Jefferson was made a posthumous exception to the rule for users of the FB who prohibit the use of phrases such as “Indian savages”).

One of the reasons why moderators find it difficult to achieve their accuracy goal is because they have to consider several sources of information for each decision to enforce the rules.

The canonical source of enforcement is the FB public community guidelines, consisting of two sets of documents: published, and longer, internal, where more details are given regarding complex issues. These documents are supplemented by a second document of 15,000 words entitled “Known Issues,” which provides additional comments and guidance on pressing moderation issues — something like the Talmud for Torah guidelines. “Known issues” once occupied one long document, which moderators had to check daily. Last year, it was included in internal guidelines to make it easier to look for.

The third source of truth is internal discussions between moderators. At the time of breaking news such as mass shootings, moderators try to reach a consensus on whether the videos meet the criteria for deletion, or if they just need to be flagged as “disturbing”. But sometimes, as moderators say, their consensus is erroneous, and managers have to walk around the office explaining the right decision.

A fourth source is perhaps the most problematic: internal FB tools designed to disseminate information. Although formal policy changes usually appear every second Wednesday, gradually increasing guidelines for issues under development are distributed almost daily. Often, instructions are posted on Workplace, the FB enterprise version that the company introduced in 2016.. Like the FB, Workplace has a newsletter working according to a certain algorithm, showing entries depending on involvement. At the time of breaking news such as mass shootings, managers often post conflicting information on how to moderate individual content examples that appear on the Workplace without sorting by time. Six current and former employees told me that they made mistakes during moderation, seeing an outdated entry at the very top of the list. Sometimes it seems that the FB's own product works against them. Moderators feel this irony.

“This has been happening all the time,” says Diana, a former moderator. “It was terrible - one of the most unpleasant things that I had to deal with personally in order to do my job correctly.” When a national tragedy occurs, such as the shooting in Las Vegas in 2017, managers instruct moderators to remove the video - and then, in a separate entry made a few hours later, leave it. Moderators make decisions based on which entry from the Workplace appears first.

“It was a complete mess,” says Diana. “We were expected to make careful decisions, and all this spoiled our statistics.”

Entries on the Workplace for policy changes are sometimes complemented by sets of slides that are sent to Cognizant employees for specific topics - often tied to sad anniversaries like Parkland shooting . However, the moderators informed me that such presentations and other supporting materials often contain shameful errors. Over the past year, members of the House of Representatives have been confused with the Senators with the Senate; incorrectly indicated the date of the election; made a mistake in the name of the Parkland school in which the shooting was.

And even despite the constantly changing set of rules, the moderators practically do not leave the right to make a mistake. This work is reminiscent of a high-stakes video game in which you start with a hundred points — an ideal measure of accuracy — and then, by hook or by crook, try to save those points. If you fall below 95, your work is at risk.

If the quality manager notes Miguel’s decision as incorrect, the latter may dispute this. Forcing QA to agree with you is, as they say, “return the point”. In the short term, a mistake is considered to be a mistake in the QA, so the moderators have every reason to challenge these decisions every time. Recently, Cognizant has further complicated the procedure for returning a point, requiring SMEs to first approve the appeal before sending it to QA.

Sometimes controversial issues reach the FB. But all the moderators I interviewed say that Cognizant managers discourage employees from raising questions to the client, apparently fearing that too many such cases will begin to annoy the FB.

As a result, Cognizant started inventing politics on the fly. For example, although the guidelines do not directly prohibit coverage of the topic of sexual strangulation , three former moderators informed me that their team leader had announced that such images were permissible only if there were no strangled fingers on the neck.

Before firing employees, they are offered mentoring and sent to a correctional program, which should ensure that they perfectly master the guidelines. However, this often serves as an excuse to fire employees, as three former moderators told me. Sometimes, contractors who have missed too many points bring the appeal to the FB, where the final decision is made. But the company, as I was informed, does not always manage to comb through the entire queue of such requests before the employee is already fired.

Officially, moderators are prohibited from communicating with QA and persuading them to change their mind. But this still happens on a regular basis, as two former employees of the quality department told me.

One of them, Randy, sometimes, returning to his car after a working day, met moderators waiting for him there. 5-6 times a year, someone tried to intimidate him so that he changed his mind. “They met me in the parking lot and threatened to knock me out of me,” he says. “No one ever tried to speak politely or respectfully with me.” It always was: “You misjudged me! It was a chest! With completely visible areoles! ""

Concerned about his safety, Randy began carrying hidden weapons to work with him. Dismissed employees regularly threatened to return to work and get even with former colleagues, and Randy thought that some of them spoke quite seriously. A former colleague told me that he knew about the pistol that Randy carried with him, and approved of his decision, worrying that the efforts of local guards would not be enough in the event of an attack.

Duncan of Cognizant told me that the company will investigate various security and management issues that I was told about. He said that it was forbidden to bring a gun to work, and that if management knew about it, they would intervene and oppose the employee.

Randy quit a year later. He did not have the opportunity to shoot a pistol, but his concern did not let go. “I left partly because I didn’t feel safe even at home,” he says.



Before taking a break, Miguel needs to click on the button in the browser extension to notify the company about it. (Davidson told me that “this is a standard procedure in such enterprises. So that you can monitor the workforce and know who is where.”)

Miguel can take two breaks of 15 minutes and one 30-minute lunch break. During breaks, he usually encounters long lines in the toilet. For several hundred people in the office there is only one urinal and two cubicles in the men's toilet, as well as three cubicles in the women's. Cognizant eventually allowed workers to use the toilets on other floors, but it would take precious minutes for Miguel to get there and return. By the time he went to the toilet and made his way through the crowd to his drawer, he had only five minutes left to look at his phone and return to the table.

Miguel also relies nine minutes a day on “feeling better,” which he should use if he feels injured and needs to move away from the table. Several moderators told me that they used these minutes to go to the toilet when the lines are getting shorter. But the management eventually found out about this, and ordered employees not to use this time for going to the toilet. (Recently, a group of FB moderators hired by Accenture in Austin complained of "inhuman" conditions associated with interruptions; the FB blamed the problem on the company's misunderstood policies).

In Phoenix, Muslims who used the time to “improve their well-being” to perform one of the five daily prayers were ordered to end this and practice religion during breaks - current and former company employees told me so. Moreover, the staff with whom I spoke did not understand why prayer is not considered the proper use of time to “improve one's well-being”. Cognizant declined to comment on these incidents, but a person familiar with one of these cases told me that one worker requested 40 minutes to perform daily prayers, and the company found these requirements excessive.

Cognizant employees are instructed to deal with stress at work by visiting psychologists when they have time; through calls to the hotline; using the employee assistance program, which involves several sessions of psychotherapy. Recently, yoga and other therapeutic exercises have been added to the work week. But besides rare visits to a psychologist, the other means, as six employees told me, turn out to be inadequate. They said they coped with stress at work in other ways: sex, drugs, and rude jokes.

The list of places where Cognizant sex workers were found was: toilet cubicles, a staircase, parking, a room for expressing milk. In early 2018, security sent a message to managers indicating the similar behavior of workers, as one person familiar with this matter told me. As a result, managers removed the locks on the door from the mother’s room and from some other locked rooms. (Now the mother’s room is locked again, but those who want to use it must get the keys from the administrator).

Former moderator Sarah said that the secrecy associated with her work, together with her high complexity, leads to a strong relationship between employees. “You get close to other guys very quickly,” she says. - If you are not allowed to chat with friends or family about work, it puts people off. And you feel that you are getting closer to these people. It feels like an emotional connection, although in reality it’s just a relationship based on common injuries. ”

Workers also cope with problems with drugs and alcohol, both on and off the job site. One former moderator, Lee, told me that he smoked marijuana with vape at work almost daily. He says that during breaks, small groups of employees often went outside to smoke (in Arizona, it is legal to use marijuana for medical purposes).

“It's hard to even figure out how many people I smoked with,” says Lee. - Looking back, it's terrible - the heart is breaking. We went down, stoned, and returned to the workplace. This is unprofessional. Imagine that the moderators of the world's largest social networks smoke at work, moderating content ... ". He is silent.

Lee, who worked as a moderator for about a year, was one of several workers who believed that the darkest humor flourished there. Workers competed by sending each other the most racist or offensive memes, he said, to cheer each other up. An ethnic minority Lee often became a joke, and he accepted these racist jokes as well-intentioned. But over time, his mental health began to bother him.

“We did things that obscured our souls - call it what you want,” he says. - What can be done in such a situation? The only thing that makes us laugh is harmful to us. I had to keep track of what kind of jokes I speak in public. "I constantly accidentally said all sorts of insulting things - and then suddenly remembered that I was in the grocery, and here you can’t say that."

There were also frequent jokes about harming oneself. Sarah heard her colleague once answering a psychologist’s question about how he was doing, answered “I get drunk to forget all this.” (But after that, the psychologist did not offer a colleague to talk about this in more detail). On especially bad days, Sarah says, people are discussing that the time has come to “hang out on the roof” - a joke that one day Cognizant workers will decide to get off it.

Once, Sarah said, the moderators were distracted from their computers and saw a man standing on the roof of a nearby office building. Most of them saw suicides starting in exactly the same way. Moderators jumped up and ran to the windows.

But that man did not jump. In the end, everyone realized that this was one of their colleagues at the break.



Like most of the former moderators with whom I spoke, Chloe quit after about a year of work.

Among other things, she was concerned about the spread among conspiracy theories. One person from QA often discussed with colleagues their confidence that the Earth was flat, and “actively tried to attract other workers into their faith,” as another moderator told me. One of Miguel’s colleagues once mentioned the concept of Holohoax , from which Miguel realized that this person was a person who denied the reality of the Holocaust.

Six moderators said conspiracy theories often received a warm welcome in the office. After the shooting at Parkland last year, the moderators were initially horrified by the results of the attack. But the more conspiracy theories people posted on the FB and on Instagram, the more often Chloe’s colleagues expressed their doubts.

“People began to believe in the recordings they had to moderate,” she says. - They said: Oh gods, they actually were not there . Oh, look at this CNN report about David Hogg - he's too old to be a schoolboy. ” People began to search for everything on Google instead of working, and to understand all of these conspiracy theories. We told them: Guys, no, this is that crazy garbage that we need to moderate. What are you doing?"

But for the most part, Chloe is concerned about the long-term effects of working on mental health. Several moderators told me that they experienced a secondary post-traumatic stress disorder, a disorder that occurs when other people are injured. This disorder, the symptoms of which may coincide with post-traumatic stress disorder (PTSD), often occurs in doctors, psychotherapists and social workers. People with a secondary disorder complain of anxiety, sleep loss, and dissociation .

Last year, a California girl, a former FB moderator, sued the company, claiming that contractor Pro Unlimited had caused her to workPTSD Her lawyers said she was “trying to defend herself from the dangers of psychological trauma caused by the FB having failed to organize a safe workplace for thousands of contractors who should provide a safe environment for users of the FB.” The trial is not over yet.

Chloe experienced symptoms of a traumatic disorder for several months after leaving this job. She began a panic attack in the cinema while watching the movie " Mother!"when the massacre on the screen brought to mind the very first video that she had to encounter during an internship. Another time, she was sleeping on the couch and suddenly heard automatic shooting, after which she panicked. Someone in the house turned on the television series with the shooting. She says she started “freaking out and begging them to turn it off.” The

attacks made her think of her colleagues, especially those who could not go through an internship and start working. “Many people just can’t cope with the training period,” she says. “They spend these four weeks and nd they are fired. As a result, they can accumulate exactly the same experience as me, but no access to psychologists, they will not. "

Davidson told me that last week the FB began to monitor a test group of moderators, measuring their “sustainability,” as the company calls it — their ability to move away after viewing traumatic materials and continue to do their job. He says the company hopes to extend this test to all moderators around the world.



Randy also quit after about a year. Like Chloe, he was heavily influenced by the video, where a man is stabbed. The victim was about the same age as him, and he remembers how this man, dying, called his mother.

“I see it every day,” Randy says. - I developed a fear of knives. I like to cook, but it’s very hard for me to go back to the kitchen and be near the knives. ”

This work also changed his attitude towards the world. After watching so many videos claiming that the 9/11 events were not an attack by terrorists, he began to believe in them. The videos revealing the Los Angeles shooting conspiracy were also very convincing, he says, and now he believes that several people were involved in the attack (the FBI claimed that it was the work of one shooter). "

Now, Randy sleeps with a gun at his side. He scrolls in his mind the flight from his house in case of attack. When he wakes up in the morning, he goes around the house with a gun in his hand in search of strangers.

He recently started visiting another psychotherapist when he was diagnosed PTSD and anxiety neurosis.

“I'm completely overwhelmed,” says Randy. - My psychological state is jumping here and there. On one day, I am completely happy; on another, I look like a zombie. Not that I was depressed, I was just stuck. ”

He adds: “I don’t think it’s possible to do this work and not to earn acute neurosis or PTSD.”

The moderators with whom I spoke often complain that local psychologists are passive and wait for workers until they themselves recognize signs of anxiety and depression and begin to seek help.

“They did absolutely nothing for us,” says Lee. “They just expected that we ourselves could understand when we break.” Most of the workers there are simply slowly degrading, and they themselves do not notice it. That's what kills me. ”



Last week, when I told the FB representatives about my conversations with moderators, the company invited me to Phoenix so that I could look at this office myself. For the first time, the FB allowed a journalist to visit the office of moderators in the United States since two years ago the company began building special offices here for this purpose. The representative who met me on the spot said that the stories told to me do not reflect the everyday experience of most contractors, neither in Phoenix nor in other offices around the world.

One source told me that on the eve of my arrival at the office center where Cognizant is located, motivational posters were hung on the walls. In general, the place turned out to be much more colorful than I expected. A neon graph on the wall described a month-long activity that looked something like a cross between a summer camp schedule and a retirement home: yoga, animal therapy, meditation, and an event inspired by the movie Mean Girls , called “on Wednesdays we wear pink.” My arrival day was the last day of the “week of random good deeds,” during which employees were encouraged to write inspirational phrases on colored paper, which should then be attached to the walls with candy.

After talking with the managers from Cognizant and Facebook, I interview five employees who volunteered to do so. They go into the meeting room in one gut, and with them is the person who is the head of this office. When their boss is sitting next to them, they acknowledge the difficulties in this work, but tell me that they feel safe, feel support and believe that this work will provide an opportunity to reach higher-paid posts - even if not in the Federal Security Service, then at Cognizant itself.

Brad, the policy manager, tells me that most of the content that they moderate with colleagues is essentially harmless and asks not to exaggerate the psychological health risks associated with this work.

“It feels like we're being bombarded with these images and other content, but it's actually the other way around,” says Brad, who worked in the office for almost two years. “Most of what we see is quite calm. These are people reacting too emotionally. These are people complaining about photos and videos that they simply do not want to see - and not because they have some bad content. Most of what we see is just that. ”

When I ask about the great difficulties involved in applying the policy, moderator Michael says that he is constantly confronted with the need to make difficult decisions. “There are an infinite number of options for the next job, and this leads to chaos,” he says. - But because of this, the work remains interesting. "There will never be one that you work all day, knowing the answer to every question."

In any case, says Michael, he likes this work more than the previous one at Walmart, where visitors often scolded him. “Nobody yells at me here,” he says.

The moderators leave the room in turn, and they introduce me to two full-time psychologists, and one of them is the organizer of psychological assistance in this office. They ask me not to use their real names. They say that they inquire about the status of each employee every day. They say that a combination of full-time services, a hotline, and employee assistance programs is enough to protect employees' well-being.

When asked about the risks of PTSD, the psychologist, whom I will call Logan, tells me about another psychological phenomenon: post-traumatic growth , when the victims of traumatic events become stronger due to their experience. He cites Malalu Yusufzai as an example ., an activist in the field of education who was wounded in the head by one of the teens in the terrorist movement Techrique Taliban Pakistan.

“It was an extremely traumatic event for her,” says Logan. - But, apparently, she got out of the situation, becoming more strong and hardy. She received the Nobel Peace Prize. So there are many examples of people who have survived difficult times and who have become stronger because of this. "

The visit ends with a tour, during which I walk around the floor and talk with other employees. It surprises me how young they are: almost all of them are under thirty, or just over thirty. While I am walking around the office, all work is paused so that I cannot accidentally see the personal information of FB users, so the employees chat nicely with their neighbors as I pass by. I mark posters. One, from Cognizant, contains the cryptic slogan "empathy on a large scale." Another, famed FB chief operating officer Cheryl Sandberg, says: “What would you do if you weren’t afraid?” I

immediately recall Randy with his gun.



Everyone I met in the office expresses great concern for the employees, and seems to do everything possible for them, in the context of the system in which they all have to work. The FB is proud that it pays contractors at least 20% more than the minimum wage in all moderator offices, provides full medical insurance, and better quality psychological assistance than in the largest call centers.

And yet, the more moderators I interview, the more I have doubts about the meaningfulness of using a call center model to organize content moderation. This model has long been a standard among tech giants.- it is used on Twitter and Google, and therefore on YouTube. In addition to cost savings, outsourcing allows technology giants to quickly expand their service offerings in new markets and in new languages. However, this strategy conveys critical communication and security issues to the hands of people who are paid as if they were answering calls from customers in an electronics supermarket.

All the moderators interviewed by me are terribly proud of their work, and talk about it with pathos seriousness. They just want the FB staff to consider them their colleagues, and treat them in a way that at least resembles equality.

“If we hadn't done this job, the FB would have looked terrible,” says Lee. - We consider all this for them. And, yes, hell, sometimes we make the wrong decisions. However, people do not understand that real human beings work here. ”

And the fact that people do not understand this, of course, is done so on purpose. It would be better for the FB to rant about its successes in the field of artificial intelligence, and that due to this, in the future, it will have to rely less on the work of moderators-people.

However, given the limitations of this technology and endless speech diversity, such a day seems very distant. In the meantime, the call center model for content moderation is very detrimental to many people working in it. They, being in the forefront of the platform with billions of users, perform a function critical for modern civilization, and receive less than half what other people working at the forefront receive. They work as long as they can - and then they quit, and the non-disclosure agreement ensures that they go into the shadows as deep as possible.

And from the point of view of the FB company, it seems as if they did not work there at all. And, technically, it is.

Also popular now: