Three months in hell

What I learned from three months of Content Moderation for Facebook in Berlin



Read the German Version here.


Germany has become one of Facebook's most important hubs for content moderation - also fueled by a controversial new law that requires social media companies to effectively remove hate speech and violence. In Berlin and Essen more than 1000 people work as Facebook Content Moderators, most of them employed by the outsourcing company Arvato, a subsidiary of Bertelsmann, one of Germany's most powerful companies. Yet the work and the rules of content moderation is done in secrecy. In a year-long investigation, our reporters Hannes Grassegger and Till Krause spoke to dozens of current and former content moderators working for Facebook in Germany and have written several award-winning reports (»Inside Facebook«) that made the working conditions and deletion rules (»The secret rules of Facebook«) of Facebook public. Recently they have been contacted by Burcu Gültekin Punsmann, a former employee who, for the first time, gives a personal account of her work as a content moderator. We have slightly shortened and edited her piece for clarity. 

As a new comer to Berlin in July 2017, I found myself in a job in content moderation at Arvato. Curiosity has a been a main driver. I accepted the very unappealing job offer and entered into a world I haven't suspected the existence of. I was recruited as an outsourced reviewer and became one of the thousands of Facebook Community Operations team members around the world working in some 40 languages. Berlin, draining well-educated multilingual cheap labor from all over the world, has recently developed, as I would learn, as new center for Facebook content moderation, as the German government toughened the legislation against hate speech.

I quit the job after a three-month period. I feel the need today to take time to reflect on this very intense professional and personal experience. This is mainly an exercise of self-reflection and learning. I consider it as well as a way to dissociate myself from the very violent content I have been handling on a daily basis. I wish through my account to add transparency and to contribute to discussions on content moderation practices. I don't intend to violate the employee non-disclosure agreement I signed. I will not dig into the polices developed by Facebook. Through my relatively short work experience, I learned that these policies are constantly re-evaluated, highly dynamic and reactive. I couldn't perceive well enough, from my vintage point, the factors that impact on these sets of policies.

Meistgelesen diese Woche:

I enjoyed the training and induction period. I was happy to be in a very international context, I was accustomed to in my previous roles in international development. The difference here was that everyone was a migrant, mostly young, with very diverse backgrounds. This eclecticism has definitively attracted me. The paradox of being behind high walls of confidentiality while working for a social media platform, which connects people and the world, made the job unique and certainly aroused my curiosity.

Facebook and most of the social media platforms have a users/community based regulation system: no one is scrutinizing the content before it is uploaded. Users have the possibility to report the posts they found inappropriate on the platform. The content moderator is basically handling these reports, called tickets. With more than 2 billion users worldwide, the amount of content generated on Facebook is massive. As far as I know, some 6.5 million reports are generated in average every week. The work never stops, it continues seven days a week with almost no interruption.

I have difficulties in conceptualizing the role I acted in. Was I acting as a censor and restricting the freedom of speech? I don't think that I acted exactly in this role. I respected the freedom to offend and to make use of the most creative forms of expression. Working in the Turkish market, I handled a lot of reports that reflected the ideological divides that exist in Turkish society, disputes on historical narratives and posts attacking religion, some with humor, some very serious. I witnessed happily how the freedom to challenge and transgress values, ideas and beliefs could be energizing and creative.

I dealt indeed more often with behaviors than speech. I would say that the pictures and texts were in most cases not a form of self-expression but rather depicting attitudes. These behaviors incorporated a lot of violence and cruelty, ranging from hate speech to sadism, from bullying to self-harm. Without euphemism, the content is very violent. I had been exposed previously to real world violence as I worked in peace building and humanitarian aid in conflict torn contexts. Yet I was far from imagining, from my own user experience, that violence could be so predominant on social media.

A minority of these posts was of criminal nature. This disturbed me even more. These attitudes/behaviors/forms of expression would easily convince any non- specialist than the society had gone insane. I witnessed behaviors that didn't respect any social norm, pushed into a  world, where the notion of intimacy and decency vanish, where there is no respect for privacy. Did those social media tools subconsciously encourage people to overcome all inhibition by destroying all social filters and moral barriers? Many people on social media showed behavior outside any social norms, yet it can not be considered anti-social, since the individual who posts is looking for followers, ‘likes' and social interaction. The purpose seems to attract attention. I have often thought what would happen if people behaved in a similar way in the public space and adopt these attitudes in public transportation, cafes or parks.

I had to quit as I was particularly disturbed by what I saw as signs of professional deformation in me: a kind of hyper vigilance (especially about the risks for my family). I was dreaming about the job an my own perception of reality shifted in a most concerning way.  The terrible Las Vegas Shooting suddenly seemed entirely normal to me.  Also I realized that my work schedule can not become compatible with my family life. I have a seven year old daughter and I could barely see her during my late shifts.

At first the job seemed like a personal challenge to test my capacity to adapt to a radically different and extremely rigid and constraining work environment.  I found myself in a factory world as part of a global digital proletariat. I was part of a cohort of more than 700 people, in a closed environment with no communication with the outside world. My productive time and breaks were precisely calculated, I was stuck to my workstation, could only leave the production line for several minutes. I don't know whether we were producing anything, but I got the sense that we were helping to keep a multi-billion industry running.

The content moderator performs a serious job of analysis. The task is actually difficult and context sensitive. The moderator has not only to decide whether reported posts should be removed or kept on the platform but navigated into a highly complex hierarchy of actions. The mental operations are evaluated as being too complex for algorithms. Nevertheless moderators are expected to act as a computer. The search for uniformity and standardization, together with the strict productivity metrics, lets not much space for human judgment and intuition. At the end of the ramp-up process, a moderator should handle approximately 1300 reports every day which let him/her in average only a few seconds to reach a decision for each report. The intellectually challenging task tends to become an automated action almost a reaction.  Repetition triggers a sense of frustration and alienation. Reflection is not encouraged, the agent has very limited initiative, s/he can only collect examples to address policy loopholes. Such standardization is meant to contribute to objectivity. The agent should not make any assumption, should refrain questioning the intentions of the person behind the post. The abuse needs to be clear and in the frame. I nevertheless think more human judgment could have helped to prevent cases that led to press fires.

The tight work schedule limits social interaction to a minimum level among staff. Although the moderator is part of the team and sits in an open office. The virtual space is encapsulating and engenders a feeling of isolation. The impact of the content viewed is therefore amplified. The agent, in the queue (production line) receives the tickets (reports) randomly. Texts, pictures, videos keep on flowing. There is no possibility to know beforehand what will pop up on the screen. The content is very diverse. No time is left for a mental transition. It is entirely impossible to prepare oneself psychologically. One never knows what s/he will run into. It takes sometimes a few seconds to understand what a post is about. The agent is in a continual situation of stress. The speed reduces the complex analytical process to a succession of automatisms. The moderator reacts. An endless repetition. It becomes difficult to disconnect at the end of the eight hour shift. The task creates a kind of dependency, even addiction. Even in my dreams I started repeating the same task. I was glad enough I did not visualize the content.

During the long hours at my workstation, I wished I were a social worker or a psychologist. I sometimes had the impression that I was working in the emergency services and at other times in law enforcement. I felt powerless, as I couldn't intervene.  Was this a digital reflection of society? I tried to convince myself that I was exposed to the most marginal and radical segments of it. Violence seemed so widespread. Good that I was not working on reports sent from the society I lived in. I felt even more sorry for my colleagues working on German languages reports.

And this is not a job that everyone can do. We are asked in the recruitment interview (which took only 15 minutes) if we thought we could handle the content. It is not easy to know beforehand. I was probably not the least prepared one: I worked previously under stressful condition in peace building and humanitarian. Above all, I was more experienced and older than most of my colleagues. I felt much more sorry for my young colleagues. This system is based on a migrant and young workforce. My colleagues were 28 years old in average.

As a new recruit, I took part  in a half-day workshop organized by the psychological support consultant. I understood that the service hasn't been available from the very beginning. I found the workshop not focused enough. It is of course a positive development, better than nothing. It gives at least the possibility for the worker to leave his workstation and talk to someone. I didn't ask for any personal consultation. I can't tell precisely how many people ask for consultations. It was good to know that we had such a kind of resource. I felt how much some of my colleagues were scared and concerned about the long-term effect of the job. In this case, it is very important to talk and share, to 'debrief' each other. We didn't have any space for it. We were sometimes discussing in the public transportation - in English. Still I felt quite uneasy about the possible 'side-effect' of our discussions on other passengers.

Talking about verbal violence, the agent quickly becomes an expert in slurs and ways of cursing. The vocabulary at use is indeed quite limited; I found some long post aiming at cursing at a person sometimes fascinating. Trying to differentiate cursing from sexual exploitation was another major source of difficulty. It had never occurred to me previously that the cursing vocabulary could be so sexually suggestive and gendered. Bullying is particularly widespread and a source of headache for the content moderator. It is not limited to teenagers. Seeing how cruel and pitiless one could be against another person profoundly disturbed me. Many post seemed to be only motivated by the intention to hurt. Many were expressions of pure sadism.

A lot of content discloses extreme graphic violence, which makes the job of the content moderator very stressful. Everyone has to find its own way of coping with the graphic violence. I realized I gained experience when I compared myself with new recruits. In a situation of constant stress, the intellect tries to retake control in order to overcome extreme feelings of anger and disgust. This effort is by itself exhausting. I learned intuitively how to adopt a kind of medical approach to the human body, observing innards and tissues. I thought I was working at forensics while reviewing content depicting violent death. I had sometimes to replay a video several times or focus on the details of a picture, as policies can be quite precise and require proximity to the scene of the crime or accident.

The mental process became much more difficult in cases I couldn't help building empathy. The display of blood, mutilated and charred bodies is mere horror. I learned how to overcome my disgust and stand it. I showed empathy only when I found something connecting me with the world of the living beings, these small details that I tried not to notice that would humanize the corpse and overcome my reflex of repulsion. Witnessing the distress of the survivors had a powerful effect as well. In this respect, the worse of all, were videos of torture, especially child and animal abuse.

Working in the Turkish market, I got an insider view into the conflicts of the Middle East. Conflicts that particularly polarize a society have a reflection on social media. I noticed that the conflict of Iraq and Syria were spilling all over social media in Turkey. I felt as if I was embedded with the different sides of the Kurdish conflict, navigating between the insurgency and the special units. Sympathizers seem to be often behind the online organization of the war effort. Fighters use online technologies to document their war and produce a variety of materials depicting the comradeship, exalting bravery and mocking or dehumanizing the enemy. Most content on social media is probably released on social media without the approval of the chain of command. This experience illustrated the absurdity of the conflict. I found consolation in the fact that both sides were interacting on social media and sharing same cultural references.

For a few weeks our queues were flooded with pictures and videos of the massacres of the Rohingya people in Myanmar. The postings were for advocacy and awareness raising purposes and were associated with denunciations of the crimes and call for actions. The content was considered as newsworthy. Some of these pictures became viral and circulated as gore content. I noticed that horror doesn't support necessarily advocacy.

Videos of beheadings are the most feared among content moderators. I have seen indeed quite a lot of them. The content is of course very graphic. But for me the abundance of blood prevents empathy,  the death is fast. I struggled not to feel sick. In the period of the Muslim Aid Feast, I had to handle a lot of content of cattle slaughtering. I was disturbed as I realize that they affected me not much differently than human beheadings. Perhaps even a bit more as the agony of the animals lasts longer.

ISIS was not the only source of beheading videos. A lot of content was originating from the war against drug cartels in Latin America, Myanmar and even from the southern borderlands of my own country. The ISIS videos were though more sophisticated: the mise en scene and background music were almost making the content unreal when I couldn't discern the last facial expression of the victims. The reason why these videos circulate is the real issue. What is the motivation behind the act of posting a beheading videos? Very few of the ISIS videos I have seen were posted by ISIS linked sources. Are they posted merely out of sadism and desire to show violence? Or just to attract attention by posting shocking content?

The only power that the content moderator has, is to delete a post. At rare occasions, I could send a notification to the person highlighting that the post was cruel. I wished I could use the function more often. I dreamed often of being able to communicate with the person behind the post, but there was no time. There is a need to educate society. This requires a collective effort. The protection that the content moderators try to offer is only enabled by the reports generated by the users. The degree of secrecy and level of confidentiality around the practices of content moderation doesn't help. The burden and responsibility cannot rest only on the technology companies. But they can support the development of online communities.

Photo: Reuters