The Verge writes a story (an exposé?) on the Facebook-moderation industry.
It goes through the standard ways it maltreats its employees: low pay, limited bathroom breaks, awful managers – and then into some not-so-standard ones. Mods have to read (or watch) all of the worst things people post on Facebook, from conspiracy theories to snuff videos. The story talks about the psychological trauma this inflicts:
It’s an environment where workers cope by telling dark jokes about committing suicide, then smoke weed during breaks to numb their emotions…where employees, desperate for a dopamine rush amid the misery, have been found having sex inside stairwells and a room reserved for lactating mothers…
It’s a place where the conspiracy videos and memes that they see each day gradually lead them to embrace fringe views. One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape route out of his house and sleeps with a gun at his side, said: “I no longer believe 9/11 was a terrorist attack.
One of the commenters on Reddit asked “Has this guy ever worked in a restaurant?” and, uh, fair. I don’t want to speculate on how much weed-smoking or sex-in-stairwell-having is due to a psychological reaction to the trauma of awful Facebook material vs. ordinary shenanigans. But it sure does seem traumatic.
Other than that, the article caught my attention for a few reasons.
First, because I recently wrote a post that was a little dismissive of moderators, and made it sound like an easy problem. I think the version I described – moderation of a single website’s text-only comment section – is an easi-er problem than moderating all of Facebook and whatever horrible snuff videos people post there. But if any Facebook moderators, or anyone else in a similar situation, read that post and thought I was selling them short, I’m sorry.
Second, because the article gives a good explanation of why Facebook moderators’ job is so much harder and more unpleasant than my job or the jobs of the mods I work with: they are asked to apply a set of rules so arcane that the article likens them to the Talmud, then have their decisions nitpicked to death – with career consequences for them if higher-ups think their judgment calls on edge cases were wrong.
While I was writing the article on the Culture War Thread, several of the CW moderators told me that the hard part of their job wasn’t keep the Thread up and running and well-moderated, it was dealing with the constant hectoring that they had made the wrong decision. If they banned someone, people would say the ban was unfair and they were tyrants and they hated freedom of speech. If they didn’t ban someone, people would say they tolerated racism and bullying and abuse, or that they were biased and would have banned the person if they’d been on the other side.
Me, I handle that by not caring. I’ve made it clear that this blog is my own fiefdom to run the way I like, and that disagreeing with the way I want a comment section to look is a perfectly reasonable decision – which should be followed by going somewhere other than my blog’s comment section. Most of my commenters have been respectful of that, I think it’s worked out very well, and my experience moderating thousands of comments per week is basically a breeze.
Obviously this gets harder when you have hundreds of different moderators, none of whom are necessarily preselected for matching Facebook HQ’s vision of “good judgment”. It also gets harder when you’re a big company that wants to keep users, and your PR department warns you against telling malcontents to “go take a hike”. It gets harder still when you host X0% of all online discussion, you’re one step away from being a utility or a branch of government or something, and you have a moral responsibility to shape the world’s conversation in a responsible way – plus various Congressmen who will punish you if you don’t. The way Facebook handles moderation seems dehumanizing, but I don’t know what the alternative is, given the pressures they’re under.
(I don’t know if this excuses sites like the New York Times saying they can’t afford moderators; I would hope they would hire one or two trusted people, then stand by their decisions no matter what.)
Third, I felt there was a weird tension in this article, and after writing that last paragraph I think I know what it is. This was a good piece of investigative reporting, digging up many genuinely outrageous things. But most of them are necessary and unavoidable responses to the last good piece of investigative reporting, and all the outrageous things it dug up. Everything The Verge is complaining about is Facebook’s attempt to defend itself against publications like The Verge.
Take, for example, the ban on phones, writing utensils, and gum wrappers:
The Verge brings this up as an example of the totalitarian and dehumanizing environment that Facebook moderators experience. But I imagine that if an employee had written down (or used their phone to take a picture of) some personal details of a Facebook user, The Verge (or some identical publication) would have run a report on how Facebook hired contractors who didn’t even take basic precautions to protect user privacy.
And what about the absolutist, infinitely-nitpicky rules that every moderator has to follow (and be double- and triple-checked to have followed) on each decision? Again, totalitarian and dehumanizing, no argument there. But if a moderator screwed up – if one of them banned a breastfeeding picture as “explicit”, and the Facebook Talmud hadn’t include twelve pages of exceptions and counterexceptions for when breasts were and weren’t allowed – I imagine reporters would be on that story in a split second. They would be mocking Facebook’s “lame excuse” that it was just one moderator acting alone and not company policy, and leading the demands for Facebook to put “procedures” in place to ensure it never happens again.
If I sound a little bitter about this, it’s because I spent four years working at a psychiatric hospital, helping create the most dehumanizing and totalitarian environment possible. It wasn’t a lot of fun. But you could trace every single rule to somebody’s lawsuit or investigative report, and to some judge or jury or news-reading public that decided it was outrageous that a psychiatric hospital hadn’t had a procedure in place to prevent whatever occurred from occurring. Some patient in Florida hit another patient with their book and it caused brain damage? Well, that’s it, nobody in a psych hospital can ever have a hardcover book again. Some patient in Delaware used a computer to send a threatening email to his wife? That’s it, psych patients can never use the Internet unless supervised one-on-one by a trained Internet supervisor with a college degree in Psychiatric Internet Supervision, which your institution cannot afford. Some patient in Oregon managed to hang herself in the fifteen minute interval between nurses opening her door at night to ask “ARE YOU REALLY ASLEEP OR ARE YOU TRYING TO COMMIT SUICIDE IN THERE?” Guess nurses will have to do that every ten minutes now. It was all awful, and it all created a climate of constant misery, and it was all 100% understandable under the circumstances.
I’m not saying nobody should ever be allowed to do investigative reporting or complain about problems. But I would support some kind of anti-irony rule, where you’re not allowed to make extra money writing another outrage-bait article about the outrages your first outrage-bait article caused.
But maybe this is unfair. “Complete safety from scandal, or humanizing work environment – pick one” doesn’t seem quite right. High-paid workers sometimes manage to do sensitive jobs while still getting a little leeway. When I worked in the psychiatric hospital, I could occasionally use my personal authority to suspend the stupidest and most dehumanizing rules. I don’t know if they just figured that medical school selected for people who could be trusted with decision-making power, or if I was high-ranking enough that everyone figured my scalp would be enough to satisfy the hordes if I got it wrong. But it sometimes went okay.
And lawyers demonstrate a different way that strict rules can coexist with a humanizing environment; they have to navigate the most complicated law code there is, but I don’t get the impression that they feel dehumanized by their job.
(but maybe if the government put as much effort into preventing innocent people from going to jail as Facebook puts into preventing negative publicity, they would be worse off.)
It seems like The Verge’s preferred solution, a move away from “the call center model” of moderation, might have whatever anti-dehumanization virtue doctors and lawyers have. Overall I’m not sure how this works, but it prevents me from being as snarky as I would be otherwise.
(except that I worry in practice this will look like “restrict the Facebook moderation industry to people with college degrees”, and we should think hard before we act like this is doing society any favors)
Last, I find this article interesting because it presents a pessimistic view of information spread. Normal people who are exposed to conspiracy theories – without any social connection to the person spouting them, or any pre-existing psychological vulnerabilities that make them seek the conspiracy theories out – end up believing them or at least suspecting. This surprises me a little. If it’s true, how come more people haven’t been infected? How come Facebook moderators don’t believe the debunking of the conspiracy theories instead? Is it just that nobody ever reports those for mod review? Or is this whole phenomenon just an artifact of every large workplace (the article says “hundreds” of people work at Cognizant) having one or two conspiracy buffs, and in this case the reporter hunted them down because it made a better story?
Just to be on the safe side, every time someone shares an SSC link, report it as violating the Facebook terms of service. We’ll make rationalists out of these people yet!