Social media: Keeping horrors off Facebook
“Imagine putting up with your commute, heading into work, and sitting in a cubicle watching murders, child abuse, drug use, and suicides all day,” said Mike Murphy in Qz.com. That’s the job Facebook is hiring for after a string of horrific videos appeared on the social network in recent weeks. CEO Mark Zuckerberg has pledged to hire 3,000 more people to monitor the site for disturbing content, in addition to the 4,500 workers who already investigate videos reported by users. Having more moderators should help the social network spot and take down the worst videos much more quickly, like the grisly video that appeared on Facebook Live last month of a man in Thailand killing his 11-month-old daughter and then committing suicide. But while Facebook’s nearly 2 billion users may be spared such horrors, what about the 7,500 workers whose job it is to regularly watch the worst of humanity?
“It is well documented that these jobs are, psychologically speaking, among the worst in tech,” said Jason Koebler in Vice .com. Unbeknown to many users, armies of contract workers are responsible for keeping all manner of terrible images off the web’s most popular sites, and they typically work long hours for low pay. Many of those workers have described suffering serious psychological damage from the images they’ve seen. Yet, “when asked why they accept such jobs, most moderators utter some variation of ‘Someone has to do it.’” Facebook and other companies are investing heavily in artificial intelligence that would make the job of content moderation less dependent on humans. The company now uses a database of known child pornography, for instance, to automatically remove some posts. But the technology still largely relies on human judgment to differentiate between, say, an action movie and a video of a real-life shooting. “Right now, there simply is no psychologically ‘safe’ way to keep Facebook clean.”
It’s good that Facebook is finally doing something, said Emily Dreyfuss in Wired.com. “But what if such reflections had taken place more meaningfully before pushing video so heavily on the site?” Surely the social network could have anticipated that, with so many users, it was inevitable that a few deranged people would use Facebook Live to broadcast acts of violence. Maybe Facebook should hire an in-house ethicist to work through the implications of its products before they launch, rather than after. Maybe it’s also time to start regulating Facebook as we do the public airwaves, said Cale Guthrie Weissman in FastCompany .com. Facebook has long insisted that it’s not a media company, but “if such acts of violence happened that often on your local TV news broadcast, that station would probably be taken off the air and face hefty fines.” ■