传媒教育网

 找回密码
 实名注册

QQ登录

只需一步,快速开始

搜索
传媒教育网 新闻聚焦 查看内容

Inside Facebook’s Secret Rulebook for Global Political Speech

2019-1-1 21:46| 发布者: 刘海明| 查看: 109| 评论: 0|原作者: By Max Fisher|来自: NYT

摘要: Inside Facebook’s Secret Rulebook for Global Political SpeechUnder fire for stirring up distrust and violence, the social network has vowed to police its users. But leaked documents raise serious que ...

Inside Facebook’s Secret Rulebook for Global Political Speech

Under fire for stirring up distrust and violence, the social network has vowed to police its users. But leaked documents raise serious questions about its approach.

MENLO PARK, Calif. — In a glass conference room at its California headquarters, Facebook is taking on the bonfires of hate and misinformation it has helped fuel across the world, one post at a time.

The social network has drawn criticism for undermining democracy and for provoking bloodshed in societies small and large.

But for Facebook, it’s also a business problem.

The company, which makes about $5 billion in profit per quarter, has to show that it is serious about removing dangerous content. It must also continue to attract more users from more countries and try to keep them on the site longer.

How can Facebook monitor billions of posts per day in over 100 languages, all without disturbing the endless expansion that is core to its business? The company’s solution: a network of workers using a maze of PowerPoint slides spelling out what’s forbidden.

ADVERTISEMENT

Every other Tuesday morning, several dozen Facebook employees gather over breakfast to come up with the rules, hashing out what the site’s two billion users should be allowed to say. The guidelines that emerge from these meetings are sent out to 7,500-plus moderators around the world. (After publication of this article, Facebook said it had increased that number to around 15,000.)

The closely held rules are extensive, and they make the company a far more powerful arbiter of global speech than has been publicly recognized or acknowledged by the company itself, The New York Times has found.

You have 3 free articles remaining.

Subscribe to The Times

The Times was provided with more than 1,400 pages from the rulebooks by an employee who said he feared that the company was exercising too much power, with too little oversight — and making too many mistakes.

An examination of the files revealed numerous gaps, biases and outright errors. As Facebook employees grope for the right answers, they have allowed extremist language to flourish in some countries while censoring mainstream speech in others.

ADVERTISEMENT

Moderators were once told, for example, to remove fund-raising appeals for volcano victims in Indonesia because a co-sponsor of the drive was on Facebook’s internal list of banned groups. In Myanmar, a paperwork error allowed a prominent extremist group, accused of fomenting genocide, to stay on the platform for months. In India, moderators were mistakenly told to flag for possible removal comments critical of religion.

The ruins of a home set upon by a Buddhist mob in a deadly attack in Sri Lanka last March. Facebook has been accused of accelerating violence in the country.CreditAdam Dean for The New York Times
Image
The ruins of a home set upon by a Buddhist mob in a deadly attack in Sri Lanka last March. Facebook has been accused of accelerating violence in the country.CreditAdam Dean for The New York Times

The Facebook employees who meet to set the guidelines, mostly young engineers and lawyers, try to distill highly complex issues into simple yes-or-no rules. Then the company outsources much of the actual post-by-post moderation to companies that enlist largely unskilled workers, many hired out of call centers.

Those moderators, at times relying on Google Translate, have mere seconds to recall countless rules and apply them to the hundreds of posts that dash across their screens each day. When is a reference to “jihad,” for example, forbidden? When is a “crying laughter” emoji a warning sign?

Moderators express frustration at rules they say don’t always make sense and sometimes require them to leave up posts they fear could lead to violence. “You feel like you killed someone by not acting,” one said, speaking on the condition of anonymity because he had signed a nondisclosure agreement.

Facebook executives say they are working diligently to rid the platform of dangerous posts.

“It’s not our place to correct people’s speech, but we do want to enforce our community standards on our platform,” said Sara Su, a senior engineer on the News Feed. “When you’re in our community, we want to make sure that we’re balancing freedom of expression and safety.”

Monika Bickert, Facebook’s head of global policy management, said that the primary goal was to prevent harm, and that to a great extent, the company had been successful. But perfection, she said, is not possible.

“We have billions of posts every day, we’re identifying more and more potential violations using our technical systems,” Ms. Bickert said. “At that scale, even if you’re 99 percent accurate, you’re going to have a lot of mistakes.”

When is it support for terrorism? Is “martyr” a forbidden word? Moderators are given guides to help them decide.

鲜花

握手

雷人

路过

鸡蛋

最新评论

掌上论坛|小黑屋|传媒教育网 ( 蜀ICP备16019560号-1

Copyright 2013 小马版权所有 All Rights Reserved.

Powered by Discuz! X3.2

© 2016-2022 Comsenz Inc.