By David Talbot and Nikki Bourassa via Medium
Violence at a white nationalist rally in Charlottesville, VA, and recent revelations about the spread of disinformation and divisive messages on social media platforms has increased interest in how these platforms set content policies and then detect and remove material deemed to violate those rules. In the United States, this process is not subject to any regulation or disclosure requirements.
On Sept. 19, the Berkman Klein Center for Internet & Society hosted a public lunch talk with Monika Bickert, the Head of Global Policy Management at Facebook. The public event was followed by a meeting at which members of the Berkman Klein Center community explored broader research questions and topics related to the challenges — including whether and how — of keeping tabs on the daily social media interactions of hundreds of millions of people.
The day was hosted by the Center’s Harmful Speech Online Project. Questions surrounding the algorithmic management of online content, and how those processes impact media and information quality, are also a core focus of the Center’s Ethics and Governance of AI Initiative.
During the public lunch talk, Jonathan Zittrain, professor of law and computer science at Harvard University and faculty director of the Berkman Klein Center, asked Bickert about Facebook policies today and how they might be designed in the future.
Bickert said 2 billion people use the site, and 1.3 billion use it every day. Eight-five percent of users live outside the United States and converse in “dozens and dozens of languages,” she said. (Facebook later provided a specific number, 40). Particularly large user communities thrive in India, Turkey, and Indonesia, she added.
The “policy” in Bickert’s title refers to Facebook’s policies or rules defining what material Facebook prohibits (including hate speech, certain kinds of graphically violent images, and terrorist content). The exact means by which the company judges content have not been made publicly available, but some internal training documents detailing past policies leaked to The Guardian. And the company has issued Community Standards that broadly describes its process, and Bickert responded to the leak with a public editorial of her own.
If a user or an automated system flags content as violating policies, the content is sent to a human reviewer for a final decision. Facebook says it is in the process of hiring 3,000 new reviewers, which will bring the total number of content reviewers to 7,500. These employees by reviewing them against Facebook’s internal policies.
Bickert called the policies reported by The Guardian a “snapshot in time” because “our policies are always changing.” She added that: “We have a mini ‘legislative session’… every two weeks where we discuss proposed policy changes” with legal, engineering, and company public policy team members present.
“I’ve been in this role just over four years and I would say that in general [Facebook’s policies] have gotten more and more restrictive and that’s true not just at Facebook but for all the large social media companies,” she said.
Zittrain asked: “Given Facebook’s primacy at the moment — it’s a big social network — does it seem right that decisions like this should repose with Facebook in its discretion — it’s a private business; it responds to market; it’s got its policies — or is there some other source that would be almost a relief? It’s like ‘You know what, world? You set the standards, just tell us what to do, dammit, we’ll do it.’”
Bickert stopped short of endorsing an imposition of external standards. But she said the company does reach out to get opinions from experts outside the company, including from a safety advisory board, as the company revises policies and even makes decisions on particular pieces of content. “We are always looking at how we can do this in a more transparent way,” she said.
“In the area of terrorism for instance … we have a group of academics around the world; we have on our team some people who were in the world of counterterrorism,” she said. “It’s very much a conversation with people in the community as we make these decisions.”
This post originally appeared on Medium on October 19, 2017.