Skip to content


evelyn douek

  • The war in Ukraine highlights the limits of Facebook’s oversight board

    March 16, 2022

    Weeks into the war in Ukraine, Facebook's parent company Meta is poised to tap its oversight board for guidance about a policy shift allowing users in Ukraine to post some calls for violence against Russian invaders. It would mark the first time the panel has formally weighed in on the tech giant's flurry of actions in response to the war, and it could shape its rules on violent rhetoric moving forward. ... The oversight board has effectively been unable to weigh in on any of the tech giant's massive policy shifts in recent weeks, and now that it will, it's on "a very small slice of what Facebook's doing," said Evelyn Douek, a lecturer at Harvard Law School.

  • Media law review raises thorny freedom of expression issues

    March 15, 2022

    Anjum Rahman knows more than most about the harmful effects of media content. Over the years, Rahman - an accountant by trade, who also founded the Inclusive Aotearoa Collective Tāhono and acts as spokesperson for the Islamic Women’s Council - has been a leading voice from the Muslim community speaking about the harms caused by online extremism. She’s received a fair deal of pushback for this, ranging from fairly civil to downright abusive. ... As Harvard Law School’s Evelyn Douek puts it, regulators should be wary of building content moderation policy on the assumption that techies at online media platforms can simply “nerd harder” and stop the spread of all harmful content, without any trade-offs, if they just put their mind to it.

  • Beware the Never-Ending Disinformation Emergency

    March 11, 2022

    “If you put up this whole interview,” Donald Trump said during a podcast livestream on Wednesday afternoon, “let’s see what happens when Instagram and Facebook and Twitter and all of them take it down.” Trump named the wrong platforms; the podcast, Full Send, a mildly Rogan-esque bro-fest, was streaming on YouTube. But otherwise his prediction made sense, because during the interview he reiterated his claim that he, not Joe Biden, was the rightful winner of the 2020 election. “The election fraud was massive,” he said during one of several riffs on the theme. “I call it ‘the crime of the century.’ We’re doing a book on it.” ... “It’s mostly been a one-way ratchet,” says Evelyn Douek, a doctoral candidate at Harvard Law School who studies content moderation. “It rarely goes back the other way; it always tightens and tightens. To date, we haven’t seen loosening at the end of periods of risk.”

  • Spotify, Joe Rogan and the Wild West of online audio

    February 7, 2022

    Neil young was five years old when, in 1951, he was partially paralysed by polio. Joni Mitchell was nine when she was hospitalised by the same illness around the same time. Both grew up to become famous singers—and, lately, prominent campaigners against anti-vaccine misinformation. The two musicians, followed by a handful of others, have withdrawn their music from the world’s biggest streaming service in protest at a podcast that gave airtime to anti-vaxxers. ... “It’s always been baffling to me how podcasts have flown under the content-moderation radar,” says Evelyn Douek of Harvard Law School. “It’s a massive blind-spot.” It could also prove to be a pricey one. As audio platforms host more user-generated content, the moderation task will expand. It will probably involve lots of human moderators; automating the process with artificial intelligence, as Facebook and others are doing, is even harder for audio than it is for text, images or video. Software firms’ valuations “have long been driven by the notion that there’s no marginal cost”, says Mr Page. “Content moderation might be their first.”

  • Spotify’s Joe Rogan Mess

    February 4, 2022

    For Spotify, the last month has seen a cascade of controversies around its exclusive podcast, The Joe Rogan Experience. Is it time for the streaming service to rethink its role as a podcast publisher? And is it even possible to moderate podcast misinformation? Guest: Evelyn Douek, lecturer at Harvard Law School, and affiliate at the Berkman Klein Center for Internet & Society

  • What the Joe Rogan podcast controversy says about the online misinformation ecosystem

    January 21, 2022

    An open letter urging Spotify to crack down on COVID-19 misinformation has gained the signatures of more than a thousand doctors, scientists and health professionals spurred by growing concerns over anti-vaccine rhetoric on the audio app's hit podcast, The Joe Rogan Experience. ... "Wherever you have users generating content, you're going to have all of the same content moderation issues and controversies that you have in any other space," said Evelyn Douek, a research fellow at Columbia University's Knight First Amendment Institute.

  • Pregnancy apps have become a battleground of vaccine misinformation

    January 3, 2022

    For generations of parents, Heidi Murkoff’s 1984 pregnancy guide “What to Expect When You’re Expecting” has been a trusty companion, offering calm, scientifically informed advice for a nerve-wracking nine months. These days, of course, there’s an app for that: What to Expect’s “Pregnancy & Baby Tracker,” which offers personalized articles, videos, graphics of your baby’s development, and other features based on your due date. ... Any platform that allows users to interact or create content eventually will face questions about how to deal with offensive speech, said evelyn douek, a lecturer at Harvard Law School who researches online content moderation. Smaller sites tend to lack the resources to moderate users’ discussions as effectively as the large platforms, she added, and can become hubs of misinformation.

  • TikTok, Snap, YouTube to defend how they protect kids online in congressional hearing

    October 26, 2021

    TikTok, Snapchat and YouTube, all social media sites popular with teens and young adults, will answer to Congress on Tuesday about how well they protect kids online. It’s the first time testifying before the legislative body for both TikTok and Snap, the parent company of Snapchat, despite their popularity and Congress’s increasing focus on tech industry practices. By contrast, Facebook representatives have testified 30 times in the past four years, and Twitter executives including CEO Jack Dorsey have testified on Capitol Hill 18 times total. ... “Facebook is just not the only game in town,” said Harvard Law School lecturer Evelyn Douek, who studies the regulation of online speech. “If we’re going to talk about teen users, we should talk about the platforms that teens actually use. Which is TikTok, Snapchat and YouTube.”

  • YouTube, TikTok And Snap Go To Congress Tuesday. Here’s Why Their Fates May Vary.

    October 25, 2021

    Long before President Trump earned widespread bans across social media, Snap enacted one of the earliest measures to curb his reach, booting him from its Discover feed of curated content in June 2020. ... On YouTube, though, the former president faced a different outcome. Six days after the riot, the site “temporarily suspended” him and has since said it would allow him to return at some point. And he never gave TikTok the chance to punish him... Each app’s unique reaction to Trump reflects their disparate approaches to moderating their sites, as well as their different relationships with U.S. politics—dynamics that will go under the spotlight on Tuesday morning when the three companies appear in Congress. ... Extending the investigation beyond Facebook could indicate that politicians may finally renew efforts to draft new regulations for social media. “There’s real political momentum,” says Evelyn Douek, a Harvard Law School lecturer who studies online speech and misinformation. “And that inevitably means you have to go to the other platforms, particularly to where the teens are,” she says, making Snap, TikTok and YouTube “the natural choices.”

  • TikTok and Snapchat are testifying for the first time. Their peers are in the double-digits.

    October 21, 2021

    TikTok and Snapchat will testify before Congress next week for the first time, spokespeople for the companies confirmed Wednesday, as Senate lawmakers broaden their investigation into how social media platforms are affecting kids’ safety. Meanwhile, Facebook CEO Mark Zuckerberg is being called by the same panel to appear on Capitol Hill for what would be his eighth time in four years, and the 31st time for any Facebook executive in that same time span, spokesman Andy Stone confirmed. The disparity highlights how lawmakers’ oversight of Silicon Valley companies has fixated on a few major platforms, most notably Facebook. ... But some researchers such as Harvard Law School lecturer Evelyn Douek have argued that Congress’ myopic focus has made it all but turn a blind eye to several of the world’s most influential sites, including TikTok, Snapchat and YouTube, which will also testify next week. Douek, who has challenged lawmakers to summon YouTube CEO Susan Wojcicki in particular to testify for the first time, celebrated the lineup for the next hearing.

  • Woman talking into a microphone

    Is it time to swipe left on social media?

    October 12, 2021

    Leaked revelations about Instagram’s impact on teens have united Republicans and Democrats in considering legal reforms, say Harvard Law School scholars.

  • 1 Billion TikTok Users Understand What Congress Doesn’t

    October 12, 2021

    An article by Evelyn DouekMany people think of TikTok as a dance app. And although it is an app full of dancing, it’s also a juggernaut experiencing astronomical growth. In July, TikTok—a short-form video-sharing app powered by an uncannily goodrecommendation algorithm and owned by the Chinese company ByteDance—became the only social-media mobile app other than those from Facebook to ever pass 3 billion downloads. At the end of last month, TikTok announcedit had more than 1 billion monthly users. It was, by some counts, the most downloaded app in 2020 and remained so into 2021. Not bad for an app launched only in 2016! Of the social-media platforms around today, TikTok is the likeliest to represent the future. Its user base is mostly young people. But if you look for TikTok in news coverage, you’re more likely to find it in the lifestyle, culture, or even food section than you are on the front page. It’s not that social-media platforms aren’t newsworthy—Facebook consistentlydominates headlines. But TikTok is all too often regarded as an unserious thing to write or read about. That’s a mistake, and it’s one that Congress is making as well.

  • YouTube bans all anti-vaccine misinformation

    September 29, 2021

    YouTube said on Wednesday that it was banning the accounts of several prominent anti-vaccine activists from its platform, including those of Joseph Mercola and Robert F. Kennedy Jr., as part of an effort to remove all content that falsely claims that approved vaccines are dangerous. ...“One platform’s policies affect enforcement across all the others because of the way networks work across services,” said Evelyn Douek, a lecturer at Harvard Law School who focuses on online speech and misinformation. “YouTube is one of the most highly linked domains on Facebook, for example.”

  • Facebook targets harmful real networks, using playbook against fakes

    September 17, 2021

    Facebook (FB.O) is taking a more aggressive approach to shut down coordinated groups of real-user accounts engaging in certain harmful activities on its platform, using the same strategy its security teams take against campaigns using fake accounts, the company told Reuters. The new approach, reported here for the first time, uses the tactics usually taken by Facebook's security teams for wholesale shutdowns of networks engaged in influence operations that use false accounts to manipulate public debate, such as Russian troll farms. ... An expansion of Facebook's network disruption models to affect authentic accounts raises further questions about how changes might impact types of public debate, online movements and campaign tactics across the political spectrum. "A lot of the time problematic behavior will look very close to social movements," said Evelyn Douek, a Harvard Law lecturer who studies platform governance. "It's going to hinge on this definition of harm ... but obviously people's definitions of harm can be quite subjective and nebulous."

  • More Content Moderation Is Not Always Better

    June 2, 2021

    An op-ed by Evelyn DouekContent moderation is eating the world. Platforms’ rule-sets are exploding, their services are peppered with labels, and tens of thousands of users are given the boot in regular foul swoops. No platform is immune from demands that it step in and impose guardrails on user-generated content. This trend is not new, but the unique circumstances of a global public health emergency and the pressure around the US 2020 election put it into overdrive. Now, as parts of the world start to emerge from the pandemic, and the internet’s troll in chief is relegated to a little-visited blog, the question is whether the past year has been the start of the tumble down the dreaded slippery content moderation slope or a state of exception that will come to an end. There will surely never be a return to the old days, when platforms such as Facebook and Twitter tried to wash their hands of the bulk of what happened on their sites with faith that internet users, as a global community, would magically govern themselves. But a slow and steady march toward a future where ever more problems are sought to be addressed by trying to erase content from the face of the internet is also a simplistic and ineffective approach to complicated issues.

  • Parler is back in Apple’s App Store, with a promise to crack down on hate speech

    May 18, 2021

    Parler, the conservative-friendly “free speech” social media app, is back in the Apple App Store. But like anything involving social media and free speech, its return is complicated. Beginning on Monday, Parler is available for download on iPhones and iPads. This comes around four months after Parler was banned or limited by Apple, Amazon, Google, and virtually every other major tech company for allowing some of its users to openly organize violence following the 2020 US election — namely at the January 6 US Capitol insurrection... “It’s going to be an interesting story to watch in a number of ways,” said Evelyn Douek, a lecturer at Harvard Law School who studies content moderation online. “There’s the story of what happens with Parler itself, whether it does get more serious about policing hate speech and violent content, and then what does it mean for Apple to get into the content moderation game.”

  • Facebook’s Oversight Board: an imperfect solution to a complex problem

    May 17, 2021

    For years, Facebook has grappled with the thorny question of who should be the ultimate “arbiter of truth” when it comes to moderating content on its near ubiquitous social media platform. Mark Zuckerberg, chief executive and ultimate decision maker, has agreed that it should not be him — semi-outsourcing the problem to a group of his own making, called the Facebook Oversight Board...It is currently funded — through a $130m trust — by Facebook. And it has binding authority over a very narrow type of case: whether a removed piece of content should be reinstated or an offensive post should come down, and whether users should remain banned. It only hears a handful of these a year. “It’s still looking at that very narrow slice of what content moderation is, namely how Facebook treats individual posts,” says Evelyn Douek, a lecturer at Harvard Law School. “It doesn’t look at things like groups, pages, down-ranking decisions, how the news feed works in terms of prioritisation, how Facebook treats entire accounts.”

  • Facebook Oversight Board upholds Donald Trump’s suspension from the platform

    May 11, 2021

    Facebook can keep blocking former President Donald Trump from using its platform but must revisit the decision within six months, the social network's court-like Oversight Board said Wednesday. The landmark decision affirmed the company's decision to issue the suspension after the January 6 US Capitol riots. The board said it concluded that Trump's posts on January 6, which praised the rioters, "severely violated" Facebook's policies and "created an environment where a serious risk of violence was possible." ... The decision to bar Trump from Facebook, if made permanent, could have vast implications, wrote Evelyn Douek, a researcher of online speech and platform moderation at Harvard Law School. "There is no greater question in content moderation right now than whether Trump's deplatforming represents the start of a new era in how companies police their platforms, or whether it will be merely an aberration," Douek wrote in January. "What one platform does can ripple across the internet, as other platforms draft in the wake of first movers and fall like dominoes in banning the same accounts or content. For all these reasons, the board's decision on Trump's case could affect far more than one Facebook page."

  • Somebody Has to Do It

    May 7, 2021

    An op-ed by Evelyn DouekNo one has set any clear standard about how badly a politician can break Facebook’s rules before getting kicked off the platform, and yesterday the company’s wannabe court missed a chance to fill the void. In a decision anticipated with the fervor that might attend a high-profile Supreme Court ruling, the Facebook oversight board told the platform that, while it might have been right to ban then-President Donald Trump on January 7 for his role in stoking the Capitol riot and because of the risk of continuing violence, the ongoing “indefinite” nature of the ban is not justified. The board gave Facebook six months to go back to the drawing board and work out what to do with Trump’s account now. But this is the exact question Facebook had asked the board to settle. The board respectfully declined. In fact, the board’s decision resolved essentially nothing—except that Facebook wasn’t exactly wrong on January 7—and leaves open the possibility that this whole charade will happen again before the year is out.

  • Facebook tried to outsource its decision about Trump. The Oversight Board said not so fast.

    May 7, 2021

    Facebook tried to pass the buck on former president Donald Trump, but the buck got passed right back. For several years, Facebook chief executive Mark Zuckerberg has pushed the idea that he and his company shouldn’t be in the position of creating the rules of the road to govern the personal expression of billions of people. He went so far as to dedicate $130 million to fund an independent panel of outside experts to which the company could outsource the thorniest decisions about what types of content — and voices — should be allowed to stay up on Facebook...But on Wednesday, the 20-member panel punted the decision back to Facebook, recommending the company decide within six months whether to permanently ban or restore Trump’s account. He is currently suspended “indefinitely,” a one-off penalty outside Facebook’s usual rules. The board, set up to act as a “Supreme Court-like” body to police Facebook’s content decisions, scolded the company for trying to pass it off, too...Others said it was the board refusing to do its job. “Their role is to constrain Facebook’s, and Mark Zuckerberg’s, discretion,” wrote Evelyn Douek, a Harvard Law School lecturer, in a Lawfare article Wednesday. “The [Facebook Oversight Board] has declined to do that almost entirely, and did not even provide meaningful parameters of the policies it calls on Facebook to develop.”

  • Trump Is Mark Zuckerberg’s Problem. Again.

    May 6, 2021

    Facebook’s Oversight Board on Wednesday upheld the social network’s temporary suspension of Donald Trump but declined to decide when, or whether, that ban should be lifted. The decision dashed the former president’s hopes for a swift reinstatement by a body charged with reviewing the platform’s content moderation practices. But it also sent a message that the scope of the board’s power is limited and that the ultimate responsibility for these questions still lies with Mark Zuckerberg and company...Evelyn Douek, a lecturer at Harvard Law School who has chronicled the board’s evolution, told me she’s seen evidence in its early decisions that “the board is chafing against the very limited remit that Facebook has given it so far.” She would like it to go farther in pushing for transparency. For instance, she said, it could call on Facebook to reveal the Trump ban’s impact on an internal metric that it calls “violence and incitement trends.” The board did take a small step in that direction on Wednesday. Among its policy recommendations, it called for Facebook to undertake a “comprehensive review of its potential contribution to the narrative of electoral fraud and the exacerbated tensions that culminated in the violence in the United States on January 6.”

  • It’s Not Over. The Oversight Board’s Trump Decision is Just the Start.

    May 6, 2021

    An article by Evelyn DouekThe long international nightmare is not over. By now, you will have read that on Wednesday the Facebook Oversight Board (FOB) upheld Facebook’s Jan. 7 restriction on former President Donald Trump’s account, largely on the basis of the ongoing violence at the time of the posts that led to the ban. But the FOB did not settle the matter for once and for all: It punted the question of what to do with the account now back to Facebook. It dinged Facebook’s “indefinite” ban as a “vague, standardless penalty”—“indefinite,” according to the FOB, is very much not synonymous with “permanent.” Now, Facebook has six months to conduct a review of what to do with Trump’s account. The decision is meaty and educational. It contains a number of recommendations which, if Facebook follows, will significantly improve the clarity and mitigate the arbitrariness of Facebook’s decision-making. It is also an attempt to split the baby—not letting Trump back on, but also not demanding a permanent ban of his account—and to avoid the inevitable controversy that would have attended any final decision.

  • All In with Chris Hayes, 5/4/21

    May 5, 2021

    The Republican Party appears to be signaling they want to essentially excommunicate from leadership Congresswoman Liz Cheney of Wyoming. Tennessee Republicans want to ban lessons on systemic racism in schools. President Joe Biden aims to vaccinate 70 percent of adults by July 4th. Tomorrow, Facebook’s oversight board is set to announce whether Donald Trump can return to the company’s platforms. Guests: London Lamar, Michael Lewis, Evelyn Douek, Adam Conner.

  • Facebook and Trump are at a turning point in their long, tortured relationship

    May 4, 2021

    On Jan. 6, as an angry mob stormed the U.S. Capitol, President Donald Trump posted on Facebook that his supporters should “remember this day forever.” “These are the things and events that happen when a sacred landslide election victory is so unceremoniously and viciously stripped away from great patriots who have been badly and unfairly treated for so long,” he said in a post. In response, Facebook did something it had resisted for years: banned Trump’s account indefinitely for inciting violence...The Oversight Board is evaluating the determination — which Facebook says was made during extenuating circumstances — at the company’s request. Facebook says the rulings of the independent, 20-member body are binding. The company does have a hand in picking board members, which include a Nobel laureate and a former Danish prime minister, and paying them through a separate trust. “This is just the start of an experiment, but it can’t be where it ends.” said Evelyn Douek, a lecturer on free speech issues at Harvard Law School. “In some sense, we are all playing Facebook’s game by taking the Board seriously as a legitimate institution. On the other hand, no one has a better alternative right now.”

  • As Indians Face A COVID-19 Crisis, Facebook Temporarily Hid Posts With #ResignModi

    April 29, 2021

    Facebook temporarily hid posts calling for the resignation of Indian Prime Minister Narendra Modi, marking the platform's latest foray in a series of controversial decisions affecting free speech in a country experiencing a full-blown COVID-19 crisis. On Wednesday, the world’s largest social network said that posts with the hashtag or text #ResignModi “are temporarily hidden here” because “some content in those posts goes against our Community Standards.” Because the posts were hidden, it’s unclear what content violated the rules of a company whose executives have often expressed a commitment to open expression... “In the context of a highly politicized environment and an ongoing emergency, it’s very concerning that Facebook isn’t being more transparent about this and is not commenting,” said evelyn douek, a lecturer at Harvard Law School. “This appears to be core political speech at a very critical time.”

  • On Social Media, American-Style Free Speech Is Dead

    April 27, 2021

    American social media platforms have long sought to present themselves as venues for unfettered free expression...The pandemic made a mockery of that idea...Evelyn Douek is a doctoral student at Harvard Law School and an affiliate at the Berkman Klein Center for Internet and Society. As an Australian scholar, she brings an international perspective to free speech questions. In a new article published by the Columbia Law Review, she argues that the pandemic exposed the hollowness of social media platforms’ claims to American-style free speech absolutism. It’s time, she writes, to recognize that “the First Amendment–inflected approach to online speech governance that dominated the early internet no longer holds. Instead, platforms are now firmly in the business of balancing societal interests.” In a conversation last week, Douek explained why the shift to a more “proportional” approach to online speech is a good thing, why platforms must provide more transparency into their moderation systems, and the perils of confusing onions for boobs. The interview has been condensed and edited for clarity.

  • What Facebook Did for Chauvin’s Trial Should Happen All the Time

    April 21, 2021

    An op-ed by Evelyn DouekOn Monday, Facebook vowed that its staff was “working around the clock” to identify and restrict posts that could lead to unrest or violence after a verdict was announced in the murder trial of the former Minneapolis police officer Derek Chauvin. In a blog post, the company promised to remove “content that praises, celebrates or mocks” the death of George Floyd. Most of the company’s statement amounted to pinky-swearing to really, really enforce its existing community standards, which have long prohibited bullying, hate speech, and incitements to violence. Buried in the post was something less humdrum, though: “As we have done in emergency situations in the past,” declared Monika Bickert, the company’s vice president of content policy, “we may also limit the spread of content that our systems predict is likely to violate our Community Standards in the areas of hate speech, graphic violence, and violence and incitement.” Translation: Facebook might turn down the dial on toxic content for a little while. Which raises some questions: Facebook has a toxic-content dial? If so, which level is it set at on a typical day? On a scale of one to 10, is the toxicity level usually a five—or does it go all the way up to 11?

  • Apple to Reinstate Parler, the App at Center of Online-Speech Debate

    April 20, 2021

    Apple Inc. plans to make the social-media app Parler available through its App Store again, the computer and smartphone company said in a letter to lawmakers on Monday. Apple removed Parler from its app store in January, citing objectionable content. In a letter to Sen. Mike Lee of Utah and Rep. Ken Buck of Colorado, both Republicans, Apple said Monday that a revised version of the Parler app with improved content moderation would be approved for release to Apple users...Apple had previously denied an earlier attempt by Parler to seek reinstatement. Evelyn Douek, a Harvard Law School lecturer who studies content moderation, said that tech platforms, including Apple, need to provide clearer guidelines as to what content is acceptable. “If Apple wants to get into the game of playing gatekeeper on the basis of content, it should be a lot more transparent about its requirements,” Ms. Douek said.

  • Trump faces a narrow path to victory against Facebook suspension

    April 12, 2021

    If former President Donald Trump manages to get back on Facebook and Instagram this month, his win will rest on a series of close calls. Facebook’s oversight board is expected to rule in the coming weeks on whether to uphold or overturn Trump’s indefinite suspension from the platforms, which the company imposed after the Jan. 6 Capitol riots over fears he might incite further violence...The oversight board’s decisions so far would seem to offer favorable omens for Trump: It has ruled against Facebook and ordered content restored in almost every case it has reviewed since its launch before the 2020 U.S. elections...The early rulings showed that the board values free expression “very highly,” said Evelyn Douek, a lecturer at Harvard Law School who has closely followed the oversight board’s work. “They put a lot of weight on the importance of voice and the importance of free expression and free speech and they really put the onus on Facebook to heavily justify any restrictions that they wanted,” she said.

  • If Mark Zuckerberg won’t fix Facebook’s algorithms problem, who will?

    March 29, 2021

    All eyes are on Facebook’s oversight board, which is expected to decide in the next few weeks if former President Donald Trump will be allowed back on Facebook. But some critics — and at least one member — of the independent decision-making group say the board has more important responsibilities than individual content moderation decisions like banning Trump. They want it to have oversight over Facebook’s core design and algorithms...When it comes to Facebook’s fundamental design and the content it prioritizes and promotes to users, all the board can do right now is make recommendations. Some say that’s a problem. “The jurisdiction that Facebook has currently given it is way too narrow,” Evelyn Douek, a lecturer at Harvard Law School who analyzes social media content moderation policies, told Recode. “If it’s going to have any meaningful impact at all and actually do any good, [the oversight board] needs to have a much broader remit and be able to look at the design of the platform and a bunch of those systems behind what leads to the individual pieces of content in question.”

  • Tech CEOs testify: Same old, or whole new world?

    March 26, 2021

    We’ve seen it all before: A congressional panel is hauling the CEOs of some of the world’s most influential tech companies to answer for their purported misdeeds. But House Energy and Commerce leaders say this time, things will be different: They are serious about legislating around issues of online extremism, including by targeting tech’s liability shield, Section 230...Roughly one month out from the Facebook Oversight Board’s ruling on whether to let Trump back onto the platform, a wider battle is already swirling around the group of outside experts: Can they create a global standard for free speech on the world’s largest social network? ... “The Board is applying international human rights law to Facebook as if it was a country. That’s impossible,” Evelyn Douek, an online content expert at Harvard, told Mark. “It’s the first body that’s using international human rights law to make content decisions. Now that we’re getting down to brass tacks, it’s difficult.”

  • The Technology 202: Where is YouTube CEO Susan Wojcicki?

    March 24, 2021

    YouTube videos are a critical source of online misinformation, yet they often get a pass in broader discussions about the dangers of social media. Even in Congress. YouTube CEO Susan Wojcicki has never had to appear alongside other social media executives for a Capitol Hill grilling, and she will not be in attendance on Thursday when Congress questions top tech executives for the first time since the Jan. 6 Capitol attacks... “There have been hearings where you can’t count on one hand the number of questions about YouTube, which is ridiculous given the level of impact,” said Evelyn Douek, a lecturer at Harvard Law School who researches online speech...YouTube has massive influence over Americans' media consumption. YouTube has the highest reach of any platform among American adults, with 73 percent of Americans reporting they use the platform in a 2019 Pew Research Center survey. Facebook is the only social network that comes even close to YouTube's reach, with 69 percent of Americans reporting they use it...That's problematic because YouTube faces unique challenges in detecting and removing disinformation and extremism from its platform. It's technically more challenging and time- consuming to comb through videos than to simply search for terms in text. “I do not understand why she hasn’t been called,” Douek said. “There is no question that Jack Dorsey should have to answer that she shouldn’t.

  • Amazon Is Pushing Readers Down A “Rabbit Hole” Of Conspiracy Theories About The Coronavirus

    March 16, 2021

    Conspiracy theorist David Icke’s lies about COVID-19 caused Facebook, Twitter, YouTube, and Spotify to ban him. But on Amazon, Icke, who believes in the existence of lizard people, is recommended reading. ... Unlike other platforms, Amazon has not taken steps to remove COVID-19 misinformation entirely, or at least from its recommendation systems. Amazon’s approach means it’s profiting from sales of the conspiracy theory books, said evelyn douek, a lecturer at Harvard Law School who studies global regulation of online speech. “There's a strong argument that if you're making money off it, you should take more responsibility,” said douek.

  • Why Facebook’s Temporary News Ban in Australia Didn’t Go Far Enough

    February 26, 2021

    An amazing thing happened last week when Facebook banned links to news articles in Australia. The Australian Broadcasting Corporation’s long-overlooked news app became the country’s hottest. The ABC app jumped from around 1,000 daily downloads to more than 15,000 in a day last week, according to mobile intelligence firm Apptopia. And by the time anyone looked up, it occupied the top spot on the country’s iOS and Google Play app stores. Facebook enacted its ban to protest an Australian law that would make the company pay news publishers. But instead of crushing ABC, the ban set it free...There are parts of the world that primarily get their news from Facebook, but it’s unlikely they’ll simply give up on news if it disappears from the platform. And though the dearth of news coverage did lead to spikes in engagement for politicians on Facebook in Australia — presumably from people seeking to fill the void — an absence of news links over time would change people’s behavior on Facebook. Scrolling for something to get mad about may well give way to curiosity about friends and family. Many in Australia have indeed been thrilled about their news-free news feeds. When Harvard lecturer Evelyn Douek asked her Australian friendswhat they thought, they could barely contain their enthusiasm. “It’s great actually,” wrote one. “Peaceful,” wrote another. “Who in the world would rely on Facebook for news???” wrote one more. News publishers, for one, still do rely on Facebook for news. And Facebook still relies on them.

  • India Has Its Own Alternative To Twitter. It’s Filled With Hate.

    February 24, 2021

    In early February, politicians from India’s ruling Bharatiya Janata Party started signing up for a social network that almost nobody had heard of...The timing wasn’t coincidental. For days, India’s government had been locked in a fierce tug-of-war with Twitter, which defied a legal order to block accounts critical of India’s Hindu nationalist government, including those belonging to journalists and an investigative news magazine. In response, India’s IT ministry threatened to send Twitter officials to jail. Amid the standoff, government officials promoted Koo as a nationalist alternative, free from American influence...As the global internet splinters, and mainstream platforms like Facebook and Twitter square off against nation states and fitfully crack down on hate speech, nationalist alternatives are springing up to host it, something that experts say is a growing trend. “This content wants to find new homes,” evelyn douek, a lecturer at Harvard Law School who studies global regulation of online speech, told BuzzFeed News. Hate speech, disinformation, harassment, and incitement that mainstream platforms have been grappling with for years are particularly problematic on platforms like Koo, she said, because those sites come under less scrutiny. “These problems come to every platform in the end,” douek said, “but with the proliferation of these alternatives, there’s likely to be far less attention and pressure on them. It also creates the possibility that there will be a global internet that has one kind of discourse, and completely alternative conversations happening on national platforms in parallel.”

  • The Internet Is Splintering

    February 18, 2021

    Each country has its own car safety regulations and tax codes. But should every country also decide its own bounds for appropriate online expression? If you have a quick answer, let me ask you to think again. We probably don’t want internet companies deciding on the freedoms of billions of people, but we may not want governments to have unquestioned authority, either...Evelyn Douek, a lecturer at Harvard Law School, told me that even when countries like Germany pass laws about online speech, it’s still the responsibility of internet companies to interpret whether millions of posts are on the right side of the law. That goes for the United States, too, where companies are largely left to decide their own bounds of acceptable online expression. Countries and international bodies should “do more to establish more clear guard rails and processes for internet platforms,” Douek said, but “they’re never going to take decision making out of these platforms.”

  • Inside the Making of Facebook’s Supreme Court

    February 12, 2021

    On a morning in May, 2019, forty-three lawyers, academics, and media experts gathered in the windowless basement of the NoMad New York hotel for a private meeting...Since its founding, in 2004, Facebook had modelled itself as a haven of free expression on the Internet. But in the past few years, as conspiracy theories, hate speech, and disinformation have spread on the platform, critics have come to worry that the company poses a danger to democracy. Facebook promised to change that with the Oversight Board...The idea for the Oversight Board came from Noah Feldman, a fifty-year-old professor at Harvard Law School, who has written a biography of James Madison and helped draft the interim Iraqi constitution. In 2018, Feldman was staying with his college friend Sheryl Sandberg, the chief operating officer of Facebook, at her home in Menlo Park, California. One day, Feldman was riding a bike in the neighboring hills when, he said, “it suddenly hit me: Facebook needs a Supreme Court.” ... Currently, users can appeal cases in which Facebook has removed a post, called “take-downs,” but not those in which it has left one up, or “keep-ups.” The problem is that many of Facebook’s most pressing issues—conspiracy theories, disinformation, hate speech—involve keep-ups...“This is a big change from what you promised,” Evelyn Douek, a Harvard graduate student who consulted with the team, fumed, during one meeting. “This is the opposite of what was promised.” Users also currently can’t appeal cases on such issues as political advertising, the company’s algorithms, or the deplatforming of users or group pages. The board can take cases on these matters, including keep-ups, only if they are referred by Facebook, a system that, Douek told me, “stacks the deck” in Facebook’s favor.

  • Why Is Big Tech Policing Free Speech? Because the Government Isn’t

    January 26, 2021

    In the months leading up to the November election, the social media platform Parler attracted millions of new users by promising something competitors, increasingly, did not: unfettered free speech...The giants of social media — Facebook, Twitter, YouTube, Instagram — had more stringent rules. And while they still amplified huge amounts of far-right content, they had started using warning labels and deletions to clamp down on misinformation about Covid-19 and false claims of electoral fraud, including in posts by President Trump...Why, for example, hasn’t Facebook suspended the accounts of other leaders who have used the platform to spread lies and bolster their power, like the president of the Philippines, Rodrigo Duterte? A spokesman said suspending Trump was “a response to a specific situation based on risk” — but so is every decision, and the risks can be just as high overseas. “It’s really media and public pressure that is the difference between Trump coming down and Duterte staying up,” says Evelyn Douek, a lecturer at Harvard Law School. “But the winds of public opinion are a terrible basis for free-speech decisions! Maybe it seems like it’s working right now. But in the longer run, how do you think unpopular dissidents and minorities will fare?” ... “I’m afraid that the technology has upended the possibility of a well-functioning, responsible speech environment,” the Harvard law professor Jack Goldsmith says. “It used to be we had masses of speech in a reasonable range, and some extreme speech we could tolerate. Now we have a lot more extreme speech coming from lots of outlets and mouthpieces, and it’s more injurious and harder to regulate.”

  • Trump Wants Back on Facebook. This Star-Studded Jury Might Let Him.

    January 25, 2021

    They meet mostly on Zoom, but I prefer to picture the members of this court, or council, or whatever it is, wearing reflective suits and hovering via hologram around a glowing table. The members include two people who were reportedly on presidential shortlists for the U.S. Supreme Court, along with a Yemeni Nobel Peace Prize laureate, a British Pulitzer winner, Colombia’s leading human rights lawyer and a former prime minister of Denmark. The 20 of them come, in all, from 18 countries on six continents, and speak 27 languages among them. This is the Oversight Board, a hitherto obscure body that will, over the next 87 days, rule on one of the most important questions in the world: Should Donald J. Trump be permitted to return to Facebook and reconnect with his millions of followers? ... The board will seriously examine the Trump question, guided by Facebook’s own rules as well as international human rights law. If Facebook accepts its rulings, as it has pledged to do, as well as the board’s broader guidance, the company will endow this obscure panel with a new kind of legitimacy. “Either it’s nothing, or it’s the New World Order,” said a lecturer at Harvard Law School who studies content moderation, Evelyn Douek, who pushed Facebook to send the Trump case to the Oversight Board...Noah Feldman, the Felix Frankfurter Professor of Law at Harvard Law School, who first brought the notion of a Facebook Supreme Court to the company, said he thought conservatives dismayed by the recent crackdown might be surprised to find an ally in this new international institution. “They may come to realize that the Oversight Board is more responsive to freedom of expression concerns than any platform can be, given real world politics,” he said.

  • Judge Refuses To Reinstate Parler After Amazon Shut It Down

    January 22, 2021

    A federal judge has refused to restore the social media site Parler after Amazon kicked the company off of its Web-hosting services over content seen as inciting violence. The decision is a blow to Parler, an upstart that has won over Trump loyalists for its relatively hands-off approach to moderating content. The company sued Amazon over its ban, demanding reinstatement. U.S. District Judge Barbara Rothstein sided with Amazon, which argued that Parler would not take down posts threatening public safety even in the wake of the attack on the U.S. Capitol and that it is within Amazon's rights to punish the company over its refusal...Evelyn Douek, a lecturer at Harvard Law School, predicts more battles over online speech will erupt between sites that choose a hands-off approach and Web hosts that demand a more aggressive stance. And that troubles her. "Is that the right place for content moderation to be occurring?" Douek asked. "It's harder to bring accountability to those choices when we don't even know who's making them or why they're being made." In other words, when a Web host has a problem with content on a client's site, usually these discussions are hashed out between the two parties, far from the public light. And Web hosts, unlike social media platforms, are not used to explaining these decisions publicly.

  • Facebook outsources its decision to ban Trump to oversight board

    January 22, 2021

    Facebook said it would refer its decision to indefinitely suspend former president Donald Trump’s Facebook and Instagram accounts to the oversight board, an independent Facebook-funded body composed of international experts that has the power to review — and potentially overturn — the company’s content decisions. The referral would amount to the first major case for the board, which took the first spate of cases it would consider in December. It will also be a key test for a model of external governance for Silicon Valley’s decision-making, one that is being closely watched by experts and policymakers around the world...Facebook said it would continue to suspend Trump’s accounts pending the board’s decision. The social network also asked the board to review its policies on world leaders, whose speech Facebook and other social media companies consider to be newsworthy or in the public interest, and therefore they are given more latitude to make inflammatory comments than everyday users. “This is clearly the right decision,” said Evelyn Douek, a lecturer at Harvard Law School who had argued, along with other academics, that the oversight board should take up the case. “Had Facebook failed to send the case to the board, it would have suggested that even it wasn’t taking its oversight board seriously. Just waiting to see how [Zuckerberg] happened to feel about whether the Trump suspension should be made permanent is not a good model for how these kinds of important decisions should get made.”

  • Inside Twitter’s Decision to Cut Off Trump

    January 19, 2021

    Jack Dorsey, Twitter’s chief executive, was working remotely on a private island in French Polynesia frequented by celebrities escaping the paparazzi when a phone call interrupted him on Jan. 6. On the line was Vijaya Gadde, Twitter’s top lawyer and safety expert, with an update from the real world. She said she and other company executives had decided to lock President Trump’s account, temporarily, to prevent him from posting statements that might provoke more violence after a mob stormed the U.S. Capitolthat day. Mr. Dorsey was concerned about the move, said two people with knowledge of the call...But he had delegated moderation decisions to Ms. Gadde, 46, and usually deferred to her — and he did so again...Since Mr. Trump was barred, many of Mr. Dorsey’s concerns about the move have been realized. Twitter has been embroiled in a furious debate over tech power and the companies’ lack of accountability. Lawmakers such as Representative Devin Nunes, a Republican from California, have railed against Twitter, while Silicon Valley venture capitalists, First Amendment scholars and the American Civil Liberties Union have also criticized the company. At the same time, activists around the world have accused Twitter of following a double standard by cutting off Mr. Trump but not autocrats elsewhere who use the platform to bully opponents. “This is a phenomenal exercise of power to de-platform the president of the United States,” said Evelyn Douek, a lecturer at Harvard Law School who focuses on online speech. “It should set off a broader reckoning.”

  • The Moderation War Is Coming to Spotify, Substack, and Clubhouse

    January 15, 2021

    Glenn Greenwald was pissed. The Columbia Journalism Review had just asked whether Substack should remove the writer Andrew Sullivan from its service. And having recently joined the email newsletter platform himself, Greenwald attacked. “It was only a matter of time before people started demanding Substack be censored,” he said, taking it a step further than the CJR. Last October, Greenwald left The Intercept, a publication he founded, claiming the publication’s editors, who previously hadn’t touched his work, “censored” him ahead of the 2020 election. So he moved to Substack, which advertises itself as a home for independent writing...Substack, Spotify, and Clubhouse’s current perspective on content moderation mirror how Twitter, Facebook, and Google once viewed the practice. Twitter executives initially called themselves “the free speech wing of the free speech party.” Facebook insisted it had no business touching political content. YouTube allowed Alex Jones and other wingnuts to build misinformation empires on its service. Now, Substack CEO Chris Best — reflecting the smaller platforms’ attitude on moderation — told CJR that if you’re looking for him to take an “editorial position” you should find another service. After initially resisting aggressive content moderation (aside from no-brainers like child porn), the bigger platforms have slowly relented. “They are agreeing that they need to moderate more aggressively and in more ways than they used to,” Evelyn Douek, a Harvard Law School lecturer who studies content moderation, told OneZero. And if past is prologue, their path to the current state is worth revisiting.

  • The Lawfare Podcast: Jonathan Zittrain on the Great Deplatforming

    January 14, 2021

    Yesterday, January 13, the House of Representatives impeached President Trump a second time for encouraging the violent riot in the Capitol Building on January 6. And yet, the impeachment is probably less of a crushing blow to the president than something else that’s happened in recent days: the loss of his Twitter account. After a few very eventful weeks, Lawfare's Arbiters of Truth series on disinformation is back. Evelyn Douek and Quinta Jurecic spoke with Jonathan Zittrain, the George Bemis Professor of International Law at Harvard Law School, about the decision by Twitter, Facebook and a whole host of other platforms to ban the president in the wake of the Capitol riot. Jonathan, Evelyn and Quinta take a step back and situate what’s happening within the broader story of internet governance. They talked about how to understand the bans in the context of the internet’s now not-so-brief history, how platforms make these decisions and, of course, Section 230 of the Communications Decency Act. Listeners might also be interested in Zittrain's February 2020 Tanner Lecture, "Between Suffocation and Abdication: Three Eras of Governing Digital Platforms," which touches on some of the same ideas discussed in the podcast.

  • YouTube suspends Trump, days after Twitter and Facebook

    January 14, 2021

    YouTube suspended President Trump from uploading new videos to his official account for at least a week, making the decision days after fellow social media giants Twitter and Facebook shut the president out of his accounts because of concerns his posts will incite violence. The Google-owned video site was the last of the major social media networks to suspend Trump after the attack on the U.S. Capitol...The resistance to removing videos completely has helped allow YouTube to fly under the radar as other social media sites take the heat for allowing misinformation to proliferate on their sites, said Harvard Law School lecturer Evelyn Douek...Researchers tend to focus on text-based Twitter and Facebook, Doueksaid, because video can be more time consuming and labor intensive to sift through. That doesn’t mean there is less misinformation floating around on YouTube, and, in fact, the company has been accused of allowing people to become radicalized on the site by promoting conspiracy theory videos, she added. YouTube’s policy of laying out rules and using a strike system to enforce them is better than ad hoc decision-making by executives, Douek said. “My view of content moderation is companies should have really clear rules they set out in advance and stick to, regardless of political or public pressure,” she said. “We don’t just want these platforms to be operating as content cartels and moving in lockstep and doing what everyone else is doing.”

  • Illustration of a hand holding a mobile phone, pressing a

    Blocking the president

    January 13, 2021

    Harvard Law experts Yochai Benkler and evelyn douek weigh in on the suspension of President Trump’s social media accounts and potential First Amendment implications.

  • Parler, a Platform Favored by Trump Fans, Struggles for Survival

    January 12, 2021

    Parler launched in 2018 as a freewheeling social-media site for users fed up with the rules on Facebook and Twitter, and it quickly won fans from supporters of President Trump. On Monday, it went dark, felled by blowback over its more permissive approach. Inc. abruptly ended web-hosting services to the company, effectively halting its operations, prompting Parler to sue Amazon in Seattle federal court. Other tech partners also acted, crippling operations. Driving the decision was last week’s mob attack on the U.S. Capitol...Amazon explained its decision to shut off services to Parler by citing 98 instances of violent content on the platform that it said violated its rules. Parler said it removed all 98 items, in some instances before Amazon reported them. Determining whether discourse on Parler or the platform’s moderation of it was fundamentally worse than on other forums is “kind of an impossible question, empirically and philosophically,” said Evelyn Douek, a Harvard Law School lecturer who studies content moderation. While a case can be made for app stores and other internet infrastructure providers taking action against platforms that don’t take moderation requirements seriously, Ms. Douek said, the speed of tech companies’ action against Parler raised questions. “If 98 is the threshold, has AWS looked at the rest of the internet,” she said. “Or at Amazon?”