People
evelyn douek
-
Weeks into the war in Ukraine, Facebook's parent company Meta is poised to tap its oversight board for guidance about a policy shift allowing users in Ukraine to post some calls for violence against Russian invaders. It would mark the first time the panel has formally weighed in on the tech giant's flurry of actions in response to the war, and it could shape its rules on violent rhetoric moving forward. ... The oversight board has effectively been unable to weigh in on any of the tech giant's massive policy shifts in recent weeks, and now that it will, it's on "a very small slice of what Facebook's doing," said Evelyn Douek, a lecturer at Harvard Law School.
-
Media law review raises thorny freedom of expression issues
March 15, 2022
Anjum Rahman knows more than most about the harmful effects of media content. Over the years, Rahman - an accountant by trade, who also founded the Inclusive Aotearoa Collective Tāhono and acts as spokesperson for the Islamic Women’s Council - has been a leading voice from the Muslim community speaking about the harms caused by online extremism. She’s received a fair deal of pushback for this, ranging from fairly civil to downright abusive. ... As Harvard Law School’s Evelyn Douek puts it, regulators should be wary of building content moderation policy on the assumption that techies at online media platforms can simply “nerd harder” and stop the spread of all harmful content, without any trade-offs, if they just put their mind to it.
-
Beware the Never-Ending Disinformation Emergency
March 11, 2022
“If you put up this whole interview,” Donald Trump said during a podcast livestream on Wednesday afternoon, “let’s see what happens when Instagram and Facebook and Twitter and all of them take it down.” Trump named the wrong platforms; the podcast, Full Send, a mildly Rogan-esque bro-fest, was streaming on YouTube. But otherwise his prediction made sense, because during the interview he reiterated his claim that he, not Joe Biden, was the rightful winner of the 2020 election. “The election fraud was massive,” he said during one of several riffs on the theme. “I call it ‘the crime of the century.’ We’re doing a book on it.” ... “It’s mostly been a one-way ratchet,” says Evelyn Douek, a doctoral candidate at Harvard Law School who studies content moderation. “It rarely goes back the other way; it always tightens and tightens. To date, we haven’t seen loosening at the end of periods of risk.”
-
Spotify, Joe Rogan and the Wild West of online audio
February 7, 2022
Neil young was five years old when, in 1951, he was partially paralysed by polio. Joni Mitchell was nine when she was hospitalised by the same illness around the same time. Both grew up to become famous singers—and, lately, prominent campaigners against anti-vaccine misinformation. The two musicians, followed by a handful of others, have withdrawn their music from the world’s biggest streaming service in protest at a podcast that gave airtime to anti-vaxxers. ... “It’s always been baffling to me how podcasts have flown under the content-moderation radar,” says Evelyn Douek of Harvard Law School. “It’s a massive blind-spot.” It could also prove to be a pricey one. As audio platforms host more user-generated content, the moderation task will expand. It will probably involve lots of human moderators; automating the process with artificial intelligence, as Facebook and others are doing, is even harder for audio than it is for text, images or video. Software firms’ valuations “have long been driven by the notion that there’s no marginal cost”, says Mr Page. “Content moderation might be their first.”
-
Spotify’s Joe Rogan Mess
February 4, 2022
For Spotify, the last month has seen a cascade of controversies around its exclusive podcast, The Joe Rogan Experience. Is it time for the streaming service to rethink its role as a podcast publisher? And is it even possible to moderate podcast misinformation? Guest: Evelyn Douek, lecturer at Harvard Law School, and affiliate at the Berkman Klein Center for Internet & Society
-
What the Joe Rogan podcast controversy says about the online misinformation ecosystem
January 21, 2022
An open letter urging Spotify to crack down on COVID-19 misinformation has gained the signatures of more than a thousand doctors, scientists and health professionals spurred by growing concerns over anti-vaccine rhetoric on the audio app's hit podcast, The Joe Rogan Experience. ... "Wherever you have users generating content, you're going to have all of the same content moderation issues and controversies that you have in any other space," said Evelyn Douek, a research fellow at Columbia University's Knight First Amendment Institute.
-
For generations of parents, Heidi Murkoff’s 1984 pregnancy guide “What to Expect When You’re Expecting” has been a trusty companion, offering calm, scientifically informed advice for a nerve-wracking nine months. These days, of course, there’s an app for that: What to Expect’s “Pregnancy & Baby Tracker,” which offers personalized articles, videos, graphics of your baby’s development, and other features based on your due date. ... Any platform that allows users to interact or create content eventually will face questions about how to deal with offensive speech, said evelyn douek, a lecturer at Harvard Law School who researches online content moderation. Smaller sites tend to lack the resources to moderate users’ discussions as effectively as the large platforms, she added, and can become hubs of misinformation.
-
TikTok, Snap, YouTube to defend how they protect kids online in congressional hearing
October 26, 2021
TikTok, Snapchat and YouTube, all social media sites popular with teens and young adults, will answer to Congress on Tuesday about how well they protect kids online. It’s the first time testifying before the legislative body for both TikTok and Snap, the parent company of Snapchat, despite their popularity and Congress’s increasing focus on tech industry practices. By contrast, Facebook representatives have testified 30 times in the past four years, and Twitter executives including CEO Jack Dorsey have testified on Capitol Hill 18 times total. ... “Facebook is just not the only game in town,” said Harvard Law School lecturer Evelyn Douek, who studies the regulation of online speech. “If we’re going to talk about teen users, we should talk about the platforms that teens actually use. Which is TikTok, Snapchat and YouTube.”
-
Long before President Trump earned widespread bans across social media, Snap enacted one of the earliest measures to curb his reach, booting him from its Discover feed of curated content in June 2020. ... On YouTube, though, the former president faced a different outcome. Six days after the riot, the site “temporarily suspended” him and has since said it would allow him to return at some point. And he never gave TikTok the chance to punish him... Each app’s unique reaction to Trump reflects their disparate approaches to moderating their sites, as well as their different relationships with U.S. politics—dynamics that will go under the spotlight on Tuesday morning when the three companies appear in Congress. ... Extending the investigation beyond Facebook could indicate that politicians may finally renew efforts to draft new regulations for social media. “There’s real political momentum,” says Evelyn Douek, a Harvard Law School lecturer who studies online speech and misinformation. “And that inevitably means you have to go to the other platforms, particularly to where the teens are,” she says, making Snap, TikTok and YouTube “the natural choices.”
-
TikTok and Snapchat are testifying for the first time. Their peers are in the double-digits.
October 21, 2021
TikTok and Snapchat will testify before Congress next week for the first time, spokespeople for the companies confirmed Wednesday, as Senate lawmakers broaden their investigation into how social media platforms are affecting kids’ safety. Meanwhile, Facebook CEO Mark Zuckerberg is being called by the same panel to appear on Capitol Hill for what would be his eighth time in four years, and the 31st time for any Facebook executive in that same time span, spokesman Andy Stone confirmed. The disparity highlights how lawmakers’ oversight of Silicon Valley companies has fixated on a few major platforms, most notably Facebook. ... But some researchers such as Harvard Law School lecturer Evelyn Douek have argued that Congress’ myopic focus has made it all but turn a blind eye to several of the world’s most influential sites, including TikTok, Snapchat and YouTube, which will also testify next week. Douek, who has challenged lawmakers to summon YouTube CEO Susan Wojcicki in particular to testify for the first time, celebrated the lineup for the next hearing.
-
Is it time to swipe left on social media?
October 12, 2021
Leaked revelations about Instagram’s impact on teens have united Republicans and Democrats in considering legal reforms, say Harvard Law School scholars.
-
1 Billion TikTok Users Understand What Congress Doesn’t
October 12, 2021
An article by Evelyn Douek: Many people think of TikTok as a dance app. And although it is an app full of dancing, it’s also a juggernaut experiencing astronomical growth. In July, TikTok—a short-form video-sharing app powered by an uncannily goodrecommendation algorithm and owned by the Chinese company ByteDance—became the only social-media mobile app other than those from Facebook to ever pass 3 billion downloads. At the end of last month, TikTok announcedit had more than 1 billion monthly users. It was, by some counts, the most downloaded app in 2020 and remained so into 2021. Not bad for an app launched only in 2016! Of the social-media platforms around today, TikTok is the likeliest to represent the future. Its user base is mostly young people. But if you look for TikTok in news coverage, you’re more likely to find it in the lifestyle, culture, or even food section than you are on the front page. It’s not that social-media platforms aren’t newsworthy—Facebook consistentlydominates headlines. But TikTok is all too often regarded as an unserious thing to write or read about. That’s a mistake, and it’s one that Congress is making as well.
-
YouTube bans all anti-vaccine misinformation
September 29, 2021
YouTube said on Wednesday that it was banning the accounts of several prominent anti-vaccine activists from its platform, including those of Joseph Mercola and Robert F. Kennedy Jr., as part of an effort to remove all content that falsely claims that approved vaccines are dangerous. ...“One platform’s policies affect enforcement across all the others because of the way networks work across services,” said Evelyn Douek, a lecturer at Harvard Law School who focuses on online speech and misinformation. “YouTube is one of the most highly linked domains on Facebook, for example.”
-
Facebook targets harmful real networks, using playbook against fakes
September 17, 2021
Facebook (FB.O) is taking a more aggressive approach to shut down coordinated groups of real-user accounts engaging in certain harmful activities on its platform, using the same strategy its security teams take against campaigns using fake accounts, the company told Reuters. The new approach, reported here for the first time, uses the tactics usually taken by Facebook's security teams for wholesale shutdowns of networks engaged in influence operations that use false accounts to manipulate public debate, such as Russian troll farms. ... An expansion of Facebook's network disruption models to affect authentic accounts raises further questions about how changes might impact types of public debate, online movements and campaign tactics across the political spectrum. "A lot of the time problematic behavior will look very close to social movements," said Evelyn Douek, a Harvard Law lecturer who studies platform governance. "It's going to hinge on this definition of harm ... but obviously people's definitions of harm can be quite subjective and nebulous."
-
More Content Moderation Is Not Always Better
June 2, 2021
An op-ed by Evelyn Douek: Content moderation is eating the world. Platforms’ rule-sets are exploding, their services are peppered with labels, and tens of thousands of users are given the boot in regular foul swoops. No platform is immune from demands that it step in and impose guardrails on user-generated content. This trend is not new, but the unique circumstances of a global public health emergency and the pressure around the US 2020 election put it into overdrive. Now, as parts of the world start to emerge from the pandemic, and the internet’s troll in chief is relegated to a little-visited blog, the question is whether the past year has been the start of the tumble down the dreaded slippery content moderation slope or a state of exception that will come to an end. There will surely never be a return to the old days, when platforms such as Facebook and Twitter tried to wash their hands of the bulk of what happened on their sites with faith that internet users, as a global community, would magically govern themselves. But a slow and steady march toward a future where ever more problems are sought to be addressed by trying to erase content from the face of the internet is also a simplistic and ineffective approach to complicated issues.
-
Parler, the conservative-friendly “free speech” social media app, is back in the Apple App Store. But like anything involving social media and free speech, its return is complicated. Beginning on Monday, Parler is available for download on iPhones and iPads. This comes around four months after Parler was banned or limited by Apple, Amazon, Google, and virtually every other major tech company for allowing some of its users to openly organize violence following the 2020 US election — namely at the January 6 US Capitol insurrection... “It’s going to be an interesting story to watch in a number of ways,” said Evelyn Douek, a lecturer at Harvard Law School who studies content moderation online. “There’s the story of what happens with Parler itself, whether it does get more serious about policing hate speech and violent content, and then what does it mean for Apple to get into the content moderation game.”
-
For years, Facebook has grappled with the thorny question of who should be the ultimate “arbiter of truth” when it comes to moderating content on its near ubiquitous social media platform. Mark Zuckerberg, chief executive and ultimate decision maker, has agreed that it should not be him — semi-outsourcing the problem to a group of his own making, called the Facebook Oversight Board...It is currently funded — through a $130m trust — by Facebook. And it has binding authority over a very narrow type of case: whether a removed piece of content should be reinstated or an offensive post should come down, and whether users should remain banned. It only hears a handful of these a year. “It’s still looking at that very narrow slice of what content moderation is, namely how Facebook treats individual posts,” says Evelyn Douek, a lecturer at Harvard Law School. “It doesn’t look at things like groups, pages, down-ranking decisions, how the news feed works in terms of prioritisation, how Facebook treats entire accounts.”
-
Facebook can keep blocking former President Donald Trump from using its platform but must revisit the decision within six months, the social network's court-like Oversight Board said Wednesday. The landmark decision affirmed the company's decision to issue the suspension after the January 6 US Capitol riots. The board said it concluded that Trump's posts on January 6, which praised the rioters, "severely violated" Facebook's policies and "created an environment where a serious risk of violence was possible." ... The decision to bar Trump from Facebook, if made permanent, could have vast implications, wrote Evelyn Douek, a researcher of online speech and platform moderation at Harvard Law School. "There is no greater question in content moderation right now than whether Trump's deplatforming represents the start of a new era in how companies police their platforms, or whether it will be merely an aberration," Douek wrote in January. "What one platform does can ripple across the internet, as other platforms draft in the wake of first movers and fall like dominoes in banning the same accounts or content. For all these reasons, the board's decision on Trump's case could affect far more than one Facebook page."