Skip to content

People

Jonathan Zittrain

  • On the Bookshelf: HLS Library Book Talks, Spring 2018 2

    On the bookshelf

    December 15, 2020

    In the unusual year of 2020, Harvard Law authors continued to do what they always have: Write.

  • Imagine a world in which AI is in your home, at work, everywhere

    December 7, 2020

    Imagine a robot trained to think, respond, and behave using you as a model. Now imagine it assuming one of your roles in life, at home or perhaps at work. Would you trust it to do the right thing in a morally fraught situation? That’s a question worth pondering as artificial intelligence increasingly becomes part of our everyday lives, from helping us navigate city streets to selecting a movie or song we might enjoy — services that have gotten more use in this era of social distancing. It’s playing an even larger cultural role with its use in systems for elections, policing, and health care...Jonathan Zittrain, Berkman Klein Center director and Harvard Law School’s George Bemis Professor of International Law, said it’s important that we think hard about whether and where AI’s adoption might be a good thing and, if so, how best to proceed. Zittrain said that AI’s spread is different from prior waves of technology because AI shifts the decision-making away from humans and toward machines and their programmers. And, where technologies like the internet were developed largely in the open by government and academia, the frontiers of AI are being pushed forward largely by private corporations, which shield the business secrets of how their technologies operate. The effort to understand AI’s place in society and its impact on all of our lives — not to mention whether and how it should be regulated — will take input from all areas of society, Zittrain said. “It’s almost a social-cultural question, and we have to recognize that, with the issues involved, it’s everybody’s responsibility to think it through,” Zittrain said.

  • Connected Parent Zoom panel

    ‘The Connected Parent’ offers guidance, insight into digital parenting

    November 16, 2020

    “The Connected Parent,” a new book by John Palfrey ’01 and Urs Gasser LL.M. ’03  is a practical guide for addressing concerns brought on by the COVID-19 pandemic and navigating an increasingly digital world.

  • voting box with a lock

    Simulating responses to election disinformation

    October 14, 2020

    In an effort to combat multiple potential vectors of attack on the 2020 U.S. election, two Berkman Klein Center affiliates have published a package of “tabletop exercises,” freely available to decisionmakers and the public to simulate realistic scenarios in which disinformation threatens to disrupt the 2020 election.

  • Icon of a lock indicating digital security

    ‘We need to be more imaginative about cybersecurity than we are right now’

    October 7, 2020

    In the “good old days” of cybersecurity risk, we only had to worry about being hacked or downloading malware. But the stakes have ramped up considerably in the past decade, say Berkman Klein directors James Mickens and Jonathan Zittrain.

  • An unbalanced scale weighing COVID against a dollar sign, house, medical symbol, pyramid, and a man teaching

    The law is ‘tested and illuminated during this pandemic’

    September 16, 2020

    In the first colloquium of a sweeping new series, “COVID-19 and the Law,” five Harvard Law faculty members grappled with the challenges, limitations, and opportunities of governmental powers during a public health crisis.

  • “Column” launches to modernize public notices

    September 9, 2020

    A group of prominent media veterans are advising a team of millennials who are launching a new company called "Column," which modernizes the placement of public notices. Why it matters: Public notices have been one of the biggest and most reliable revenue streams​ for local newspapers for centuries. Amid the pandemic, they are becoming more important to local papers that are seeing regular local advertising dry up. Catch up quick: Public notices are legally required updates from the government to citizens about different types of legal or regulatory proceedings, like ordinances, foreclosures, municipal budgets, adoptions, and public meetings. Like obituaries, the exact amount of revenue they deliver to the newspaper industry is debated, but they've long been considered a critical part of local newspaper businesses, due to a longstanding legal requirement for local governments to place them in newspapers...Details: The company has a full-time team of nine, and is supported and advised by many notable media experts, including David Chavern, CEO of the News Media Alliance and Nancy Gibbs, the faculty director of Harvard's Shorenstein Center on Media, Politics and Public Policy, and the former editor in chief of Time Magazine. It receives occasional informal advice from Marty Baron, executive editor of the Washington Post. The company is advised by several Harvard professors and entrepreneurs, including Jonathan Zittrain, co-founder of the Berkman Klein Center of Internet and Society at Harvard Law School, and Henry Ward, founder and CEO of Carta.

  • Are We Already Living in a Tech Dystopia?

    September 8, 2020

    For the most part, fictional characters rarely recognize when they’re trapped in a dystopia...To them, that dystopia is just life. Which suggests that—were we, at this moment, living in a dystopia ourselves—we might not even notice it...Is this, right here, the tech-dystopia we were worried about? For this week’s Giz Asks, we reached out to a number of experts with differing opinions...Jonathan Zittrain: "Yes, in the sense that so many of us feel rightly that instead of empowering us, technology is used against us—most especially when it presents itself as merely here to help. I’ve started thinking of some of our most promising tech, including machine learning, as like asbestos: it’s baked wholesale into other products and services so we don’t even know we’re using it; it becomes pervasive because it seems to work so well, and without any overlay of public interest on its installation; it’s really hard to account for, much less remove, once it’s in place; and it carries with it the possibility of deep injury both now and down the line. I’m not anti-tech. But I worry greatly about the deployment of such power with so little thought given to, and so few boundaries against, its misuse, whether now or later. More care and public-minded oversight goes into someone’s plans for an addition to a house than to what can be or become a multi-billion dollar, multi-billion-user online platform. And while thanks to their power, and the trust placed in them by their clients, we recognize structural engineers, architects, lawyers, and doctors as members of learned professions—with duties to their clients and to the public that transcend a mere business agreement—we have yet to see that applied to people in data science, software engineering, and related fields who might be in a position to recognize and prevent harm as it coalesces. There are ways out of this. But it first requires abandoning the resignation that so many have come to about business as usual."

  • The Root Room 7

    HLS librarians, remote but not distant: ‘We’re here for you’

    August 13, 2020

    Since going remote in March, the Harvard Law School Library has reimagined what it means to provide services to the HLS community.

  • Testing Is on the Brink of Paralysis. That’s Very Bad News.

    July 17, 2020

    An article by Margaret Bourdeaux, Beth Cameron and Jonathan ZittrainAs Covid-19 cases surge to their highest levels in dozens of states, the nation’s testing effort is on the brink of paralysis because of widespread delays in getting back results. And that is very bad news, because even if testing is robust, the pandemic cannot be controlled without rapid results. This is the latest failure in our national response to the worst pandemic in a century. Since the Trump administration has abdicated responsibility, governors must join forces to meet this threat before the cataclysm that Florida is experiencing becomes the reality across the country. Testing should be the governors’ first order of business. Despite President Trump’s boast early this month that testing “is so massive and so good,” the United States’ two largest commercial testing companies, Quest Diagnostics and LabCorp, have found themselves overwhelmed and unable to return results promptly. Delays averaging a week or longer for all but top-priority hospital patients and symptomatic health care workers are disastrous for efforts to slow the spread of the virus. Without rapid results, it is impossible to isolate new infections quickly enough to douse flare-ups before they grow. Slow diagnosis incapacitates contact tracing, which entails not only isolating those who test positive but also alerting the infected person’s contacts quickly so they can quarantine, too, and avoid exposing others to the virus unwittingly. Among those who waited an absurdly long time for her results was the mayor of Atlanta, Keisha Lance Bottoms. “We FINALLY received our test results taken 8 days before,” she tweeted last week. “One person in my house was positive then. By the time we tested again, 1 week later, 3 of us had COVID. If we had known sooner, we would have immediately quarantined.”

  • Twitter’s Least-Bad Option for Dealing With Donald Trump

    June 26, 2020

    An article by Jonathan ZittrainOn Tuesday, President Donald Trump began his day as he usually does—by tweeting. In this case, Trump fired off a threat of using “serious force” against hypothetical protesters setting up an “autonomous zone” in Washington, D.C. Twitter, in response, hid the tweet but did not delete it, requiring readers to click through a notice that says the tweet violated the platform’s policy “against abusive behavior, specifically, the presence of a threat of harm against an identifiable group.” Twitter’s placement of such a “public interest notice” on a tweet from the president of the United States was just the latest salvo in the company’s struggle to contend with Trump’s gleefully out-of-bounds behavior. But any response from Twitter is going to be the least bad option rather than a genuinely good one. This is because Trump himself has demolished the norms that would make a genuinely good response possible in the first place. The truth is that every plausible configuration of social media in 2020 is unpalatable. Although we don’t have consensus about what we want, no one would ask for what we currently have: a world in which two unelected entrepreneurs are in a position to monitor billions of expressions a day, serve as arbiters of truth, and decide what messages are amplified or demoted. This is the power that Twitter’s Jack Dorsey and Facebook’s Mark Zuckerberg have, and they may well experience their own discomfort with it. Nor, though, would many of us wish for such powerful people to stand idly by when, at no risk to themselves, they could intervene to prevent misery and violence in the physical world, by, say, helping to counter dangerous misinformation or preventing the incitement of violence.

  • Is Digital Contact Tracing Over Before It Began?

    June 26, 2020

    An article by Jonathan ZittrainLast month I wrote a short essay covering some of the issues around standing up contact tracing across the U.S., as part of a test/trace/quarantine regime that would accompany the ending of a general lockdown to prevent the spread of the Coronavirus pandemic...In the intervening month, some things have remained the same. As before, tech companies and startups continue to develop exposure notification apps and frameworks. And there remains no Federally-coordinated effort to test, trace, and isolate — it’s up to states and respective municipalities to handle anything that will happen. Some localities continue to spin up ambitious contact tracing programs, while others remain greatly constrained. As Margaret Bourdeaux explains, for example: “In Massachusetts, many of the 351 local boards of health are unaccredited, and most have only the most rudimentary digital access to accomplish the most basic public health goals of testing and contact tracing in their communities.” She cites Georgetown’s Alexandra Phelan: “Truly the amount of US COVID19 response activities that rely solely on the fax machine would horrify you.” There remain any number of well-considered plans that depend on a staged, deliberate reopening based on on testing, tracing, and supported isolation, such as ones from Harvard’s Safra Center (“We need to massively scale-up testing, contact tracing, isolation, and quarantine — together with providing the resources to make these possible for all individuals”), the Center for American Progress (calling for “instantaneous contact tracing and isolation of individuals who were in close proximity to a positive case”), and the American Enterprise Institute (“We need to harness the power of technology and drive additional resources to our state and local public-health departments, which are on the front lines of case identification and contact tracing”).

  • João Marinotti ’20

    João Marinotti ’20 wants to know how the world works

    May 27, 2020

    “I’ve always had a passion for engaging in my curiosity,” says João Marinotti ‘20, a linguist turned lawyer whose work focuses on sustainability, business, property, and private law.

  • Entering the Minefield of Digital Contact Tracing

    May 26, 2020

    An article by Jonathan Zittrain: People across America and the world remain under strong advisories or outright orders to shelter in place, and economies largely shut down, as part of an ongoing effort to flatten the curve of the most virulent pandemic since 1918. The economic effects have been predictably staggering, with no clear end in sight. Until a vaccine or other transformative medical intervention is developed, the broad consensus of experts is that the only way out of mass sheltering in place, if hospital occupancy curves are to remain flattened, entails waiting for most of the current cases to resolve, and then cautiously and incrementally reopening. That would mean a sequence of allowing people out; promptly testing anyone showing symptoms — and even some who are not; identifying recent proximate contacts of those who test positive; and then getting in touch with those contacts and, if circumstances dictate, asking or demanding that they individually shelter until the disease either manifests or not. The idea is to promptly prune branches of further disease transmission in order to keep its reproductive factor non-exponential.

  • Tree with red leaves

    Summations: Reflections from the Class of 2020

    May 20, 2020

    Members of the Class of 2020 reflect on their interests and share experiences they will take from their time at Harvard Law.

  • A start-up is using photos to ID you. Big tech can stop it from happening again.

    April 16, 2020

    An article by Jonathan Zittrain and John BowersEarlier this year, the public learned that a tiny start-up called Clearview AI was offering a big service. Clearview subscribers could give the company a photo of someone they had just taken and get links to other photos of the same person, often revealing information like who they are and where they live. A little tweaking and the service might simply identify people over any live feed aimed at any street, hallway or classroom. Though it has been marketed as a one-stop warrantless law enforcement tool, Clearview’s client list is also reported to include casinos, gyms, supermarkets, sporting leagues and wealthy parents curious about their kids’ dates. The upshot? The fundamental comfort — and liberty — of being able to walk down a street or enter a supermarket or stadium without the authorities, or fellow strangers, immediately knowing who you are is about to evaporate without any public debate about whether that’s okay. It’s as if someone invented glasses that could see through walls, sold them to a select few, and everyone else inexplicably shrugged. Now, the Wall Street Journal reports that Clearview AI is “in discussions with state agencies about using its technology to track patients infected by the coronavirus, according to people familiar with the matter.” It’s a savvy move, aimed at turning a rogue actor into a hero.

  • Cyberlaw Clinic turns 20

    April 9, 2020

    It was 1999 and the dot-com bubble was about to burst. Corporations were scrambling to address new legal challenges online. Napster was testing the music industry. And at Harvard Law School, the Berkman Klein Center was creating a clinical teaching program specializing in cyberlaw.

  • An Ambitious Reading of Facebook’s Content Regulation White Paper

    March 9, 2020

    An article by John Bowers and Jonathan Zittrain: Corporate pronouncements are usually anodyne. And at first glance one might think the same of Facebook’s recent white paper, authored by Monika Bickert, who manages the company’s content policies, offering up some perspectives on the emerging debate around governmental regulation of platforms’ content moderation systems. After all, by the paper’s own terms it’s simply offering up some questions to consider rather than concrete suggestions for resolving debates around platforms’ treatment of such things as anti-vax narratives, coordinated harassment, and political disinformation. But a careful read shows it to be a helpful document, both as a reflection of the contentious present moment around online speech, and because it takes seriously some options for “content governance” that–if pursued fully–would represent a moonshot for platform accountability premised on the partial but substantial, and long-term, devolution of Facebook’s policymaking authority.

  • Mike Bloomberg tweeted a doctored debate video. Is it political spin or disinformation?

    February 21, 2020

    Following his lackluster performance in Wednesday’s Democratic presidential debate, former New York Mayor Mike Bloomberg tweeted out a doctored video that made it look like he had a hugely successful moment on the debate stage, even though he didn’t. ... Take what happened earlier this month: Trump tweeted out a video that had been edited to make it look like Speaker of the House Nancy Pelosi was ripping up the president’s State of the Union speech during touching moments, such as the introduction of a Tuskegee airman. That’s not what transpired: Pelosi did rip up the speech, but only at the end of the full address. Jonathan Zittrain, a legal expert at Harvard, argues that tweet shouldn’t be taken down, even though it’s misleading, because it’s protected by free speech. “It’s political expression that could be said to be rearranging the video sequence in order to make a point that ripping up the speech at the end was, in effect, ripping up every topic that the speech had covered,” he wrote on Medium on February 10. “And to show it in a video conveys a message far more powerful than just saying it — something First Amendment values protect and celebrate, at least if people aren’t mistakenly thinking it is real,” Zittrain wrote.

  • A man of letters: The Antonin Scalia Collection opens at Harvard Law School

    February 11, 2020

    The Harvard Law School Library has announced the public release of the first batch of papers and other items from the Antonin Scalia Collection. His papers were donated by the Scalia family following the influential justice's death in 2016.

  • A World Without Privacy Will Revive the Masquerade

    February 11, 2020

    An article by Jonathan ZittrainTwenty years ago at a Silicon Valley product launch, Sun Microsystems CEO Scott McNealy dismissed concern about digital privacy as a red herring: “You have zero privacy anyway. Get over it.” “Zero privacy” was meant to placate us, suggesting that we have a fixed amount of stuff about ourselves that we’d like to keep private. Once we realized that stuff had already been exposed and, yet, the world still turned, we would see that it was no big deal. But what poses as unsentimental truth telling isn’t cynical enough about the parlous state of our privacy. That’s because the barrel of privacy invasion has no bottom. The rallying cry for privacy should begin with the strangely heartening fact that it can always get worse. Even now there’s something yet to lose, something often worth fiercely defending. For a recent example, consider Clearview AI: a tiny, secretive startup that became the subject of a recent investigation by Kashmir Hill in The New York Times.