Skip to content
  • Type:
    Categories:
    Sub-Categories:

    Just over a year ago, with support from the William and Flora Hewlett Foundation, Harvard’s Berkman Center for Internet & Society at Harvard University convened a diverse group of security and policy experts from academia, civil society, and the U.S. intelligence community to begin to work through some of the particularly vexing and enduring problems of surveillance and cybersecurity.

  • Type:
    Categories:
    Sub-Categories:

    Suppose a laptop were found at the apartment of one of the perpetrators of last year’s Paris attacks. It’s searched by the authorities pursuant to a warrant, and they find a file on the laptop that’s a set of instructions for carrying out the attacks.

  • Type:
    Categories:
    Sub-Categories:

    June 2014 saw a media uproar about Facebook's emotional contagion study, published in the Proceedings of the National Academy of Sciences. In conjunction with researchers at Cornell, Facebook designed an experiment which altered the Facebook News Feed to explore if emotions can spread through Facebook. These feeds, the primary activity and content list on Facebook, are populated according to a proprietary algorithm. In the experiment, the algorithms for a random subset of users were manipulated to display either proportionately more negative emotional content or proportionately more emotional content; a control group saw content according to the current algorithm. This study met vocal opposition not solely for manipulating the moods of Facebook users, but also because users neither volunteered nor opted in to such research, and were not informed of their participation in the study. This study is a motivating example of the moral, legal, and technical questions raised when algorithms permeate society. This case explains the parameters of the experiment, the reaction in the media, and the legal issues introduced (including Federal Trade Commission standards for commercial practices and the US Department of Health and Human Services Policy for the Protection of Human Research Subjects, informally known as the "Common Rule"). In order to encourage examining the issues presented by algorithms in a number of different scenarios, the case uses six hypothetical situations that encourage participants to ponder the use of algorithms in different mediums, including their use with print media, charity, and business, among others. These hypothetical scenarios present varying aspects of the expanding role algorithms play, and will elicit more meaningful discussion and force participants to address through nuanced arguments the complicated issues surrounding algorithms. After briefing analyzing and debating all six scenarios, participants will delve deeper into one hypothetical, using their position on a hypothetical issue to inform their stance on the Facebook Emotional Contagion study. The exercise concludes with a class-wide debate of the ethics surrounding the Facebook Emotional Contagion study.

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

    This publication is the second annual report of the Internet Monitor project at the Berkman Centerfor Internet & Society at Harvard University. As with the inaugural report, this year’s edition is a collaborative effort of the extended Berkman community. Internet Monitor 2014: Reflections on the Digital World includes nearly three dozen contributions from friends and colleagues around the world that highlight and discuss some of the most compelling events and trends in the digitally networked environment over the past year. The result, intended for a general interest audience, brings together reflection and analysis on a broad range of issues and regions — from an examination of Europe’s “right to be forgotten” to a review of the current state of mobile security to an exploration of a new wave of movements attempting to counter hate speech online — and offers it up for debate and discussion. Our goal remains not to provide a definitive assessment of the “state of the Internet” but rather to provide a rich compendium of commentary on the year’s developments with respect to the online space. Last year’s report examined the dynamics of Internet controls and online activity through the actions of government, corporations, and civil society. We focus this year on the interplay between technological platforms and policy; growing tensions between protecting personal privacy and using big data for social good; the implications of digital communications tools for public discourse and collective action; and current debates around the future of Internet governance. The report reflects the diversity of ideas and input the Internet Monitor project seeks to invite. Some of the contributions are descriptive; others prescriptive. Some contain purely factual observations; others offer personal opinion. In addition to those in traditional essay format, contributions this year include a speculative fiction story exploring what our increasingly data-driven world might bring, a selection of “visual thinking” illustrations that accompany a number of essays, a “Year in Review” timeline that highlights many of the year’s most fascinating Internet-related news stories (and an interactive version of which is available at the netmonitor.org), and a slightly tongue-in-cheek “By the Numbers” section that offers a look at the year’s important digital statistics. We believe that each contribution offers insights, and hope they provoke further reflection, conversation, and debate in both offline and online settings around the globe.

  • Type:
    Categories:
    Sub-Categories:

    Links:

    In these edited remarks originally given at ROFLCon in May 2012, Jonathan Zittrain muses on the nature of memes and their relationships to their creators as well as to broader culture and to politics. The distributed environment of the internet allows memes to morph and become distanced from their original intentions. As meme culture becomes more and more assimilated into popular culture, subcultures like those of Reddit or 4chan have begun to re-conceptualize their own role from just meme propagators to cultural producers. Memes can gain commercial appeal, much to the chagrin of their creators. More strangely, memes can gain political traction and affiliation, like American conservative commentator Bill O’Reilly’s ‘You can‘t explain that’ or Anonymous’ ‘Low Orbit Ion Cannon’. Can meme culture survive becoming not just the property of geeks and nerds, but part of the commercial and political world?

  • Type:
    Categories:
    Sub-Categories:

    In this article the author discusses aspects of smart technology that might have disarmed the insurgent group in Iraq, the Islamic State (ISIS), without bombs or bullets. Topics include remotely disabling weaponry with a kill switch similar to one on the iPhone, which resulted in reduced iPhone thefts, examples of fail-safe mechanisms that could be built using basic signature-and-authentication technologies, and the implications of a failed kill-switch strategy.

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

  • Favorite

    Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

    The scary future of digital gerrymandering—and how to prevent it.

  • Type:
    Categories:
    Sub-Categories:

    Links:

    It has become increasingly common for a reader to follow a URL cited in a court opinion or a law review article, only to be met with an error message because the resource has been moved from its original online address. This form of reference rot, commonly referred to as ‘linkrot’, has arisen from the disconnect between the transience of online materials and the permanence of legal citation, and will only become more prevalent as scholarly materials move online. The present paper, written by Jonathan Zittrain, Kendra Albert and Lawrence Lessig, explores the pervasiveness of linkrot in academic and legal citations, finding that more than 70% of the URLs within the Harvard Law Review and other journals, and 50% of the URLs within United States Supreme Court opinions, do not link to the originally cited information. In light of these results, a solution is proposed for authors and editors of new scholarship that involves libraries undertaking the distributed, long-term preservation of link contents.

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

    It has become increasingly common for a reader to follow a URL cited in a court opinion or a law review article, only to be met with an error message because the resource has been moved from its original online address. This form of reference rot, commonly referred to as ‘linkrot’, has arisen from the disconnect between the transience of online materials and the permanence of legal citation, and will only become more prevalent as scholarly materials move online. The present paper, written by Jonathan Zittrain, Kendra Albert and Lawrence Lessig, explores the pervasiveness of linkrot in academic and legal citations, finding that more than 70% of the URLs within the Harvard Law Review and other journals, and 50% of the URLs within United States Supreme Court opinions, do not link to the originally cited information. In light of these results, a solution is proposed for authors and editors of new scholarship that involves libraries undertaking the distributed, long-term preservation of link contents.

  • Type:
    Categories:
    Sub-Categories:

    Links:

    What is the Web? What makes it work? And is it dying? This paper is drawn from a talk delivered by Prof. Zittrain to the Royal Society Discussion Meeting 'Web science: a new frontier' in September 2010. It covers key questions about the way the Web works, and how an understanding of its past can help those theorizing about the future. The original Web allowed users to display and send information from their individual computers, and organized the resources of the Internet with uniform resource locators. In the 20 years since then, the Web has evolved. These new challenges require a return to the spirit of the early Web, exploiting the power of the Web's users and its distributed nature to overcome the commercial and geopolitical forces at play. The future of the Web rests in projects that preserve its spirit, and in the Web science that helps make them possible.

  • Type:
    Categories:
    Sub-Categories:

    This briefing document was developed as part of a March 30, 2012 workshop entitled “Public Networks for Public Safety: A Workshop on the Present and Future of Mesh Networking,” hosted by the Berkman Center for Internet & Society at Harvard University. The event provided a starting point for conversation about whether mesh networks could be adopted within consumer technologies to enhance public safety communications and empower and connect the public while simultaneously improving public safety. Participants in this initial convening included members of government agencies, academia, the telecommunications industry, and civil society organizations; their helpful inputs were integral to the final version of this document. Building on the dialogue at this gathering, this briefing document seeks to: sketch a broad overview of mobile ad hoc networks (MANETs) and mesh technologies; identify critical technical issues and questions regarding the communications effectiveness of those technologies; explain how public safety communications relate to mesh and offer a synopsis of current regulations affecting those communications; describe a set of basic use cases that emerged from the conference; map out stakeholders at the technical, regulatory, legal, and social levels, and associated interests, points of connection, and potential challenges; catalog select examples and, where possible, highlight potential next steps and areas for short term action; and, summarize key takeaways from the conference, with an emphasis on shared principles or best practices that might inform participants’ diverse efforts to improve communications affordances for the public and the public safety community. The paper also synthesizes several strains of workshop discussion that probed big picture framing concerns that could inform the present and future of mesh. Specifically, it puts forth two related but distinct models for mesh: mesh in a technical sense and mesh as a metaphor or social layer construct, with a particular emphasis on the need for further conceptual development with regard to “social mesh.” The final section emphasizes key take-aways from the event, highlighting core principles and best practices that might both provide a theoretical underpinning for the future conceptual development of mesh networking technologies and social mesh models, respectively, and inform the real-world development of communications systems that involve either definition of mesh. The Berkman Center thanks all of the workshop attendees both for their participation during the event and for comments offered during the development of this briefing document. Berkman Center Project Coordinator Alicia Solow-Niederman worked closely with Professor Jonathan Zittrain to plan and execute this event as well as to produce this briefing document. Berkman Center Research Assistants Andrew Crocker and Kevin Tsai provided exceptional research and contributions to this briefing document, and June Casey contributed indispensable support with background research.

  • Type:
    Categories:
    Sub-Categories:

    Links:

    When people took to the streets across the UK in the summer of 2011, the Prime Minister suggested restricting access to digital and social media in order to limit their use in organizing. The resulting debate complemented speculation on the effects of social media in the Arab Spring and the widespread critique of President Mubarak's decision to shut off the Internet and mobile phone systems completely in Egypt (see the photo).

  • Type:
    Categories:
    Sub-Categories:

    In 1977, 22-year-old Steve Jobs introduced the world to one of the first self-contained personal computers, the Apple II. The machine was a bold departure from previous products built to perform specific tasks: turn it on, and there was only a blinking cursor awaiting further instruction. Some owners were inspired to program the machines themselves, but others could load up software written and shared or sold by others more skilled or inspired.

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

    Links:

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

    Wikileaks is a self-described “not-for-profit media organization,” launched in 2006 for the purposes of disseminating original documents from anonymous sources and leakers. Its website says: “Wikileaks will accept restricted or censored material of political, ethical, diplomatic or historical significance. We do not accept rumor, opinion, other kinds of first hand accounts or material that is publicly available elsewhere.”

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

    In August 2010, selected faculty and researchers at the Berkman Center for Internet & Society at Harvard University, an independent, exploratory study analyzing ICANN’s decision-making processes and communications with its stakeholders. The study focused on developing a framework and recommendations for understanding and improving ICANN’s accountability and transparency. The study was undertaken as part of ICANN’s first Accountability and Transparency Review. On November 4, 2010, the Berkman team’s independent report was publicly posted alongside ICANN’s Accountability and Transparency Review Team's Draft Proposed Recommendations for Public Comment. The Executive Summary below outlines key Findings and Recommendations for Improvement. In addition to this Final Report, associated research materials, resources, and other supplementary inputs that were gathered in the course of the Berkman team’s work. 1. Problem Statement: In recent years, ICANN has taken important actions — ranging from significant policy changes to formal reviews — to improve its accountability, transparency, and the quality of its decision making. Despite considerable efforts and acknowledged improvements, ICANN continues to struggle with making decisions that the global Internet community can support. 2. Independent Review of Transparency and Accountability at ICANN: As part of a larger independent review process, faculty and researchers from the Berkman Center for Internet & Society have taken on the challenge of researching ICANN’s current efforts to improve accountability via mechanisms of transparency, public participation and corporate governance, and of analyzing key problems and issues across these areas. 3. Findings and Assessment: In-depth research into the three focus areas of this report reveals a highly complex picture with many interacting variables that make fact-finding challenging and also render simple solutions impossible. With this complexity in mind, and referring to the main text of the report for a more granular analysis, the findings and assessments of this report can be condensed as follows. ICANN’s performance regarding transparency is currently not meeting its potential across all areas reviewed and shows deficits along a number of dimensions. It calls for clearly defined improvements at the level of policy, information design, and decision making. ICANN has made significant progress in improving its public participation mechanisms and gets high marks regarding its overall trajectory in this regard. Remaining concerns about the practical impact of public participation on Board decisions are best addressed by increasing visibility and traceability of individual inputs, in order to clarify how these inputs ultimately factor into ICANN decision-making processes. ICANN’s greatest challenge ahead, despite significant recent efforts, remains corporate and Board governance. Proposed measures identified in this report aim to increase efficiency, transparency and accountability within the current context and in the absence of standard accountability mechanisms. 4. Recommendations: There is no straightforward way to address the various challenges ICANN faces. The approach underlying this report’s recommendations takes an evolutionary rather than revolutionary perspective. This approach is aimed at continually improving ICANN’s accountability step by step, based on lessons learned, through a series of measured interventions, reinforced by monitoring and subsequent re-evaluation. For each of the three focal areas covered in this report and for each of the key issues addressed, this report suggests ways in which the status quo can be improved. Some of these recommendations can be implemented quickly, others require policy changes, and still others call for more in-depth research, consultation and deliberation among the involved stakeholders. This report’s recommendations vary in kind and orientation. They encourage the adoption of best practices where available and experimentation with approaches and tools where feasible. Several of the recommendations are aimed at improving information processing, creation, distribution, and responsiveness at different levels of the organization.

  • Type:
    Categories:
    Sub-Categories:

    This essay focuses instead on the scripts in which the authors are trapped, and then on a very different kind of fear than that of government encroachment. Many professional journalists of good will and undisputed talent have drifted to a place where they are routinely parties to the absurd and prisoners to threats not as readily grasped as those from official censors.

  • Type:
    Categories:
    Sub-Categories:

    This essay makes the case that Internet-based platforms can be evaluated along two dimensions: the first between generative and sterile, indicating openness to further contribution and development from outsiders, and the second between hierarchy and polyarchy, indicating the ease with which those affected by the platform can escape its umbrella. The quadrants formed by these two dimensions can help us to understand patterns in the development of new technologies and platforms, and to brainstorm the widest array of solutions to problems arising under them, particularly problems involving enforcement of regulations against bad actors. Using cybersecurity as a principal example, the essay explains why current national security approaches to Internet vulnerabilities are unduly narrow, and how focusing attention on the "fourth quadrant" can broaden the range of options.

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

    Popular imagination holds that the turf of a state’s foreign embassy is a little patch of its homeland. Enter the American Embassy in Beijing and you are in the United States. Indeed, in many contexts – such as resistance to search and seizure by a host country’s authorities – there is an inviolability to diplomatic outposts. These arrangements have been central to diplomacy for decades so that diplomats can perform their work without fear of harassment and coercion. Complementing a state’s oasis on foreign territory is the ability to get there and back unharried. Diplomats are routinely granted immunity from detention as they travel, and la valise diplomatique – the diplomatic pouch – is a packet that cannot be seized, or in most cases even inspected, as it moves about. Each pouch is a link between a country and its outposts dispersed in alien territory around the world. Citizens and their digital packets deserve much the same treatment as they traverse the global Internet. Just as states expect to conduct their official business on foreign soil without interference, so citizens should be able to lead digitally mediated – and increasingly distributed – lives without fear that their links to their online selves can be arbitrarily abridged or surveilled by their Internet Service Providers or any other party. Just as the sanctity of the embassy and la valise diplomatique is vital to the practice of international diplomacy, the ability of our personal bits to travel about the net unhindered is central to the lives we increasingly live online. This frame differs from the usual criteria for debating the merits of net neutrality. It does not focus on what makes for more efficient provision of broad-band services to end users. It is unaffected by what sorts of bundling of services by a local ISP might intrigue the ISP’s subscribers. It does not examine the costs and benefits of faraway content providers being asked to bargain for access to that local ISP’s customers. Instead, it recognizes that Internet users establish outposts far and wide, and that a new status quo of distributed selfhood is quickly taking hold.

  • Type:
    Categories:
    Sub-Categories:

    Links:

    Robert Faris and Jonathan Zittrain chart the highs and lows for free expression online in 2009: from the triumph over Green Dam to cyber attacks.

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

    Links:

    Nature regulars give their recommendations for relaxed, inspiring holiday reading and viewing — from climate-change history to Isaac Newton the detective.

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

    Links:

    A high-profile copyright activist is fighting for traditional publishers to stop criminalizing their own readers, explains Jonathan Zittrain.

  • Type:
    Categories:
    Sub-Categories:

    Links:

    I recently wrote a book about the future of the Internet. The book's thesis is that the mainstream computing environment we've experienced for the past 30-plus years—dating from the introduction of the first mainstream personal computer, the Apple II, in 1977—is an anomaly. The basic building blocks of modern IT are PCs that anyone can reprogram, connected to an Internet that unquestioningly routes bits between two arbitrary points. This has led to a generative revolution where novel and disruptive technologies have come from obscure backwaters—and conquered. While incumbents bet on (or were) gated-community networks like CompuServe, Prodigy, and AOL, or makers of "smart appliances" such as dedicated word processors and video-game consoles, dark-horse candidates like the Internet and the PC emerged unexpectedly and triumphed, helped along by commercial forces that belatedly hopped on their bandwagons.

  • Type:
    Categories:
    Sub-Categories:

    The Internet, perhaps the most important technological development of the past 30 years, succeeded unexpectedly. It started out in an experimental backwater, nurtured far from the mainstream. It was spawned with no business plan and with no CEO leading the charge. Instead, a group of researchers—nerds, really—had the very un-entrepreneurial idea to develop a set of free and open technical protocols to move data from one place to another. The PC, which I think of as a companion technology to the Internet, likewise groomed as the hobbyhorse of passionate nerds who (at least initially) shared their designs. Both the Internet and the PC were released unfinished, and because they were open technologies, businesses and inventors could use them as a springboard for innovation. New applications were deployed to use them without needing the permission of their vendors.

  • Type:
    Categories:
    Sub-Categories:

    Links:

    Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a drawing pin and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This paper explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.

  • Jonathan L. Zittrain, Perfect Enforcement on Tomorrow's Internet, in Regulating Technologies: Legal Futures, Regulatory Frames and Technological Fixes (Roger Brownsword & Karen Yeung eds., 2008).

    Type:
    Categories:
    Sub-Categories:

  • Favorite

    Type:
    Categories:
    Sub-Categories:

    This extraordinary book explains the engine that has catapulted the Internet from backwater to ubiquity—and reveals that it is sputtering precisely because of its runaway success. With the unwitting help of its users, the generative Internet is on a path to a lockdown, ending its cycle of innovation—and facilitating unsettling new kinds of control. IPods, iPhones, Xboxes, and TiVos represent the first wave of Internet-centered products that can’t be easily modified by anyone except their vendors or selected partners. These “tethered appliances” have already been used in remarkable but little-known ways: car GPS systems have been reconfigured at the demand of law enforcement to eavesdrop on the occupants at all times, and digital video recorders have been ordered to self-destruct thanks to a lawsuit against the manufacturer thousands of miles away. New Web 2.0 platforms like Google mash-ups and Facebook are rightly touted—but their applications can be similarly monitored and eliminated from a central source. As tethered appliances and applications eclipse the PC, the very nature of the Internet—its “generativity,” or innovative character—is at risk. The Internet’s current trajectory is one of lost opportunity. Its salvation, Zittrain argues, lies in the hands of its millions of users. Drawing on generative technologies like Wikipedia that have so far survived their own successes, this book shows how to develop new technologies and social structures that allow users to work creatively and collaboratively, participate in solutions, and become true “netizens.”

  • Type:
    Categories:
    Sub-Categories:

  • Access Denied: The Practice and Policy of Global Internet Filtering (Ronald Deibert, John G. Palfrey, Rafal Rohozinski & Jonathan L. Zittrain eds., MIT Press 2008).

    Type:
    Categories:
    Sub-Categories:

    Access Denied documents and analyzes Internet filtering practices in more than three dozen countries, offering the first rigorously conducted study of an accelerating trend.

  • Type:
    Categories:
    Sub-Categories:

    We assess the impact of spam that touts stocks upon the trading activity of those stocks and sketch how profitable such spamming might be for spammers and how harmful it is to those who heed advice in stock-touting e-mails. We find convincing evidence that stock prices are being manipulated through spam. We suggest that the effectiveness of spammed stock touting calls into question prevailing models of securities regulation that rely principally on the proper labeling of information and disclosure of conflicts of interest as means of protecting consumers, and we propose several regulatory and industry interventions. Based on a large sample of touted stocks listed on the Pink Sheets quotation system and a large sample of spam emails touting stocks, we find that stocks experience a significantly positive return on days prior to heavy touting via spam. Volume of trading responds positively and significantly to heavy touting. For a stock that is touted at some point during our sample period, the probability of it being the most actively traded stock in our sample jumps from 4% on a day when there is no touting activity to 70% on a day when there is touting activity. Returns in the days following touting are significantly negative. The evidence accords with a hypothesis that spammers "buy low and spam high," purchasing penny stocks with comparatively low liquidity, then touting them - perhaps immediately after an independently occurring upward tick in price, or after having caused the uptick themselves by engaging in preparatory purchasing - in order to increase or maintain trading activity and price enough to unload their positions at a profit. We find that prolific spamming greatly affects the trading volume of a targeted stock, drumming up buyers to prevent the spammer's initial selling from depressing the stock's price. Subsequent selling by the spammer (or others) while this buying pressure subsides results in negative returns following touting. Before brokerage fees, the average investor who buys a stock on the day it is most heavily touted and sells it 2 days after the touting ends will lose close to 5.5%. For those touted stocks with above-average levels of touting, a spammer who buys on the day before unleashing touts and sells on the day his or her touting is the heaviest, on average, will earn 4.29% before transaction costs. The underlying data and interactive charts showing price and volume changes are also made available.

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories: