Skip to content
  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

    This essay responds to Orin S. Kerr, Searches and Seizures in a Digital World, 119 Harv. L. Rev. 531 (2005), http://ssrn.com/abstract=697541. Professor Kerr has published a thorough and careful article on the application of the Fourth Amendment to searches of computers in private hands - a treatment that has previously escaped the attentions of legal academia. Such a treatment is perhaps so overdue that it has been overtaken by two phenomena: first, the emergence of an overriding concern within the United States about terrorism; and second, changes in the way people engage in and store their most private digital communications and artifacts. The first phenomenon has foregrounded a challenge by the President to the very notion that certain kinds of searches and seizures may be proscribed or regulated by Congress or the judiciary. The second phenomenon, grounded in the mass public availability of always-on Internet broadband, is leading to the routine entrustment of most private data to the custody of third parties - something orthogonal to a doctrinal framework in which the custodian of matter searched, rather than the person who is the real target of interest of a search, is typically the only one capable of meaningfully asserting Fourth Amendment rights to prevent a search or the use of its fruits. Together, these phenomena make the application of the Fourth Amendment to the standard searches of home computers - searches that, to be sure, are still conducted regularly by national and local law enforcement - an interesting exercise that is yet overshadowed by greatly increased government hunger for private information of all sorts, both individual and aggregate, and by rapid developments in networked technology that will be used to satisfy that hunger. Perhaps most important, these factors transform Professor Kerr's view that a search occurs for Fourth Amendment purposes only when its results are exposed to human eyes: such a notion goes from unremarkably unobjectionable - police are permitted to mirror entirely a suspect's hard drive and then are constitutionally limited as they perform searches on the copy - to dangerous to any notion of limited government powers. Professor Kerr appreciates this as a troublesome result - indeed, downright creepy - but does not dwell upon it beyond suggesting that the copying of data might be viewed as a seizure if not a search, at least so long as it involves some physical touching or temporary commandeering of the machine. This view should be amplified: If remote vacuum cleaner approaches are used to record and store potentially all Internet and telephone communications for later searching, with no Fourth Amendment barrier to the initial information-gathering activity in the field, the government will be in a position to perform comprehensive secret surveillance of the public without any structurally enforceable barrier, because it will no longer have to demand information in individual cases from third parties or intrude upon the physical premises or possessions of a search target in order to gather information of interest. The acts of intruding upon a suspect's demesnes or compelling cooperation from a third party are natural triggers for judicial process or public objection. If the government has all necessary information for a search already in its possession, then we rely only upon its self-restraint in choosing the scope and depth of otherwise unmonitorable searching. This is precisely the self-restraint that the Fourth Amendment eschews for intrusive government searches by requiring outside monitoring by disinterested magistrates - or individually exigent circumstances in which such monitoring can be bypassed. Taken together, the current areas of expansion of surveillance appear permanent rather than exigent, and sweeping rather than focused, causing the justifications behind special needs exceptions to swamp the baseline protections established for criminal investigations. This expansion stands to remove the structural safeguards designed to forestall the abuse of power by a government that knows our secrets.

  • Derek E. Bambauer, Ronald J. Deibert, John F. Palfrey, Rafal Rohozinski, Nart Villeneuve & Jonathan L. Zittrain, Internet Filtering in China in 2004-2005: A Country Study (Apr. 15, 2005).

    Type:
    Categories:
    Sub-Categories:

    China's Internet filtering regime is the most sophisticated effort of its kind in the world. Compared to similar efforts in other states, China's filtering regime is pervasive, sophisticated, and effective. It comprises multiple levels of legal regulation and technical control. It involves numerous state agencies and thousands of public and private personnel. It censors content transmitted through multiple methods, including Web pages, Web logs, on-line discussion forums, university bulletin board systems, and e-mail messages. Our testing found efforts to prevent access to a wide range of sensitive materials, from pornography to religious material to political dissent. We sought to determine the degree to which China filters sites on topics that the Chinese government finds sensitive, and found that the state does so extensively. Chinese citizens seeking access to Web sites containing content related to Taiwanese and Tibetan independence, Falun Gong, the Dalai Lama, the Tiananmen Square incident, opposition political parties, or a variety of anti-Communist movements will frequently find themselves blocked. Contrary to anecdote, we found that most major American media sites, such as CNN, MSNBC, and ABC, are generally available in China (though the BBC remains blocked). Moreover, most sites we tested in our global list's human rights and anonymizer categories are accessible as well. While it is difficult to describe this widespread filtering with precision, our research documents a system that imposes strong controls on its citizens' ability to view and to publish Internet content. This report was produced by the OpenNet Initiative, a partnership among the Advanced Network Research Group, Cambridge Security Programme at Cambridge University, the Citizen Lab at the Munk Centre for International Studies, University of Toronto, and the Berkman Center for Internet & Society at Harvard Law School.

  • Jonathan L. Zittrain, Normative Principles for the Evaluation of Free and Proprietary Software, 71 U. Chi. L. Rev. 265 (2004).

    Type:
    Categories:
    Sub-Categories:

    The production of most mass-market software can be grouped roughly according to free and proprietary development models. These models differ greatly from one another, and their associated licenses tend to insist that new software inherit the characteristics of older software from which it may be derived. Thus the success of one model or another can become self-perpetuating, as older free software is incorporated into later free software and proprietary software is embedded within successive proprietary versions. The competition between the two models is fierce, and the battle between them is no longer simply confined to the market. Claims of improper use of proprietary code within the free GNU/Linux operating system have resulted in multi-billion dollar litigation. This article explains the ways in which free and proprietary software are at odds, and offers a framework by which to assess their value - a prerequisite to determining the extent to which the legal system should take more than a passing, mechanical interest in the doctrinal claims now being pressed against GNU/Linux specifically and free software generally.

  • Type:
    Categories:
    Sub-Categories:

    Use this short secondary text with any torts casebook to give your students a demonstration of the practical dimensions to tort law alongside the doctrine they are already learning.

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

    The authors are collecting data on the methods, scope, and depth of selective barriers to Internet access through Chinese networks. Tests from May 2002 through November 2002 indicate at least four distinct and independently operable methods of Internet filtering, with a documentable leap in filtering sophistication beginning in September 2002. The authors document thousands of sites rendered inaccessible using the most common and longstanding filtering practice. These sites were found through connections to the Internet by telephone dial-up link and through proxy servers in China. Once so connected, the authors attempted to access approximately two hundred thousand web sites. The authors tracked 19,032 web sites that were inaccessible from China on multiple occasions while remaining accessible from the United States. Such sites contained information about news, politics, health, commerce, and entertainment. The authors conclude (1) that the Chinese government maintains an active interest in preventing users from viewing certain web content, both sexually explicit and non-sexually explicit; (2) that it has managed to configure overlapping nationwide systems to effectively - if at times irregularly - block such content from users who do not regularly seek to circumvent such blocking; and (3) that such blocking systems are becoming more refined even as they are likely more labor- and technology-intensive to maintain than cruder predecessors.

  • Type:
    Categories:
    Sub-Categories:

    Links:

    We collected data on the methods, scope, and depth of selective barriers to Internet usage through networks in China. Tests conducted from May through November 2002 indicated at least four distinct and independently operable Internet filtering methods - Web server IP address, DNS server IP address, keyword, and DNS redirection with a quantifiable leap in filtering sophistication beginning in September 2002.

  • Type:
    Categories:
    Sub-Categories:

  • Jonathan L. Zittrain, Be Careful What You Ask For: Reconciling a Global Internet and Local Law, in Who Rules the Net? Internet Governance and Jurisdiction (Adam Thierer & Wayne Crews eds., Cato Institute 2003).

    Type:
    Categories:
    Sub-Categories:

    This book considers the threats to free speech and online commerce posed by international government attempting to impose such territorial statutes and standards within cyberspace.

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

    The first session focused on the role, value, and limits of scientific and technical data and information in the public domain. This was followed in the second session by an overview of the pressures on the public domain.

  • Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

    The authors are studying Internet filtering worldwide, primarily in the context of affirmative actions by governments to restrict the sites viewed by their respective users. We are also interested in requests or demands to private parties that they assist in preventing particular citizens' exposure to locally unwanted or illicit data or activities. A well-known example is the attempt by the French judiciary, acting on complaint of a French NGO, to prevent those on French territory from viewing Yahoo auctions that include the display of Nazi memorabilia, and Yahoo's response.

  • Type:
    Categories:
    Sub-Categories:

  • Lawrence Lessig, Jonathan L. Zittrain, Charles R. Nesson, William F. Fisher & Yochai Benkler, Internet Law (Found. Press 2002).

    Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

    In the spring of 1998, the U.S. government told the Internet: Govern yourself. This unfocused order - a blandishment, really, expressed as an awkward "statement of policy" by the Department of Commerce, carrying no direct force of law - came about because the management of obscure but critical centralized Internet functions was at a political crossroads. This essay reviews Milton Mueller's book Ruling the Root, and the ways in which it accounts for what happened both before and after that crossroads.

  • Jonathan L. Zittrain, ICANN: Between the Public and the Private, in The Best in E-Commerce Law (Warren Agin ed., 2001).

    Type:
    Categories:
    Sub-Categories:

  • Jonathan L. Zittrain, What the Publisher Can Teach the Patient: Intellectual Property and Privacy in an Era of Trusted Privication, 52 Stan. L. Rev. 1201 (2000).

    Type:
    Categories:
    Sub-Categories:

  • Type:
    Categories:
    Sub-Categories:

    Current tax law makes it difficult to enforce sales taxes on most Internet commerce and has generated considerable policy debate. In this paper we analyze the costs and benefits of enforcing such taxes including revenue losses, competition with retail, externalities, distribution, and compliance costs. The results suggest that the costs of not enforcing taxes are quite modest and will remain so for several years. At the same time, compliance costs are also likely to be low. There are benefits to nurturing the Internet but they tend to diminish over time. When tax costs and benefits take this form, a moratorium provides a natural compromise.

  • Type:
    Categories:
    Sub-Categories:

    Current tax law makes it difficult to enforce sales taxes on most Internet commerce and has generated considerable policy debate. In this paper we analyze the costs and benefits of enforcing such taxes including revenue losses, competition with retail, externalities, distribution, and compliance costs. The results suggest that the costs of not enforcing taxes are somewhat modest and will remain so for several years. At the same time, compliance costs and the benefits of nurturing the Internet diminish over time. When tax costs and benefits take this form, a moratorium provides a natural compromise.

  • Type:
    Categories:
    Sub-Categories:

    The model of university as producer of knowledge-as-product-for-sale is a closed one. Knowledge is treated as property to be copyrighted, patented, classified, licensed, and litigated. Under this closed model, creative work cannot progress without negotiations about license fees (the ambit of legal "fair use" at a minimum). As faculty become work-for-hire, money becomes the currency of the campus, and legality the dominant feature of relationship. Under this model, the nature of Harvard will change fundamentally - for the worse, I think. The community of scholars at the heart of the academy trades riches for a comfortable secure environment in which to think, research, and teach. This community, comprised of intellectuals who do not hold money paramount, will be oppressed by a commercial/legal environment. The Berkman Center aspires to demonstrate a different model - open IT, we call it. We encourage cooperative work dedicated to the open domain. Faculty, students, staff, alumni, relatives, and friends are permitted and encouraged (though not required) to work together in the public interest. Intellectual community and creative process is our product, knowledge the by-product. This approach galvanizes spirit and produces educational works of great distinction and wide public utility. Furthermore, this model maintains the community of scholars while avoiding the meanness of money and licenses. It will enhance the prestige of the institutions that contribute and become part of it. But there are questions.

  • Jonathan Zittrain, The Un-Microsoft Un-Remedy: Law Can Prevent the Problem it Can't Patch, 31 Conn. L. Rev. 1361 (1999).

    Type:
    Categories:
    Sub-Categories:

    Microsoft has brilliantly exploited its current control of the personal computer operating system (OS) market to grant itself advantages towards controlling tomorrow's operating system market as well. This is made possible by the control Microsoft has asserted over user "defaults," a power Microsoft possesses thanks to a combination of (1) Windows' high market share, (2) the "network effects" that make switching to an alternative so difficult for any given consumer or computer manufacturer, and (3) software copyright, which largely prevents competitors from generating software that defeats network effects. The author suggests a much-reduced term of copyright for computer software--from 95 years to around five years--as a means of preventing antitrust problems before they arise.

  • Type:
    Categories:
    Sub-Categories:

    Chief Justice Earl Warren was a great sports fan and watched a lot of sports on TV. Perhaps for that reason he was appalled at the idea of cameras in the courtroom. Thirty years ago, he brought the Supreme Court within one vote of declaring televised criminal trials to be unconstitutional. The Hollywoodization of trials would be their undoing, he wrote, because the search for truth would be subverted into the search for ratings. He worried that trial participants would behave differently under the glare of publicity, and that those watching trials on TV would get a distorted view of the American justice system. They might even be driven to doubt it.