Visiting Professor of Law
Niva Elkin-Koren & Maayan Perel, Accountability in Algorithmic Copyright Enforcement, 20 Stan. Tech. L. Rev. (forthcoming 2017).
Sub-Categories:, , , , ,
Recent years echo a growing use of algorithmic law enforcement by online intermediaries. Facilitating the distribution of online content, online intermediaries offer a natural point of control for monitoring access to illegitimate content, which makes them ideal partners for performing civil and criminal enforcement. Copyright law was at the forefront of algorithmic law enforcement from the early 90’s, conferring safe harbor protection to online intermediaries who remove allegedly infringing content upon notice under the Digital Millennium Copyright Act (DMCA). Over the past two decades, the Notice and Takedown (N&TD) regime has become ubiquitous and embedded in the system design of all major intermediaries: while major copyright owners increasingly exploit robots to send immense volumes of takedown requests – a practice that was recently accredited by the 9th Circuit in Lenz v. Universal Studios – major online intermediaries, in response, use algorithms to filter, block, and disable access to allegedly infringing content automatically, with little or no human intervention. Algorithmic enforcement by online intermediaries reflects a fundamental shift in our traditional system of governance. It effectively converges law enforcement and adjudication powers, at the hands of a small number of mega platforms, profit-maximizing, and possibly biased private entities. Yet, notwithstanding their critical role in shaping access to online content and facilitating public discourse, intermediaries are hardly held accountable for algorithmic enforcement. We simply do not know which allegedly infringing material triggers the algorithms, how decisions regarding content restrictions are made, who is making such decisions and how target users might affect these decisions. Lessons drawn from algorithmic copyright enforcement by online intermediaries could offer a valuable case study for addressing these concerns. As we demonstrate, algorithmic copyright enforcement by online intermediaries lacks sufficient measures to assure accountability, namely, the extent to which decision makers are expected to justify their choices, are answerable for their actions, and are held responsible for their failures and wrongdoings. This Article proposes a novel framework for analyzing accountability in algorithmic enforcement that is based on three factors: transparency, due process and public oversight. It identifies the accountability deficiencies in algorithmic copyright enforcement and further maps the barriers for enhancing accountability, including technical barriers of non-transparency and machine learning, legal barriers that disrupt the development of algorithmic literacy and practical barriers. Finally, the Article explores current and possible strategies for enhancing accountability, by increasing public scrutiny and promoting transparency in algorithmic copyright enforcement.
Niva Elkin-Koren & Maayan Perel, Black Box Tinkering: Beyond Transparency in Algorithmic Enforcement, Fla. L. Rev. (forthcoming 2017).
Sub-Categories:, , , ,
The pervasive growth of algorithmic enforcement magnifies current debates regarding the virtues of transparency. Not only does using codes to conduct robust online enforcement amplify the settled problem of magnitude, or “too-much-information,” often associated with present-day disclosures, it imposes additional practical difficulties on relying on transparency as an adequate check for algorithmic enforcement. In this Essay we explore the virtues of black box tinkering methodology as means of generating accountability in algorithmic systems of online enforcement. Given the far-reaching implications of algorithmic enforcement of online content for public discourse and fundamental rights, we advocate active public engagement in checking the practices of automatic enforcement systems. Accordingly, we explain the inadequacy of transparency in generating public oversight. First, it is very difficult to read, follow and predict the complex computer code which underlies algorithms as it is inherently non-transparent and capable of evolving according to different patterns of data. Second, mandatory transparency requirements are irrelevant to many private implementations of algorithmic governance which are subject to trade secrecy. Third, algorithmic governance is so robust that even without mandatory transparency it is impossible to review all the information already disclosed. Fourth, when algorithms are called on to replace humans in making determinations that involve discretion, transparency about the algorithms’ inputs (the facts) and outputs (the outcomes) is not enough to allow adequate oversight. This is because a given legal outcome does not necessarily yield sufficient information about the reasoning behind it. Subsequently we establish the benefits of black box tinkering as a proactive methodology that encourages social activism, using the example of a recent study of online copyright enforcement practices by online intermediaries. That study sought to test systematically how hosting websites implement copyright policy by examining the conduct of popular local image-sharing platforms and popular local video-sharing platforms. Particularly, different types of infringing, non-infringing and fair use materials were uploaded to various hosting facilities, each intended to trace choices made by the black box system throughout its enforcement process. The study’s findings demonstrate that hosting platforms are inconsistent, therefore unpredictable in detecting online infringement and enforcing copyrights: some platforms allow content that is filtered by others; some platforms strictly respond to any notice requesting removal of content despite its being clearly non-infringing, while other platforms fail to remove content upon notice of alleged infringement. Moreover, many online mechanisms of algorithmic copyright enforcement generally do very little in terms of minimizing errors and ensuring that interested parties do not abuse the system to silence legitimate speech and over-enforce copyright. Finally, the findings indicate that online platforms do not make full efforts to secure due process and allow affected individuals to follow, and promptly respond to, proceedings that manage their online submissions. Based on these findings, we conclude that black box tinkering methodology could offer an invaluable grasp of algorithmic enforcement practices on the ground. We hence evaluate the possible legal implications of this methodology and propose means to address them.
Niva Elkin-Koren, The New Frontiers of User Rights, 32 Am. U. Int’l L. Rev. 1 (2016).
Sub-Categories:, , , , ,