via Wired

By Susan Crawford

birds eye view of a large city in black and white, showing streets and cars. credit: Diane Bentley Raymond/Getty Images

The current state of rules for use of facial recognition technology is literally all over the map. Next month, the city council in Portland, Oregon will hold a public meeting about blocking use of the technology by private companies, as well as by the government. San Francisco, Oakland, Calfornia, and Somerville, Massachusetts, already have banned the use of facial recognition technology by city agencies; Seattle’s police stopped using it last year; and Detroit has said facial recognition can be used only in connection with investigation of violent crimes and home invasions (and not in real time).

State governments have their own rules too. In October, California joined New Hampshire and Oregon in prohibiting law enforcement from using facial recognition and other biometric tracking technology in body cameras. Illinois passed a law that permits individuals to sue over the collection and use of a range of biometric data, including fingerprints and retinal scans as well as facial recognition technology. Washington and Texas have laws similar to the one in Illinois, but don’t allow for private suits.

In other words, we’re headed for a major clash. The potential benefits of facial recognition, and biometric data generally, are just too great for governments and corporations to pass up. Existing bans of public-sector use that are based on its present, inaccurate, and discriminatory implementations likely won’t be sustainable long-term as the technology improves. At the same time, completely unfettered use of private biometric systems seems incompatible with American values. We’re not China, or at least not yet.

This situation is crying out for policy development: Government needs to act to determine where the lines of appropriate use should be drawn. This is not likely to happen on the federal level, though, anytime soon: Even as pressure from activists builds, Congress has so far been unable to pass even a basic federal online privacy law; this month’s House Oversight Committee hearing on facial recognition has just been punted to next year. (A proposed bipartisan bill to constrain the use of the technology by federal law enforcement officers would address just a sliver of the issues raised by the use of biometric identifiers.) That leaves the issues to be worked out in different ways in different places, as a patchwork of local laws. Tech and telecom companies often moan about just this sort of outcome, complaining that it makes compliance difficult and drives up production costs—but in this case, it’s a good thing.

When federal policy is absent, ham-handed, or hopelessly captured by industry, local governments can act as testing grounds for new ideas, providing proof that the status quo can change. This is not a new idea: As Supreme Court Justice Louis Brandeis wrote in 1932, a “state may, if its citizens choose, serve as a laboratory; and try novel social and economic experiments without risk to the rest of the country.” That approach—of using local laws as laboratory trials—worked when it came to spreading the power grid across the country. States and localities led the way in making electricity a publicly governed utility. The same thing happened in health care: Former Massachusetts Governor Mitt Romney has said that “without Romneycare [in Massachusetts] we wouldn’t have had Obamacare.”

The patchwork can work for tech too. In October, the federal appeals court for the District of Columbia circuit issued a 186-page opinion allowing states to continue to impose their own “open internet” laws and executive orders in the absence of any federal regulation of high-speed internet access. As telecom commentator Harold Feld wrote, this gives the industry “significant incentive to stop fooling around and offer real concessions to get some sort of federal law on the books.” In other words, the patchwork is usefully painful for companies: The agony stimulates them to come to the table.

Similarly, as I described earlier this year in my book, Fiber: The Coming Tech Revolution―and Why America Might Miss It, hundreds of cities and localities across the country have taken their destinies into their own hands by calling for the construction of fiber-optic internet access networks. They’re not waiting for the federal government to act to make world-class fiber a basic element of a thriving life. Instead, the cheap, ubiquitous, reasonably priced public option that cities have been pushing will—someday—shame national policymakers into action. It’s clearly possible to have sensible communications policy, but it takes action at the local level to

 

So we should be glad to have all these local takes on the ethics of biometric data use. Thank goodness that Somerville, with its public sector ban, applies a different logic than, say, Plano Texas, which has enthusiastically adopted facial recognition technology with little public oversight. Thank goodness Portland is looking at a wholesale ban on commercial facial recognition technology within its borders. All of these places can do the hard work of figuring out where use of facial recognition and other biometric data by either private companies or public bodies is unethical, inappropriate, or immoral.

As more Somervilles, Planos, and Portlands decide on their different approaches to biometric identifiers, the public will continue to focus on this issue—and that will keep the pressure on both companies and government to reach a much-needed, national consensus on the use of biometric data. The hope is that someday, when all the good arguments are on the table and the pain of vendor compliance with a continued patchwork is too great to bear, the federal government will be shamed by the existence of good local laboratory test cases into adopting strong, basic rules for data use.

These might include: sharply constraining real-time use (as opposed to forensic or investigative use with a warrant in the criminal justice system) of biometrics for any purpose; permitting easy opt-outs from the use of biometric data for commercial purposes; greatly limiting the retention of all biometric data; requiring continued, intrusive auditing of (and public reporting about) the use of biometric data by both companies and government; swiftly punishing misuse of this data; and prohibiting biometric use in particular contexts that are prone to discriminatory activities, such as selecting people for particular jobs, insuring them, or admitting them to educational programs. That list is just a start. We have a great deal of policy work to do.

If we end up with sensible national policies constraining the use of biometric data—which is by no means certain—it will largely be thanks to the role of local government in America.

Filed in: Analysis, In the News

Tags: Cyberlaw Clinic, facial recognition technology, Susan Crawford

Contact Office of Clinical and Pro Bono Programs

Website:
hls.harvard.edu/clinics

Email:
clinical@law.harvard.edu