Using the Tools of the Trade: The Campaign to Stop Killer Robots
Last summer, after two years at Harvard Law School, I elected to take a leave of absence to join President Obama’s re-election campaign. My decision had less to do with any affinity for the President and more to do with my disillusionment with law school in general. I had enrolled with aspirations to enter public service, believing that by simply attending classes in the same building as Charles Hamilton Houston, the famed civil rights lawyer, I’d follow in his footsteps.
After a month of lectures about water property lines, chicken sexing, and figuring out whether a tomato was a fruit or a vegetable, I began to question whether law school was really the right choice for me. If my goal was to combat systemic inequities, could an education that focused on how to work within the status quo—rather than challenge it—be the best path? As the saying goes: will the master’s tools ever be good enough to dismantle the master’s house?
In London recently, I had the opportunity to find out. I traveled there with a team from the International Human Rights Clinic, which I joined after returning to HLS in January. For the past few months, we had been working on the controversial topic of fully autonomous weapons, which are essentially drones that can target and kill without any human intervention. These weapons don’t exist yet, but technology is moving rapidly in that direction, and precursors are already in use.
A coalition of nongovernmental organizations (NGOs) had gathered to launch a campaign to ban these “killer robots,” and I was there with my clinical supervisor, Bonnie Docherty, also a senior arms researcher at Human Rights Watch, to participate in it. At a pre-launch forum for campaigners, Docherty was busy giving a presentation in one room while I slipped into a session on the ethics involved with fully autonomous weapons.
“And this brings me to my last point of the presentation,” said science philosopher and robot ethicist Peter Asaro, sitting in his swivel chair. “These fully autonomous weapon systems – killer robots – will have the authority to both target and eliminate whomever with no regard for due process.” Asaro’s allusion to basic criminal law doctrine surprised me. I glanced around the room, wanting to make sure no one else wanted to speak before I raised my hand.
“What do you mean by ‘due process’?” I asked, then paused to collect my thoughts. Truth be told, two years after Professor John Goldberg first lectured section 3 on the subject of Criminal Law, my command over the doctrine was rusty. “Don’t get me wrong,” I continued. “I reckon it’s a salient point, and I agree that due process should enter the calculus of our critique, but just how much process is due? How do we go about defining it on an international scale?”
Asaro nodded his head and reclined his seat. “Leave it to the lawyer to ask the tough questions. Look, more research is definitely needed – our work isn’t done yet –”
“And that’s why we’re all here,” Jody Williams interjected from her seat beside me. Sixteen years earlier, Williams had won the Nobel Peace Prize for her work banning landmines. The “we” she referred to that day was a collection of some of the world’s leading disarmament and human rights activists—and me, a 3L.
Over the next two days in London, I had the privilege to both witness and collaborate in the launch of a campaign we hope will lead to a treaty to ban fully autonomous weapons preemptively. It was both fascinating and inspiring to see international activists and renowned scientists working together. The challenge was great: develop arguments that use hard scientific research but appeal to a broad audience while maintaining language that conveys righteous indignation over international law violations. Watching it all come together, I felt I’d come a long way from pondering chicken sexing in torts class.
That first day, coalition members discussed every aspect of the campaign – from logo design to the eventual language wanted for a treaty. Then, on day two, they moved on to the media phase. Reporters from a range of news outlets filed into the Frontline Club for a press conference, while campaigners tweeted about the launch under the hashtag “#killerrobots.” Within hours, stories about the campaign appeared everywhere from CNET to the Huffington Post to the BBC. There was a photo-op with a friendly robot outside of Parliament, followed by a briefing in the Palace of Westminster, where the House of Commons and House of Lords meet.
Reflecting back on the trip, my conversation with Docherty at the end of our first full day in London stuck out.
“How’d the ethics discussion go?” she asked, after Peter Asaro’s presentation at the NGO forum concluded. “It went well, I think,” I told her. “But I’m sorta confused on this issue of due process in international humanitarian law. Can you walk me through this?”
As she explained it, the utility of the master’s tools became more and more evident to me. A graduate of Harvard Law School herself, Docherty has for years been using the same tools to advocate for human rights and civilian protection in war, notably during the successful campaign to ban cluster munitions. In a room with political activists, ethicists, and scientists, I could see the important role lawyers also played in the production of frameworks that protect human and civil rights worldwide. It may have taken a while, but thanks to the International Human Rights Clinic, I now know how to begin using these tools–and I’m ready to get started.
Read the Clinic’s joint report with Human Rights Watch on fully autonomous weapons here. Learn more about the Campaign to Stop Killer Robots here.