At a talk hosted by the Berkman Center for Internet & Society on June 23, Mitali Thakor, a PhD student in MIT’s HASTS program and a Berkman affiliate, discussed her findings on techniques and strategies for preventing and prosecuting child exploitation and human trafficking, and how new digital approaches to addressing these issues affect young people online.

Thakor opened her talk with an example of the kind of policing she studies: An NGO based in the Netherlands designed an avatar, named Sweetie, that looked like a 10-year-old girl from the Philippines. Posing as the young girl in chat rooms, the group used the avatar as a tool to get email and IP addresses of sexual predators. After the group handed this identifying information over to Interpol, the International Criminal Police Organization, predators from around the world were brought to justice and police in the Philippines cracked down on child exploitation within the country.

On the surface, said Thakor, this seems like a good thing. But, she said, the purpose of her research is to show that it’s not that simple.

“It’s really hard to be in the position where I’m critiquing the people trying to do good activist work but I think that’s really necessary,” she said in an interview after the discussion. “There was so much lost in how people understood the issue, and that’s where I really started getting concerned.”

According to Thakor, the current system—centered on law enforcement and made up of NGOs, large corporations such as Microsoft, and the U.N.—reinforces larger societal problems that together contribute to human trafficking.

“It’s not about the individual person who is engaging in risky behavior. It’s about a larger culture,” she said.

According to Thakor, trafficking and exploitation are a result of misguided immigration policies and labor laws which contribute to massive structural poverty and inequality.

In her dissertation, Thakor hopes to present a critical perspective on the current system for dealing with human trafficking by using a feminist lens and focusing on the rights of young people online. Her research will also examine the technology that goes into projects to catch predators and missing children.

“Can we look at people in positions of power [and] understand what motivates them to do certain things?” Thakor said. “That’s a very feminist project, to think about projects of race, gender, privilege in the way that technologies are designed. Technology is not neutral. It never is. And so software is never neutral.”

Her research combines ideas on immigration, sex work, technology and the digital space occupied by young people. She’s not only interested in the process of catching sexual predators and finding lost children, but in analyzing how adult decision-makers talk about young people when they’re policing these spaces.

“I think we’re going towards a model where there’s a lot of technophobia and techno panics around young people’s use of the Internet,” she said, using as an example the Malaysian government’s extreme censorship of anything it considers morally reprehensible.

Because her research is focused on how we talk about children and their digital lives, Thakor said she’s forced to consider other issues related to the technology, such as the potential for exploitation and even surveillance. When companies are developing facial recognition tools that can recognize a young nude body on Facebook, where else will they use that software? What assumptions are they making about the young skin they’re trying to identify? And with the Internet becoming an ever-more undeniable part of young people’s lives, how are these developers talking about the children they are trying to protect? Within the use of technology there are underlying questions of surveillance, marginalization and universal rights.

“There’s a lot of work that needs to be done on re-conceptualizing children, childhood and young people in digital spaces in terms of healthy sexuality and a harm reduction model,” she said. “A lot of the policies today are still based on much older ideas of sex education and access to information and how we think of young people. If you have a fear-based model of sexuality and sex education, then that’s just going to get replicated with Internet policy.”