Gone are the days of mundane status updates, “likes,” and keeping tabs on high school friends. Today, Americans’ opinion of social media apps like Facebook, Instagram, and Twitter has decidedly soured, exhausted by a wave of scandals revealing the platforms’ mishandling of personal data, amplification of disinformation, and opaque advertising practices. Still, 69% of Americans say they use Facebook, and nearly 119 million log onto Instagram — a photo-sharing app — at least once a month.
In September, Facebook’s reputation took another hit when The Wall Street Journal published internal documents showing that, among other revelations, it knew more about the detrimental impact Instagram, which Facebook owns, had on some teenagers’ mental health than it let on. In the leaked reports, which Facebook has said were incomplete and taken out of context, the company’s researchers found that young people, and teenage girls in particular, often described feeling more anxious or depressed when using Instagram, and that the app made body image issues worse for one in three girls.
Worse, Facebook has largely declined to address the issues it uncovered, said Frances Haugen, a whistleblower and former product manager at the company, while testifying before a bipartisan Senate committee last week about the documents she had provided to the Journal. The result has been renewed scrutiny and calls for regulation of the powerful social media platforms that dominate so much of our lives.
“Anyone who has been paying attention to the social media debate over the last decade or half-decade wouldn’t be totally shocked by any of the harms that were detailed in the Facebook Files,” says evelyn douek LL.M. ’17, lecturer on law and S.J.D. candidate at Harvard Law School. “It’s not news that these platforms have potentially harmful effects on minors, or that they privileged certain high-profile users. But what was really different in this case was that it was based on Facebook’s own internal research, and it had the voices of employees inside the company saying, ‘We are not doing the things that we are saying we are doing publicly.’”
John Palfrey ’01, president of the MacArthur Foundation and HLS visiting professor of law, says that the leaks confirmed what he had found in his own research at the Berkman Klein Center for Internet & Society. “Our data show that for most young people, using social media is a net positive, a welcome part of their social and emotional life. But it is also clear that for some young people, particularly those who are suffering from eating disorders, body image issues, or fear of missing out on socialization, it can be extremely damaging.”
Those problems can be magnified, says douek, by Instagram’s algorithms, which are built to decipher a user’s interests and then feed them increasingly extreme content to keep them coming back to the app. “A user will go on the platform and start looking at certain kinds of content — say they look at the occasional diet product — and then the platform works out that they like this kind of thing. It keeps serving more of it so they stay on the platform longer and it can collect more advertising revenue,” she says. That targeting means a user could end up seeing more and more weight-loss and fitness-related images over time, exacerbating existing mental health and self-esteem challenges.
Instagram, of course, is not the first communications vehicle to come under fire for its impact on America’s children, nor is it the first time Congress has questioned how to respect free enterprise while protecting minors.
“Decades ago, as television became ubiquitous and kids accrued their screen time there, the government pushed broadcast networks to temper kids’ programming with educational content, and to shift material not suitable for kids to later in the evening,” says Jonathan Zittrain ’95, George Bemis Professor of International Law and faculty director of the Berkman Klein Center for Internet & Society. “These restrictions seem quaint today, even as services like Instagram exert a far stronger pull on kids’ attention, and tailor what they offer not to any educational or public interest goal, but to dependency.”
And yet, Zittrain adds, nothing Facebook is doing is necessarily illegal. Current U.S. law requires parental consent to collect data about children under 13, but is mostly hands-off with everyone else. “At least for older kids, this is all not only within the bounds of the law, but restrictions or guidance of the sort weakly deployed for television would have to weather First Amendment challenges if attempted to shape social media.”
douek agrees. “One of the things that this current situation is revealing is that Facebook can do a lot of things within the law that it knows are potentially harmful,” she says.
Palfrey says that in the early days of the internet, Americans — and Congress — were loath to introduce regulations that could diminish free speech or stifle innovation around the nascent technology. Today, he says, it is time to “stop thinking about internet companies as something that must be unregulated, and move into a mode where we see them as something akin to public utilities.” Indeed, Palfrey urges a move toward “ensuring a degree of disclosure of the things happening on platforms that have a public policy implication.”
douek says that even if there is general agreement that platforms harm some teenagers, that won’t necessarily mean an easy accord on solutions. Because the First Amendment bars content-based limitations on speech, that is, restrictions based on subject matter or viewpoint, “It’s really difficult to tell platforms that they can’t host certain kinds of content. You can’t outlaw ‘thinspiration’ or diet content or anything like that,” she says.
Yet ideas for reform are plentiful. Some, like Haugen, the Facebook whistleblower, have suggested altering section 230 of the Communications Decency Act, which shields social media companies from liability for what their users say, to remove protections for platforms’ decisions about the design of their algorithms. douek favors other alternatives.
“Nate Persily of Stanford has a proposal for getting external researchers access to this internal data, which Facebook employees are using to do the studies referenced in the Facebook Files,” she says. “We’re still operating on the basis of imperfect information right now. We have these internal studies that have been leaked; Facebook says it’s not the full picture, but Facebook won’t release the full picture. We need to work out exactly what the problems are before we try and solve them.”
douek says Congress could also opt to enact privacy laws to ban behavior-based content targeting. In other words, it could try to reduce or eliminate the power the apps’ algorithms have to wield users’ data to determine which content they see.
How we ultimately decide to regulate these platforms is critical, says Palfrey. “The class I teach at Harvard Law School encourages students to ask themselves, ‘How do you determine what the policy goals should be? And then what are the right mechanisms to regulate that?’” he says. “I think we need to do more to protect people, and in particular, the most vulnerable — young people, which is what Facebook’s own data is screaming.”
Even if Congress chooses not to act right away — or opts to do more research — douek says people shouldn’t discount the power of public shaming in bringing social media companies to heel. She points to Facebook delaying the launch of Instagram Kids, which was intended for children under age 13, after The Wall Street Journal’s articles.
Still, douek cautions against relying entirely on platforms to regulate themselves. “Experience so far has shown the limits of relying on self-regulation. Facebook has a profit motive, and it can’t be blamed for that — it’s a private company,” she says. “But too often, Facebook has shown that it will choose growth above mitigating the risks that its platform creates.”
Zittrain says that the Facebook Files, and Haugen’s testimony, were another invitation to the public to ask questions about what social media is doing to minors — and the rest of us. “We’ve all been subjects in an evolving massive-scale experiment about how our wants and behaviors can be exquisitely shaped and reinforced, and it’s fair to ask what that’s doing to us — and to demand that the data that could help answer that question not be cornered by the companies doing the experimentation,” he says.
At the end of the day, says douek, it’s simple: “Regulators are going to have to step up and do something if we are really going to change the status quo.”