Author of the best-selling “Nudge,” about influencing people’s behavior for their benefit, Professor Cass Sunstein ’78 has just published a new book titled “#republic: Divided Democracy in the Age of Social Media.” It seeks to bring together people who are increasingly polarized by consuming information that conforms to their political beliefs and social groups. Sunstein, who served as administrator of the White House Office of Information and Regulatory Affairs in the Obama administration, argues in the book that self-insulation facilitated by the internet and social media has harmful consequences for our democracy—one of several topics he covered in a recent interview with the Bulletin.

You previously wrote about some of the same issues you address in your new book in “Republic.com” (2001) and “Republic.com 2.0” (2007). What’s changed since the publication of those books?

The rise of social media! Facebook and Twitter now have a major role in political debate, and that adds a new element to our democratic system. Most of all, it simplifies the creation of echo chambers, which is a real problem. The earlier books explored the problem of self-sorting, but with social media, that problem really requires a different kind of analy­sis. There is a real connection here with the problem of mutual misunderstanding and with failures of problem-solving.

Can you expand on the effects of the echo chambers and the dangers?

If you live in an echo chamber, you won’t exactly expand your horizons, and you are likely to get a narrow and distorted understanding of both politics and culture. Take the issue of regulation: If you think that it is necessary to control powerful interests, or instead that it is a harmful way of expanding government power, you are not going to be able to make progress on, for example, safety on the highways. And if many people live in echo chambers, it won’t be easy to solve social problems.

Call it Hamilton’s nightmare: Hamilton prized the “jarring of parties,” but that was because he thought it would promote circumspection and deliberation. Sometimes that happens, but echo chambers make it less likely. People need to learn from one another. There’s a ton of information out there and it is best if it is shared across “tribes.” If you speak to like-minded others, you will probably get more confident and more extreme—and your group will get more unified. A unified, confident and extreme group is not likely to play well with others. And when there are many such groups, self-government may not function so well. Some of the problems in modern American government are a product of relentless self-sorting.

How can people be “nudged” to consume information that’s outside of their echo chamber?

Facebook could help by improving its News Feed. How about a serendipity button, by which people could choose to see a random sample of perspectives, and also topics? Or an opposing viewpoints button, by which people could choose to see views that they disagree with? Lots of people are working on creative ideas of this kind.

Some research indicates that people are unwilling to accept facts that diverge from their preconceived notions. What influence do you think alternative information will have on people who have already formed opinions about issues?

“A unified, confident and extreme group is not likely to play well with others.”


If you are really committed to a view—say, that dropped objects fall, or that the Holocaust happened—you won’t be affected by alternative views. But on many questions, people do listen, especially if those who offer new views have some credibility (say, because they are experts, or because they would not be expected to have those views). If people have deeply entrenched views about climate change, for example, of course they will be skeptical of people who cast doubt on those views. But on many issues, our views are not so entrenched, and some sources have real credibility to us—say, because they are evidently specialists on climate science, or because something in their background and history suggests that we ought to listen carefully to them. Consider here the idea of “surprising validators”—not an elegant term, but when people hear someone who surprisingly endorses a particular perspective (whether left or right), they start to listen more closely.

For decades, the government mandated a Fairness Doctrine in order to air opposing views. Does government have a role in ensuring that people are exposed to different viewpoints? What about the responsibility of private entities that share information with a wide audience?

I think the government should stay out of the censorship business, though subsidizing public broadcasting seems pretty reasonable to me. Facebook does have public responsibilities, and it seems to be thinking pretty hard about them. It should not just be providing people with information cocoons.

As someone who worked in the Obama administration, what information do you expose yourself to that may be contrary to your worldview, and how does this exposure affect you?

I regularly read The Wall Street Journal, National Review, and The Weekly Standard—and the nonpolitical American Economic Review, The Economic Journal, Judgment and Decision Making, Journal of Risk and Uncertainty, and The Quarterly Journal of Economics, which have diverse findings on various issues—and [the writings of] law professors who don’t much like the Obama administration. My own views are pretty eclectic (meaning that I diverge from Democratic orthodoxy on many issues), and I try to read a lot from left, right, and center.

What can be done within the limits of the First Amendment to neutralize fake news?

You’d need a whole law review article on that one; the doctrine is not simple. For good reason, there are sharp constitutional constraints on the government’s ability to censor news, even if it deems it fake. The main responses should come from the private sector, not government. To be sure, the free speech principle allows control on libel (within limits).