After receiving his PhD in political science from Duke University in 2009, Brendan Nyhan came to U-M to study health policy as a Robert Wood Johnson Foundation Scholar. His mentors were the Ford School’s Paula Lantz and Rick Hall. Nyhan was then hired by Dartmouth College, and by 2016 he was promoted to professor of government. He came back to U-M in 2018, this time to the Ford School. In addition to his research on (mis)information in contemporary society, Nyhan contributes to “The Upshot” at the New York Times; is a co-founder of Bright Line Watch, which monitors the state of American democracy; and is a 2018 Andrew Carnegie Fellow. State & Hill sat down with Nyhan to learn what drives him, what he thinks of his new home, and whether Russians are really controlling us on Facebook.
State & Hill: How did your interest in public policy begin, and how did it grow?
Brendan Nyhan: I have always been interested in public policy. After college, I helped start a fact-checking website called Spinsanity where we debunked false and misleading claims about topics ranging from the 9/11 terrorist attacks to the war in Iraq. In graduate school, I began to study political misperceptions — why people so often believed claims about politics and public policy that are false or unsupported by the best available evidence. During my time as a postdoc at Michigan, I turned my focus to misperceptions about health care reform and vaccines.
Why, in fact, do people believe misperceptions?
People tend to be biased toward information that reinforces their point of view or partisan affiliation and biased against information that contradicts their preconceptions. In recent years, we have seen elites and media outlets become more aggressive in exploiting those tendencies to spread rumors and false claims. That’s why we have to study where people’s information comes from in the first place.
Illustration: Dan Page / The iSpot
What have you discovered?
It turns out that “echo chambers” are rare– most people actually have a relatively balanced information diet. For instance, in the period just before the 2016 election, only 10 percent of the public consumed approximately 60 percent of the “fake news.” That group is relatively small but it’s also highly politically active, however. As a result, it could have a disproportionate effect on our political system. I also worry that people who consume this kind of toxic content are likely to feel more negatively toward the opposing party and to further spread the misinformation they encounter.
Why did you choose to come back to the University of Michigan?
I had been here for my postdoc, and I wanted to come back. At Michigan, people work across disciplinary lines and collaborate around common problems. I am now part of a grant proposal with people from all over the campus, for instance, to address the role of information in democracies. collaboration happens within Ford, too. And the students here are amazing. I am excited to be teaching them in a class next term on misperceptions and conspiracy theories in public policy.
Your research on misinformation has been widely published. What have you discovered about the current state of “fake news”?
Research suggests that the panic over fake news was overblown. It seems not to have swayed the 2016 election. The Russians can’t control us via Facebook, though they may have tried. Still, it is clear that online misinformation is a chronic problem that we must manage carefully. In particular, there are important policy decisions to be made about the online platforms given the influence they have.
Wait. The Russians can’t control us via Facebook? Some of us have been operating on a casual assumption that they can.
The idea that the Russians swung the 2016 election is not currently supported by credible evidence. We’ve been able to quantify how many people saw Russian information operations on Facebook; their reach and scope was limited. however, the problems of “fake news” and Russian disinformation could worsen if we do nothing. The key is to address these threats in a manner that reflects our democratic values. What Facebook and Google are doing to counter fake news, for instance, affects the flow of information on a global scale. The online platforms are making de facto public policy. We must therefore strike a balance between preventing misuse of online platforms and respect for free speech. A lot of misinformation comes from people expressing sincere beliefs. do we want to empower private companies to suppress them?
What is next for you in your research?
I’m excited to contribute to the science of information that is currently emerging. Digital tools allow us to monitor people’s news consumption minute-by-minute, which could change how we understand the flow of information in our democracy.
It seems that, at the Ford School, you have found a great home for investigating these issues.
I am especially excited about the leadership of the Ford School. Michael Barr, Liz Gerber, and Paula Lantz are doing incredible things. We are well positioned to take advantage of Michigan’s strengths and become an even more important force in Ann Arbor and nationally. This is a great institution.
—Story by David Pratt
Below is a formatted version of this article from State & Hill, the magazine of the Ford School. View the entire Winter 2019 State & Hill.