Misinformation…Democracy’s Downfall?
Munther Dahleh: Welcome to Data Nation. I'm Munther Dahleh, and I'm the director of the MIT's Institute of Data, Systems, and Society. Today, on Data Nation, Liberty and Scott are getting the facts on misinformation in politics.
Scott: Misinformation, everyone has an opinion on it. Both sides of the political aisle are accusing each other of spewing fake news. [00:00:30] We aren't here to debate what's real news versus fake news. We want to know what the data really says about political misinformation. I mean, is this as big of a threat as we're making it out to be?
Liberty: Really quick, we should clarify some of these terms before we get started. There's misinformation and disinformation. Misinformation is information that is incorrect, but it's not intended to be wrong. It's just sloppily put out there, no fact checking, whereas disinformation [00:01:00] is information that is intentionally meant to deceive people. It's put out there incorrectly on purpose. It can be really easy to think that only people not paying attention fall victim to political mis- or disinformation, but the truth is misinformation is getting really hard to identify, and disinformation is almost impossible to identify because the point of it is to deceive you, so it's really likely that at one point or another, regardless [00:01:30] of how data or tech savvy you are, that we've all believed false information.
Scott: We saw this during the summer of 2020 during the Black Lives Matter protests. At the height of the protests, social media had a powerful influence on misinformation. Things like Instagram, Facebook, Snapchat and everything in between allowed random people to spread whatever information they wanted with no controls, not just random people, random entities that posed as people. No matter what you believed, if you were onto social media at this time, you were fed [00:02:00] this misinformation or at least exposed to it.
Conservatives attempted to paint the Black Lives Matter protestors as violent and unfounded. For example, there's a story circulating that they were designated as terrorist group, a completely unfounded accusation. There were other pictures circulating accusing protestors of beating elderly people, only to be later revealed that these photos were taken in South Africa years ago, totally unrelated to all this, yet they were latched on and spread and people believed them and actively protested and [00:02:30] organized around them. These are the types of things that were inflaming 2020 at the time.
Liberty: On the other hand though, no one is innocent in this misinformation game. A lot of these protests did eventually become violent as protestors started fires or were looting. There was a massive story on the left side of social media saying that it was strictly white supremacists posing as BLM protesters who were acting violent, but that's simply not true. I mean, although there were some white supremacists who did go to these [00:03:00] protests and incite racial tensions through violence, there were many BLM supporters who were charged with doing the exact same thing.
Scott: There are times when misinformation is obvious to most people. For example, there was a post circulating around that the government had lowered the age of consent to four. That sounds ridiculous. A little bit of critical thinking and some common sense will basically tell you that that's not the case. The problem is there are many times when the misinformation isn't so obvious in the moment like in the case of [00:03:30] the Black Lives Matters protests.
Liberty: The question is who is really spreading this misinformation and, in some ways more importantly, disinformation, and why are they doing it? We decided to talk about this with Katie Harbath. Katie is the former director of public policy at Facebook, the director of technology and democracy for the International Republican Institute, and the founder and CEO of Anchor Change.
I want to understand a little bit about who we think is spreading this the most. We throw around [00:04:00] these terms like fake news or disinformation as people throw these news around. Who is doing it the most? Who's most at fault?
Katie Harbath: It's really a wide range of actors, and it also really depends on the topics that you're talking about. There are some elected officials who are spreading mis- and disinformation. There are foreign actors, foreign adversaries, both state-backed, that are trying to do this, but also ones that just believe in the causes of Russia or China or whatever and are [00:04:30] trying to push it. There are people who are pushing this to make money. They're looking for clicks. They're trying to get people to go to their websites to make money. In fact, after 2016, the story at first was not about Russian misinformation. It was about Macedonian teenagers who were sharing it in order to make money off of advertising. There are political groups who are doing it. There's people who do it just for fun. I hate to say it, but there are.
Liberty: The trolls of the world.
Katie Harbath: Right. There's all sorts of those ones, so I [00:05:00] don't really think that I can pinpoint it to any one specific group that is doing it more than the other. It really depends on context and everything, and so it's quite a complicated landscape.
Scott: Facebook isn't the source or the originator or the inventor of political misinformation. These types of things have been around for hundreds, if not thousands, of years. Some examples, in US politics, just look at a pamphlet about the British and the Revolutionary War. Look at a pamphlet about the north or south [00:05:30] during the civil war. Look at World War II propaganda posters on both sides. This is a common thing especially in and around politics.
Katie, here's my question, and I bet you get this a lot. Is social media an increasing factor on misinformation and disinformation? Is it worse because of social media?
Katie Harbath: I do think social media certainly played a role in changing the landscape of what mis- and disinformation look like. They're certainly not the only players. I don't believe that if [00:06:00] you were to shut down Facebook tomorrow that this problem goes away. I think it continues. One of the trends that I'm seeing, in fact, and is something that keeps me up at night is how decentralized our internet consumption is getting. It is no longer just on the big platforms. You're seeing it through podcasts, through traditional radio, through media, through things like Twitter spaces. The content is becoming much more live or it's ephemeral, and it disappears after 24 hours.
These are a lot more [00:06:30] challenges in terms of trying to track the potential mis- and disinformation out there and to even combat it, and that's not even starting to even get into how a lot more of people's behavior is moving to messaging apps. Some of those are encrypted like WhatsApp, and so then you can't even look at the content, and you have to more think about the design of those apps to try to prevent some of the spread of this content.
Liberty: There's clearly a lot of platforms that people can use to spread misinformation and disinformation, and it's so decentralized [00:07:00] that it makes it a lot harder to control, but what about what we can control or could control, which is the social media giants, because if they pick a side, the other side is in trouble. I mean, look at Ukraine and Russia or look at the Arab Spring. That never would've happened. What if they choose what we think is the wrong side in the future?
Katie Harbath: Most of these types of situations when it comes to thinking about mis- and disinformation are a lot more complicated and gray, and there isn't necessarily that clear enemy against it. You are a hundred percent right [00:07:30] though in that we have to be looking at and worried about platforms that are not coming from democratic societies. In fact, two of the most popular ones, TikTok and Telegram, are not run by American companies nor are they based necessarily in democratic regimes. TikTok, obviously, their parent company is based in China. Telegram is a little unique. Their CEO is Russian, but he's based in... I think they're based in London and Saudi Arabia... or Dubai. Sorry. Dubai is I think where Telegram [00:08:00] is based out of.
It is something I worry about, the geopolitics of tech and how it continues to be weaponized and used not just from the decisions that the tech companies are making, but also the decisions that their home countries are making or other countries might make to shut those platforms down. You look again at what Putin did in terms of shutting down a lot of western platforms so that the Russian people can't use them to get news and information. It's this really delicate [00:08:30] balancing act that I think we're going to be continuing to move into with these tech companies and thinking about it.
Liberty: Everything makes this sound like the biggest scariest thing. We heard on both sides in 2016 how fake news was affecting either side. A lot of people believe that 2016 was won through fake news. People believe 2020 was stolen through fake news. Is this actually affecting? Do you believe that the disinformation and misinformation that was spread in 2016 affected the election? Is [00:09:00] this something that we need to be worried about actually affecting the outcome of the election in the future?
Katie Harbath: I have to be honest. I don't know yet, and I don't know if we've really given ourselves enough space and time and that we've really seen the studies to really understand how much of a factor it was. It certainly was a factor, but I don't know that I can honestly say that I think the only reason Trump won was because of it. There were so many other factors that went into it. Everything from how he [00:09:30] ran his campaign, how he ran it digitally compared to how Hillary Clinton did it, the mood of the nation, all of those types of things I think went into that, but it's been really hard to. We need more time and space I think to actually have some objective looking at that and understanding how much it really did impact it.
Liberty: The 2016 and 2020 elections were certainly not the first events in American history where mis- and disinformation was used [00:10:00] as, Scott, you said, but they were the first events in current American politics that caused a lot of people to start wondering like, "Whoa, is this misinformation hurting democracy?"
Scott: Yeah. The example of political misinformation that really brought the fake news to the forefront was the conversation around Cambridge Analytica and their involvement in the Trump campaign in 2016. Cambridge Analytica was founded by a bunch of data scientists in the UK, and they had this app that was collecting tens of thousands of interviews [00:10:30] of Americans back in 2014, and they were using that information to build models and targeting for the Donald Trump campaign.
Liberty: Yeah. I mean the idea was to profile and tailor advertisements and other information to these users as potential voters based upon psychological similarities, so then, what happens, these users looked at their feeds and they saw information, fake or otherwise, coming from only one political side, the Conservative side. Cambridge Analytica was painted as [00:11:00] this dark art, dark ops firm that stole Americans' privacy for the benefit of the highest paying Conservative candidate. They were painted as stealing the rightful position of the Liberal candidate away through manipulation, fraud, big data hacking and black arts.
Scott: Yeah, but, to be clear, under Facebook's guidelines, there were thousands, if not hundreds, of apps that were doing the same thing, grabbing interviews from people on the internet and using that information for targeting. The only reason why Cambridge Analytica [00:11:30] had headlines is because they were paid by the Donald Trump campaign. They were headquartered in the UK, and the founder was pretty mysterious. These are tactics normally done by every campaign. They're just a little bit more interesting about it.
Liberty: Exactly. I mean, if Cambridge Analytica had been a company that sold people refrigerators or anything else except being connected to Trump, no one would've cared, but Cambridge Analytica was hailed as the usurper and Trump's secret weapon. In 2008 and 2012, Barack Obama's campaign also [00:12:00] profiled prospective voters using information obtained through social media just as Cambridge Analytica attempted to do. As you know, a lot of entities are still doing that. I would say the difference is that, unlike Cambridge Analytica, they're doing it effectively. I guess the question is should we still be scared of the fake news, the misinformation and disinformation coming at us like a hailstorm because it hasn't stopped?
Scott: Yeah. No. That is the big question. As someone who's been partaking [00:12:30] in this industry for a while, I hope we're not scared of it. I hope we understand exactly how it works and, I think, the good companies out there doing it transparently, but the real question is what works? What actually works? What doesn't make headlines?
To that end, we've got this great guest, Adam Berinsky. He's the Mitsui professor of political science at MIT. He also serves as the director of the MIT Political Experiments Research Lab. This is where we can actually scientifically figure out what does move voters, what demotivates them, what motivates them, what makes [00:13:00] them tick. We can apply some science here, not some sensational headlines.
Adam, thanks for joining us. This is great. I always like to talk to other practitioners like yourself who can apply a little bit of a scientific lens to some of these social sciences that we're dealing with in politics. One of the things I want to break down with you is who spreads misinformation and disinformation, how much of a role does social media platforms play, and then also who determines what is exactly misinformation and what disinformation is.
Adam Berinsky: Where you stand depends on where you sit [00:13:30] in terms of where you're going to find as an acceptable answer for that. There is no single best, true answer to say this is the person who should be the authority of truth. That makes things very complicated. We have lots of different actors in this space, so I think the primary people that people turn to in this country are fact checkers. These are people who it's their job to research claims to see are they supported by the evidence that's available online, so that's one answer [00:14:00] is to say it's fact checkers.
Now, the problem is that not everyone, all citizens accept those answers. Some people say, "Fact checkers are biased." Now, it doesn't matter whether or not they're actually biased, but if you as a citizen believe that they're biased, you're not going to take their word as authority, and that becomes a problem. Now, I think that the answer depends on what you think it should be versus what it actually is. In an ideal world, those two things would [00:14:30] be the same. Think about the role of expertise.
Scott: I had a front row seat to 2016. I was the data science director for a major Republican candidate. I remember looking at the information coming and, while my guy didn't win, so it tells you who I worked for, I remember seeing all this stuff after 2016 about Russian disinformation. I remember looking at the old polling data and seeing that anywhere from 20 to 40% of Democrats think that Russia helped Donald Trump steal the election in 2016. Was [00:15:00] Russia actually trying to mess with the election, and did its way the 2016 election, and do we need to be afraid of the interference in the 2024 election?
Adam Berinsky: Like you, I'm skeptical that this actually was the cause of this. Thinking about the Cambridge Analytica story that we heard, thinking about, oh, they have these great psychological profiles of folks and they can use targeted ads, I find it very hard to believe that they could move a lot of votes based on just what I know about the effectiveness of micro- [00:15:30] targeting, the difficulty of political persuasion. On the other hand, you didn't need to move a lot of votes. We're talking less than a hundred thousand votes across the three states there. What I would say is, and this goes, too, looking ahead, is to say that probably it does not have a huge effect on a direct election.
However, what it does have an effect on is, as I said, this larger political climate. Basically, do you believe in elections? Do you trust elections? What [00:16:00] we found is that there've been huge effects there in terms of changes in percentages of Republicans who trust elections to be free and fair in the wake of the last two elections there. Going forward, I would say that that's the bigger problem. Should we be scared of fake news moving forward? Absolutely, not because it's going to swing any particular election, but the damage it's doing for the trust.
Scott: Trust in our institutions.
Adam Berinsky: Yep. [00:16:30] Yeah. As you say, it's not just on the Republican side. I live in Cambridge, Mass., which is probably a 102% Democrat, and so, the people I talk to, I hear all sorts of stories like, "Oh, not only did Russia try to interfere in the election, they actually hacked voting machines that flipped votes in Wisconsin." I'm like, "Not really."
Scott: That's not how it works.
Adam Berinsky: Right. It's a case where this larger question about [00:17:00] trust in government, trust in institutions is at stake.
Liberty: Based upon what political party you fall into or which news channel you watch, it's always the opposing party, that source saying the mis- and disinformation, but is there any quantitative data, real evidence that there is one political party that tends to believe and cultivate misinformation and fake news more than the other?
Adam Berinsky: There's a couple of aspects to that. The first is do [00:17:30] Republicans and Democrats genuinely believe different things just like ordinary citizens? The answer is yes. If you ask questions about the economy, how it's performing under a Republican president, Republicans are more likely to say it's doing well. Democrats are more likely to say that it's doing worse. Looking at the same factual situation, they come to different conclusions about the state of the economy.
That extends to political rumors as well, [00:18:00] that you see differences between Democrats and Republicans, but then the tricky question that you raise as well is what causes that? Are they less discerning Republicans and Democrats or are they being fed different kinds of information from their political leaders? I think that the balance of academic research shows that it's much more the case that people are listening to political leaders and it's the leaders who are at fault.
Liberty: Adam, how big of an issue [00:18:30] is this really, and what can we do to fix it and mitigate the amount of misinformation and disinformation that's circulating?
Adam Berinsky: What I would say is that I know a lot more about how much trouble we're in than what we can do to solve it. I think that part of it is going to be can we find authorities that people from both political parties at least in this country can agree are the authorities? One promising thing that I've done with some other colleagues, [00:19:00] and there's a bunch of folks who are looking into this, is the notion of crowdsourcing. Basically, can you get ordinary people to agree, have a bunch of different folks rating that are... some are Democrats, some are Republicans, and having that crowdsourced rating to say, "People like you say that this is the case." Perhaps, there is a role there for authority.
Another thing that I found is looking at what people [00:19:30] think, in the hypothetical, who should be dealing with the misinformation problem. They don't love fact checkers. They don't like social media companies, but if we look at subject matter experts, so like doctors on medical information, expert scientists on different kinds of policy matters, they're much more trusting of that. What I would say is that we don't want to assume that people have faith in different authority figures, but there are some authority figures that people do have faith [00:20:00] in.
Scott: Liberty, I don't know how you feel after the last set of conversations, but I feel a little bit better because, while political misinformation is a big deal, I don't know that it's really wrecking with US American politics and changing massively the outcomes. That's not to say it's not a problem, but after these discussions, what do you think?
Liberty: I think the good thing is that America is not doomed. Our democracy is still intact. Listen, Katie and Adam are experts in this field, and neither are convinced [00:20:30] that any of this fake news changed the outcome of the election. What they do seem worried about is how this affects the general trust in our current political system. While it's something that I think we need to think about, we certainly shouldn't be terrified that this is going to ruin elections in the future as it is now.
Scott: What do you think we can do? That's the question we've asked both the guests. That's something you and I have discussed quite a bit. I think a good place to start is social media is a place where you can get instant gratification, instant news. [00:21:00] Everyone just needs to take a deep breath before they share, before they react and apply some critical thinking and research. I know it seems absurd, but, a little research, you might find what people think is true really didn't happen at all. I think that's the takeaway here is not everything sensational is true, and let's just calm down and think this through a little bit.
Liberty: Yeah, and, I mean, given the fact that fake news has been around way before even the Revolutionary War, I think it's probably going to be around forever, but we haven't evolved that much since then.
[00:21:30] Thanks so much for listening to this episode of Data Nation. This podcast is brought to you by MIT's Institute for Data Systems, and Society. If you want to learn more about what IDSS does, please follow us, @mitidss, on Twitter or visit our website at idss.mit.ed.