In This Episode
How dangerous is disinformation on the Internet and what can we do about it? We learned in 2016 how viral lies and conspiracy theories can impact our elections, but they can also lead to violence abroad. We look at Myanmar, where hate campaigns on Facebook helped fuel ethnic cleansing.
Host Ben Rhodes talks to the tech watchers in Southeast Asia who spotted disinformation and warned Facebook, and asks leading experts and activists about how to regulate Big Tech.
Transcript
EPISODE 4 – DISINFORMATION
In July, twenty-sixteen, Hillary Clinton seemed like she was on her way to the White House.
She was leading Donald Trump in the polls by a wide margin. Any other year? Her staff woulda felt pretty confident.
But around the same time…
[MUSIC BEGINS]
…they also found themselves dealing with something… unprecedented.
CNBC REPORT: The Russian government and hackers penetrated the computer network of the Demcratic National Committee and gained access to the entire database.
Jake Sullivan was Hillary’s top policy advisor. He and others suspected the DNC hack was part of a larger information war being waged by Russia…. to tear down Hillary Clinton and get Donald Trump elected.
And even as Trump egged Russia on…
TRUMP: Russia if you’re listening, I hope you’re able to find the 30,000 emails that are missing.
…A lot of people thought Jake and the Clinton campaign were crazy. Including, sometimes, themselves.
SULLIVAN: We had a sense that maybe we were conspiracy theorists, you know? Maybe, this couldn’t possibly be happening because we hadn’t seen anything quite like it before, and because we would go talk to the senior producers of networks and the editors of major newspapers and tell them, and they’d look at us like we were wearing tinfoil hats.
What they were seeing went way beyond the hacked emails.
Like Facebook groups, set up by fake accounts, spreading lies about Hillary’s civil rights record, urging African Americans not to vote.
Crackpot social-media conspiracy theories about Hillary getting caught in some kind of criminal act by President Obama, who had made her his pawn.
Online videos suggesting she was suffering brain damage from a fall in twenty-twelve.
WATSON ARCHIVE: Is Hillary on the verge of a mental breakdown due to stress, or are her strange outbursts linked to a medical condition?
Eventually, these fake stories, would get picked up by outlets like Fox News. Or at least by guests when they *appeared* on Fox News. Like Rudy Giuliani.
CLIP: GIULIANI: So go online and put down “Hillary Clinton, illness,” take a look at the videos for yourself.”
Then the rest of the media would report on the controversies. Which made it all seem…reasonable, even mainstream.
It was a symphony of lies. And the evidence pointed to an invisible conductor: Russia.
[MUSIC ENDS]
But still… a hostile nation trying to determine the outcome of a U.S. election… deploying social media platforms most often used to spread cat videos? To a lot of people, it seemed like a conspiracy theory.
SULLIVAN: So we had moments of genuine self-doubt about how extensive and systematic this was, or maybe we were, you know, making more out of it than it was.
Turns out… they weren’t.
[CLIPS]
NEWS RUSSIA: US intelligence has concluded that Russian President Vladimir Putin ordered his military to help Donald Trump win the election.
RUSSIA: The 37-page indictment says the Russians, beginning in 2014, used fake social media posts and advertisements to sway opinion in favor of then-candidate Donald Trump.
RUSSIA: Twelve Russian intelligence officers are charged with hacking into the email servers of the Democratic party, and the Hillary Clinton campaign.
[THEME MUSIC BEGINS]
I’m Ben Rhodes, and welcome back to “Missing America.” A look at the political diseases sweeping across the world in the absence of American leadership.
This week: Disinformation.
In the years after the election, the world learned just how much Russia had created many of those fake websites, fake Facebook groups, and fake news. And you can be sure they’re still doing that…as I speak.
But Russia isn’t the only actor using online lies to create real-world chaos. From Southeast Asia to the United States, disinformation is a growing part of the political landscape. Today, we’ll hear about how quickly it infected one country:
PETERSEN: The Internet had barely started taking off in Myanmar. And still there was sort of this really powerful effect of social media just amplifying something that anybody could have posted.
And we’ll learn what the U.S. could do about it — if our own President wasn’t a constant source of disinformation himself.
Leading the fight, in a virtual war. On this episode of Missing America.”
[MUSIC ENDS]
ACT ONE
On January 5, twenty-seventeen, I was sitting in the Oval Office. Watching leaders of the U.S. intelligence community brief President Obama and Vice President Biden about the scale of Russia’s intervention in our election.
I wasn’t surprised.
I’d seen the Russian government’s disinformation machine at work, two years earlier — during their war with Ukraine.
CBS NEWS CLIP: New video shows the fiery wreckage just moments after the downing of Malaysia Airlines flight 17…
You may remember the story. A civilian passenger plane was shot down — probably by accident. It’d been hit by a Russian missile. Over the part of Ukraine controlled by Russian-backed separatists. Advised by the Russian military.
But Russia’s foreign ministry quickly deflected the blame. With an avalanche of lies.
NEWS CLIP:: Russia has repeatedly denied any involvement in the shooting down of MH17, and the Kremlin has promoted many alternative theories on what happened.
They said a *Ukrainian* combat aircraft did it. *Ukrainian* surface-to-air missiles were in the area. The missile was fired from territory controlled by the *Ukrainian* government.
NEWS CLIP: RUSSIAN MILITARY PRESS CONFERENCE W/ENGLISH TRANSLATOR: This is conclusive evidence that Ukraine was not just involved in this tragedy, but it also manipulated the international investigation.
Nevermind the stories contradicted each other. Never mind they could all be debunked.
The official investigation would drag on for over a year before reporting all the facts. Meanwhile, Russia’s state-run news channels and fake social media accounts? Flooded the internet with disinformation within days….and the lies moved faster and farther than the truth.
I remember being frustrated. Our government had no way to stop this disinformation…we had little capacity to fight back.
Russia seemed to have thousands of people pumping out fake news. They were well-funded. They were willing to lie. I had a small press staff, a handful of official Twitter accounts, and an obligation to back-up any assertion with, you know, *actual evidence.*
No wonder that in our twenty-sixteen election, Russia was able to pull off the intelligence coup of the century. But since then we’ve learned the damage that can be done… even without Russia’s resources.
SULLIVAN: What’s interesting to watch now, as you look at these disinformation campaigns — basically in every major political context now — is how simple and replicable and yet hard to stop this strategy is. That’s what’s so alarming, is that it’s not some super sophisticated, highly resourced, “can-only-be-done-once-every-few-years-effectively” kind of operation. It’s something that can be copied, scaled, adapted and executed basically on a moment’s notice for exploiting political divisions in any society, in any context.
To get a sense of the danger involved.… look no further than Myanmar, in Southeast Asia. Maybe the purest example of how much damage social media disinformation can do. And how quickly…
And how the institutions most able to stop it…don’t.
[MUSIC BEGINS]
You may still know Myanmar as “Burma,” its British, colonial name.
For decades, it was ruled by a military dictatorship that kept the country cut off from the rest of the world. Think: North Korea, only twice the size.
The military also cracked down repeatedly on an ethnic Muslim group called the Rohinjya in Myanmar’s Rakhine State. They were a convenient minority against whom to rally the country’s Buddhist majority. Muslims and Buddhists would clash. Then the military would crack down on the side of the Buddhists. There would be deaths. And mass displacement.
CNN NEWS CLIP: Simmering tension between Rohingya and the larger Buddhist community exploded in June of 2012, killing hundreds of people and leaving thousands of homes burned to the ground.
By the mid-twenty-tens, that tension still existed.
But in other ways, Myanmar was changing. The government had opened up space for independent media and civil society. Political prisoners were released. One was even elected to lead the country, routing the military’s chosen candidate. I remember travelling to Myanmar with President Obama around this time and tens of thousands of people greeted his motorcade. It was a hopeful time….democracy was taking root.
Jes Petersen is a Danish social entrepreneur – using technology to empower people. He moved to the country just in time to watch the political transformation.
PETERSEN: But there’s also another reason that I think for me personally made it a really interesting time to be in Myanmar. And that was sort of the rise of technology. Where until I suppose early 2014, mid 2014, barely anybody in Myanmar had access to the Internet. So, SIM cards were super expensive. There was not really any 3G or 4G. Fixed internet connections were just not a thing for anybody, perhaps, except for the wealthy elite.
Until… as part of its democratic opening, the government allowed two telecom companies to set up mobile networks across the country.
In twenty-fourteen, they both launched.
PETERSEN: And so basically overnight, the cost of a SIM card dropped from hundreds and in some cases thousands of dollars, to basically the equivalent of a dollar and a half. So you could take that SIM card and pop it in a smartphone to go on the Internet, which is something that most people hadn’t been able to do before.
Saijai Liangpunsakul was also working to help young Burmese use tech to start companies and create jobs.
She remembers twenty-fourteen as an optimistic moment for Myanmar. After all, the Arab Spring movement had just used Twitter to help topple a dictator in Egypt.
LIANGPUNSAKUL: When the whole country have access to Internet, there’s so much hope and excitement. You know, people see technology as a force for civic engagement. So at that time, technology is hope for the country.
That hope turned out to be short-lived.
And a big reason…. was Facebook.
[MUSIC ENDS]
See, in Myanmar, people largely accessed the internet from smartphones. But many had never owned a smartphone before.
They’d also never downloaded apps from an app store — they didn’t know what an app store was.
So whoever sold them the phone would load it with an app or two. One of which was almost always… Facebook.
LIANGPUNSAKUL: When people go to the shop, Facebook is often installed on the phone. So that was the first application that people, you know, have access to. From having no information about what is Internet at all. So it’s become… Facebook is internet in Myanmar.
RHODES (ON TAPE): So Facebook and the Internet are indistinguishable.
LIANGPUNSAKUL: Yeah.
It bears repeating: in Myanmar, Facebook *is* the internet. Many people never use any other website or app — Facebook is their entire online experience.
Imagine living your whole life having access to little information other than government propaganda. And then, overnight, in 2014, you think you have access to all of the information in the world. And it’s….Facebook.
LIANGPUNSAKUL: A lot of people that I talked to, first time when they saw things on Facebook, they thought that everything on Facebook, it’s the same as newspaper. Right? There must be, whatever put on Facebook, must be true.
Saijai says she had friends who thought that, like a newspaper, every article on Facebook had been vetted by some kind of Facebook editor. And a fact-checking staff.
LIANGPUNSAKUL: So that was people’s perception of Facebook, because they never exposed to technology. So it’s really easy for bad actors to use that against them.
Sure enough, in a country still divided by ethnic conflict? Bad actors *did* use Facebook to throw fuel on the fire. Almost immediately.
[OMINOUS MUSIC BEGINS]
Jes Petersen.
PETERSEN: Back in the mid of 2014, I think it was, in July, in Mandalay, in the north of Myanmar — There was a blog post that popped up on the Internet accusing a Muslim shop owner, falsely, of raping a young Buddhist girl. What happened very quickly was that the content from this blog post moved onto Facebook, where it started spreading like wildfire. Lots of people started taking to the streets. There were riots that broke out between Muslim and Buddhist groups…
AL JAZEERA NEWS CLIP: [CROWD CHANTING] REPORTER: The crowd threatened to kill all Muslims, saying they want to get rid of them. They want to avenge the death of a Buddhist man who was killed by Muslims during riots that started on Tuesday…
PETERSEN: …What happened eventually was that the government resorted to shut down Facebook in Mandalay and around Mandalay, to sort of kill the conversation about all of this until the riots died down and stopped. And I think, you know, this was at a time when the internet had barely started taking off in Myanmar. And still, there was sort of this really powerful effect of social media just amplifying something that anybody could have posted.
The hate speech and conspiracy theories continued. Month after month, for years. Jes walked me through another moment when violence threatened to erupt:
PETERSEN: In 2017, there were sort of two sets of chain messages going around Facebook. And one of these chain messages were warning people that Muslim extremists were planning a terrorist attack. Sort of nonspecific, just broadly, “Be careful in the next few days. There are Muslim jihadists who are planning a terrorist attack.” And at the same time, another chain of messages were going around Facebook Messenger, warning Muslims that Buddhists were planning an anti-Muslim movement and riots. And obviously, it was it was quickly pretty clear to people monitoring this, that this was likely to come from the same place — and to have an aim of sort of creating tension, and riots, and perhaps violence between Muslim and Buddhist groups.
Who was behind all this? What was the motive?
The New York Times would later report that Myanmar’s military played a central role — part of a concerted campaign to stir up Buddhist nationalism and a deeper distrust of the Muslim Rohingya.
But at the time? All Petersen knew was this was a disaster waiting to happen.
PETERSEN: What happened then was that my colleagues and I and a bunch of other organizations and people in our community here in Yangon reached out to Facebook about this and told them, “Listen, folks, um, you should take a look at this. Because these messages are going around. And we know from before that this can have really serious consequences, So we think you should put a stop to it.”
Here’s the good news: Facebook took down the messages. It deactivated the fake accounts that spread them. And just like that? —
[OMINOUS MUSIC ENDS]
— this one disaster was averted.
But Jes knew there was bound to be another one. Because the incident he’d flagged pointed to this much bigger problem:
The fact that he’d had to alert Facebook about it in the first place.
See, at the time, Facebook deployed around seventy-five-hundred human moderators to keep an eye out for fake news in countries across the globe.
But almost none of them… were monitoring Myanmar. Saijai Liangpunsakul.
LIANGPUNSAKUL: I think like at the beginning, Facebook see Myanmar as a new market that hasn’t been explored. Right? So they came to the country without thinking of “What are the impact of Facebook for the population?” Like if you look at the data, around 2014, Facebook has only about two or three content moderators for the country.
RHODES: For the whole country?
LIANGPUNSAKUL: For the whole country.
RHODES: Two or three people…
LIANGPUNSAKUL: Two or three people.
RHODES: …For a country of 60 million people.
LIANGPUNSAKUL: Yeah; two or three people who monitor hate content, review the content on Facebook.
None of those monitors actually lived in Myanmar, by the way. The company didn’t even have an office there.
In other words, Facebook had become the dominant source of information in the entire country. But civilians — many of whom were totally new to the internet — were expected to police the country’s millions of posts.
Not surprisingly, that wasn’t enough to stop the disinformation. It continued on Facebook all through the summer of twenty-seventeen. Ginning up hatred on both sides of the conflict.
LIANGPUNSAKUL: I think what scary is not just what they said, they also call for action in that message, right? It’s not just, ‘I hate Rohingya’ or ‘Rohingya a dog.’ They also asked people to do something about that. That’s the scary part.
With predictable consequences.
[MUSIC BEGINS]
BBC NEWS CLIP: The world’s fastest-growing humanitarian crisis, as thousands of Rohingya refugees are spending a fourth night stranded near the border with Bangladesh
In August, Rohingya militants attacked police checkpoints.
In response, Buddhist extremists and the military attacked Rohingya villages — burning their homes, killing people, and driving hundreds of thousands of men, women and children out of Myanmar.
The U.N. called it “a textbook case of ethnic cleansing.”
A director at Human Rights Watch in Asia? Placed the blame partly on Facebook, for letting its platform become a clearinghouse for propaganda.
He called the company “absentee landlords.”
Since then, Facebook’s been shamed into hiring more content monitors in Myanmar. And it’s made efforts to flag and limit posts that might be disinformation.
But it still doesn’t have an office in the country, where monitors could be attuned to what’s happening on the ground.
And most importantly, Facebook’s algorithms are still designed to favor the sensationalist posts that tend to attract views… mainlining hate into users’ newsfeeds.
Meanwhile, even more of Myanmar’s people have joined Facebook. And there’s an election coming up.
LIANGPUNSAKUL: I think technology play a big role in the upcoming 2020 election. And…I’m a bit scared, actually. From what we saw with what happened with Rohingya. Because in this upcoming 2020 election, we know that most of the young people will be on social media. All the political party will use social media to get their message across, and we know that there going to be a lot of bad actors who are playing a big role on this.
Myanmar’s an extreme case of the damage disinformation can do. But wherever you live in the world….the chances are, you are consuming disinformation on social media.
So if companies like Facebook won’t protect us from disinformation… and if more and more governments are propagating it… what’s to be done?
That’s where we come in. Stay with us.
[MUSIC ENDS]
— AD —
ACT TWO
[MUSIC BEGINS]
So… when I tell you the United States needs to lead the fight against disinformation… I know it might be hard to fathom why.
Especially since it’s now pretty well known our own President’s re-election campaign is busy spreading disinformation. On a daily basis.
MSNBC NEWS CLIP: The Times reports that a consultant for Team Trump has created a fake campaign website for Biden, that’s unflattering to say the least. And in the last 3 months that fake website has become the most popular Biden website on the internet.
But I have to remind you: Facebook? Twitter? Google? All these platforms Russia used to hack our last election? The same platforms Myanmar’s military used to spread hateful lies?
They’re almost all created and run by American companies.
It’s not only our responsibility to reign in the Frankenstein monster we’ve created. We’re the ones best-positioned to do it.
Even if a lot of the best ideas for how to do it… happen to come from overseas.
[MUSIC ENDS]
Marietje Schaake spent ten years representing the Netherlands in the European Parliament. And her advice to us… is to start doing something Europe’s already taken steps towards doing: Regulate social media platforms.
Because… why wouldn’t we?
SCHAAKE: In many ways it is almost unprecedented and hard to imagine how such a powerful sector – billions and billions of dollars in revenue and profit – has remained so un-regulated. And I think regulation is inevitable, not for its own sake, but to preserve the rule of law online, to preserve democratic rights and human rights online. And logically, Americans should lead in these regulations if they believe that those values are integral to their quality of life.
There’s a big obstacle to regulating these companies, though:
America’s current President is as eager to weaponize regulations… as he is disinformation itself. Remember his executive order this May, aimed at regulating social media?
TRUMP – They try to silence views that they disagree with by selectively applying a fact check…This censorship and bias is a threat to freedom itself…Therefore today Im signing an executive order to protect and uphold the free speech and rights of the American people.
Of course, Trump’s order was intended to cow social media companies. So they’d let more right-wing disinformation — including his own — continue to spread, unchecked. A wild west where any attempt to clamp down on hateful lies is called “censorship.”
Trump’s order probably won’t take effect — crafting regulations is Congress’s job. But in hearings with social media CEOs, some members of Congress barely seemed to understand how these platforms work in the first place.
[AUDIO CLIPS]
HATCH: How you sustain a business model in which users don’t pay for your service? Senator, we run ads. I see…that’s great
FISCHER: So, how many data categories do you store (hesitating) on the categories that you collect?… Senator, can you clarify what you mean?
BLUNT: Now you understand this better than I do, but maybe you can explain to me why that is complicated.
So in this fight, progressives will have to make sure that Congress understands the scale of the problem and the need for action. And then, Schaake says, we have to frame the debate over regulation carefully. Not as an attempt to censor… but to protect.
SCHAAKE: The way in which I think about the role of government is not to regulate a platform, or what people call “regulate the Internet,” but to actually regulate FOR free expression. To regulate AGAINST discrimination. To regulate FOR fair competition. And these — nondiscrimination, freedom of expression, fair competition — they are actually not that controversial in our society.
And how do you regulate a platform like Facebook “FOR” freedom of expression, without *restricting* that expression?
Schaake says: You don’t regulate the speech. You regulate the algorithms that determine what kind of speech is rewarded.
DUAN: I think we can’t live in the world anymore where we think that technology is something that is neutral.
Paul Duan agrees with Schaake. He used to work in Silicon Valley. Now he’s a tech activist and entrepreneur in Paris.
DUAN: So here I’ll give you one example: If you look at Facebook’s news ranking algorithms, or if you look at the way that Google will rank news or do that on YouTube, very often they will say that that the algorithm is neutral because it just promoting the videos that the A.I. has decided will have the most engagement and will bring in the most ad revenue. But in reality, this is not neutral. Right? Because the content that tends to drive more engagement tend to be content that is more controversial. Content is more sensationalistic.
RHODES: Conspiracy theories, yea.
DUAN: Right And so in this case, you have a direct conflict between what the market is optimizing for and what we may say are the values of our societies.
In other words, social media algorithms push disinformation to maximize profits — regardless of whether it’s destroying society. So regulating these algorithms is in our public interest. And not just our own..
Because disinformation is an international plague.
[MUSIC BEGINS]
In 2019, EU countries created something called The Rapid Alert System, to identity and squash disinformation campaigns before they can spread. The system was triggered a few months ago. When Europe was flooded with disinformation about the coronavirus.
This kind of international cooperation… is gonna be increasingly important.
So says Graham Brookie — he’s Director of the Digital Forensic Lab. They expose and explain disinformation campaigns worldwide.
BROOKIE: Disinformation is a collective problem, and it doesn’t respect our neatly defined borders. So with the problem of disinformation, we’re always looking for areas by which we can create global standards. So whether that’s, you know, data protection in Europe that the companies have to apply to the entire world, or whether that’s, you know, telecoms policy in Brazil that now has to be applied to Europe or to India or wherever.
A couple of years ago, the European Union drafted a version of this kind of international set of standards. The “E.U. Code of Practice” was supposed to compel social media platforms in Europe to police disinformation more vigorously. And respond to it faster.
Facebook, Twitter, and other platforms all signed on to the Code. But it hasn’t had much effect. Probably not least of all because compliance… is voluntary.
[MUSIC ENDS]
Similarly, a slew of big companies – voluntarily – pulled their advertising from Facebook this summer. It was a boycott, after Facebook let lies spread across the platform in the wake of the Black Lives Matter protests.
But what’s preventing them from plunging their advertising dollars right back into Facebook as soon as Black Lives Matter isn’t front page news?
Mark Zuckerberg himself reportedly told his staffers advertisers would return quote “soon enough.”
NEWS CLIP: FACEBOOK MEETING: Civil rights groups say that they remain unconvinced that Facebook is doing enough to combat hate speech on its platform. Representatives discussed the company’s handling of hate speech with Mark Zuckerberg and other executives on Tuesday. Many of them called the meeting quote: disappointing…
Just last week, a self-described militia on Facebook posted a “call to arms” against BLM protestors in Kenosha, Wisconsin.
Just like Jes Petersen in Myanmar, users warned Facebook the posts could lead to violence. And Facebook did finally take down the militia’s page… After this happened:
KENOSHA NEWS: Two people were killed during a Black Lives Matter protest in Kenosha, Wisconsin.
ANCHOR: Investigators say it may have been a vigilante attack carried out by a young white man…
This is why regulation needs to come from the government — and the regulations better have teeth.
Jake Sullivan…
SULLIVAN: Well, particularly for the social media platforms, it seems to me that we have to be talking seriously about removing the immunity they have under U.S. law for being held liable for the content on their sites. Facebook is receiving massive amounts of revenue from the advertising it’s selling on every one of these pages being loaded, as these conspiracy theories and information-warfare operations are being spread, and they should have to take some responsibility for it.
[SUSPENSEFUL MUSIC BEGINS]
Now imagine applying that principle internationally.
How might’ve Facebook responded differently, in Myanmar… if it could’ve been held legally liable for spreading hate that contributed to the ethnic cleansing of an entire people?
So how can America lead if we have a “Diss”—informer-in-Chief sitting in the White House?
It won’t surprise you that I believe…we can not solve this problem so long as Donald Trump is President.
But I also know… people around the world are more aware than ever that disinformation is not a tinfoil-hat conspiracy theory.
They’re more aware than ever that social media is not always good for us. And that means, if this election goes our way, there’s hope to make social media better. Jake Sullivan:
SULLIVAN: In a way, we’ve been a few years behind the curve of Europe and we’re just now catching up on everything from privacy to extremist content online to disinformation. And so I do think there is a real opportunity, if you got a different president, to have a conversation with Europeans about some common set of regulatory approaches that would allow us to have people be able to enjoy and make use of social media. But that would curb some of these excesses and these abuses.
Second of all — I’ve said it before on this show, and it bears repeating: Getting rid of Trump won’t alone solve our problems. That’s just the beginning of our work.
Remember where we started? Russia’s attack on the 2016 election. That happened in part because our government didn’t have the right tools to respond.
But the hard truth? We were an easy mark.
We were polarized. We had right wing media eager to echo the disinformation. We had a mainstream media that couldn’t resist reporting on it. Russia just poured gasoline on fires that were already there. Graham Brookie.
BROOKIE: …the Internet’s not written in stone. You’ve got to engage at all times. And the other side is really, really good at engaging at all times on stuff that’s easy and like vitriolic in a lot of times and viral. I mean, the term going viral is an informative term here.
So how do you beat vitriolic content that’s going viral? It’s not just regulation.
BROOKIE: The first thing is you’ve got to have a better story. Just full stop. If you don’t have a better story, you’re competing with a backward-looking negative story about how dark and nasty the world is and how scary other people are.
Sure, government has a role to play. But so do citizens…by rejecting disinformation and lies…by resisting the hopelessness and apathy that disinformation breeds…and by telling stories that recognize our shared humanity….
[HOPEFUL MUSIC BEGINS]
At our best, that’s what America has always done…
OBAMA: You think about the United States of America. We have a really good story called the Declaration of Independence. “We hold these truths to be self-evident that all men are created equal; that we’re endowed with certain unalienable rights; that among these are life, liberty, and the pursuit of happiness.” …It was just a good story that they were telling about what *could* be. And then people were attracted to that story. And it led to independence, and it led to immigrants from around the world who wanted that vision for themselves.
We get to choose what stories to believe in…what stories we tell….what stories we spread. After all, whether we live in America or Russia or Myanmar, we’re all just people – with the choice to be kind or cruel…to do good or evil…whatever the platform.
Saijai leaves us with this:
LIANGPUNSAKUL: I do think that it’s really normal when the world have some new thing come in? We we don’t know how to respond to that. There’s no law or regulation with Facebook, right? But I do believe the power of the citizen and government and civil society. If we come together, we will find a way to respond to this.
She’s right. Even for Americans, we’re still in the early days of social media. Facebook has only *existed* for sixteen years. But we have to develop the antibodies against disinformation… before it’s too late.
[MUSIC ENDS]
[THEME MUSIC BEGINS]
This week we talked about how disinformation made the sectarian resentment in one country worse.
Next week, we’ll explain how sectarian resentment builds in the first place — and why we’re seeing more and more of it around the world.
AYYUB: Overnight, perfectly all right neighbors, perfectly normal people… turned savages. People who you call your best friends, your neighbors!
As Trump and the GOP polarize America… how can we lead the charge against dangerous polarization abroad? The problem, and some solutions… on our next episode.
[CREDITS MUSIC BEINGS]
Missing America is written and hosted by me, Ben Rhodes.
It’s a production of Crooked Media.
The show is produced by Andrea Gardner-Bernstein.
Rico Gagliano is our story editor.
Austin Fisher is our associate producer.
Sound design and mixing by Daniel Ramirez.
Production support and research from Nimi Uberoi and Sydney Rapp.
Fact checking by Justin Klozco
Original music by Marty Fowler.
The executive producers are Sarah Geismer, Lyra Smith, and Tanya Somanader.
Special thanks to Alison Falzetta, Tommy Vietor, Jon Lovett and Jon Favreau.
Thanks for listening.
[CREDITS MUSIC FADES]