Podcast

Chinese Charm Attempts to Alter American Political Opinion

Posted: 18th March 2019
By: ANE POKORNY
Chinese Charm Attempts to Alter American Political Opinion

There’s an increasing awareness of foreign influence on American institutions through social media. U.S. intelligence agencies have asserted that Russians made a concerted effort to disrupt and influence the 2016 presidential election, and there’s widespread evidence that Russia continues to sow the seeds of discord with the aim of eroding Westerners’ trust and confidence in their political systems and social norms.

Recorded Future’s Insikt Group recently published findings from their research into Chinese efforts to sway public opinion via social media, and how their goals and tactics are markedly different from those of the Russians.

We welcome back Recorded Future’s Priscilla Moriuchi to the show. She shares Insikt Group’s findings and helps put it all into broader perspective.

This podcast was produced in partnership with the CyberWire.

For those of you who’d prefer to read, here’s the transcript:

This is Recorded Future, inside threat intelligence for cybersecurity.

Dave Bittner:

Hello everyone, and welcome to episode 99 of the Recorded Future podcast. I’m Dave Bittner from the CyberWire.

There is an increasing awareness of foreign influence on American institutions through social media. U.S. intelligence agencies have asserted that Russians made a concerted effort to disrupt and influence the 2016 presidential election, and there’s widespread evidence that Russia continues to sow the seeds of discord with the aim of eroding Westerners’ trust and confidence in their political systems and social norms.

Recorded Future’s Insikt Group recently published findings from their research into Chinese efforts to sway public opinion via social media, and how Chinese goals and tactics are markedly different from those of the Russians.

We welcome back Recorded Future’s Priscilla Moriuchi to the show. She shares Insikt Group’s findings and helps put it all into broader perspective. Stay with us.

Priscilla Moriuchi:

Since the 2016 U.S. presidential election, there’s been a lot of work done. Research, reporting, and resources devoted to understanding the role that Russian disinformation, or influence operations, played in the outcome of that election. As a result, there exists this implicit assumption that other state-run influence campaigns must look the same and operate in the same manner. We wanted to test that assumption because we didn’t necessarily believe that all, especially the social media aspect of influence campaigns, were necessarily the same, no matter which country was behind them.

We tested it by studying the social media influence operations of China. It’s another country that’s widely documented as engaging in real-world influence operations against the United States, going back decades. We had two fundamental research questions. One, how does Chinese state-run influence operations, specifically their social media operations, differ from the Russian ones, and in what ways? And what can we learn from that?

At a high level, we examined the campaigns. What we found was that, for example, both of the countries’ campaigns are driven by their strategic goals. Russia’s strategic goals are more disruptive. They want to undermine faith in the American democratic process, raise support for pro-Russian policies, undermine western alliance systems, the EU. But for China, their goals are different. They seek a larger role and greater influence in the current international system and to propagate what they call the so-called “Chinese Dream.” This idea that China’s rise, the rising tide raises all boats, that China’s rise is good for the whole world. And those goals drive the methodology of their social media influence campaigns.

Dave Bittner:

Let’s start out by digging in some to the Russian model, because I think that’s probably the one that most people are familiar with and has certainly gotten a lot of attention lately, certainly after all the allegations and conclusions by U.S. intelligence organizations about Russian influence in our election. So let’s just go through some of the details. What do they do and how effective has it been?

Priscilla Moriuchi:

Sure, if I could step back just one second though, because I want to make sure to define some of the terms so that people understand what it is that we’re talking about because it’s important, especially for the China model, to distinguish from influence and propaganda from state-administered or state-sponsored media.

So, the terms that we’re using are either “social media operations” or “influence operations.” We’re probably sticking with social media operations because we pull from a number of terms. So, one of the most important things for the discussion we’re having today is this term that French researchers came up with called “information manipulation.” It’s essentially the intentional and massive dissemination of false or biased news for hostile political purposes.

I think there are three things to keep in mind, that are very important to understanding the Chinese campaigns. One, that information manipulation campaigns or social media operations consist of one, a coordinated campaign, two, the diffusion of false information or information that is knowingly distorted, and three, the political intention to cause harm against the targets of your campaign.

For this research, we wanted to state that because it’s important to keep these concepts of coordination, distortion, and harm at the forefront when we discuss influence campaigns, because they’re distinguishing features from regular propaganda or regular state-run media. So we’ll get that out of the way so that people understand that we’re talking about coordinated campaigns in which the information is consciously distorted by the initiators — in this case, the Chinese or Russian state, with the intent to cause harm against the target, which in this case is Americans.

If we step back and look through the Russian influence model as a baseline, we go back to the 2016 U.S. presidential election, the gold standard, I think, or the first attempt for Russian operations. The methodology they used was, first, to use a nominally private company, so the Internet Research Agency. I say nominally private because it was run by a man who was well connected and had longstanding ties to president Putin. The funding around that organization is very opaque and many believe it’s connected to the Russian state and maybe even Russian intelligence. So first is to use this nominally private cover company.

Second was this evolution in content. If you look at the content disseminated beginning in 2015 and around the U.S. presidential election, much of the content is what we would’ve called “fake news,” or demonstrably false information. As we’ve studied the progression of content being distributed by Russian influence operations, we’ve seen that evolution change from, or that content evolve from, this kind of fake news to the propagation of what we call hyper-partisan but legitimate content. So in this case, these are just sharply polarized perspectives on legitimate U.S. news stories.

So from legitimate U.S. news sources, like Fox News, CNN, CNBC, as well as hyper-partisan sites, but ones that are well-read among conservative communities like the Hannity website or Breitbart. The shift that we’re seeing is that the vast majority of these posts are actual, real news stories. They present just a sharply polarized perspective on those facts. That’s a demonstrable change from the tactic that was used in 2016.

The third goal that Russians used very effectively is the use of memes. This is quite unique to Russian operations. We’ve seen that documented in the Oxford University report on Russian disinformation campaigns where they use information from the senate, how the Russians are able to use memes and propagate the messages via memes quite effectively.

And lastly, that the operations are designed to destabilize, erode trust, promote chaos, and sow discontent. The tactics that Russia has utilized are quite unique, as well. They express a clear preference for one candidate. We saw this, obviously, in 2016 where the U.S. Intelligence Community assessed that the Russian disinformation campaign had a preference for Donald Trump. We saw this at Recorded Future in our research on the U.S. midterms, where Russian influence accounts across social media would express a clear preference for one candidate in either congressional, senatorial, or gubernatorial races. They would target that candidate’s opponents, support that candidate’s policies, and reinforce those themes across all the social media platforms that they had.

These operations also had a real-world impact and a real-world, I would say, intent, to suppress voter turnout and even propagate some secessionist messages. That we, as whatever party, Republicans or Democrats, are so different from the other party and the other Americans that we simply can’t live together. And overall, the goals, these long-term goals that Russia has that are disruptive and destructive, dictate the model. Then, therefore, their social media influence model utilizes disruptive and destructive techniques.

Dave Bittner:

Now let me ask you, do you have any sense for what prompted the shift from disinformation to amplification?

Priscilla Moriuchi:

We believe that some of the just fake news was not propagating as well or getting the traction that maybe it had gotten in 2016 because of the research and the reporting and the knowledge that people had about what fake news was.

Dave Bittner:

So people had been inoculated because of the detection of that … They had their guard up against it, perhaps?

Priscilla Moriuchi:

Yeah, I believe so. And there was minimal fake news in the 2018 campaign. So still some, but very minimal. And instead, it’s just this propagation and this echo chamber of the messages that they want to get across. Those messages are already out there on real U.S. news sites. All the Russian bots had to do was supercharge them and amplify them.

Dave Bittner:

Well, let’s move on to China. What’s the difference here? What do they do to advance their particular goals?

Priscilla Moriuchi:

Sure. So China, most people don’t realize, is actually what we would call the “grandfather of social media influence operations.” It’s something that they’ve been developing and testing and imposing upon their own domestic population and domestic social media services for decades, since the late 1990s. So we took that knowledge and the techniques that China used on its own domestic population and examined those to see whether they applied to China’s foreign influence operations.

The Chinese use a number of tactics. One, outright censorship, domestically. They have the ability to sensor based on topic, keyword, URL — a number of different capacities to just outright block or censor topics. Second is, they block platforms or services like Facebook, for example, Twitter, are outright blocked in China. Or they require social media to comply with state censorship and blocking regulations, which domestic social media does. And last, they employ these tactics, they literally employ people, otherwise known as what’s called this “50 Cent Party,” which is anywhere from … Numbers range from about half a million to two million people, employed by the Chinese government to essentially flood Chinese social media with pro-regime or distracting comments. It’s called astroturfing, basically, fake grassrooting. It’s a technique that’s used quite effectively in the domestic sphere.

So in the foreign sphere, we looked at, specifically, English-language, Chinese state-run media posts. Because we’re trying to … As we talked about with Russia, its strategic goals are driven and established by the state and require coordination by the state. It’s this greater influence in the international system, propagating this Chinese Dream, which includes propagating this positive image of the Communist party and the Chinese state, and their role in economic globalization.

We took that theory, that Chinese goals drive their tactics, and that state-run media, therefore, as the main propaganda and influence outlets, are the ones responsible for influencing foreign populations. And we examined state-run media. So in China, there is really no media that’s not state-run, so we looked at the ones that were connected either through the intelligence and security services or ones that had a demonstrated and longstanding presence in the English language sphere.

There were six publications we looked at. We examined their social media operations across a number of platforms. And some of these included Xinhua, People’s Daily, China Global Television, and a few others.

First off, we looked at these accounts and the messages that they were putting out and we discovered that, first, all of these accounts put out an overwhelmingly positive message. We use what’s called sentiment analysis to weigh the sentiment that these accounts were trying to put out on social media. All of the accounts were putting out a saccharine, positive, glowing account of China that supports the positive, glowing image that, strategically, China is trying to propagate throughout the world.

Dave Bittner:

What kinds of messages are we talking about here? Is this a tourism message, that China is a beautify company … Or a beautiful country? Those sorts of things? Or what’s the spectrum?

Priscilla Moriuchi:

The spectrum is everything from, “China is a beautiful country with appealing cultural traditions and heritage,” all the way to the positive impact that China is having in the world on science, technology, sports, to the last one, which I think is maybe the most insidious of the influence aspect, is there is breaking news.

Another thing that we saw really heavily messaged among these Chinese influence accounts is that they were trying to establish these Chinese state-run media outlets with a biased and distorted view that they are propagating as a wire news service, that it was just China’s perspective on the global news. We really feel like that’s one of the messages, from what we saw, that was the most widely propagated, and really the most damaging, as well.

Dave Bittner:

Now, when you say breaking news, is this the same news that other organizations would be breaking, but with a distinctly Chinese tilt to it?

Priscilla Moriuchi:

Yeah, exactly. Especially when it comes to stories on policies or issues that China has a particular position on. So if we look at the trade issues, the trade war, that the United States is engaged in with China at the moment, or the Iranian nuclear deal, the JCPOA, the stories and the message propagated by these influence accounts would have you believe that China … That these accounts were just reporting on the news. But when you looked at the message or the content, it was advocating very specifically for China’s particular position on that.

So, in the trade war, the message was that China is the responsible player, China is the one, the country who’s in favor of equal, fair global trade, and the U.S. is irresponsible, is the negative influencer. And on the Iranian side with the JCPOA, there was a propagation of a questioning message of, “Now that the U.S. has pulled out, is war inevitable? This is the United States’ fault. There was an agreement. The U.S. has pulled out. And if there’s a war, it’s the United States’ fault.”

Dave Bittner:

And it’s all wrapped in this veil of positivity?

Priscilla Moriuchi:

Yeah. It’s this overwhelmingly positive view of China and China’s policies, the Chinese state. Anything that China does, the message is this intentionally distorted and biased narrative portraying this utopian, saccharine view of the Chinese government and party. Again, and to go back, the intent of these social media operations and information manipulation is to cause harm. Political harm or social harm to their targets, which is the United States and Americans.

Dave Bittner:

That’s interesting. Because my first response is, is portraying yourself in the best light, is it accurate to perceive that as causing harm?

Priscilla Moriuchi:

We don’t think it’s the best light. It’s a biased and distorted light. There’s a difference between, say, like a tourism board in the state of California putting out a message that says, “Come visit California, we’re amazing,” and Chinese state-run media distributing a constant flow of messages about, distorted messages about China, that there are only good things about China, for the purposes of swaying the world’s opinion about an authoritarian, repressive, dictatorial regime.

So again, that’s why there’s a thin line, and the thin line goes back to that intent and the coordination of the campaign. We’re talking about a thin line between, like you said, putting out the best possible perspective on yourself, marketing yourself, and intentionally distorting the news and information about yourself, to cause harm to your target.

Dave Bittner:

It’s interesting. Come for the beautiful vistas and the cute, cuddly pictures of panda bears, and stay for the propaganda about our political goals.

Priscilla Moriuchi:

Right. Exactly. I mean, studies have shown, and academics have documented this for quite a while, that in part of this larger goal set — and we see this in these social media operations — is to exploit American openness in order to advance China’s own goals on a competitive playing field, like this global playing field that’s not level. That’s what we see with these social media operations.

Dave Bittner:

What’s your sense, when you contrast the two — the Russian efforts and the Chinese efforts — is there any sense for which is more successful for achieving their goals?

Priscilla Moriuchi:

It’s difficult to say. We did one apples-to-apples comparison. We looked at audience engagement numbers on Instagram. We used some of the numbers provided by the Senate in the new knowledge report on Russian disinformation campaigns, specifically the Russian Internet Research Agency account’s use of Instagram. And we looked at just two Chinese accounts’ use of Instagram. We compared the same time frame, over a generic four-month period in the Russian disinformation campaign with the four-month period we had data on for China. So, all the Russian accounts versus only two Chinese accounts, and what we saw was pretty staggering. These two Chinese accounts accounted for roughly one sixth of the total audience engagement, or the total rough impact, as the entire Russia, IRA-associated campaign targeting the United States on Instagram.

It’s just two accounts that we profiled. So from an engagement, or a rough-impact perspective, these two accounts, and if you extrapolate the numbers to the six accounts and the numerous other state-run media accounts that were not part of this study, you could argue that Chinese and Russian campaigns are having at least the same impact that support their goals. I think the impact on both is quite difficult to quantify. One, because Russia’s goals are different, they’re more divisive and discordant than China’s goals, which are more positive and much longer term.

Dave Bittner:

I guess, in some ways, with the Russians, there’s not a whole lot of ambiguity as to what they’re up to. The negativity is right there in your face. And the Chinese method seems to be a little more, I don’t know, subversive or subtle.

Priscilla Moriuchi:

Yeah, that’s right. I mean, with China, it’s about changing the way that you think, as an American, about China and about what China is trying to do in the world. And with Russia, their social media campaign appears to be designed to divide the American public so that Russia can take strategic advantage of the weakness that results from that.

Dave Bittner:

Yeah, change the way you think about yourself as an American, dividing us, and that sort of thing. Pitting us against each other.

Priscilla Moriuchi:

Right.

Dave Bittner:

Yeah, that’s really an interesting contrast. So, what are the take-homes for you? When you gather up the information that you all brought together here, what’s the bottom line?

Priscilla Moriuchi:

One bottom line is, social media operations, influence operations, especially state-run, are not “one size fits all.” So, each country has their own strategic goals, and those goals drive the methodologies that they use. We see that with the examination of China and Russia. Also, that the impacts of these campaigns are different and that they’re hard to measure. Lastly, our big message is, don’t be an enabler, as I call it.

There have been studies done — one that is my favorite is by RAND — talking about why influence campaigns and state-run propaganda are effective. And we believe that, as users of social media, we are the ones who are responsible for propagating content and understanding what the content is that we’re looking at, why it’s out there, and whether or not we choose to amplify that. People are poor judges, really, of true versus false information. They don’t necessarily remember that some information was false. It will stick in their brain.

Also, information overload. It leads people to take shortcuts in determining whether a message is truthful or trustworthy. Familiar themes and messages, even if they are false, but if you hear them enough, they can be appealing and they can drive your belief and your trustworthiness in certain sources. And peripheral cues, such as the appearance of objectivity, which China has seized on by using state-run media, and billing state-run media as a wire news service, even if those stories are not objective, which we have demonstrated they aren’t, in which the Chinese state has said explicitly that those messages are not objective, the appearance of that can increase the credibility of those messages.

So, our last appeal is, don’t enable these types of campaigns. Be more critical, accumulate as much knowledge as you can, as a user of social media, and have discretion when you choose to post and repropagate a message. Make sure you understand what that message is and who it’s coming from, and that there could be an impact beyond just yourself.

Dave Bittner:

Our thanks to Recorded Future’s Priscilla Moriuchi for joining us. The report is titled, “Beyond Hybrid War: How China Exploits Social Media to Sway American Opinion.” You can find it on the Recorded Future website. It’s in the blog section.

Don’t forget to sign up for the Recorded Future Cyber Daily email, where every day you’ll receive the top results for trending technical indicators that are crossing the web, cyber news, targeted industries, threat actors, exploited vulnerabilities, malware, suspicious IP addresses, and much more. You can find that at recordedfuture.com/intel.

We hope you’ve enjoyed the show and that you’ll subscribe and help spread the word among your colleagues and online. The Recorded Future podcast team includes Coordinating Producer Zane Pokorny, Executive Producer Greg Barrette. The show is produced by the CyberWire, with Editor John Petrik, Executive Producer Peter Kilpe, and I’m Dave Bittner.

Thanks for listening.

Related