Laman

Senin, 11 Februari 2019

The Russian Firehose of Falsehood Propaganda


Since its 2008 incursion into Georgia (if not before), there has been a remarkable evolution in Russia’s approach to propaganda. This new approach was on full display during the country’s 2014 annexation of the Crimean peninsula. 

It continues to be demonstrated in support of ongoing conflicts in Ukraine and Syria and in pursuit of nefarious and long-term goals in Russia’s “near abroad” and against NATO allies.

In some ways, the current Russian approach to propaganda builds on Soviet Cold War–era techniques, with an emphasis on obfuscation and on getting targets to act in the interests of the propagandist without realizing that they have done so.

In other ways, it is completely new and driven by the characteristics of the contemporary information environment. Russia has taken advan-tage of technology and available media in ways that would have been inconceivable during the Cold War. Its tools and channels now include the Internet, social media, and the evolving landscape of professional and amateur journalism and media outlets.

We characterize the contemporary Russian model for propaganda as “the firehose of falsehood” because of two of its distinctive features: high numbers of channels and messages and ashameless willingness to disseminate partial truths or outright fictions. In the words of one observer, “[N]ew Russian propaganda entertains, confuses and overwhelms the audience.”

Contemporary Russian propaganda has at least two other distinctive features. It is also rapid, continuous, and repetitive, and it lacks commitment to consistency. Interestingly, several of these features run directly counter to the conventional wisdom on effective influence and communication from government or defense sources, which traditionally emphasize the importance of truth, credibility, and the avoidance of contradiction.

Despite ignoring these traditional principles, Russia seems to have enjoyed some success under its contemporary propaganda model, either through more direct persuasion and influence or by engaging in obfuscation, confusion, and the disruption or diminution of truthful reporting and messaging.

We offer several possible explanations for the effectiveness of Russia’s firehose of falsehood. Our observations draw from a concise, but not exhaustive, review of the literature on influence
and persuasion, as well as experimental research from the field of psychology. We explore the four identified features of the Russian propaganda model and show how and under what circumstances each might contribute to effectiveness.

Many successful aspects of Russian propaganda have surprising foundations in the psychology literature, so we conclude with a brief discussion of possible approaches from the same field for responding to or competing with such an approach.

Russian Propaganda Is High-Volume and Multichannel 
Russian propaganda is produced in incredibly large volumes and
is broadcast or otherwise distributed via a large number of channels. his propaganda includes text, video, audio, and still imagery propagated via the Internet, social media, satellite television, and traditional radio and television broadcasting. he producers and disseminators include a substantial force of paid Internet “trolls” who also often attack or undermine views or information that runs counter to Russian themes, doing so through online chat rooms, discussion forums, and comments sections on news and other websites.

Radio Free Europe/Radio Liberty reports that “there are thousands of fake accounts on Twitter, Facebook, LiveJournal, and vKontakte”maintained by Russian propagandists. According to a former paid Russian Internet troll, the trolls are on duty 24 hours a day, in 12-hour shifts, and each has a daily quota of 135 posted comments of at least 200 characters.

RT (formerly Russia Today) is one of Russia’s primary multimedia news providers. With a budget of more than $300 million per year, it broadcasts in English, French, German, Spanish, Rus-
sian, and several Eastern European languages. The channel is particularly popular online, where it claims more than a billion page views. If true, that would make it the most-watched news source on the Internet.

In addition to a knowledged Russian sources like RT, there are dozens of proxy news sites presenting Russian propaganda, but with their ailiation with Russia disguised or downplayed.

Experimental research shows that, to achieve success in  disseminating propaganda, the variety of sources matters:
• Multiple sources are more persuasive than a single source, especially if those sources contain diferent arguments that point to the same conclusion.
• Receiving the same or similar message from multiple sources is more persuasive.
• People assume that information from multiple sources is likely to be based on diferent perspectives and is thus worth greater consideration.

The number and volume of sources also matter:
• Endorsement by a large number of users boosts consumer trust, reliance, and conidence in the information, often with little attention paid to the credibility of those making the endorsements.
• When consumer interest is low, the persuasiveness of a message can depend more on the number of arguments supporting it than on the quality of those arguments.

Finally, the views of others matter, especially if the message comes from a source that shares characteristics with the recipient:
• Communications from groups to which the recipient belongs
are more likely to be perceived as credible. he same applies
when the source is perceived as similar to the recipient. If a propaganda channel is (or purports to be) from a group the recipient identiies with, it is more likely to be persuasive.
• Credibility can be social; that is, people are more likely to
perceive a source as credible if others perceive the source as credible. his efect is even stronger when there is not enough information available to assess the trustworthiness of the source.
• When information volume is low, recipients tend to favor, but when information volume is high, recipients tend to favor information from other users. 
• In online forums, comments attacking a proponent’s expertise or trustworthiness diminish credibility and decrease the likelihood that readers will take action based on what they have read.

The experimental psychology literature suggests that, all other things being equal, messages received in greater volume and from more sources will be more persuasive. Quantity does indeed have a quality all its own. High volume can deliver other beneits that are relevant in the Russian propaganda context.

First, high volume can consume the attention and other available bandwidth of potential audiences, drowning out competing messages.
Second, high volume can overwhelm competing messages in a lood of disagreement.
Third, multiple channels increase the chances that target audiences are exposed to the message.
Fourth, receiving a message via multiple modes and from multiple sources increases the message’s perceived credibility, especially if a disseminating source is one with which an audience member identifies. 

Russian Propaganda Is Rapid, Continuous, and Repetitive
Contemporary Russian propaganda is continuous and very respon-
sive to events. Due to their lack of commitment to objective reality
(discussed later), Russian propagandists do not need to wait to check facts or verify claims; they just disseminate an interpretation of emergent events that appears to best favor their themes and objectives.

This allows them to be remarkably responsive and nimble, often broadcasting the irst “news” of events (and, with similar frequency, the irst news of nonevents, or things that have not actually happened). hey will also repeat and recycle disinformation.

The January 14, 2016, edition of Weekly Disinformation Review reported the reemergence of several previously debunked Russian propaganda stories, including that Polish President Andrzej Duda was insisting that Ukraine return former Polish territory, that Islamic State ighters were joining pro-Ukrainian forces, and that there was a Western-backed coup in Kiev, Ukraine’s capital.

Sometimes, Russian propaganda is picked up and rebroadcast
by legitimate news outlets; more frequently, social media repeats
the themes, messages, or falsehoods introduced by one of Russia’s many dissemination channels. For example, German news sources rebroadcast Russian disinformation about atrocities in Ukraine in early 2014, and Russian disinformation about EU plans to deny visas to young Ukrainian men was repeated with such frequency in Ukrainian media that the Ukrainian general staf felt compelled to
post a rebuttal.

The experimental psychology literature tells us that first impressions are very resilient: An individual is more likely to accept the irst information received on a topic and then favor this information when faced with conlicting messages.

Furthermore, repetition leads to familiarity, and familiarity leads to acceptance:
• Repeated exposure to a statement has been shown to increase its acceptance as true.
• The “illusory truth efect” is well documented, whereby people rate statements as more truthful, valid, and believable when they have encountered those statements previously than when they are new statements.
• When people are less interested in a topic, they are more likely to accept familiarity brought about by repetition as an indicator that the information (repeated to the point of famil-
iarity) is correct.
• When processing information, consumers may save time and
energy by using a frequency heuristic, that is, favoring inforamtion they have heard more frequently.
• Even with preposterous stories and urban legends, those who have heard them multiple times are more likely to believe that they are true.
• If an individual is already familiar with an argument or claim (has seen it before, for example), they process it less carefully, often failing to discriminate weak arguments from strong arguments.

Russian propaganda has the agility to be irst, which afords propagandists the opportunity to create the irst impression. Then, the combination of high-volume, multichannel, and continuous messaging makes Russian themes more likely to be familiar to their audiences, which gives them a boost in terms of perceived credibility, expertise, and trustworthiness.

Russian Propaganda Makes No Commitment to Objective Reality
It may come as little surprise that the psychology literature supports the persuasive potential of high-volume, diverse channels and sources, along with rapidity and repetition. These aspects of Russian propaganda make intuitive sense. One would expect any influence efort to enjoy greater success if it is backed by a willingness to invest in additional volume and channels and if its architects ways to increase the frequency and responsiveness of messages.

This next characteristic, however, lies in the face of intuition and
conventional wisdom, which can be paraphrased as “The truth always wins.”

Contemporary Russian propaganda makes little or no commitment to the truth. his is not to say that all of it is false. Quite the contrary: It often contains a signiicant fraction of the truth. Sometimes, however, events reported in Russian propaganda are wholly manufactured, like the 2014 social media campaign to create panic about an explosion and chemical plume in St. Mary’s Parish, Louisiana, that never happened.

Russian propaganda has relied on manufactured evidence—often photographic. Some of these images are easily exposed as fake due to poor photo editing, such as discrepancies of scale, or the availability of the original (pre-altered) image.

Russian propagandists have been caught hiring actors to portray victims of manufactured atrocities or crimes for news reports (as was the case when Viktoria Schmidt pretended to have been attacked by Syrian refugees in Germany for Russian’s Zvezda TV network), or faking on-scene news reporting (as shown in a leaked video in which “reporter” Maria Katasonova is revealed to be in a darkened room with explosion sounds playing in the background rather than on a battleield in Donetsk when a light is switched on during the recording).

In addition to manufacturing information, Russian propagan-
dists often manufacture sources. Russian news channels, such as
RT and Sputnik News, are more like a blend of infotainment and
disinformation than fact-checked journalism, though their formats
intentionally take the appearance of proper news programs. Russian news channels and other forms of media also misquote credible sources or cite a more credible source as the origin of a
selected falsehood.

For example, RT stated that blogger Brown Moses (a staunch critic of Syria’s Assad regime whose real name is Eliot Higgins) had provided analysis of footage suggesting that chemical weapon attacks on August 21, 2013, had been perpetrated by Syrian rebels. In fact, Higgins’s analysis concluded that the  Syrian government was responsible for the attacks and that the foot-age had been faked to shift the blame.

Similarly, several scholars and journalists, including Edward Lucas, Luke Harding, and Don Jensen, have reported that books that they did not write—and containing views clearly contrary to their own—had been published in Russian under their names. “The Kremlin’s spin machine wants to portray Russia as a besieged fortress surrounded by malevolent outsiders,” said Lucas of his misattributed volume, How the West Lost to Putin.

Why might this disinformation be efective?

First, people are often cognitively lazy. Due to information overload (especially on the Internet), they use a number of diferent heuristics and shortcuts to determine whether new information is trustworthy.

Second, people are often poor at discriminating true information
from false information—or remembering that they have done so previously. The following are a few examples from the literature:
• In a phenomenon known as the “sleeper efect,” low- credibility sources manifest greater persuasive impact with the passage of time. While people make initial assessments of the credibility of a source, in remembering, information is often dissociated from its source. hus, information from
a questionable source may be remembered as true, with the
source forgotten.
• Information that is initially assumed valid but is later retracted or proven false can continue to shape people’s memory and inluence their reasoning.
• Even when people are aware that some sources (such as political campaign rhetoric) have the potential to contain misinformation, they still show a poor ability to discriminate between information that is false and information that is correct. Familiar themes or messages can be appealing even if these themes and messages are false.

Information that connects with group identities or familiar narratives—or that arouses emotion—can be particularly persuasive. he literature describes the efects of this approach:
• Someone is more likely to accept information when it is consistent with other messages that the person believes to be true.
• People sufer from “conirmation bias”: hey view news and opinions that conirm existing beliefs as more credible than other news and opinions, regardless of the quality of the arguments.
• Someone who is already misinformed (that is, believes something that is not true) is less likely to accept evidence that goes against those misinformed beliefs.
• People whose peer group is afected by an event are much more likely to accept conspiracy theories about that event.
• Stories or accounts that create emotional arousal in the recipient (e.g., disgust, fear, happiness) are much more likely to be passed on, whether they are true or not.
• Angry messages are more persuasive to angry audiences.

False statements are more likely to be accepted if backed by evidence, even if that evidence is false:
• The presence of evidence can override the efects of source credibility on perceived veracity of statements.
• In courtroom simulations, witnesses who provide more details—even trivial details—are judged to be more credible.

Finally, source credibility is often assessed based on “peripheral
cues,” which may or may not conform to the reality of the situation. A broadcast that looks like a news broadcast, even if it is actually a propaganda broadcast, may be accorded the same degree of credibility as an actual news broadcast.

Findings from the field of psychology show how peripheral cues can increase the credibility of propaganda:
• Peripheral cues, such as the appearance of expertise or the
format of information, lead people to accept—with little relection—that the information comes from a credible source.
• Expertise and trust worthiness are the two primary dimensions of credibility, and these qualities may be evaluated based on visual cues, such as format, appearance, or simple claims of expertise.
• Online news sites are perceived as more credible than other online formats, regardless of the veracity of the content.

The Russian firehose of falsehood takes advantage of all ive of these factors. A certain proportion of falsehood in Russian propaganda may just be accepted by audiences because they do not recognize it as false or because various cues lead them to assign
it greater credibility than they should. his proportion actually increases over time, with people forgetting that they have rejected certain ofered “facts.”

The proportion of falsehoods accepted increases even more when the disinformation is consistent with narratives or preconceptions held by various audiences. Where evidence is presented or seemingly credible sources disseminate the falsehoods, the messages are even more likely to be accepted. This is why Russian faux-news propaganda channels, such as RT and Sputnik, are so insidious. Visually, they look like news programs, and the persons appearing on them are represented as journalists and experts, making audience members much more likely to ascribe credibility to the misinformation these sources are disseminating.

Russian Propaganda Is Not Committed to Consistencyt the final distinctive characteristic of Russian propaganda is that it is not committed to consistency.

First, diferent propaganda media do not necessarily broadcast the exact same themes or messages.
Second, diferent channels do not necessarily broadcast the same account of contested events.
Third, diferent channels or representatives show no fear of “changing their tune.” If one falsehood or misrepresentation is exposed or is not well received, the propagandists will discard it and move on to a new (though not necessarily more plausible) explanation.

One example of such behavior is the string of accounts ofered for the downing of Malaysia Airlines 
Flight. Russian sources have ofered numerous theories about how the aircraft came to be shot down and by whom, very few of which are plausible. Lack of commitment to consistency is also apparent in statements from Russian President Vladimir Putin.

For example, the first denied that the “little green men” in Crimea were Russian soldiers but later admitted that they were. Similarly, the at irst denied any desire to see Crimea join Russia, but then he admitted that that had been his plan all along.

Again, this lies in the face of the conventional wisdom on inluence and persuasion. If sources are not consistent, how can they be credible? For example, when recipients make an efort to scrutinize inconsistent messages from the same source.

How ever, the literature in experimental psychology also shows that audiences can overlook contradictions under certain circumstances:
• Contradictions can prompt a desire to understand why a shift in opinion or messages occurred. When a seemingly
strong argument for a shift is provided or assumed (e.g., more thought is given or more information is obtained), the new message can have a greater persuasive impact.
• When a source appears to have considered diferent perspectives, consumer attitudinal conidence is greater. A source who changes his or her opinion or message may be perceived as having given greater consideration to the topic, thereby
inluencing recipient conidence in the newest message. 

Potential losses in credibility due to inconsistency are potentially ofset by synergies with other characteristics of contemporary propaganda. As noted earlier in the discussion of multiple channels, the presentation of multiple arguments by multiple sources is
more persuasive than either the presentation of multiple arguments
by one source or the presentation of one argument by multiple
sources.

These losses can also be ofset by peripheral cues that enforce perceptions of credibility, trustworthiness, or legitimacy. Even if a channel or individual propagandist changes accounts of events from one day to the next, viewers are likely to evaluate the credibility of the new account without giving too much weight to the prior, “mistaken” account, provided that there are peripheral
cues suggesting the source is credible.

While the psychology literature suggests that the Russian propaganda enterprise sufers little when channels are inconsistent with each other, or when a single channel is internally inconsistent, it is unclear how inconsistency accumulates for a single prominent figure.

While inconsistent accounts by diferent propagandist on RT, for example, might be excused as the views of diferent journalists or changes due to updated information, the fabrications of Vladimir Putin have been unambiguously attributed to him, which cannot be good for his personal credibility.

Of course, perhaps many people have a low baseline expectation of the veracity of statements by politicians and world leaders. To the extent that this is the case, Putin’s fabrications, though more egregious than the routine, might be perceived as just more of what is expected from politicians in general and might not constrain his future inluence potential.

What Can Be Done to Counter the Firehose of Falsehood?
Experimental research in psychology suggests that the features of the contemporary Russian propaganda model have the potential to be highly efective. Even those features that run counter to conventional wisdom on efective inluence (e.g., the importance of veracity and consistency) receive some support in the literature.

If the Russian approach to propaganda is efective, then what can be done about it? We conclude with a few thoughts about how NATO, the United States, or other opponents of the irehose of falsehood might better compete. The first step is to recognize that this is a non trivial challenge.

Indeed, the very factors that make the firehose of falsehood efective also make it quite diicult to counter: For example, the high volume and multitude of channels for Russian propaganda ofer proportionately limited yield if one channel is taken of the air (or oline) or if a single misleading voice is discredited.

The persuasive beneits that Russian propagandists gain from presenting the irst version of events (which then must be dislodged by true accounts at much greater efort) could be removed if the true accounts were instead presented irst. But while credible and professional journalists are still checking their facts, the Russian firehose of falsehood is already lowing: It takes less time to make up facts than it does to verify them.

We are not optimistic about the efectiveness of traditional counter propaganda eforts. Certainly, some efort must be made to point out falsehoods and inconsistencies, but the same psychological evidence that shows how falsehood and inconsistency gain traction also tells us that retractions and refutations are seldom efective. Especially after a signiicant amount of time has passed, people will have trouble recalling which information they have received is the disinformation and which is the truth. Put simply, our irst suggestion is don’t expect to counter the irehose of falsehood with the squirt gun of truth.

To the extent that eforts to directly counter or refute Russian propaganda are necessary, there are some best practices available—also drawn from the ield of psychology—that can and should be employed. Three factors have been shown to increase the (limited) efectiveness of retractions and refutations:
(1) warnings at the time of initial exposure to misinformation,
(2) repetition of the retraction or refutation, and
(3) corrections that provide an alternative story to help ill the resulting gap in understanding when false “facts” are removed.

Forewarning is perhaps more efective than retractions or refutation of propaganda that has already been received. he research suggests two possible avenues:
• Propagandists gain advantage by ofering the first impression, which is hard to overcome. If, however, potential audiences have already been primed with correct information, the disinformation inds itself in the same role as a retraction or refutation: disadvantaged relative to what is already known.
• When people resist persuasion or inluence, that act reinforces their preexisting beliefs.

It may be more productive to highlight the ways in which Russian propagandists attempt to manipulate audiences, rather than ighting the speciic manipulations. In practice, getting in front of misinformation and raising awareness of misinformation might involve more robust and more
widely publicized eforts to “out” Russian propaganda sources and
the nature of their eforts. Alternatively, it could take the form of sanctions, ines, or other barriers against the practice of propaganda under the guise of journalism. The UK communications regulator, Ofcom, has sanctioned RT for biased or misleading programs, but more is needed.

Our second suggestion is to ind ways to help put raincoats on those at whom the firehose of falsehood is being directed. Another possibility is to focus on countering the efects of Russian propaganda, rather than the propaganda it self. The propagandists are working to accomplish something. The goal may be a change in attitudes, behaviors, or both. Identify those desired efects and then work to counter the efects that run contrary to your goals.

For example, suppose the goal of a set of Russian propaganda products is to undermine the willingness of citizens in NATO countries to respond to Russian aggression. Rather than trying to block, refute, or undermine the propaganda, focus instead on countering its objective. his could be accomplished through eforts to, for example, boost support for a response to Russian aggression, promote solidarity and identity with threatened NATO partners, or reairm international commitments.

Thinking about the problem in this way leads to several positive developments. It encourages prioritization: Do not worry so much about countering propaganda that contributes to efects that are not of concern. his view also opens up the aperture.

Rather than just trying to counter disinformation with other information, it might be possible to thwart desired efects with other capabilities—or to simply apply information eforts to redirecting behaviors or attitudes without ever directly engaging with the propaganda. That leads to our third suggestion: Don’t direct your low of information directly back at the firehose of falsehood; instead, point
your stream at whatever the irehose is aimed at, and try to push that audience in more productive directions.

That metaphor and mindset leads us to our fourth suggestion for responding to Russian propaganda: Compete! If Russian propaganda aims to achieve certain efects, it can be countered by preventing or diminishing those efects.

Yet, the tools of the Russian propagandists may not be available due to resource constraints or policy, legal, or ethical barriers. Although it may be diicult or impossible to directly refute Russian propaganda, both NATO and the United States have a range of capabilities to inform, influence, and persuade selected target audiences. Increase the low of persuasive information and start to compete, seeking to generate efects that support U.S. and NATO objectives.

Our fifth and inal suggestion for addressing the challenge of Russian propaganda is to use various technical means to turn of (or turn down) the low. If the firehose of falsehood is being employed as part of active hostilities, or if counter propaganda eforts escalate to include the use of a wider range of information warfare capabilities, then jamming, corrupting, degrading, destroying, usurping, or otherwise interfering with the ability of the propagandists to broadcast and disseminate their messages could diminish the impact of their eforts.

Anything from aggressive enforcement of terms of service agreements with Internet providers and social media services to electronic warfare or cyberspace operations could lower
the volume—and the impact—of Russian propaganda. 

References

Alba, Joseph W., and Howard Marmorstein, “he Efects of Frequency Knowledge on Consumer Decision Making,” Journal of Consumer Research,  Vol. 14, No. 1, June 1987, pp. 14–25

Balmforth, Tom, “You Pay, I Say: Website Says It Exposed Russian TV Fakery,”Radio Free Europe/Radio Liberty, February 4, 2016. As of June 1, 2016.

Bell, Brad E., and Elizabeth F. Loftus, “Trivial Persuasion in the Courtroom:  The Power of (a Few) Minor Details,” Journal of Personality and Social Psychology, Vol. 56, No. 5, May 1989, pp. 669–679.

Bertolin, Giorgio, “Conceptualizing Russian Information Operations: Info-War and Iniltration in the Context of Hybrid Warfare,” IO Sphere, Summer 2015, pp. 10–11. As of June 1, 2016

Chen, Adrian, “he Agency,” New York Times Magazine, June 2, 2015.

Claypool, Heather M., Diane M. Mackie, Teresa Garcia-Marques, Ashley McIntosh, and Ashton Udall, “The Efects of Personal Relevance and Repetition
on Persuasive Processing,” Social Cognition, Vol. 22, No. 3, June 2004, pp. 310–335.

Davis, Julia, “Russia’s Top 100 Lies About Ukraine,” he Examiner, August 11, 2014.

Destono, David, Richard E. Petty, Derek D., Rucker, Duane T. Wegener, and Julia Braverman, “Discrete Emotions and Persuasion: he Role of Emotion-
Induced Expectancies,” Journal of Personality and Social Psychology, Vol. 86, No. 1, January 2004, pp. 43–56.

Disinformation, “Weekly Disinformation Review,” Disinfo, January 14, 2016. 

Ecker, Ullrich K. H., Stephan Lewandowsky, Olivia Fenton, and Kelsey Martin, “Do People Keep Believing Because hey Want to? Preexisting Attitudes and Continued Inluence of Misinformation,” Memory and Cognition, Vol. 42, No. 2, 2014, pp. 292–304.

Flanagin, Andrew J., and Miriam J. Metzger, “The Role of Site Features, User Attributes, and Information Veriication Behaviors on the Perceived Credibility of Web-Based Information,” New Media and Society, Vol. 9, No. 2, April 2007, 
pp. 319–342.

Garcia-Marques, Teresa, and Diane M. Mackie, “he Feeling of Familiarity as a Regulator of Persuasive Processing,” Social Cognition, Vol. 19, No. 1, 2001, pp. 9–34.

Goble, Paul A., “Top 10 Fakes of Russian Propaganda About Ukraine in 2015,” Euromaidan Press, December 26, 2015. As of June 1, 2016.

Harkins, Stephen G., and Richard E. Petty, “he Multiple Source Efect in
Persuasion: he Efects of Distraction,” Personality and Social Psychology Bulletin, Vol. 7, No. 4, December 1981, pp. 627–635.

Henkel, Linda A., and Mark E. Mattson, “Reading Is Believing: he Truth Efect and Source Credibility,” Consciousness and Cognition, Vol. 20, No. 4, December
2011, pp. 1705–1721.

Hughes, Michael G., Jennifer A. Griith, homas A. Zeni, Matthew L. Arsenault, Olivia D. Copper, Genevieve Johnson, Jay H. Hardy, Shane Connelly, and Michael D. Mumford, “Discrediting in a Message Board Forum: he Efects
of Social Support and Attacks on Expertise and Trus tworthiness,” Journal of Computer-Mediated Communication, Vol. 19, No. 3, April 2014, pp. 325–341.

Jackson, Jasper, “RT Sanctioned by Ofcom Over Series of Misleading and Biased Articles,” he Guardian, September 21, 2015. As of June 1, 2016.

Kelley, Michael B., and Brett LoGiurato, “Russia’s Military Tells a Very Diferent Story About What Happened to MH17,” Business Insider, July 21, 2014. As of 
June 1, 2016.

Lau, Richard R., “Negativity in Political Perception,” Political Behavior, Vol. 4, No. 4, December 1982, pp. 353–377.

Lelich, Milan, “Victims of Russian Propaganda,” New Eastern Europe, July 25, 2014. As of June 1, 2016.

Lewandowsky, Stephan, Ullrich K. H. Ecker, Colleen M. Seifert, Norbert Schwarz, and John Cook, “Misinformation and Its Correction: Continued Inluence and Successful Debiasing,” Psychological Science in the Public Interest, Vol. 13, No. 3, December 2012, pp. 106–131.

Lucas, Edward, “Russia Has Published Books I Didn’t Write!” Daily Beast, August 20, 2015. As of June 1, 2016.

McCroskey, James C., and homas J. Young, “Ethos and Credibility: 
he Construct and Its Measurement After hree Decades,” Central States Speech Journal, Vol. 32, No. 1, 1981, pp. 24–34.

Metzger, Miriam J., and Andrew J. Flanagin, “Credibility and Trust of
Information in Online Environments: The Use of Cognitive Heuristics,” Journal of Pragmatics, Vol. 59, Part B, December 2013, pp. 210–220.

Miller, James, “Russian Media: Conspiracy heories and Reading Comprehension Issues,” The Interpreter, September 18, 2013. As of June 1, 2016.

Muñoz, Arturo, U.S. Military Information Operations in Afghanistan: Efectiveness
of Psychological Operations 2001–2010, Santa Monica, Calif.: RAND Corporation, MG-1060, 2012. As of June 1, 2016.

Oliker, Olga, “Russia’s New Military Doctrine: Same as the Old Doctrine, Mostly,” Washington Post, January 15, 2015.

Paul, Christopher, Strategic Communication: Origins, Concepts, and Current Debates, Santa Barbara, Calif.: Praeger Security International, 2011.

Petty, Richard E., John T. Caccioppo, Alan J. Strathman, and Joseph R. Priester, “To hink or Not To hink: Exploring Two Routes to Persuasion,” in Timothy C. Brock, and Melanie C. Green, eds., Persuasion: Psychological Insights and Perspectives, 2nd ed., housand Oaks, Calif.: Sage Publications, 2005, pp. 81–116

Pifer, Steven, “Putin, Lies and His ‘Little Green Men,’” CNN, March 20, 2015.

Pomerantsev, Peter, and Michael Weiss, he Menace of Unreality: How the Kremlin Weaponizes Information, Culture and Money, New York: Institute of Modern Russia and The Interpreter, 2014.

Pornpitakpan, Chanthika, “The Persuasiveness of Source Credibility: A Critical Review of Five Decades’ Evidence,” Journal of Applied Social Psychology, Vol. 34, No. 2, February 2004, pp. 243–281.

Reich, Taly, and Zakary L. Tormala, “When Contradictions Foster Persuasion: An Attributional Perspective,” Journal of Experimental Social Psychology, Vol. 49, No. 3, May 2013, pp. 426–439.

Rucker, Derek D., Richard E. Petty, and Pablo Briñol, “What’s in a Frame Anyway? A Meta-Cognitive Analysis of the Impact of One Versus Two Sided Message Framing on Attitude Certainty,” Journal of Consumer Psychology, Vol. 18, No. 2, April 2008, pp. 137–149.

Smith, Oli, “Watch: Russia’s Fake Ukraine War Report Exposed in Putin PR Disaster,” Express, August 24, 2015. As of June 1, 2015.

Tree, Jean E. Fox, and Mary Susan Eldon, “Retelling Urban Legends,” American Journal of Psychology, Vol. 120, No. 3, Fall 2007, pp. 459–476.

U.S. Department of Defense, Defense Science Board, Report of the Defense Science Board Task Force on Strategic Communication, Washington, D.C., January 2008. As of June 1, 2016.

Van Prooijen, Jan-Willem, and Eric van Dijk, “When Consequence Size Predicts Belief in Conspiracy heories: The Moderating Role of Perspective Taking,”  Journal of Experimental Social Psychology, Vol. 44, November 2014, pp. 63–73.

Volchek, Dmitry, and Daisy Sindelar, “One Professional Russian Troll Tells All,” Radio Free Europe/Radio Liberty, March 25, 2015. As of June 1, 2016:

Ziegler, René, Michael Diehl, Rafael Zigon, and Torsten Fett, “Source
Consistency, Distinctiveness, and Consensus: he hree Dimensions of the Kelley ANOVA Model of Persuasion,” Personality and Social Psychology Bulletin, Vol. 30, No. 3, March 2004, pp. 352–364.

Notes

Olga Oliker, “Russia’s New Military Doctrine: Same as the Old Doctrine, Mostly,” Washington Post, January 15, 2015.

Giorgio Bertolin, “Conceptualizing Russian Information Operations: Info-War and Iniltration in the Context of Hybrid Warfare,” IO Sphere, Summer 2015, p. 10.

See Adrian Chen, “The Agency,” New York Times Magazine, June 2, 2015, and Peter Pomerantsev and Michael Weiss, he Menace of Unreality: How the Kremlin Weaponizes Information, Culture and Money, New York: Institute of Modern Russia and he Interpreter, 2014.

Dmitry Volchek and Daisy Sindelar, “One Professional Russian Troll Tells All,” Radio Free Europe/Radio Liberty, March 25, 2015.

The point on the sleeper efect and credibility is from Pornpitakpan, 2004, and Henkel and Mattson, 2011. See also Lewandowsky et al., 2012, and Ullrich K. H. Ecker, Stephan Lewandowsky, Olivia Fenton, and Kelsey Martin, “Do People Keep Believing Because hey Want to? Preexisting Attitudes and Continued Inluence of Misinformation,” Memory and Cognition, Vol. 42, No. 2, 2014. The point on information that is later retracted or proven false is from Ecker et al., 2014. See also Lewandowsky et al., 2012. The point on awareness of potential misinformation is from Lewandowsky et al., 2012.

These points on evidence and credibility are from, respectively, Pornpitakpan, 2004, and Brad E. Bell and Elizabeth F. Loftus, “Trivial Persuasion in the Courtroom: 
he Power of (a Few) Minor Details,” Journal of Personality and Social Psychology, Vol. 56, No. 5, May 1989.

René Ziegler, Michael Diehl, Rafael Zigon, and Torsten Fett, “Source Consistency, Distinctiveness, and Consensus: he hree Dimensions of the Kelley ANOVA Model of Persuasion,” Personality and Social Psychology Bulletin, Vol. 30, No. 3, March 2004.

Source:
(PDF) The Russian “Firehose of Falsehood” Propaganda - www.rand.org 


This document and trademark (s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only.

The RAND Corporation is a research organization that develops solutions to public policy challenges to help make communities throughout the world safer and more secure, healthier and more prosperous.

The RAND Corporation
RAND is nonprofit, nonpartisan, and committed to the public interest. RAND’s publications do not necessarily reflect the opinions of its research clients and sponsors. Copyright © is a registered trademark. All Rights Reserved.

Unauthorized posting of this publication online is prohibited. Permission is given to duplicate this document for personal use only, as long as it is unaltered and complete. Permission is required from RAND to reproduce, or reuse in another form, any of our research documents for commercial use. For information on reprint and linking permissions or contact the director (contact information is provided on the web page) please visit www.rand.org/pubs/permissions.

About Authors:

Christopher Paul is a senior social scientist at RAND and a professor at the Pardee RAND Graduate School. He is also an adjunct faculty member in the Center for Economic Development in the Heinz College at Carnegie Mellon University.

Christopher Paul
He focuses on developing methodological competencies for comparative historical and case-study approaches, evaluation research, various forms of quantitative analysis, and survey research. He has published on such topics as insurgency and counterinsurgency, building
international partner capacity, and information operations and strategic communication.

Miriam Matthews is a behavioral and social scientist at RAND and a professor at the Pardee RAND Graduate School. She conducts research in the areas of political psychology, international conflict, and diversity and multiculturalism.

Miriam Matthews
She has published on the factors that contribute to negative intergroup attitudes, the influence of acculturation ideologies, the effects of threats on political attitudes, and the origins of support for anti Western jihad.