50 S S Q W 2017
Commanding the Trend:
Social Media as Information Warfare
Lt Col Jarred Prier, USAF
Abstract
is article demonstrates how social media is a tool for modern
information-age warfare. It builds on analysis of three distinct topics:
social networking, propaganda, and news and information sharing. Two
case studies are used to show how state and nonstate actors use social
media to employ time-tested propaganda techniques to yield far-reaching
results. e spread of the propaganda message is accomplished by tap-
ping into an existing narrative, then amplifying that message with a
network of automatic “bot” accounts to force the social media platform
algorithm to recognize that message as a trending topic. e rst case
study analyzes Islamic State (IS) as a nonstate actor, while the second
case observes Russia as a state actor, with each providing evidence of suc-
cessful inuence operations using social media. Coercion and persuasion
will continue to be decisive factors in information warfare as more countries
attempt to build inuence operations on social media.
✵ ✵ ✵ ✵ ✵
For years, analysts in the defense and intelligence communities have
warned lawmakers and the American public of the risks of a cyber Pearl
Harbor. e fear of a widespread cyber-based attack loomed over the
country following intrusions against Yahoo! email accounts in 2012,
Sony Studios in 2014, and even the United States government Oce of
Personnel Management (OPM) in 2015. e average American likely
did not understand exactly how, or for what purposes, US adversaries
Lt Col Jarred Prier, USAF, currently serves as director of operations for the 20th Bomb Squadron. He
completed a USAF fellowship at the Walsh School of Foreign Service at Georgetown University and earned
a master’s degree from the School of Advanced Air and Space Studies at Air University, Maxwell Air Force
Base, Alabama. Prier also holds a master of science degree in international relations from Troy University,
Alabama. is article evolved from his thesis.
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 51
were operating within the cyber domain, but the implications of future
attacks were not dicult to imagine. Enemies of the United States could
target vulnerable power grids, stock markets, train switches, academic
institutions, banks, and communications systems in the opening salvos
of this new type of warfare.
1
In contrast to more traditional forms of cyberattack, cyber operations
today target people within a society, inuencing their beliefs as well as
behaviors, and diminishing trust in the government. US adversaries now
seek to control and exploit the trend mechanism on social media to
harm US interests, discredit public and private institutions, and sow
domestic strife. “Commanding the trend” represents a relatively novel
and increasingly dangerous means of persuasion within social media.
us, instead of attacking the military or economic infrastructure, state
and nonstate actors outside the United States can access regular streams
of online information via social media to inuence networked groups
within the United States. is article analyzes how two US adversaries
hijacked social media using four factors associated with command of
the trend. First it provides a basis for commanding the trend in social
media by analyzing social media as a tool for obtaining and spreading
information. It then looks more specically at how US adversaries use
social media to command the trend and target US citizens with malicious
propaganda. Next, the two most prominent, recent case studies provide
evidence of how nonstate and state actors use social media to counter
the United States. e rst case study covers IS from 2014 to 2016 by
examining the groups use of social media for recruiting, spreading pro-
paganda, and proliferating terror threats. e second case describes the
pattern of Russian hacking, espionage, disinformation, and manipula-
tion of social media with a particular focus on the United States presi-
dential election of 2016. Evidence for this second case study comes from
nearly two years of research on Twitter accounts believed to be part of
a Russian information warfare network. e article concludes with im-
plications and predictions of how social media will continue to develop,
what can be expected in the future, and how the United States can re-
spond to the growing threat of adversaries commanding the trend.
Commanding the Trend in Social Media
e adaptation of social media as a tool of modern warfare should
not be surprising. Internet technology evolved to meet the needs of
Jarred Prier
52 S S Q W 2017
information-age warfare around 2006 with the dawn of Web 2.0, which
allowed internet users to create content instead of just consuming online
material. Instead, the individual could decide what was important and
only read what was important, on demand. Not only could users se-
lect what news they want to see, but they could also use the medium to
create news based on their opinions.
2
e social nature of humans ulti-
mately led to virtual networking. As such, traditional forms of media were
bound to give way to a more tailorable form of communication. US
adversaries were quick to nd ways to exploit the openness of the internet,
eventually developing techniques to employ social media networks as
a tool to spread propaganda. Social media creates a point of injection
for propaganda and has become the nexus of information operations
and cyber warfare. To understand this we must examine the important
concept of the social media trend and look briey into the fundamentals
of propaganda. Also important is the spread of news on social media,
specically, the spread of “fake news” and how propaganda penetrates
mainstream media outlets.
Trending Social Media
Social media sites like Twitter and Facebook employ an algorithm to
analyze words, phrases, or hashtags to create a list of topics sorted in
order of popularity. is “trend list” is a quick way to review the most
discussed topics at a given time. According to a 2011 study on social
media, a trending topic “will capture the attention of a large audience
for a short time” and thus “contributes to agenda setting mechanisms.
3
Using existing online networks in conjunction with automatic “bot”
accounts, foreign agents can insert propaganda into a social media
platform, create a trend, and rapidly disseminate a message faster and
cheaper than through any other medium. Social media facilitates the
spread of a narrative outside a particular social cluster of true believers
by commanding the trend. It hinges on four factors: (1) a message that
ts an existing, even if obscure, narrative; (2) a group of true believers
predisposed to the message; (3) a relatively small team of agents or cyber
warriors; and (4) a network of automated “bot” accounts.
e existing narrative and the true believers who subscribe to it are
endogenous, so any propaganda must t that narrative to penetrate the
network of true believers. Usually, the cyber team is responsible for crafting
the specic message for dissemination. e cyber team then generates
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 53
videos, memes, or fake news, often in collusion with the true believers.
To achieve the eective spread of propaganda, the true believers, the
cyber team, and the bot network combine eorts to take command of
the trend. us, an adversary in the information age can inuence the
population using a variety of propaganda techniques, primarily through
social media combined with online news sources and traditional forms
of media.
A trending topic transcends networks and becomes the mechanism
for the spread of information across social clusters. Here the focus is
primarily on Twitter, a “microblogging” site where each post is limited
to 140 characters.
4
Facebook also has a trends list, but it is less visible than
the Twitter trends list, and the two applications serve dierent purposes.
Facebook maintains a function of bringing friends and families together.
On Facebook, your connections are typically more intimate connec-
tions than you would expect on Twitter, which focuses less on bringing
people together and more on bringing ideas together. As a microblog,
Twitter’s core notion is to share your thoughts and feelings about the
world around you with a group of people who share similar interests.
e individuals who follow each other may not be friends but could be
a team of like-minded academics, journalists, sports fans, or politicos.
When a person tweets, that tweet can be viewed by anyone who follows
that person, or anyone who searches for that topic using Twitter’s search
tool. Additionally, anyone can “retweet” someone elses tweet, which
broadcasts the original to a new audience. Twitter makes real-time idea
and event sharing possible on a global scale.
5
Another method for quick
referencing on Twitter is using a “hashtag.” e tweet would then be
visible to anyone who clicked on the link along with all of the other
tweets using the same hashtag.
A trend can spread a message to a wide group outside of a persons
typical social network. Moreover, malicious actors can use trends to
spread a message using multiple forms of media on multiple platforms,
with the ultimate goal of garnering coverage in the mainstream media.
Command of the trend is a powerful method of spreading information
whereby, according to an article in the Guardian, “you can take an exist-
ing trending topic, such as fake news, and then weaponise it. You can
turn it against the very media that uncovered it.
6
Because Twitter is an idea-sharing platform, it is very popular for rapidly
spreading information, especially among journalists and academics;
Jarred Prier
54 S S Q W 2017
however, malicious users have also taken to Twitter for the same benets
in recent years. At one time, groups like al-Qaeda preferred creating
websites, but now, “Twitter has emerged as the internet application most
preferred by terrorists, even more popular than self-designed websites or
Facebook.
7
Twitter makes it easy to spread a message to both supporters
and foes outside of a particular network. Groups trying to disseminate
a message as widely as possible can rely on the trend function to reach
across multiple networks.
ree methods help control what is trending on social media: trend
distribution, trend hijacking, and trend creation. e rst method is
relatively easy and requires the least amount of resources. Trend distribu-
tion is simply applying a message to every trending topic. For example,
someone could tweet a picture of the president with a message in the
form of a meme—a stylistic device that applies culturally relevant humor
to a photo or video—along with the unrelated hashtag #SuperBowl.
Anyone who clicks on that trend list expecting to see something about
football will see that meme of the president. Trend hijacking requires
more resources in the form of either more followers spreading the mes-
sage or a network of “bots” (autonomous programs that can interact with
computer systems or users) designed to spread the message automati-
cally. Of the three methods to gain command of the trend, trend cre-
ation requires the most eort. It necessitates either money to promote a
trend or knowledge of the social media environment around the topic,
and most likely, a network of several automatic bot accounts.
Bot accounts are non-human accounts that automatically tweet and
retweet based on a set of programmed rules. In 2014, Twitter estimated
that only 5 percent of accounts were bots; that number has grown along
with the total users and now tops 15 percent.
8
Some of the accounts
are “news bots,” which just retweet the trending topics. Some of the
accounts are for advertising purposes, which try to dominate conversa-
tions to generate revenue through clicks on links. Some bots are trolls,
which, like a human version of an online troll, tweet to disrupt the civil
conversation.
For malicious actors seeking to inuence a population through trends
on social media, the best way to establish trends is to build a network
of bot accounts programmed to tweet at various intervals, respond to
certain words, or retweet when directed by a master account. Figure 1
illustrates the basics of a bot network. e top of the chain is a small
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 55
core group. at team is composed of human-controlled accounts with
a large number of followers. e accounts are typically adversary cyber
warriors or true believers with a large following. Under the core group
is the bot network. Bots tend to follow each other and the core group.
Below the bot network is a group consisting of the true believers with-
out a large following. ese human-controlled accounts are a part of
the network, but they appear to be outsiders because of the weaker
links between the accounts. e bottom group lacks a large following,
but they do follow the core group, sometimes follow bot accounts, and
seldom follow each other.
Figure 1. Illustration of a bot network
Enough bots working together can quickly start a trend or take over
a trend, but bot accounts themselves can only bridge the structural hole
between networks, not completely change a narrative. To change a nar-
rative, to conduct an eective inuence operation, requires a group to
combine a well-coordinated bot campaign with essential elements of
propaganda.
Small Core Group
Larger Group with Strong TiesLarger Group with Strong Ties
Small Outsiders with Weak Ties
Jarred Prier
56 S S Q W 2017
Propaganda Primer
Messaging designed to inuence behavior has been around for centuries
but became easier as methods of mass communication enabled wider dis-
semination of propaganda. Observing the rise of mass media and its pres-
ence in daily life, French philosopher Jacques Ellul noted the simplicity of
propaganda in 1965. According to Ellul, “Propaganda ceases where simple
dialogue begins.
9
at said, it is worth noting Eric Hoers comments that
propaganda on its own cannot force its way into unwilling minds, neither
can it inculcate something wholly new.
10
For propaganda to function, it
needs a previously existing narrative to build upon, as well as a network of
true believers who already buy into the underlying theme. Social media helps
the propagandist spread the message through an established network. A per-
son is inclined to believe information on social media because the people
he chooses to follow share things that t his existing beliefs. at person, in
turn, is likely to share the information with others in his network, to others
who are like-minded, and those predisposed to the message. With enough
shares, a particular social network accepts the propaganda storyline as fact.
But up to this point, the eects are relatively localized. e most eective
propaganda campaigns are not conned just to those predisposed to the
message. Essentially, propaganda permeates everyday experiences, and the
individual targeted with a massive media blitz will never fully understand
that the ideas he has are not entirely his own. A modern example of this
phenomenon was observable during the Arab Spring as propaganda spread
on Facebook “helped middle-class Egyptians understand that they were not
alone in their frustration.
11
In short, propaganda is simpler to grasp if ev-
eryone around a person seems to share the same emotions on a particular
subject. Even a general discussion among the crowd can provide the illu-
sion that propaganda is information.
12
In other words, propaganda creates
heuristics, which is a way the mind simplies problem solving by relying
on quickly accessible data. e availability heuristic weighs the amount and
frequency of information received, as well as recentness of the information,
as more informative than the source or accuracy of the information.
13
Es-
sentially, the mind creates a shortcut based on the most—or most recent—
information available, simply because it can be remembered easily. Often,
the availability heuristic manifests itself in information received through
media coverage. e availability heuristic is important to understanding in-
dividual opinion formation and how propaganda can exploit the shortcuts
our minds make to form opinions. e lines in gure 2 show formation
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 57
of opinions temporally, with bold arrows inuencing a nal opinion more
than light arrows. e circled containers indicate a penetration point for
propaganda exploitation. As previously described, mass media enables rapid
spread of propaganda, which feeds the availability heuristic. e internet
makes it possible to ood the average persons daily intake of information,
which aids the spread of propaganda.
One of the primary principles of propaganda is that the message must
resonate with the target. erefore, when presented with information that
is within your belief structure, your bias is conrmed and you accept the
propaganda. If it is outside of your network, you may initially reject the
story, but the volume of information may create an availability heuristic in
your mind. Over time, the propaganda becomes normalized—and even
believable. It is conrmed when a fake news story is reported by the main-
stream media, which has become reliant on social media for spreading and
receiving news.
Figure 2. Model of individual opinion formation. (Reproduced by permission
from Alan D. Monroe, Public Opinion in America [New York: Dodd, Mead, and
Co., 1975], 147.)
Social
Characteristics
of Parents
Cultural
Values
PAST
TIME
PRESENT
Values
Held by
Parents
Opinions
Expressed
by Peers
Prior
Opinions
Opinion
Events
in the
World
Media
Coverage
of Events
Perception
of Events
Social
Characteristics
of the Individual
Identification
with
Groups
Values
Internalized by
the Individual
Values
Internalized by
Peers
Jarred Prier
58 S S Q W 2017
Figure 3 maps the process of how propaganda can penetrate a network
that is not predisposed to the message. is outside network is a group
that is ideologically opposed to the group of true believers. e outside
network is likely aware of the existing narrative but does not necessarily
subscribe to the underlying beliefs that support the narrative.
Figure 3. Process map of how propaganda spreads via the trend
Command of the trend enables the contemporary propaganda model,
to create a “rehose of information” that permits the insertion of false
narratives over time and at all times.
14
Trending items produce the illusion
of reality, in some cases even being reported by journalists. Because
untruths can spread so quickly now, the internet has created “both
deliberate and unwitting propaganda” since the early 1990s through
the proliferation of rumors passed as legitimate news.
15
e normaliza-
tion of these types of rumors over time, combined with the rapidity and
volume of new false narratives over social media, opened the door for
fake news.
e availability heuristic and the rehose of disinformation can slowly
alter opinions as propaganda crosses networks by way of the trend, but
Existing
Narrative
Idea/Message
Tweets
• Media which supports propaganda
• State-owned media
• Other social media platforms
• Hacked Emails
• Forged Documents/Videos
• Memes
• Fake News
• Real news supporting propaganda
• Threats
• Power demonstration video/photos
*This is optional
Cyber Team
Outside
Network
Trend
Propaganda
Bot Network
Journalists
Other Resources*
True
Believers
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 59
the amount of inuence will likely be minimal unless it comes from a
source that a nonbeliever nds trustworthy. An individual may see the
propaganda and believe the message is popular because it is trending but
still not buy into the message itself. Instead, the individual will likely
turn to a trusted source of news to test the validity of the propaganda.
erefore, we must now analyze modern journalism to determine how com-
mand of the trend can transform propaganda from fake news to real news.
Social Networks and Social Media
Currently, 72 percent of Americans get digital news primarily from
a mobile device, and people now prefer online news sources to print
sources by a two-to-one ratio.
16
e news consumer now selects from an
abundance of options besides a local newspaper, based on how the con-
sumer perceives the credibility of the resource. As social media usage has
become more widespread, users have become ensconced within specic,
self-selected groups, which means that news and views are shared nearly
exclusively with like-minded users. In network terminology, this group
phenomenon is called homophily. More colloquially, it reects the con-
cept that “birds of a feather ock together.” Homophily within social
media creates an aura of expertise and trustworthiness where those fac-
tors would not normally exist. Along the lines of social networking and
propaganda, people are more willing to believe things that t into their
worldview. Once source credibility is established, there is a tendency to
accept that source as an expert on other issues as well, even if the issue is
unrelated to the area of originally perceived expertise.
17
Ultimately, this
echo chamber” can promote the scenario in which your friend is “just
as much a source of insightful analysis on the nuances of U.S. foreign
policy towards Iran as regional scholars, arms control experts, or jour-
nalists covering the State Department.
18
If social media facilitates self-reinforcing networks of like-minded users,
how can a propaganda message traverse networks where there are no
overlapping nodes? is link between networks is only based on that
single topic and can be easily severed. us, to employ social media
eectively as a tool of propaganda, an adversary cannot rely on individual
weak links between networks. Instead, an adversary must exploit a feature
within the social media platform that enables cross-network data shar-
ing on a massive scale: the trending topics list. Trends are visible to
everyone. Regardless of who follows whom on a given social media plat-
Jarred Prier
60 S S Q W 2017
form, all users see the topics algorithmically generated by the platform
as being the most popular topics at that particular moment. Given this
universal and unavoidable visibility, “popular topics contribute to the
collective awareness of what is trending and at times can also aect the
public agenda of the community.
19
In this manner, a trending topic can
bridge the gap between clusters of social networks. A malicious actor can
quickly spread propaganda by injecting a narrative onto the trend list.
e combination of networking on social media, propaganda, and
reliance on unveriable online news sources introduces the possibility of
completely falsied news stories entering the mainstream of public con-
sciousness. is phenomenon, commonly called fake news, has generated
signicant criticism from both sides of the American political spectrum,
with some labeling any contrary viewpoints fake. In reality, fake news
consists of more than just bad headlines, buried ledes, or poorly sourced
stories.
20
Fake news is a particular form of propaganda composed of a
false story disguised as news. On social media, this becomes particularly
dangerous because of the viral spread of sensationalized fake news stories.
A prime example of fake news and social media came from the most
shared news stories on Facebook during the 2016 US presidential elec-
tion. e source of the fake news was a supposedly patriotic American
news blog called “End the Fed,” a website run by Romanian business-
person Ovidiu Drobota. One story stating that the pope endorsed
Donald Trump for president received over one million shares on Face-
book alone, not to mention shares on Twitter.
21
Other fake news stories
from that site and others received more shares in late 2016 than did
traditional mainstream news sources (see gure 4).
22
It is important to recognize that more people were exposed to those
fake news stories than what is reected in the “shares” data. In some
cases, people would just see the story in a Facebook or Twitter feed; in
many cases, people actively sought out news from those sources, which
are ction at best and foreign propaganda at worst. Over time, those
fake news sources become trusted sources for some people. As people
learn to trust those sources, legitimate news outlets become less trust-
worthy. A 2016 poll by Gallup showed American trust in mass media is
at an all-time low.
23
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 61
Figure 4. Total Facebook engagements for top 20 election stories
When news is tailorable to ones taste and new stories are popping up
around the world every second, mainstream journalists have to change
their methods to compete with other sources of news. erefore, if
social media is becoming a source for spreading news and information,
journalists will try to keep up by using social media to spread their stories
and to acquire information rst. According to an Indiana University
School of Journalism study, the most common use of social media for
journalists is to check for breaking news.
24
As a result, mainstream
journalists tend to use tweets as a legitimate source, especially when
there is a lack of more valid or conrmed sources.
25
Overreliance on social
media for breaking news can become problematic in the midst of an on-
going information operation. If an adversary takes control of a trend on
Twitter, the trend is likely to be noticed by mainstream media journalists
who may provide legitimacy to a false story—essentially turning fake
news into real news. is is the initial setup for how social media became
extremely inuential via an adversarys propaganda. IS and Russia suc-
cessfully manipulated social media, particularly Twitter. Although they
had dierent objectives, the tools and techniques were similar. Both foreign
actors used command of the trend to spread propaganda that inuenced
the emotions, opinions, and behavior of US citizens in a manner anti-
thetical to US interests. In essence, IS and Russia hijacked social media
through propaganda narratives, true believers, cyber warriors, and a
bot network.
15 million
Number of Engagements
2016
12 million
8.7 million
7.3 million
9 million
6 million
3 million
Feb.–April May–July Aug.–Nov. 8
MAINSTREAM NEWS
FAKE NEWS
Jarred Prier
62 S S Q W 2017
Hijacking Social Media—the Case of IS
IS could be considered either a large terrorist organization or a very
fragile state with a weak army. However, the perception of IS varies. To
believers, IS is a religious caliphate, but much of the rest of the world
assumes it is a terrorist group that represents a perversion of faith. IS
managed to master the art of manipulation because a single message
simultaneously targeted potential allies and foes alike. Its use of social
media is a case study in eective propaganda techniques that bolstered
recruiting, increased brand recognition, and spread terror with minimal
eort. It quickly became the rst organization to use social media eec-
tively to achieve its goals.
Although IS may use terrorism as a tactic, the organization behaves
dierently than any other terrorist organization in the world.
26
e dif-
ferences are apparent in every aspect, from operations to recruiting to
governing. e last factor is the key discriminator. As a descendant of
al-Qaeda in Iraq, the group struggled to nd its way after the death of
leader Abu Musab al-Zarqawi in 2006; under the leadership of Abu Bakr
al-Baghdadi the group has established clear lines of authority, taxation
and educational systems, trade markets, even policing and a judiciary
(covering civil, criminal, and religious complaints).
27
Gaining and holding
land is just a part of what IS believes is the destiny of the organization
and its followers. Certainly, the desire is to create a caliphate,
28
but its
ultimate purpose is more apocalyptic in nature: IS seeks to usher in the
end of the world.
29
Its members believe that their actions will bring the
forces of the world to attack their caliphate and result in the imminent
defeat of the indel army in the Syrian town of Dabiq, thus triggering
the end of the world and the nal purge of evil.
30
IS is a revolutionary
force with doomsday cult beliefs.
31
To advance the organizations objectives, IS used messages that served
to spread its propaganda on social media to a broad audience that t
within a narrative of strength for the supporter and a narrative of terror
for the adversary. In other words, IS cyber warriors combined propa-
ganda with command of the trend to accomplish three things with one
message. First, they demonstrated the weakness and incompetence of
the international community to ght them online and on the battle-
eld. Second, they injected terror into the mainstream media. Finally
and most importantly, they recruited new ghters to join them on the
battleeld in Iraq and Syria—and online.
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 63
Islamic State Commanding the Trend
rough a combination of ingenious marketing and cyber mastery,
IS bolstered its message around the world. First, the group rened IS
branding. e organization projects a very specic image to the world
that aects the viewer dierently based on beliefs. To a follower, the images
that are shared via social media demonstrate strength and power. To the
nonfollower, the images are grotesque and horrifying. In other words,
no matter what IS puts out in social media the result is a win for the
organization because the same message successfully targets two dierent
groups. e amplication of those messages by creating trends on Twitter
is guaranteed to get further attention once the tweet falls into the main-
stream media. us, IS is capable of using relatively small numbers of
Twitter users (see table 1) to project an aura of strength.
e method for expanding the reach of a single IS tweet or hashtag
involves a network of legitimate retweets combined with bots and unwitting
Twitter users. While IS does maintain a strong network of true believers,
the numbers are relatively small and spread thinly across the Middle
East. erefore, IS must game the system and rig Twitter for a mes-
sage to go viral. One high-tech method for creating a bot network was
a mobile app called “Dawn of Glad Tidings.” e app, designed by IS
cyber warriors, provides updates on IS activities and spiritual guidance
to the user. When users download the app, they create an account that
links to their Twitter account, which then gives the app generous per-
missions, allowing the app to tweet using that users account.
32
e app
then retweets on behalf of the user when a master account sends an IS-
branded tweet.
Over time, the hashtag generates enough tweets to start localized
trends. Once the trend surfaces, it is broadcast over trend-monitoring
networks, like the Arabic Twitter account @ActiveHashtags.
33
at
causes the hashtag to gather more attention across the region and then
be retweeted by real followers and other bot accounts. e nal step in
the process is when the trend goes global.
Jarred Prier
64 S S Q W 2017
Table 1. Snapshot of Islamic State Twitter activity
Twitter-related activity studied Related statistics
Estimated number of overt IS Twitter accounts 46,000
Number of “bot” accounts 6,216
Average number of tweets per day per user 7. 3
Average number of followers 1,004
Most common year accounts created 2014
Top languages Arabic (73%), English (18%), French (6%)
Top locations “Islamic State, Syria, Iraq, Saudi Arabia
a
Source: J. M. Berger and Jonathon Morgan, “e ISIS Twitter Census,” Brookings Institute, accessed 20 March 2015, https://www
.brookings.edu/research/the-isis-twitter-census-dening-and-describing-the-population-of-isis-supporters-on-twitter/.
a
Based on location-enabled users and self-dened account locations
Worldwide trends on Twitter have been a boon for IS. Creating and
hijacking trends garnered attention for the group that would otherwise
have gone unnoticed on social media. e peak of IS trend hijacking
was during the World Cup in 2014—as one of the world’s most popular
sporting events, it was no surprise that the hashtag #WorldCup2014
trended globally on Twitter nonstop during the tournament. At one
point though, nearly every tweet under this hashtag had something to
do with IS instead of soccer. e network of IS supporters and bot
accounts hijacked the trend. Because people were using the hashtag to
discuss the matches and advertisers were using the trend for marketing,
Twitter struggled to stop the trend and the subsequent IS propaganda
eort.
In fact, IS cyber warriors and true believers foiled most of the early
attempts by Twitter to stop IS from using their platform to spread pro-
paganda. Twitters initial reaction was to suspend accounts that violated
the user terms of the agreement. e result was creative user names by
IS supporters; for example, a user named @jihadISIS42 was created after
@jihadISIS41 was suspended, which was set up after @jihadISIS40 was
suspended.
34
Each new account demonstrated a deep dedication to the
cause that, when combined with the seemingly signicant presence on
social media, presented the group as dominating social media.
In the case of #WorldCup2014, IS took command of the trend by
hijacking, using the opportunity to push recruiting messages, and making
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 65
terror threats against the tournament venues in Brazil. Additionally,
the co-opted hashtag often directed users to other hashtags in what
was ultimately a successful attempt to generate worldwide trends of
other IS-related themes. One successful hashtag-creation eort was
#StevensHeadinObamasHands, which included memes of President
Barack Obama and IS-held American journalist Steven Sotlo. e im-
plication was that the president of the United States did not care to or
was powerless to stop the murder of an American citizen. Once again,
IS appeared to be disproportionately powerful because of the command
of the trend.
Due to the organizations aggressive communications strategy and
branding, the IS social media presence consistently outperforms similar
jihadist groups in the region that have the same number of, or more,
followers.
35
Unlike al-Qaeda, which largely limited its online activity to
websites, IS wanted to communicate with a broader audience—it wants
to communicate directly to the whole world. In addition to spreading
terror threats, the appearance of the group as a powerful state appealed
to a group of true believers who turned to IS as new recruits to ght in
Iraq and Syria. IS used social media from 2014 to 2016 to demonstrate
power, sow fear in the international audience, and recruit the true believers.
All the while, they used the true believers following on social media to
boost their trends on social media. However, the group currently nds
itself altering its modus operandi due to the recent loss of territories in
Iraq and Syria, combined with a spate of successful terrorist-style attacks
in Europe. e ongoing worry for counterterrorism experts is nally
beginning to come to fruition: the recruit staying home to ght instead
of joining IS overseas.
After years of maintaining a signicant presence on social media, IS
is using Twitter less now for ocial communication. e reasoning is
likely twofold. First, the group has lost territory in Iraq and Syria and is
adjusting its strategies. Second, Twitter has removed over 600,000 IS-related
accounts consisting of bots, cyber warriors, and true believers.
36
Additionally,
Twitter has adjusted the program to nd terror-related videos, memes,
and photos soon after an account from the IS network posts the propa-
ganda. e reasons IS seemed so powerful is that, when viewed through
the lens of terrorist groups, it advertised using weaponized social media
campaigns. Its intense social media presence, ghastly videos, massive
Jarred Prier
66 S S Q W 2017
recruiting, and victories against Iraqi security forces made IS seem dis-
proportionately stronger than it was.
In summation, IS serves as a model for any nonstate group attempting
to use social media for cyber coercion. Table 2 summarizes its use of the
four requirements to gain command of the trend based on the analysis
within this case study.
Table 2. Islamic State case study analysis
Requirement Example
Propaganda narratives
1. IS is strong; everyone else is weak.
2. True believers should join the cause.
True believers Muslims believing in the caliphate of al-Baghdadi
Cyber warriors
Propaganda makers, video editors, app programmers,
recruiters, spiritual leaders using low- and high-tech tools
to advertise IS on social media
Bot network
Unwitting victims of spiritual-guidance app “Dawn of Glad
Tidings”
At the same time IS was weaponizing Twitter, Russia was using it to
simultaneously cause confusion and garner support for its invasion of
Crimea. Soon, Russias command of the trend would be used to target
the United States 2016 presidential election.
Russia: Masters of Manipulation
Russia is no stranger to information warfare. e original technique of
Soviet actors was through aktivnyye meropriyatiya (active measures) and
dezinformatsiya (disinformation). According to a 1987 State Depart-
ment report on Soviet information warfare, “active measures are dis-
tinct both from espionage and counterintelligence and from traditional
diplomatic and informational activities. e goal of active measures is
to inuence opinions and/or actions of individuals, governments, and/
or publics.
37
In other words, Soviet agents would try to weave propaganda into
an existing narrative to smear countries or individual candidates. Active
measures are designed, as retired KGB General Oleg Kalugin once
explained,to drive wedges in the Western community alliances of all
sorts, particularly NATO, to sow discord among allies, to weaken the
United States in the eyes of the people in Europe, Asia, Africa, Latin
America, and thus to prepare ground in case the war really occurs.” Editor,
translator, and analyst of Russian Federation trends Michael Weiss says,
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 67
“e most common subcategory of active measures is dezinformatsiya,
or disinformation: feverish, if believable lies cooked up by Moscow Centre
and planted in friendly media outlets to make democratic nations look
sinister.
38
e techniques Russia uses today are similar to those they used during the
Cold War, but dissemination is more widespread through social media.
Recently, the Russian minister of defense acknowledged the existence of
their cyber warriors in a speech to the Russian parliament, announcing
that Russia formed a new branch of the military consisting of informa-
tion warfare troops.
39
e Internet Research Agency, as it was called
in 2015, now seems to be the information warfare branch he openly
admitted to. is army of professional trolls’ mission is to ght online.
e Russian trolls have a variety of state resources at their disposal, in-
cluding a vast intelligence network to assist their cyber warriors. e
additional tools available to Russia also include RT (Russia Today) and
Sputnik, the Kremlin-nanced television news networks broadcasting
in multiple languages around the world. Before the trolls begin their
activities on social media, the cyber warrior hackers rst provide hacked
information to Wikileaks, which, according to CIA director Mike Pom-
peo, is a “non-state hostile intelligence service abetted by state actors
like Russia.
40
In intelligence terms, WikiLeaks operates as a “cutout”
for Russian intelligence operations—a place to spread intelligence in-
formation through an outside organization—similar to the Soviets’ use
of universities to publish propaganda studies in the 1980s.
41
e trolls
then take command of the trend to spread the hacked information on
Twitter, referencing WikiLeaks and links to RT news within their tweets.
ese Russian eorts would be impossible without an existing network
of American true believers willing to spread the message. e Russian
trolls and the bot accounts amplied the voices of the true believers in
addition to inserting propaganda into that network. en, the com-
bined eects of Russian and American Twitter accounts took command
of the trend to spread disinformation across networks.
e cyber trolls produced several hoaxes in the United States and Europe,
like the Louisiana hoax, according to Adrian Chen in his article “e
Agency” in the New York Times Magazine.
42
Protests of police depart-
ments throughout the United States during the summer of 2015 pro-
vided several opportunities to manipulate narratives via social media,
and it is likely Russian trolls hijacked some of the Black Lives Matter–related
Jarred Prier
68 S S Q W 2017
trends to spread disinformation and accuse journalists of failing to cover
important issues.
43
e Russian trolls said the idea was to spread fear,
discrediting institutions—especially American media—while making
President Obama look powerless and Russian president Vladimir Putin
more favorable.
44
Several hijacked hashtags in 2015 attempted to discredit the Obama
administration while spreading racist memes and hoaxes aimed at the
African American community. In other words, the Russian trolls seemed
to target multiple groups to generate anger and create chaos. One
particularly eective Twitter hoax occurred as racial unrest fell on the
University of Missouri campus that fall.
#PrayforMizzou
On the night of 11 November 2015, #PrayforMizzou began trending on
Twitter.
45
e trend was a result of protests at the University of Missouri
campus over racial issues; however, “news” slowly started developing
within the hashtag that altered the meaning and soon shot the hashtag
to the top of the trend list. e news was that the KKK was marching
through Columbia and the Mizzou campus. One user, display name
“Jermaine” (@Fanfan1911), warned residents, “e cops are marching
with the KKK! ey beat up my little brother! Watch out!” Jermaine’s
tweet included a picture of a black child with a severely bruised face; it
was retweeted hundreds of times. Additionally, Jermaine and a handful
of other users continued tweeting and retweeting images and stories of
KKK and neo-Nazis in Columbia, chastising the media for not covering
the racists creating havoc on campus.
Looking at Jermaines followers, and the followers of his followers, one
could observe that the original tweeters all followed and retweeted each
other. ose users also seemed to be retweeted automatically by approxi-
mately 70 bots. ese bots also used the trend-distribution technique,
which used all of the trending hashtags at that time within their tweets,
not just #PrayforMizzou. Spaced evenly, and with retweets of real people
who were observing the Mizzou hashtag, the numbers quickly escalated
to thousands of tweets within a few minutes. e plot was smoothly
executed and evaded the algorithms Twitter designed to catch bot tweeting,
mainly because the Mizzou hashtag was being used outside of that
attack. e narrative was set as the trend was hijacked, and the hoax was
underway.
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 69
e rapidly spreading image of a bruised little boy was generating
legitimate outrage across the country and around the world. However,
a quick Google image search for “bruised black child” revealed the picture
that “Jermaine” attached to the tweet was a picture of an African Ameri-
can child who was beaten by police in Ohio over one year earlier. e
image and the narrative were part of a larger plot to spread fear and
distrust. It worked.
e University of Missouri student body president tweeted a warning
to stay o the streets and lock doors because “KKK members were con-
rmed on campus.” National news networks broke their coverage to
get a local feed from camera crews roaming Columbia and the campus
looking for signs of violence. As journalists continued to search for signs
of Klan members, anchors read tweets describing shootings, stabbings,
and cross burnings. In the end, the stories were all false.
Shortly after the disinformation campaign at Mizzou, @Fanfan1911
changed his display name from Jermaine to “FanFan” and the prole picture
of a young black male changed to the image of a German iron cross.
e next few months, FanFans tweets were all in German and consisted
of spreading rumors about Syrian refugees. Russian active measures in
Europe around this time were widely reported, and the account that
previously tweeted disinformation regarding Mizzou now focused on
messages that were anti-Islamic, anti–European Union, and anti-German
Chancellor Angela Merkel. His tweets reached a crescendo after reports
of women being raped on New Years Eve 2016. Some of the reports were
false, including a high-prole case of a 13-year-old ethnic-Russian girl
living in Berlin who falsely claimed that she was abducted and raped by
refugees.
46
Once again, Russian propaganda dominated the narrative.
47
Similar to previous disinformation campaigns on Twitter, the Russians
trolls were able to spread the information because of an underlying fear
and an existing narrative that they were able to exploit. e trolls used
trend-hijacking techniques in concurrence with reporting by Russian
state-funded television network Russia Today. To attempt to generate
more attention to the Russian anti-Merkel narrative in European media,
Russian foreign minister Sergey Lavrov accused German authorities of a
politically correct cover-up” in the case of the Russian teen.
48
Because
of the Russian propaganda push, the anti-immigration narrative began
spreading across traditional European media.
49
In fact, a magazine in
Jarred Prier
70 S S Q W 2017
Poland devoted an entire issue to the topic of Muslim immigration with
a disturbing cover photo entitled “Islamic Rape of Europe.
50
In addition to the German tweets, FanFan began tweeting in English
again in the spring of 2016. His tweets and the tweets of other Russian
trolls were spreading in America. e narrative they spread was develop-
ing a symbiotic relationship with American right-wing news organiza-
tions like Breitbart and its followers on social media—a group of true
believers in the Russian propaganda narrative.
Additionally, the troll network already seeded various social media
platforms with pages designed for spreading disinformation.
51
Seem-
ingly patriotic American Facebook pages linked articles to RT, legiti-
mate American news sources advocating a right-leaning perspective,
Breitbart, right-wing conspiracy sites like InfoWars, and non-factual
news” sites like the Conservative Tribune and Gateway Pundit. e
Facebook pages also linked to Russia-run sites with nothing but false
news stories. Based on anti-Obama sentiment, the Facebook pages were
popular among conservative users but not getting broad exposure. Be-
fore 2016, Russian active measures were also used in European elections,
most notably the “Brexit” campaign. One European expert on Russia
quoted in the Atlantic article “War Goes Viral” summarized Putins intent
as “not to make you love Putin”; instead “the aim is to make you dis-
believe anything. A disbelieving, fragile, unconscious audience is much
easier to manipulate.
52
Active measures enable manipulation. Smearing
political candidates, hacking, the spread of disinformation, and hoaxes
all contribute to a breakdown of public trust in institutions.
As the 2016 US presidential campaign began in earnest, much of
the online animosity was now directed at Obamas potential successor:
Hillary Clinton. She became a rallying cry for Trump supporters and a
force-multiplying tool for the Russian trolls.
Inuencing the 2016 Presidential Election
According to the Oce of Director of National Intelligence (ODNI)
Report on Russian Inuence during the 2016 presidential election,
“Moscows inuence campaign followed a messaging strategy that blends
covert intelligence operations—such as cyber activity—with overt ef-
forts by Russian Government agencies, state funded media, third-party
intermediaries, and paid social media users, or ‘trolls.’ ”
53
In the case of
the 2016 election, Russian propaganda easily meshed with right-wing
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 71
networks known as the “alt-right” and also with supporters of Senator
Bernie Sanders in the left wing of the Democratic Party. Hillary Clinton
had been a target of conservative groups since she rst came into the
national spotlight as rst lady in the 1990s.
54
us, groups on the left
and right presented strong opposition to her candidacy in 2016, which
meant Russian trolls already had a narrative to build upon and a net-
work of true believers on social media to spread their propaganda.
In a September 2016 speech, Clinton described half of candidate
Trumps supporters as “deplorables.” She went on to say that the other
half of Trumps supporters were just people who felt the system had left
them behind, who needed support and empathy. Clearly, she was not
referring to all of Trumps supporters as deplorable, but the narrative
quickly changed after social media users began referring to themselves as
“Deplorable” in their screen names.
Before the “basket of deplorables” comment, the trolls primarily used
an algorithm to rapidly respond to a tweet from Donald Trump. ose
tweets were prominently displayed directly under Trumps tweet if a user
clicked on the original. ose users became powerful voices with large
followings; Trump himself frequently retweeted many of those users.
55
How-
ever, after the Clinton speech, a “people search” on Twitter for “deplorable”
was all one needed to suddenly gain a network of followers numbering
between 3,000 and 70,000. Once again, FanFans name changed—this
time to “Deplorable Lucy”—and the prole picture became a white,
middle-aged female with a Trump logo at the bottom of the picture. e
FanFan follower count went from just over 1,000 to 11,000 within a few
days. His original network from the Mizzou and European campaigns
changed as well: tracing his follower trail again led to the same groups
of people in the same network, and they were all now dened by the
“Deplorable” brand. In short, they were now completely in unison with
a vast network of other Russian trolls, actual American citizens, and bot
accounts from both countries on Twitter. With a large network consist-
ing of Russian trolls, true believers, and bots, it suddenly became easier
to get topics trending with a barrage of tweets. e Russian trolls could
employ the previously used tactics of bot tweets and hashtag hijacking,
but now they had the capability to create trends.
Besides creating trends, the trolls could relay strategy under the radar
using Twitter. at is to say, a message could be delivered in the form
of a picture that did not include any words. e lack of words would
Jarred Prier
72 S S Q W 2017
spread the message to the followers in a timeline, but retweets would
not develop any trends—only that network of followers or someone ac-
tively observing the network saw the messages. Often, anonymous users
discussed the tactics behind the trend creation on the social media site
4Chan or on the bulletin board called “/pol/” and subsequently coor-
dinated the trend within the Deplorable Network on Twitter. e most
eective trends derived from this strategy came in the days following the
release of the “Access Hollywood” tape from 2005 in which Trump had
made vulgar remarks.
56
e Deplorable Network distributed the corre-
sponding strategy throughout the network to drown out negative atten-
tion to Trump on Twitter. Coinciding with the implementation of the
strategy to mask anti-Trump comments on Twitter, WikiLeaks began re-
leasing Clinton campaign chairman John Podestas stolen emails.
57
e
emails themselves revealed nothing truly controversial, but the narrative
that the trending hashtag created was powerful. First, the issue of hacked
emails developed into a narrative conating Podestas emails to the issue
of Clintons use of a private email server while she was secretary of state.
e Clinton server was likely never hacked, but the problem of email
loomed over her candidacy.
Secondly, the Podesta email narrative took routine issues and made
them seem scandalous. e most common theme: bring discredit to
the mainstream media. Podesta, like any campaign manager in modern
politics, communicated with members of the press. Emails communi-
cating with reporters were distributed via trending tweets with links to
fake news websites. e fake news distorted the stolen emails into
conspiracies of media “rigging” of the election to support Hillary Clin-
ton. e corruption narrative also plagued the Democratic National
Committee (DNC), which experienced a hack earlier in the year, by
Russian sources and revealed by WikiLeaks.
58
A month after the election, a man drove from his home in North
Carolina to Washington, DC, to uncover the truth behind another news
story he read online. He arrived at Comet Ping-Pong, a pizza restaurant,
with an AR-15, prepared to free children from an underground child
sex tracking ring in the restaurant. After searching the store, he found
no children. e story was a hoax. One of the emails stolen from John
Podesta was an invitation to a party at the home of a friend that prom-
ised good pizza from Comet Ping Pong and a pool to entertain the kids.
Fake news sites reported the email as code for a pedophilic sex party; it
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 73
was widely distributed via the trending #PodestaEmail hashtag and an
associated new hashtag, #PizzaGate.
e #PizzaGate hoax, along with all of the other false and quasi-false
narratives, became common within right-wing media as another indica-
tion of the immorality of Clinton and her sta. Often, the mainstream
media would latch onto a story with unsavory backgrounds and false
pretenses, thus giving more credibility to all of the fake news; however,
the narrative from the #PizzaGate hoax followed the common propa-
ganda narrative that the media was trying to cover up the truth and that
the government failed to investigate the crimes. Ultimately, that is what
drove the man to inquire into the fake news for himself.
59
Finally, the stolen emails went beyond sharing on social media. e
trend became so sensational that traditional media outlets chose to cover
the Podesta email story, which gave credibility to the fake news and the
associated online conspiracy theories promulgated by the Deplorable
Network. e WikiLeaks release of the Podesta emails was the peak of
Russian command of the trend during the 2016 election. Nearly every
day #PodestaEmail trended as a new batch of supposedly scandalous
hacked emails made their way into the mainstream press.
By analyzing the followers of a suspected Russian troll, a picture
emerges regarding the structure of the network that was active during
the 2016 election. e core group in the Deplorable Network consisted
of Russian trolls and popular American right-wing accounts like Jack
Posobiec, Mike Cernovich, and InfoWars editor Paul Joseph Watson.
e Network also consisted of two bot accounts while the remaining
nodes are individual accounts likely consisting of human-managed ac-
counts. In total, the Deplorable Network was approximately 200,000
Twitter accounts consisting of Russian trolls, true believers, and bots.
Based on my analysis, the bot network appeared to be between 16,000
and 34,000 accounts.
60
e cohesiveness of the group indicates how a
coordinated eort can create a trend in a way that a less cohesive net-
work could not accomplish. To conduct cyberattacks using social media
as information warfare, an organization must have a vast network of bot
accounts to take command of the trend. With unknown factors like the
impact of fake news, the true results of the Russian inuence operation
will likely never be known. As Ellul said, experiments undertaken to
gauge the eectiveness of propaganda will never work because the tests
cannot reproduce the real propaganda situation.
61
e concept itself
Jarred Prier
74 S S Q W 2017
is marred by the fact that much of the social media support Trump re-
ceived was through real American true believers tweeting. However, two
numbers will stand out from the 2016 election: 2.8 million and 80,000.
Hillary Clinton won the popular vote by 2.8 million votes, and Donald
Trump won the electoral vote via a combination of just over 80,000
votes in three key states. One could easily make the case—as many on
the left have done—that Clinton lost because of the Russian inuence.
62
Conversely, one could also argue she was destined to lose because of a
botched campaign combined with a growing sense of disenchantment
with the American political system. However, one cannot dispute the
fact that Russia launched a massive cyberwarfare campaign to inuence
the 2016 presidential election.
63
For the most part, the Russian trolls became savvier with their tech-
niques as they adapted to the inuence operation in the United States.
However, some users, like FanFan, were sloppy with their tradecraft and
were obvious to anyone monitoring. e trolls were occasionally sloppy
with their IP address locations as well. Following the rst presidential
debate, the #TrumpWon hashtag quickly became the number one trend
globally. Using the TrendMap application, one quickly noticed that the
worldwide hashtag seemed to originate in Saint Petersburg, Russia. Rus-
sian trolls gave obvious support to Donald Trump and proved that
using social media could create chaos on a massive scale, discredit any
politician, and divide American society.
Adrian Chen, the New York Times reporter who originally uncovered
the troll network in Saint Petersburg in 2015, went back to Russia in the
summer of 2016. Russian activists he interviewed claimed that the pur-
pose of the trolls “was not to brainwash readers, but to overwhelm social
media with a ood of fake content, seeding doubt and paranoia, and
destroying the possibility of using the Internet as a democratic space.
64
e troll farm used similar techniques to drown out anti-Putin trends
on Russian social media in addition to pumping out disinformation to
the United States.
A Congressional Research Service Study summarized the Russian
troll operation succinctly in a January 2017 report: “Cyber tools were
also used [by Russia] to create psychological eects in the American
population. e likely collateral eects of these activities include com-
promising the delity of information, sowing discord and doubt in the
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 75
American public about the validity of intelligence community reports,
and prompting questions about the democratic process itself.
65
For Russia, information warfare is a specialized type of war, and modern
tools make social media the weapon. According to a former Obama
administration senior ocial, Russians regard the information sphere
as a domain of warfare on a sliding scale of conict that always exists
between the US and Russia.
66
is perspective was on display dur-
ing a Russian national security conference “Infoforum 2016.” Andrey
Krutskih, a senior Kremlin advisor, compared Russias information war-
fare to a nuclear bomb, which would “allow Russia to talk to Americans
as equals,” in the same way that Soviet testing of the atomic bomb did
in 1949.
67
Table 3. Russia case study analysis in 2016 election
Types Examples
Propaganda
narratives
•  Anything discrediting to Hillary Clinton
•  News media hides information 
•  Politicians are rigging the system
•  Global elite trying to destroy the world
•  Globalism is taking jobs and destroying cultures
•  Refugees are terrorists
•  Russian foreign policy is strong on antiterrorism
•  Democrats and some Republicans want WWIII with Russia
True believers
Alt-right, some Bernie Sanders supporters, followers of InfoWars and 
Breitbart, 4Chan and /pol/ users.
Cyber warriors Hackers and professional trolls
Bot network
Large, sophisticated network that leveraged cyber warriors and true
believer accounts to create the “Deplorable Network.
From 2015 to 2016, Russian trolling modus operandi took a logical
path from small stories designed to create panic and sow seeds of doubt
to a social media machine that IS could only imagine. In warfare strategy,
narrative manipulation through social media cyber operations is the cur-
rent embodiment of taking the ght directly to the people. e 2016
election proved that using social media to inuence political outcomes,
as opposed to violence or Cold War–like posturing, is a highly eective
strategy in modern information warfare—a strategy that will likely
continue as technology continues to develop and adapt to the ever-
growing social media landscape as more actors gain the ability to take
command of the trend.
Jarred Prier
76 S S Q W 2017
e Future of Weaponized Social Media
Smear campaigns have been around since the beginning of politics,
but this article illustrated novel techniques recently employed by a
terrorist group and foreign state actor, with each attack gaining popu-
larity and credibility after trending on Twitter. e attacks, often under
the guise of a “whistleblower” campaign, make routine political actions
seem scandalous. Additionally, WikiLeaks advertises that it has never
published anything requiring retraction because everything it posts is
supposedly authentic stolen material. Just like the Podesta email releases,
several politicians and business leaders around the world have fallen victim
to this type of attack.
Recall the 2015 North Korean hacking of Sony Studios. Lost in the
explosive nature of the hacking story is that the fallout at the company
was not because of the hacking itself but from the release of embarrass-
ing emails from Sony senior management, as well as the salaries of every
employee at Sony. e uproar over the content of the emails dominated
social media, often fed by salacious stories like the RT headline: “Leaked
Sony emails exhibit wealthy elites maneuvering to get child into Ivy
League school.” Ultimately, Sony red a senior executive because of the
content of her emails.
68
In another example from May 2017, nine gigabytes of email stolen
from French presidential candidate Emmanuel Macrons campaign were
released online and veried by WikiLeaks. Subsequently, the hashtag
#MacronLeaks trended to number one worldwide. It was an inuence
operation resembling the #PodestaEmail campaign with a supporting
cast of some of the same actors. During the weeks preceding the French
election, many accounts within the Deplorable Network changed their
names to support Macrons opponent, Marine LePen. ese accounts
mostly tweet in English and still engage in American political topics as
well as French issues.
69
Some of the accounts also tweet in French, and a
new network of French-tweeting bot accounts uses the same methods as
the Deplorable Network to take command of the trend.
In his book Out of the Mountains, David Kilcullen describes a future
comprising large, coastal urban areas lled with potential threats, all
connected.
70
e implications of his prediction are twofold. First, net-
works of malicious nonstate actors can band together to hijack social
media using a template similar to IS. Although these groups may not
have the power to create global trends, they can certainly create chaos
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 77
with smaller numbers by hijacking trends and creating local trends.
With minimal resources, a small group can create a bot network to amplify
its message. Second, scores of people with exposure to social media are
vulnerable to online propaganda eorts. In this regard, state actors can
use the Russian playbook.
Russia will likely continue to dominate this new battlespace. It has
intelligence assets, hackers, cyber warrior trolls, massive bot networks,
state-owned news networks with global reach, and established networks
within the countries Russia seeks to attack via social media. Most impor-
tantly, the Russians have a history of spreading propaganda. After the
2016 elections in the United States, Russian trolls again worked toward
inuencing European elections. Currently, Russian trolls are active in
France, the Balkans, and the Czech Republic using active measures and
coercive social media messages.
71
It is clear that other countries are at-
tempting to build capabilities to match the Russian cyber troll inuence.
Already, Turkey, Iran, and Venezuela are noted as having bot networks
and cyber warriors similar to Russian trolls.
72
With these other states, a
popular use for the trolls in the social media battlespace is to stoke nation-
alism and control the narrative within their own borders. For example, the
fake Twitter followers of Venezuelan president Nicolás Maduro number
so many that he is now the “third-most-retweeted public gure in the
world, behind only the king of Saudi Arabia and the pope.
73
With a large enough bot network, states can also control messages
outside of social media using similar techniques. Manipulating search
engines is called “search engine optimization,” which uses bot accounts
to increase the number of clicks to a particular web page after perform-
ing a search. e search engine algorithm then prioritizes that page in re-
sponse to subsequent searches using the same keyword. A Google search
for “ODNI Report” is illustrative: in March 2017, the top Google re-
sults were RT articles lambasting the intelligence assessment that named
the Russian government as the perpetrators behind the 2016 election
interference.
Techniques like search engine optimization and command of the trend
will become common in future wars to sow discord and spread false in-
formation, with the aim of causing the other side to change its course of
action. ese online weapons should frighten every leader in a democ-
racy. Perhaps most frightening is the Oxford Internet Institute Unit for
Propaganda discovery that “hundreds of thousands of ‘sleeper bots’ exist
Jarred Prier
78 S S Q W 2017
on Twitter.
74
ese bots are accounts that are active but have not yet
started tweeting. Researchers do not know who owns the accounts or
what will trigger them. e ease of use and large numbers of active bots
and sleeper bots indicate a high likelihood of social media continuing to
be used for propaganda, especially as more and more state and nonstate
organizations realize the impact they can make on an adversary.
us far, the United States response has been relatively weak. For one,
the US government does not prioritize information operations the way
it once did during the Cold War. When President Eisenhower started
the United States Information Agency (USIA), the objective was to
compete with Soviet propaganda around the world. e mission state-
ment of USIA claried its role: “e purpose of the United States Infor-
mation Agency shall be to submit evidence to peoples of other nations
by means of communication techniques that the objectives and policies
of the United States are in harmony with and will advance their legiti-
mate aspirations for freedom, progress, and peace.
75
Knowing what we know now about Russian disinformation active
measures, USIA was never truly equipped to ght an information war.
e agency became a public diplomacy platform with a positive message
rather than a Soviet-style campaign of negative smear tactics. Accord-
ingly, several questions arose: should USIA spread propaganda? Should
it seek out and attempt to remove negative publicity about the US?
Should it slander opponents? Most importantly: should it do any or all
of these things when the American public could be inuenced by a mes-
sage intended for an international audience?
76
ose problems persist today because the government lacks a central-
ized information authority since the mission of USIA was relegated to
the Department of State. Several failed attempts to counter IS on Twit-
ter show the US government’s weakness when trying to use social media
as a weapon. One example is the Center for Strategic Counterterrorism
Communications, created in 2010, which started the program “ink
Again, Turn Away.” e State department awarded a $575,046 contract
to a Virginia-based consulting rm to manage the project.
77
e intent
was to curb the appeal of IS by creating a counternarrative to the
IS message on social media. Unfortunately, the Twitter campaign had
undesirable consequences after the account sent tweets arguing the ner
points of the Islamic faith with IS sympathizers. Rita Katz best summa-
rized the failure: “In order to counter a problem, one must rst study it
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 79
before adopting a solution. Had the people behind ‘ink Again, Turn
Away’ understood jihadists’ mindsets and reasons for their behavior, they
would have known that their project of counter-messaging would not
only be a waste of taxpayer money but ultimately be counterproductive.
78
In the end, the “ink Again, Turn Away” campaign was almost
comical as it could not communicate eectively with any audience and
severely discounted the importance of its message. Jacques Ellul noted
that democracies were prone to having problems with outward commu-
nication through propaganda. Because democracies rely on presenting
an image of fairness and truth, “propaganda made by democracies is
ineective, paralyzed, mediocre.
79
e United States was ill equipped
to combat Soviet active measures during the Cold War, and it remains
unable to compete using social media as an inuence operation.
Unfortunately, countering Russian inuence operations has taken a
partisan slant within the United States. Many downplay the Russian role
in the 2016 election while others appear to be so blinded by the Russian
operation that they cannot see the underlying conditions that allowed
for the spread of that narrative in the rst place.
80
With the two parties
unable to reach a consensus on what happened or the impact of the op-
eration, they fail to realize that as technology improves and proliferates
around the world, disinformation campaigns and inuence operations
will become the norm. e attack in a future information war could be
toward either political party and come from any of the several countries
attempting to build an online army in the mold of Russias trolls and
bot network.
Conclusion
In the 1987 book Truth Twisters, Richard Deacon laments the future
of independent thinking, as computers “could become the most dangerous
hypnotic inuence in the future. . . . [T]he eect of a reliance on com-
puterology, of allowing oneself to be manipulated and controlled by it,
is certainly hypnotic in that the mind allows itself to accept whatever the
computer tells it.
81
He believed that such technology could lead one
to commit treason without realizing any manipulation. Propaganda is
a powerful tool, and, used eectively, it has been proven to manipulate
populations on a massive scale. Using social media to take command
of the trend makes the spread of propaganda easier than ever before for
both state and nonstate actors.
Jarred Prier
80 S S Q W 2017
Fortunately, social media companies are taking steps to combat mali-
cious use. Facebook has been at the forefront of tech companies taking
action to increase awareness of fake news and provide a process for re-
moving the links from the website.
82
Also, although Facebook trends are
less important to information warfare than Twitter trends, the website
has taken measures to ensure that humans are involved in making the
trends list. Furthermore, Twitter has started discreetly removing unsavory
trends within minutes of their rise in popularity. However, adversaries adapt,
and Twitter trolls have attempted to regain command of the trend by
misspelling a previous trend once it is taken out of circulation. Still,
even if the misspelled word regains a spot on the trend list, the message
is diminished.
e measures enacted by Facebook and Twitter are important for pre-
venting future wars in the information domain. However, Twitter will
also continue to have problems with trend hijacking and bot networks.
As demonstrated by #PrayforMizzou and #WorldCup2014, real events
happening around the world will maintain popularity as well-intending
users want to talk about the issues. In reality, removing the trends func-
tion could end the use of social media as a weapon, but doing so could
also devalue the usability of Twitter. Rooting out bot accounts would
have an equal eect since that would nearly eliminate the possibility
of trend creation. Unfortunately, that would have an adverse impact
on advertising rms that rely on Twitter to generate revenue for their
products.
With social media companies balancing the interests of their busi-
nesses and the betterment of society, other institutions must respond
to the malicious use of social media. In particular, the credibility of our
press has been put into question by social media inuence campaigns—
those groups should respond accordingly. For instance, news outlets
should adopt social media policies for their employees that encourage
the use of social media but discourage them from relying on Twitter as a
source. is will require a culture shift within the press and fortunately
has gathered signicant attention at universities researching the medias
role in the inuence operation. It is worth noting that the French press
did not cover the content of the Macron leaks; instead, the journalists
covered the hacking and inuence operation without giving any cred-
ibility to the leaked information.
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 81
Finally, our elected ocials must move past the partisan divide of
Russian inuence in the 2016 election. is involves two things: rst,
both parties must recognize what happened—neither minimizing nor
overplaying Russian active measures. Second, and most importantly,
politicians must commit to not using active measures to their benet.
Certainly, the appeal of free negative advertising will make any politi-
cian think twice about using disinformation, but the reality of a foreign
inuence operation damages more than just the other party, it damages
our democratic ideals. Senator John McCain summarized this sentiment
well at a CNN Town Hall: “Have no doubt, what the Russians tried to
do to our election could have destroyed democracy. at’s why weve got
to pay . . . a lot more attention to the Russians.
83
is was not the cyber war we were promised. Predictions of a cata-
strophic cyberattack dominated policy discussion, but few realized that
social media could be used as a weapon against the minds of the popula-
tion. IS and Russia are models for this future war that uses social media to
directly inuence people. As technology improves, techniques are rened,
and internet connectivity continues to proliferate around the world, this
saying will ring true: He who controls the trend will control the narrative—
and, ultimately, the narrative controls the will of the people.
Notes
1. Elisabeth Bumiller and om Shanker, “Panetta Warns of Dire reat of Cyberattack
on U.S.,New York Times, 11 October 2012, http://www.nytimes.com/2012/10/12/world
/panetta-warns-of-dire-threat-of-cyberattack.html?mcubz=0/.
2. Jeremy Scott-Joynt, “What Myspace Means to Murdoch,” BBC News Analysis, 19 July
2005, http://news.bbc.co.uk/2/hi/business/4697671.stm.
3. Sitaram Asur, Bernardo A. Huberman, Gabor Szabo, and Chunyan Wang, “Trends in Social
Media: Persistence and Decay” (unpublished manuscript, submitted to Cornell University Library
arXiv 7 February 2011), 1, https://arxiv.org/abs/1102.1402?context=physics.
4. “Blog” is short for “web log.” A blog is a way to share your thoughts via the internet. A
microblog is a blog with a character limit to the text.
5. Rani Molla, “Social Studies: Twitter vs. Facebook,Bloomberg Gady, 12 February 2016,
https://www.bloomberg.com/gady/articles/2016-02-12/social-studies-comparing-twitter-with
-facebook-in-charts.
6. Carole Cadwalladr, “Robert Mercer: e Big Data Billionaire Waging War on the Main-
stream Media,Guardian, 26 February 2017, https://www.theguardian.com/politics/2017
/feb/26/robert-mercer-breitbart-war-on-media-steve-bannon-donald-trump-nigel-farage.
7. Gabriel Weimann, Terrorism in Cyberspace: e Next Generation (Washington, DC:
Woodrow Wilson Center Press, 2015), 138.
Jarred Prier
82 S S Q W 2017
8. Alex Lubben, “Twitter’s Users Are 15 Percent Robot, but ats Not Necessarily a Bad
ing,” VICE News, 12 March 2017, https://news.vice.com/story/twitters-users-are-15-percent
-robot-but-thats-not-necessarily-a-bad-thing.
9. Jacques Ellul, Propaganda: e Formation of Men’s Attitudes (New York: Knopf, 1965), 6.
10. Eric Hoer, e True Believer: oughts on the Nature of Mass Movements (New York:
Harper and Row, 1951), 105.
11. omas Rid, Cyber War Will Not Take Place (New York: Oxford University Press,
2013), 132.
12. Ellul, 85.
13. Daniel Kahneman, inking, Fast and Slow (New York: Farrar, Straus and Giroux,
2011), 87.
14. Christopher Paul and Miriam Matthews, e Russian “Firehose of Falsehood” Propaganda
Model, RAND Report PE-198-OSD (Santa Monica, CA: RAND, 2016), 4, https://www.rand
.org/pubs/perspectives/PE198.html.
15. Garth Jowett and Victoria O’Donnell, Propaganda & Persuasion, 5th ed. (ousand
Oaks, CA: SAGE, 2012), 159.
16. Katerina Eva Matsa and Kristine Lu, “10 Facts about the Changing Digital News
Landscape,” Pew Research Center, 14 September 2016, http://www.pewresearch.org/fact
-tank/2016/09/14/facts-about-the-changing-digital-news-landscape/.
17. Jowett and O’Donnell, Propaganda & Persuasion, 300.
18. Tom Hashemi, “e Business of Ideas Is in Trouble: Re-injecting Facts into a Post-
truth World, War on the Rocks, 9 December 2016, https://warontherocks.com/2016/12/the
-business-of-ideas-is-in-trouble-re-injecting-facts-into-a-post-truth-world/.
19. Asur, Huberman, Szabo, and Wang, “Trends in Social Media,” 1.
20. Merriam-Webster Dictionary Online, s.v. “lede,” accessed 10 October 2017, https://
www.merriam-webster.com/dictionary/lede. “e introductory section of a news story that is
intended to entice the reader to read the full story.
21. Tess Townsend, “e Bizarre Truth behind the Biggest Pro-Trump Facebook Hoaxes,” Inc.
com, 21 November 2016, https://www.inc.com/tess-townsend/ending-fed-trump-facebook.html.
22. Craig Silverman, “is Analysis Shows How Viral Fake Election News Stories Outper-
formed Real News on Facebook,” BuzzFeed News, 16 November 2016, https://www.buzzfeed
.com/craigsilverman/viral-fake-election-news-outperformed-real-news-on-facebook?utm
_term=.qwWdA0G8G#.fcEv1Qono.
23. Art Swift, “AmericansTrust in Mass Media Sinks to New Low,” Gallup, 14 September
2016, http://news.gallup.com/poll/195542/americans-trust-mass-media-sinks-new-low.aspx.
24. Andrea Peterson, “ree Charts that Explain how U.S. Journalists Use Social Media,
Washington Post, 6 May 2014, https://www.washingtonpost.com/news/the-switch/wp/2014/05/06
/three-charts-that-explain-how-u-s-journalists-use-social-media/?utm_term=.9cdd82cb8fa7.
25. Weimann, Terrorism in Cyberspace, 138.
26. Audrey Kurth Cronin, “ISIS Is Not a Terrorist Group,Foreign Policy (March/April
2015), https://www.foreignaairs.com/articles/middle-east/isis-not-terrorist-group.
27. Stephen M. Walt, “ISIS as Revolutionary State,Foreign Policy (November/December
2015): 42, https://www.belfercenter.org/publication/isis-revolutionary-state.
28. Caliphate is dened as “a form of Islamic government led by a—a person considered a
political and religious successor to the Islamic prophet, Muhammad, and a leader of the entire
Muslim community. Source: Wadad Kadi and Aram A. Shahin, “Caliph, caliphate,” in e
Princeton Encyclopedia of Islamic Political ought, ed. Gerhard Bowering, Patricia Crone, Wadad
Kadi, Devin J. Stewart, Muhammad Qasim Zaman, and Mahan Mirza (Princeton, NJ: Princeton
University Press, 2013), 81–86, http://www.jstor.org/stable/j.ctt1r2g6m.8.
29. Graeme Wood, “What ISIS Really Wants,Atlantic, March 2015, 3, https://www
.theatlantic.com/magazine/archive/2015/03/what-isis-really-wants/384980/.
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 83
30. Dabiq is also the name of the ISIS magazine, which is available electronically and
spread via social media.
31. Walt, “ISIS as Revolutionary State,” 43.
32. J. M. Berger, “How ISIS Games Twitter,Atlantic, 16 June 2014, https://www.theatlantic
.com/international/archive/2014/06/isis-iraq-twitter-social-media-strategy/372856/.
33. Ibid.
34. Terrorist Use of Social Media: Policy and Legal Challenges,” roundtable forum (Wash-
ington, DC: Council on Foreign Relations, 14 October 2015).
35. Berger, “How ISIS Games Twitter.
36. Carleton English, “Twitter Continues to Wage its Own War against ISIS,New York Post, 21
March 2017, http://nypost.com/2017/03/21/twitter-continues-to-wage-its-own-war-against-isis/.
37. United States Department of State, report, Soviet Inuence Activities: A Report on Active
Measures and Propaganda, 1986–87 (Washington, DC: Bureau of Public Aairs, 1987), viii.
38. Natasha Bertrand, “It Looks Like Russia Hired Internet Trolls to Pose as Pro-Trump Ameri-
cans, Business Insider, 27 July 2016, http://www.businessinsider.com/russia-internet-trolls-and
-donald-trump-2016-7.
39. Vladimir Isachenkov, “Russia Military Acknowledges New Branch: Info Warfare Troops,
AP News, 22 February 2017, https://www.apnews.com/8b7532462dd0495d9f756c9ae7d23c.
40. Richard Gonzalez, “CIA Director Pompeo Denounces WikiLeaks as ‘Hostile Intelligence
Service,’ ” NPR, 23 April 2017, http://www.npr.org/sections/thetwo-way/2017/04/13/523849965
/cia-director-pompeo-denounces-wikileaks-as-hostile-intelligence-service.
41. Malcolm Nance, e Plot to Hack America: How Putin’s Cyberspies and WikiLeaks Tried
to Steal the 2016 Election (New York: Skyhorse Publishing, 2016), Kindle edition, 1,839.
42. Adrian Chen, “e Agency,New York Times Magazine, 2 June 2015, https://www.nytimes
.com/2015/06/07/magazine/the-agency.html. On 11 September 2014, the small town of St.
Mary Parish, Louisiana, was thrown briey into a panic when residents began hearing reports
through text, social media, and on local television stations that a nearby chemical plant re was
spreading toxic fumes that would soon endanger the whole town. e entire narrative was based
on falsied—but very real looking—online news stories, hashtag manipulation, and mass
texts (SMS) to various numbers with the local area code and dialing prex. e actual source
for the news was not the chemical factory; it was a nondescript building in St. Petersburg,
Russia, where an army of online cyber-warrior trolls seeks to distribute false information.
43. Statement of Clint Watts, Foreign Policy Research Institute fellow, in “Disinformation:
A Primer in Russian Active Measures and Inuence Campaigns,” testimony before the Senate
Intelligence Committee, 115th Cong., 1st sess., 30 March 2017, https://www.intelligence
.senate.gov/sites/default/les/documents/os-cwatts-033017.pdf.
44. Chen, “e Agency.
45. Because of the Adrian Chen article, I observed particular tweeting patterns of certain
individuals involved in a hoax on the campus of the University of Missouri that seemed to match
the methods of the Russian trolls interviewed by Chen. I mention only one particular user in this
article, but I also monitored a dozen or so accounts that contributed to that hoax. Each account
followed a pattern that also happened to align with noted Russian inuence operations in Europe
and eventually in the US presidential election. I describe that transition in the article. From
those accounts, I built a database of suspected Russian bot accounts to build a network map. e
Mizzou hoax was a trend hijacking eort launched by actors who later proved to match the Rus-
sian modus operandi of using cyber trolls originally observed by Adrian Chen and conrmed by
the Oce of the Director of National Intelligence (ODNI) report and Foreign Policy Research In-
stitute fellow Clint Watts in his testimony before the Senate Intelligence Committee (note 43).
46. Nadine Schmidt and Tim Hume, “Berlin Teen Admits Fabricating Migrant Gang-Rape
Story, Ocial Says,” CNN, 1 February 2016, http://www.cnn.com/2016/02/01/europe/germany
-teen-migrant-rape-false/index.html.
Jarred Prier
84 S S Q W 2017
47. Judy Dempsey, “Russias Manipulation of Germanys Refugee Problems,” Carnegie
Europe, 28 January 2016, http://carnegieeurope.eu/strategiceurope/?fa=62611.
48. Schmidt and Hume, “Berlin Teen Admits Fabricating Migrant Gang-Rape Story.
49. Barbara Tasch, “ ‘e Aim Is to Weaken the West’: e Inside Story of How Russian
Propagandists Are Waging War on Europe,Business Insider, 2 February 2017, http://www
.businessinsider.com/russia-propaganda-campaign-weakening-europe-2017-1?r=UK&IR=T.
50. Harriet Sherwood, “Polish Magazines ‘Islamic Rape of Europe’ Cover Sparks Out-
rage,” 18 February 2016, https://www.theguardian.com/world/2016/feb/18/polish-magazines-
islamic-of-europe-cover-sparks-outrage.
51. Chen, “e Agency.
52. Robinson Meyer, “War Goes Viral: How Social Media Is Being Weaponized across the
World,Atlantic, 18 October 2016, https://www.theatlantic.com/magazine/archive/2016/11
/war-goes-viral/501125/.
53. Oce of the Director of National Intelligence (ODNI), Intelligence Community As-
sessment Report, Assessing Russian Activities and Intentions in Recent US Elections, 6 January
2017, ii, https://www.dni.gov/les/documents/ICA_2017_01.pdf.
54. Hanna Rosin, “Among the Hillary Haters,Atlantic, 1 March 2015, 63, https://www
.theatlantic.com/magazine/archive/2015/03/among-the-hillary-haters/384976/.
55. K. or Jensen, “Inside Donald Trumps Twitter-Bot Fan Club,New York Magazine,
15 June 2016, http://nymag.com/selectall/2016/06/inside-donald-trumps-twitter-bot-fan
-club.html.
56. David A. Farenthold, “Trump Recorded Having Extremely Lewd Conversation
about Women in 2005,Washington Post, 8 October 2016, https://www.washingtonpost
.com/politics/trump-recorded-having-extremely-lewd-conversation-about-women-in
-2005/2016/10/07/3b9ce776-8cb4-11e6-bf8a-3d26847eeed4_story.html.
57. “e Podesta Emails,” Politico LiveBlog, accessed 6 December 2016, http://www.politico
.com/live-blog-updates/2016/10/john-podesta-hillary-clinton-emails-wikileaks-000011.
58. ODNI Report, 2.
59. Faiz Siddiqui and Susan Svrluga, “N.C. Man Told Police He Went to D.C. Pizzeria
with Gun to Investigate Conspiracy eory,Washington Post, 5 December 2017, https://
www.washingtonpost.com/news/local/wp/2016/12/04/d-c-police-respond-to-report-of-a
-man-with-a-gun-at-comet-ping-pong-restaurant/?utm_term=.c33057f66007.
60. is count is based on analysis of the followers of followers of suspected troll accounts
and bots. e study was conducted 15 March 2016. e number of accounts appears to
have reduced dramatically since May, following the French election, implying that Twitter
suspended some of the accounts. Unfortunately, software limitations prevent this analysis
from being more accurate. Additionally, it is nearly impossible to derive the exact number of
Russian accounts from that network using my available resources.
61. Ellul, Propaganda, 6.
62. Many on the left have mischaracterized the attack as “Russian hacking of the election,
which has in turn conated the issue of the John Podesta email theft with a hacking of the
actual election systems. To be clear: there is no evidence of any sort of hack on any ballot-
counting systems, only evidence outlined in this paper of two hacks (Democratic National
Committee and Podesta) combined with an inuence/information operation.
63. ODNI Report, 1.
64. Adrian Chen, “e Real Paranoia-Inducing Purpose of Russian Hacks,New Yorker,
27 July 2016, https://www.newyorker.com/news/news-desk/the-real-paranoia-inducing-purpose
-of-russian-hacks.
65. Catherine eohary and Cory Welt, “Russia and the U.S. Presidential Election,” CRS
Report no. IN10635 (Washington, DC: Congressional Research Service, 2017).
Commanding the Trend: Social Media as Information Warfare
S S Q W 2017 85
66. David Ignatius, “Russias Radical New Strategy for Information Warfare,Washington
Post, 18 January 2017, https://www.washingtonpost.com/blogs/post-partisan/wp/2017/01/18/
russias-radical-new-strategy-for-information-warfare/?utm_term=.da53e31d7aaa.
67. Ibid.
68. “Ex-Sony Chief Amy Pascal Acknowledges She Was Fired,” NBCNews.com, 12
February 2015, https://www.nbcnews.com/storyline/sony-hack/ex-sony-chief-amy-pascal
-acknowledges-she-was-red-n305281.
69. e political left in the United States seems to have a large group of bot accounts
forming around the “Resist” movement. It is unclear whether those accounts are foreign cyber
warriors or bots, but external actors can certainly feed o the underlying narratives and tap
into existing networks of true believers.
70. David Kilcullen, Out of the Mountains: e Coming Age of the Urban Guerrilla (New
York: Oxford University Press, 2013), 231.
71. Anthony Faiola, “As Cold War Turns to Information War, a New Fake News Police
Combats Disinformation,Washington Post, 22 January 2017, https://www.washingtonpost
.com/world/europe/as-cold-war-turns-to-information-war-a-new-fake-news-police/2017/01
/18/9bf496-d80e-11e6-a0e6-d502d6751bc8_story.html?utm_term=.7c99cc2fadd5.
72. Meyer, “War Goes Viral.
73. Ibid.
74. Cadwalladr, “Robert Mercer: e Big Data,” 1.8.
75. Malcolm Mitchell, Propaganda, Polls, and Public Opinion: Are the People Manipulated?
(Englewood Clis, NJ: Prentice-Hall, 1977), 12.
76. Ibid., 13.
77. Rebecca Carroll, “e State Department Is Fighting with ISIL on Twitter.” Defense
One, 25 June 2014, http://www.defenseone.com/technology/2014/06/state-department-ghting
-isil-twitter/87286/.
78. Rita Katz, “e State Departments Twitter War with ISIS Is Embarrassing,Time, 16
September 2014, http://time.com/3387065/isis-twitter-war-state-department/.
79. Ellul, Propaganda, 241.
80. Adrian Chen, “e Propaganda about Russian Propaganda,New Yorker, 1 December
2016, https://www.newyorker.com/news/news-desk/the-propaganda-about-russian-propaganda.
81. Richard Deacon, e Truth Twisters (London: Macdonald, 1987), 95.
82. Michelle Castillo, “Facebook Found Fake Accounts Leaking Stolen Info to Sway Presi-
dential Election,” CNBC.com, 27 April 2017, https://www.cnbc.com/2017/04/27/facebook-
found-eorts-to-sway-presidential-election-elect-trump.html.
83. Eric Bradner, “At CNN Town Hall, McCain and Graham Give eir View of Trumps
Presidency so Far,” CNN, 2 March 2017, http://www.cnn.com/2017/03/01/politics
/john-mccain-lindsey-graham-town-hall/index.html.
Disclaimer
e views and opinions expressed or implied in SSQ are those of the
authors and are not ocially sanctioned by any agency or depart-
ment of the US government. We encourage you to send comments