DON'T LOSE ACCESS:
Your IP access to ForeignPolicy.com will expire on June 15.
To ensure uninterrupted reading, please contact Rachel Mines, sales director, at firstname.lastname@example.org.
The Future of War Will Be ‘Liked’
In the social media age, what you share is deciding what happens on the battlefield.
It was, perhaps, the strangest demand in political history:
“The middle photo is taken from Hungarian porn. Stop using fake photos to ‘trick’ people into supporting your lost cause.”
This Nov. 18, 2014, tweet from a now-defunct Twitter account run by the U.S. State Department offered an early glimpse into a new front in the future of war: trolling. The message was the outgrowth of an effort the department had launched in 2011 to track and counter terrorist propaganda, first against al Qaeda, then against the fast-growing Islamic State that had spun out of its Iraqi remains.
The campaign may have sounded sensible, but it soon backfired. Instead of cheering on the online battle against extremism, Twitter users piled on with more questions than the staid Foggy Bottom bureaucrats manning the account were prepared to answer. “How did the State Dept. know it was Hungarian porn?” @SpaSuzy asked. “dude … it’s really weird you know so much about hungarian porn,” @7thhorse added.
After an avalanche of criticism, the State Department decided it was inappropriate for the U.S. government to get stuck in the muck of social media—better to stick to airstrikes—and pulled the plug on the Twitter account.
Four years later, such scruples seem almost quaint. In an era in which President Donald Trump didn’t just rise to power through his deft use of the same medium, but then even used it to fire his first secretary of state, the old notions that government should stay above the social media fray have evaporated. Facebook, YouTube, and Twitter have become crucial battlegrounds for politics, war, and even truth itself. Social media has emerged as an arena in which virality—how far and wide a message spreads—trumps veracity. In this domain, attention is power. Win enough of it and you can reshape the very fabric of reality.
A generation ago, the new notion of what was called “cyberwar”—the hacking of networks—began to take conflict into a new domain. Today, what we call “like wars”—the hacking of the people and ideas on those networks—mark the latest twist in the ever-evolving nature of warfare.
On the surface, many of these battles waged on social media can seem like mere propaganda and an often silly version at that—like teenaged trolling transposed onto the global stage. In August 2017, for example, the Ukrainian government’s official Twitter account attacked Russia with a mocking South Park GIF; in June 2018, the Israeli Embassy in Washington, D.C., answered a fire-and-brimstone threat by Iran’s supreme leader, Ayatollah Ali Khamenei, with a Mean Girls meme; and in May 2018, the U.S. Air Force cracked jokes about airstrikes in Afghanistan while the Taliban returned the favor by poking fun at former U.S. commander David Petraeus’s illicit love affair.
The goal for all such actors is not merely the lulz but to ridicule their foes and expand their influence, in a world where online sway can drive real-world power. Yet beneath it all, a more serious side of conflict also takes place, its ammunition the bevy of images taken from actual battles. Today, nearly all our moves are tracked, including those in anything from election campaigns to military ones.
Some of it is intentional: selfies taken in the midst of battle, observers watching events, smartphone in hand. Others are captured in the background: be it images that lay in the distance or even information in the digital background, from the geolocation of CIA black sites revealed by guards’ use of exercise apps to the metadata that accompanies every online post. The result is that the smallest of firefights is observed by a global audience, while terrorist attacks are even shared out live by the killers themselves. Open-source intelligence analysts then use these very same digital breadcrumbs to reveal new secrets, documenting war crimes that would go otherwise untracked or assessing the strength of enemy formations that would go otherwise unobserved. It works for both good and bad: Terrorists use this information to win new recruits; human rights activists use it to highlight the plight of civilians caught in harm’s way and even steer rescues their way. During the 2016-2017 Battle of Mosul—the most livestreamed and hashtagged siege in history—thousands of virtual observers waited for each new snippet of content, spinning it to all of these ends at once.
These battles that play out in the digital shadows are not just about unveiling secrets but burying truths—and even shaping hearts, minds, and actions. Russian sockpuppets and botnets, for instance, did quite a bit more than simply meddle in the 2016 U.S. presidential election. They used a mix of old-school information operations and new digital marketing techniques to spark real-world protests, steer multiple U.S. news cycles, and influence voters in one of the closest elections in modern history. Using solely online means, they infiltrated U.S. political communities so completely that flesh-and-blood American voters soon began to repeat scripts written in St. Petersburg and still think them their own. Internationally, these Russian information offensives have stirred anti-NATO sentiments in Germany by inventing atrocities out of thin air; laid the pretext for potential invasions of Estonia, Latvia, and Lithuania by fueling the political antipathy of ethnic Russian minorities; and done the same for the very real invasion of Ukraine. And these are just the operations we know about.
Such online skirmishes may appear insignificant compared with real fights conducted with real weapons, but they have become just as important. As Gen. Stanley McChrystal, the highly decorated former commander of Joint Special Operations Command, stated at a military conference in 2017, for the foreseeable future what happens on social media will be crucial to the outcome of any debate, battle, or war. The reason, he explained, is that battles are now being waged over truth itself. In these fights, “the line between reality and perception will be blurred,” he said. “Separating fact from fiction will be tough for governments but almost impossible for populations.”
McChrystal’s comments may seem to echo the ravings of the notorious conspiracy theorist Alex Jones, whose website Infowars uses the tagline, “There’s a war on for your mind!” But that doesn’t make them any less truthful. With our personal and political understanding of the world increasingly filtered through online sources, images and ideas distributed and created on social media may become more important than objective facts. As McChrystal put it, “Shaping the perception of which side is right or which side is winning will be more important than actually which side is right or winning.”
Indeed, the messages coursing through social media today shape not just the perceived outcomes of conflicts but the very choices leaders make during both military campaigns. Russia, for instance, has crafted its information operations into a potent, nimble weapon that can target U.S. voters or pinpoint artillery strikes in Ukraine, using what happens in the online world to geolocate soldiers—and then message their looming death right before the cannons fire. Social media even shapes the overall flow. A 2016 study by the American University professor Thomas Zeitzoff of the Israel Defense Forces’ 2012 air campaign against Hamas in the Gaza Strip found that the conflict followed the pace set on Twitter; the tempo of operations and targeting shifted depending on which side was dominating the online conversation at the time. The military officers and civilian leaders were watching their social media feed and reacting accordingly.
Sometimes, social media posts can even spark new fights, especially when they play to long-standing tensions or hatreds. The Sri Lankan government blamed viral Facebook rumors for stirring up the hatred that led to a brutal assault on the country’s Muslim minority this March. In June, false reports circulated among India’s 200 million WhatsApp users spurred a spate of lynchings. Meanwhile, racist messages and rumors shared on Facebook continue to fuel the ongoing ethnic cleansing of the Rohingya Muslim minority in Myanmar.
Mounting evidence suggests that these online tug of wars may not just start fights and mass killings but also make conflicts harder to end. Criminologists who study the spike of murders in cities such as Chicago note how an increasing share of gang violence is attributable to social media trash-talking. Sometimes, the spark is a disrespectful emoji; other times, it’s a long-forgotten post, dug up in a moment of escalating tensions. Unlike the interaction in the street (or by diplomats in a traditional negotiation), it doesn’t matter if the original insult was made a year ago or hundreds of miles away. All that matters is that the world is watching and the internet never forgets. It’s easy to see how a similar dynamic will haunt future cease-fire negotiations, whether the end of an insurgency or the conclusion of a major interstate war. There’s always some people intent on keeping the violence going. And online, they never lose their voice.
Daunting as all this may seem, however, social media has only just begun to shape the future of war. Only half the world is online, while the tools of “like wars” today are like the biplanes of air war. Indeed, new machine intelligence is making it ever harder for humans to discern truth from lies and is possibly reshaping our conception of reality itself. Over the last year, the techniques needed to create “deep fakes”—hyper-realistic digital forgeries generated by advanced artificial intelligence neural networks—have become increasingly accessible. This technology, currently used mostly by cutting-edge computer scientists and inventive pornographers, will soon flood the internet with pitch-perfect voice imitations, photo-realistic video fabrications, and vast networks of chattering bots indistinguishable from their human counterparts. And like everything else, deep fakes are also likely to be weaponized, both in elections and even battles. We’ve already had a taste of it; in its run to seize Mosul, the Islamic State was able to use a mix of real and fake news to help spur retreat by Iraqi Army units. Even U.S. information war units now train at sowing false digital trails to misdirect their foes. We may one day even face the prospect of a digital Gulf of Tonkin, where the very case for a real war is built wholly on AI-constructed lies.
These changes reshape the speed, experience, and even the reach of conflict. In the social media age, every election, every conflict, and every battle is simultaneously global and local. Even as the physical experience of war grows more alien to the average Westerner with each generation, it has also become more personal than ever. Our choices of what to “like” and share (or not) shape not only the outcomes of elections and battles but also what our friends, family, and the wider world treat as real. You may not be interested in like wars, but the future of war and politics is very much interested in you—and your clicks.