Disinformation Is Drowning Democracy
In the new age of lies, law, not tech, is the answer.
From India to Indonesia to Brazil, democracy is being compromised by online domestic disinformation campaigns from political parties seeking to gain an advantage. Democratic institutions have not been able to keep up and have instead deferred to tech firms, trusting them to referee online behavior. But this is a task far beyond the limited capabilities and narrow motivations of companies such as Facebook and Twitter. If the democratic recession is to end, democratic institutions need to create new rules and hold the responsible parties to account.
As internet penetration rapidly increases, new users from retirees in Arizona to villagers in Uttar Pradesh, ill-equipped to navigate their new information environments, are being exploited by domestic political parties and interest groups. The digital campaigning strategies to exploit them are remarkably similar across the globe.
Political parties such as the Bharatiya Janata Party (BJP) in India and the Great Indonesia Movement Party in Indonesia often operate through surrogates and consultancies to keep their activities at arm’s length. They deploy armies of real humans managing news sites, YouTube channels, online accounts, and WhatsApp groups to produce and disseminate hyperpartisan content.
Using data analytics, their messages are targeted, primarily through WhatsApp, to resonate with specific demographics. The content is polarizing, drawing on religious symbolism, nationalism, and moral narratives to vilify certain ethnic groups and political opponents. These tactics have been so successful that there are a plethora of political consultancies and data analytics firms beyond the infamous Cambridge Analytica, including Rasmussen, SCL Group, and Targeted Victory, offering similar services to political parties and candidates across the world.
Democratic discourse in India has been marked by online fake news and dangerous speech in recent years, spreading rumors about kidnapping and the mistreatment of cows, as well as doctored images and videos of politicians. The nationalist BJP led the way there in 2014, and other parties are playing catch-up.
Disinformation along with hyperpartisan content is created and disseminated by political parties and their associates en masse, as exposed by whistleblowers and former employees, including Shivam Shankar Singh from the BJP’s notorious social media operation called the IT Cell; journalists such as Swati Chaturvedi; and influencers such as Dhruv Rathee. False and hyperpartisan content is often created by paid IT Cell staff or surrogate groups and targeted to communities down to the local level using customized WhatsApp groups. BJP President Amit Shah has directed state units to again compile the phone numbers into WhatsApp groups for voters in every polling station.
As in other countries, it’s becoming increasingly tough for Indians to distinguish between organic democratic discourse and material promoted by political parties and their associates. So-called astroturfing has been around for a long time, but the internet allows it to be done more effectively and deceptively than ever.
Conventional political ads are tied into this nexus. Online campaigning is big business, with over $350,000 spent on political Facebook ads in India between March 2 and 16 alone, ahead of the six-week general election that began April 11. On both Google Ads and Facebook, political advertising is often channeled through pages and consultancies such as Nation with NaMo, My First Vote for Modi, and Digitant Consulting Pvt. Ltd., none of which outline their relationship with the government.
The same problems hit Brazil last year, with several parties benefiting from misinformation in the lead-up to the October elections. An investigation found political consultancies and businessmen with links to now-President Jair Bolsonaro’s campaign funded disinformation campaigns on WhatsApp that targeted false and hyperpartisan messages to unsuspecting citizens. As in India, the messages were crafted to go viral and targeted political opponents and the mainstream media.
Late last year, an investigation into disinformation in Indonesia revealed that both political parties were using shadow teams, producing and disseminating disinformation through proxies kept at arm’s-length. Reuters also uncovered groups of freelancers called “buzzers” who spread falsehoods and religious divides. Mafindo, an organization focused on combating fake news and improving digital literacy, found that political fake news and disinformation shot up by 61 percent between December 2018 and this January ahead of elections on April 17.
And that’s the main problem—and the main difference from conventional political advertising. Regardless of whether the content was partisan or not, users remained ignorant of who was behind the information they were seeing and why the content they’d viewed was being artificially amplified. Anonymous pamphlets and rumors are hardly new in politics, but social media has been a force multiplier for such behavior. Citizens had no reason to suspect political parties were distorting what they saw. This allows political parties to evade accountability from voters when spreading harmful, false, or toxic messages, particularly when those messages can be micro-targeted.
The response from politicians, governments, and the media has been to call on Facebook and other tech firms to prevent misuse of their platforms by removing content or banning political accounts. India’s Election Commission brought tech platforms together to sign a voluntary code of ethics to support fair and transparent elections there. Tech platforms have doubled down on their efforts to combat coordinated and inauthentic content in major markets ahead of elections and have been bolder in their efforts to counter bad behavior, including by naming and shaming the political parties involved.
But tech companies should not be running point on elections without input from existing democratic institutions. Tech companies cannot truly alter political incentives by simply removing and blocking content. That only leads to a cat-and-mouse game with bad actors. Nor do they have the democratic mandate to play such a critical role and sanction parties, leaving them open to allegations of political bias. There are also serious limits to their overall capacity to monitor complex political networks across languages and cultures in distant places.
The key role must be played by national election commissions. Beyond signing codes of ethics with tech platforms, election commissions need to modernize their own regulations to create codes of conduct for online campaigns that prevents parties from operating through proxies. Campaign finance and data protection regulation also needs an update. Groups purchasing political ads online should be forced to identify their relationships with campaigns, and electoral data should not be used for micro-targeting. Citizens should know when information they’re receiving actually comes from a political party and its associates.
Here, tech companies and nongovernmental organizations can assist in investigations and developing attribution tests to understand when a proxy is or could be linked to a political party. The penalties for misleading the public should be severe and should come from existing public institutions. Just as community liaison electoral officials are the point of call for citizens to report electoral fraud, social media liaison officers could help electoral agencies gather evidence and direct action against political parties.
This will require tech platforms to also come to the table. Banning online political ads as Google did in Canada might be justified in the short term but is not a healthy long-term solution for platforms which are central to civic debate.
The work of democracy cannot be left to global tech companies alone. That responsibility rests on the public’s shoulders and those of democratic institutions. If they fail to modernize our election regulations, the democratic recession will only continue.