Argument

An expert's point of view on a current event.

For Facebook, It’s All About the Bottom Line

Mark Zuckerberg’s commitment to free speech always concealed corporate interests first.

Facebook co-founder, Chairman, and CEO Mark Zuckerberg
Facebook co-founder, Chairman, and CEO Mark Zuckerberg testifies before the House Energy and Commerce Committee in the Rayburn House Office Building on Capitol Hill on April 11, 2018. Chip Somodevilla/Getty Images

When Mark Zuckerberg took the stage at Georgetown University in late 2019, the heat was already on him. His company was getting ripped apart in the media, with journalists and experts examining every facet of his company’s practices and every policy call he and his lieutenants had made.

Yet despite it all, Zuckerberg shocked the world. His company would be standing for “free speech,” he proclaimed. Even politicians who intentionally spread political lies in their own political interest would not be stopped or censored by his company.

The depressing truth is that the reasoning behind Zuckerberg’s policy posture—and the company’s underlying policy position on these matters, which it claims to be defined by free speech—is not about free speech at all. It is about money.

In 2019, he was under plenty of pressure to change. He had already announced that the company would be setting up the Oversight Board, a body nominally external to Facebook designed to adjudicate on a narrow set of content-policy decisions before the company—a proposal that was implicated by many as a corporate regime that was destined to fail. Add to that the backdrop of the Russian disinformation operations that came to light in 2017—the year I left the company as a policy advisor—and that may well have swung the 2016 presidential election. It was on the back of these events that many Democrats were up in arms about Facebook’s handling of matters of questionable privacy, advertising, and corporate development practices.

There was intense political intrigue over the company’s handling of political affairs, too; in the aftermath of the 2016 Gizmodo report and related reporting about an alleged liberal spin within the company’s ranks, U.S. President Donald Trump and other Republicans were spurring on claims of anti-conservative bias, suggesting that Facebook suppressed conservative viewpoints as a rule in algorithms like the News Feed recommendation engine. This even resulted in the Trump Justice Department, led at the time by Attorney General Jeff Sessions, teaming up with state attorneys general to advance a sweeping regulatory inquiry into the possibility of bringing an antitrust suit against the company.

But perhaps most difficult for Facebook to contend with were tensions with the left; after proof of the Russian infiltration of Facebook’s targeted advertising regime in 2016, Democrats, weary and wary of past and future instances of politically harmful misinformation tropes and disinformation campaigns involving deepfakes, half-truths, and coordinated ad dissemination, were eyeing a slate of robust regulatory reforms as soon as they could get back into political power.

Zuckerberg was thus tilting one way and the other on a political tightrope; threading the needle between conservatives and liberals, who could well win the then-upcoming 2020 presidential election, could not have been an easy task. He was contending with several weighty themes all at once: the most sacred of American civil liberties in freedom of political expression, the economic fortunes of the masses, the regulatory fate of the internet, and indeed the U.S. presidential election itself. And yet, he chose to do nothing—in the name of “free speech.”

What he perhaps did not see was that this fateful decision would help lead to the horrific events the world witnessed in Washington on Wednesday—a mob that struck at the heart of American democracy, the killing of a police officer, and the pitiable deaths of four deluded individuals. But perhaps he should have seen that coming. After all, Facebook had already been blamed for the stirring up of anti-Rohingya mobs in Indonesia, violence against Muslims in Sri Lanka, and the spread of conspiracy theories at home.

But consider Zuckerberg’s calculus: Above all else, according to the norms of American corporate life, he must serve the interests of his public company’s shareholders. He is their chief executive, and per federal regulations, he must do what is financially right by them. And he has both a personal financial and, perhaps, a psychological incentive in ensuring the ongoing commercial dominance over the social media market of his company.

Seen through this lens, Zuckerberg’s thought process immediately crystallizes. In 2019, there was tremendous uncertainty over the outcome of the presidential election that would follow just 12 months later—an election that could also determine crucial regulatory outcomes for Facebook and the internet industry. He did not want to raise Trump’s ire—a president who had layered various threats on Facebook beyond claims of anti-conservative bias, including the implication that Facebook entailed a “very antitrust situation.” Perhaps he hoped that in not angering the president, he could depend on the Republicans to end the threats and give his company a regulatory free pass. And perhaps even more critically, the company did not want to tick off the president’s massive following—much of which is highly active on Facebook’s platforms—and thereby severely diminish on- and off-platform engagement among political conservatives in the United States.

If Zuckerberg had said that Facebook would begin to set clear rules of the road for itself and begin to look more harshly on disinformation, hateful conduct, incitement to violence, and other forms of offending content, it is also likely that he would have set off on a regulatory slippery slope. Governments the world over want Facebook to establish public rules that it closely polices. Stating publicly where those rules lie would have invited the chance of an open, global regulatory inquiry into where Facebook and similar companies should paint those lines—and with it, more questions about a company whose internal practices have not generally stood up to the light of day.

Zuckerberg has today responded to yesterday’s appalling events by suspending Trump’s account for two weeks at least, perhaps indefinitely. But this may be too little, too late; it was on the back of Zuckerberg’s fateful decisions to harbor even the most divisive of content on his platform that such a divided culture in our national politics was ultimately facilitated. We now have seen an army of Trump followers holding stakes and hateful banners under the pillars of the Capitol building—and we can trace the dollars back to one company. Even the decision to de-platform Trump is a calculation that, it can be argued, served the company’s interests on this day given Trump’s election loss and relative concession—and the force of public anger.

What must come now is a public reckoning for Facebook, followed in earnest by sweeping regulatory change—legislative reforms that curb the spread of offending content online by incentivizing more aggressive corporate moderation through Section 230 reform, and address the overreaches of the company’s and broader industry’s commercial overreaches through consumer privacy, market competition and algorithmic transparency. Under the incoming Biden-Harris administration, and with anticipated cooperation in both the House and Senate, such reforms could well be in the offing. Nothing else will convince the company or its executives. True democracies favor free markets and capitalistic activity—but only until to the point that the markets don’t crack democracy itself.

Dipayan Ghosh is co-director of the Digital Platforms & Democracy Project and Shorenstein Fellow at Harvard Kennedy School. He was a technology and economic policy advisor for former President Barack Obama, and until recently worked on U.S. privacy and public policy issues at Facebook.