Can Facebook — and the Republic — Solve the Fake News Problem?
Incendiary false news reports proliferated on the platform during the U.S. election campaign.
This election season, Facebook turned into a cesspool of false news. A fake story about Pope Francis endorsing Donald Trump spread like wildfire. An article purportedly describing how an FBI agent investigating Hillary Clinton’s email server had killed his wife and shot himself flew across the platform. Neither story was true -- and now the social media giant faces an outcry that it effectively aided and abetted Donald Trump’s electoral victory last week.
This election season, Facebook turned into a cesspool of false news. A fake story about Pope Francis endorsing Donald Trump spread like wildfire. An article purportedly describing how an FBI agent investigating Hillary Clinton’s email server had killed his wife and shot himself flew across the platform. Neither story was true — and now the social media giant faces an outcry that it effectively aided and abetted Donald Trump’s electoral victory last week.
Mark Zuckerberg, Facebook’s founder and CEO, has called the notion “crazy,” but he’s clearly sensitive to the criticism at a time when nearly half of Americans consume news through Facebook. What’s more, the entire business rationale of the social media site is to generate advertising revenue — dollars predicated on the notion that what Facebook tells its users, they take to the bank.
This week, the company moved to cut off fake news outlets from its advertising platform — likely driven in part by a BuzzFeed expose detailing how a group of Macedonian teens set up a cottage industry of right-wing Facebook pages trafficking in viral fake news and pulling down enough advertising to fund a comfortable teenage existence in a poor Balkan state. But Zuckerberg faces a much larger problem than strangling the ill-gotten ad revenue of a few Balkan propagandists.
“Facebook needs to to be known as a place that can have influence on how people think and behave,” said Mike Ananny, a professor at the University of Southern California who studies news and social media. “If they didn’t then why would advertisers give them money?”
Facebook newsfeeds are posts, photos, and articles curated just for users by a company-designed algorithm. Zuckerberg says the goal is to deliver the content that is most “meaningful” — in practice, that means content that your extended network has interacted with in some way by commenting on it, liking it, or sharing the material.
But that model has created a tinderbox for far-right conflagrations. In the month leading up to Nov. 10, Breitbart, the house organ of the alt-right, beat a who’s-who of establishment media outlets in total Facebook interactions. Only CNN and Fox outperformed the upstart website — whose success this election season propelled the site’s former CEO Steve Bannon, an operative with ties to white nationalist movements, into a senior advisory position in the Trump White House.
“As we’ve learned in this election, bullshit is highly engaging,” former Facebook designer Bobby Goodlatte added.
It’s not just Facebook’s problem. On Monday, a Google search for “final election results” turned up a fake article in Google News results claiming Trump had won the popular vote. In response, Google also cracked down ad revenue for such sites, and banned its advertising network on sites that “misrepresent, misstate, or conceal information.”
Privately, Facebook executives are reportedly concerned about the degree to which fake news and racist and alt-right memes on the platform influenced voting decisions, and are looking for ways to better deal with fake and extreme right-wing political content. One group of employees believe the company hasn’t gone far enough and have formed a renegade taskforce to tackle the problem, according to BuzzFeed.
Over the weekend, Zuckerberg indicated that he would be unlikely to crack down hard on fake news on the platform. “Identifying the ‘truth’ is complicated,” he wrote on his page. “I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves.” Users flagging suspicious content, he seems to suggest, could be enough to weed out the rot.
But in doing so, Zuckerberg is optimistic about human nature. “Our goal is to show people the content they will find most meaningful, and people want accurate news,” he wrote. “In my experience, people are good, and even if you may not feel that way today, believing in people leads to better results over the long term.”
It’s not clear why Zuckerberg believes accurate news is in demand. “We don’t know whether people want accurate news,” Ananny said. “The National Enquirer should have gone out of business a long time ago if that was true.”
For its part, Google’s initial efforts to weed out noxious and fake news items come down to tweaking some code. “The goal of Search is to provide the most relevant and useful results for our users,” a spokesperson for the company said in a statement. “In this case we clearly didn’t get it right, but we are continually working to improve our algorithms.”
But for both Facebook and Google, algorithms only go so far in solving the fake news problem. Computers seem to have trouble determining what is and isn’t fake news, and Facebook has in recent months indicated it isn’t inclined to keep humans in the loop. That hasn’t gone well.
In August, the company fired a group of journalists who had been involved in selecting so-called “trending stories” to be featured prominently on the site after allegations arose that they had a left-wing bias. They were replaced by an algorithm — which immediately promoted a false story about Fox News anchor Megyn Kelly getting fired from Fox News for supporting Clinton. (She wasn’t, of course.)
Zuckerberg insists his platform isn’t littered with false information — “more than 99 percent of what people see is authentic” — but it’s impossible to validate his claims.
For researchers like Ananny, studying Facebook is an exercise in frustration. The company provides little access to its data and is ferociously protective of how its algorithm works. While political scientists have determined that Facebook can influence voter turn-out, the finer dynamics of how the platform influences voter behavior remains unknown.
Researchers fear that social media companies such as Facebook are accelerating the trend begun almost two decades ago of herding news consumers into like-minded enclaves where they engage with content that only reinforces their existing views.
It’s a notion that Facebook denies, but tech thinkers argue that the so-called “filter bubble” shields internet users from opposing viewpoints. A 2016 study of Facebook, for example, found that users tend to “promote their favored narratives” and “form polarized groups.” Whether a given piece of content agrees with their pre-existing beliefs “helps to account for users’ decisions” about whether to share content — that most prized action on the world’s largest social media platform.
“Engagement is the holy grail for Facebook,” Ananny said. “But engagement is not what republics need in order to live together.”
Frank Zauritz – Pool /Getty Images
More from Foreign Policy

A New Multilateralism
How the United States can rejuvenate the global institutions it created.

America Prepares for a Pacific War With China It Doesn’t Want
Embedded with U.S. forces in the Pacific, I saw the dilemmas of deterrence firsthand.

The Endless Frustration of Chinese Diplomacy
Beijing’s representatives are always scared they could be the next to vanish.

The End of America’s Middle East
The region’s four major countries have all forfeited Washington’s trust.