Google can't encrypt its phone data as well as Apple. That's bad news for its customers -- and good news for the government.
Google’s mobile operating system runs on more phones around the world than Apple’s. And like Apple, Google has publicly embraced the kind of encryption designed to make it difficult for law enforcement to crack — and that has led Washington to accuse Silicon Valley of effectively helping terrorists.
That conflict escalated on Tuesday when a federal judge ordered Apple to unlock an iPhone 5c phone belonging to Syed Rizwan Farook who, with his wife, killed 14 people in a December shooting spree at a San Bernardino, California, community center.
The FBI isn’t simply going after Apple because the Islamist militant used an iPhone. Instead, Apple is in government crosshairs — and not Google — because of the little-known fact that Apple uses far more secure encryption in its mobile software than does Google. That’s potentially bad news for the hundreds of millions of people who use Android phones, but it’s a boon for the law enforcement personnel who say encrypted communications prevented them from stopping the Paris attacks and could enable more terror strikes in the future.
“The phones of the rich keep them safe, and the phones of the poor leave them vulnerable,” said Christopher Soghoian, the principal technologist at the American Civil Liberties Union.
Together, Google and Apple dominate the mobile phone market, but Android devices are more popular than iPhones. Some 51 percent of American phones run Android, compared to the 44 percent that use iOS, and Android devices are much cheaper because they’re sold by a variety of manufacturers, from Samsung at the high end to Alcatel at the low end. An unlocked iPhone 6s starts at $649; an Alcatel can be had for $174.
But the large number of frequently bargain devices running Android also makes it harder for Google to implement strong encryption, according to Soghoian. He said that creates a “digital security divide” where cheaper Android phones come with worse security.
Google did not respond to requests for comment on its security procedures. The company’s CEO, however, has publicly backed Apple in its fight with the government.
“Forcing companies to enable hacking could compromise users’ privacy,” the executive, Sundar Pichai, said on Twitter Wednesday. If the government prevails in the case, Pichai said it “could be a troubling precedent.”
Apple, by contrast, builds its phones and has full control over their software. When Apple discovers security problems with iOS, for example, it can push an update directly to its users. Google, on the other hand, is forced to issue updates to Android in a roundabout way. The company first updates the code, then sends it to device manufacturers, who make their own tweaks to the software. That software is then sent to mobile carriers, who make their own changes to it. Only then is it made available to users.
As a result, Android users consistently use out-of-date software riddled with security problems, according to security experts. “Unfortunately something has gone wrong with the provision of security updates in the Android market,” a trio of University of Cambridge computer scientists wrote in an October paper. “Many smartphones are sold on 12-24 month contracts, and yet our data shows few Android devices receive many security updates.” That’s welcomed by law enforcement and intelligence agencies, since old software is far easier to penetrate than newer versions, but potentially bad news for consumers.
For the moment, law enforcement personnel around the world say terror groups are increasingly using encrypted messaging tools and devices. In an interview with Yahoo News published Wednesday, National Security Agency chief Michael Rogers said that the Islamic State-affiliates who attacked Paris in November had encrypted some of their communications, hindering the ability of his agency to stop the attack.
“Clearly, had we known, Paris would not have happened,” Rogers said.
Apple, unsurprisingly, takes a different view. The company says it will fight the California order — which it fears will set a legal precedent in future cases — to protect the privacy of all its users. “While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products,” Apple CEO Tim Cook wrote in a statement. “And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.”
Apple doesn’t like to admit it, but its embrace of encryption is as much a business model decision as it is a principled embrace of user privacy. In 2013, classified National Security Agency documents leaked by whistleblower Edward Snowden showed how American tech companies worked closely with the U.S. government to eavesdrop or monitor consumers worldwide.
In the wake of those revelations, U.S. tech firms say many of their foreign clients questioned whether their data could be safe on an American server. Cisco said deals were slower to close post-Snowden, and IBM poured $1.2 billion into an initiative to build data centers abroad to address concerns about storing data within U.S. borders.
To quell those fears and buck the NSA, Silicon Valley also embraced strong encryption to an unprecedented degree. “Apple’s motives are clear, if not clearly expressed,” James Lewis, a senior fellow at the Center for Strategic and International Studies, wrote Wednesday. “The Snowden revelations damaged the brand of all American technology products.”
Apple has fought back with ever-stronger security measures, leaving Google struggling to catch up. In 2014, for example, Apple said it would scramble all of the information held on iPhones, making it accessible only to the consumer with a unique passcode, and Google quickly said it would follow suit. But then it immediately backtracked: Phone manufacturers running Android realized that if forced to implement strong encryption, their phones would take huge performance hits and appear less appealing to users.
As a result, Google did not force Android phones to use encryption, and was candid about the reasons. “Due to performance issues on some Android partner devices we are not yet at encryption by default,” the company said in a March statement.
To be sure, Google has moved to implement more encryption measures similar to Apple’s. But the fact remains that Android users are encrypting their data at lower rates than iPhone users. According to a November report from Manhattan District Attorney Cy Vance — an outspoken opponent of encryption — 91 percent of Apple users run a version of iOS that turns on encryption by default, as of October. By contrast, according to Google’s own figures, a mere 34 percent of Android users currently run a version of Android capable of encryption, but that version does not encrypt by default.
(Google’s Nexus phone does encrypt data by default. The most recent version of Android encrypts by default but only does so if the phone meets certain performance targets. A mere 1.2 percent of Android users have installed that software.)
This issue of default encryption weighs heavily on the minds of law enforcement. Authorities have long feared that criminals and terrorists use encryption to hide communications, but it is only in recent years that such technology has become available on a mass scale. Highly popular messaging platforms such as Facebook’s WhatsApp, for example, use end-to-end encryption that even the NSA likely has to invest significant resources to crack.
Law enforcement officials argue that firms like Apple have given criminals cover — without even asking for it — by encrypting communications. WhatsApp, for example, gives Grandma and sophisticated hackers defrauding Ukrainian banking clients alike the protection of powerful mathematics.
Tuesday’s court order is something of a public relations victory for Apple, which can now say to its consumers that the U.S. government was forced to go to court in order to defeat its security features.
By the same token, the Justice Department has placed Apple in a difficult position, forcing the company to defend the privacy of a dead terrorist who helped kill 14 people. It is on this case the Justice Department has begun to abandon its previous pledges of compromise and dialogue with Silicon Valley over encryption, and potentially compel Apple to rewrite its own code.
Photo credit: Stephen Lam/ Getty Images