Sam Altman Has a Plan to Tame the AI He Unleashed
Worldcoin trades cryptocurrency for eyeball scans, creating a global ID database and scaring the willies out of privacy experts.
Sam Altman is no stranger to technological innovation and the oft-resultant fear. His newest project carries a far heavier dose of both.
Sam Altman is no stranger to technological innovation and the oft-resultant fear. His newest project carries a far heavier dose of both.
Altman is the CEO of OpenAI—the company whose chatbot ChatGPT has taken the world by storm—pushing the boundaries of artificial intelligence and raising concerns about the technology’s potential harms.
Lately, however, the conversation has been all about Worldcoin, an initiative backed by another of Altman’s companies called Tools for Humanity. It aims to create a “globally inclusive identity and financial network, owned by the majority of humanity,” according to a white paper describing the project. What that means in practice is a unique digital identifier for each person on Earth, known as a World ID, tied to a bespoke cryptocurrency called WLD. Together, the company says, they can be used to authenticate human beings online in the age of rapidly proliferating AI bots while also providing a pathway to a universal basic income in a global economy disrupted by AI. Worldcoin is starting out with a sizable war chest, having raised $115 million in funding from some of Silicon Valley’s biggest investors that it announced in May.
The controversy around the project stems from the way it works. Worldcoin creates its unique identifier by scanning people’s eyeballs using an orb shaped like an oversized webcam. The iris images are then mapped to a numerical code unique to each individual that serves as their World ID. “The iris code that the Orb outputs is compared against all other iris codes previously generated by all other Orbs,” Worldcoin’s website states.
Although the project officially launched two weeks ago, it has been collecting iris images for more than two years and has thus far signed up more than 2.2 million users across dozens of countries, per a live ticker on the Worldcoin website. In countries where the laws (or lack thereof) allow it, it is also offering prospective users 25 WLD each—currently worth about $50—for lining their eyes up to the orb. WLD is not available in the United States, where, in the words of Altman and Tools for Humanity CEO Alex Blania, the “rules are less clear.”
Some countries are already pushing back. France, Germany, and the United Kingdom have all announced investigations into Worldcoin’s data collection practices in the two weeks since it launched. Kenya went a step further last week, suspending the project’s ability to sign up new users in the country. Police in Nairobi also reportedly raided Worldcoin’s warehouses in the country, seizing equipment and documents. Blania said in a tweet last week that the company had “paused World ID verifications in Kenya as we continue to work with local regulators to address their questions.”
In a statement to Foreign Policy, a spokesperson for the Worldcoin Foundation—a Cayman Islands-registered nonprofit that oversees Worldcoin—attributed the pause to an “overwhelming” demand in Kenya that saw “tens of thousands” line up in the country to get their irises scanned. “Out of an abundance of caution and in an effort to mitigate crowd volume, verification services have been temporarily paused,” the spokesperson said. “During the pause, the team will develop an onboarding program that encompasses more robust crowd control measures and work with local officials to increase understanding of the privacy measures and commitments Worldcoin implements, not only in Kenya, but everywhere.”
Worldcoin does not link the World ID to any personal information and deletes iris images after collecting them unless users opt in to having theirs stored in encrypted form, the spokesperson added. While companies and governments can integrate the World ID with their own systems, Worldcoin said it uses a layer of encryption, called a “zero-knowledge proof,” that allows third parties to verify a user without revealing any underlying information.
Blania and Worldcoin have previously said that the cryptocurrency is in part an incentive to entice users in the initial stages, but they have also presented it as a potential pathway to receive income and benefits in the future. Worldcoin does not currently monetize its offerings, but some of its high-profile backers have presented it as an alternative to existing online verification tools such as CAPTCHA.
But digital privacy experts remain skeptical. “People have been creeped out by this for very understandable reasons,” said Katitza Rodriguez, the Electronic Frontier Foundation’s policy director for global privacy. “Even if they do have meaningful technical privacy protections, people should not be induced with promises of money to ‘consent’ to something that they don’t understand.”
The central thrust of Worldcoin’s argument is that in the age of AI-enabled bots and deep fakes that are getting more sophisticated, humans will need a universal “proof of personhood” to distinguish themselves online, and businesses will need that proof to avoid being overrun by fake netizens. But that entire point may also be misguided, said Nathalie Maréchal, co-director of the privacy and data project at the nonprofit Center for Democracy and Technology. “A technology or a product exists to solve a problem, so you have to define the problem that you’re attempting to solve,” she said. “In this case, the problem that Worldcoin is attempting to solve is that there is no single sign-on for the entire world, and I don’t think it’s desirable to have a single sign-on for the entire world.”
And despite Worldcoin’s iris deletion mechanisms, linking digital identity to biometrics will likely pose several deeper privacy issues that go beyond the project itself. The rush to register for a World ID “normalizes the use of iris scanning in other contexts, most of which are not particularly privacy-protective and not necessarily in the subject’s interest,” Rodriguez said.
It also raises deeper philosophical and ethical questions about the AI revolution itself. “The idea that widespread generative AI means that we should prove our humanity to express ourselves online is very troubling,” Maréchal said. “In the same way that we should be able to walk down a public street without having facial recognition cameras identifying us and being able to constitute records of what we do as we go about our day, we should similarly be free from that same kind of surveillance online.”
Altman is in a uniquely odd position. His ChatGPT is revolutionizing artificial intelligence. Now he is trying to scan eyeballs to protect against any threats from runaway AI, she said.
“The fact that many of the same people who are behind the explosion in generative AI are also setting themselves up as the ones who have the solution, that should make us suspicious.”
Rishi Iyengar is a reporter at Foreign Policy. Twitter: @Iyengarish
More from Foreign Policy

Chinese Hospitals Are Housing Another Deadly Outbreak
Authorities are covering up the spread of antibiotic-resistant pneumonia.

Henry Kissinger, Colossus on the World Stage
The late statesman was a master of realpolitik—whom some regarded as a war criminal.

The West’s False Choice in Ukraine
The crossroads is not between war and compromise, but between victory and defeat.

The Masterminds
Washington wants to get tough on China, and the leaders of the House China Committee are in the driver’s seat.
Join the Conversation
Commenting on this and other recent articles is just one benefit of a Foreign Policy subscription.
Already a subscriber?
.Subscribe Subscribe
View Comments
Join the Conversation
Join the conversation on this and other recent Foreign Policy articles when you subscribe now.
Subscribe Subscribe
Not your account?
View Comments
Join the Conversation
Please follow our comment guidelines, stay on topic, and be civil, courteous, and respectful of others’ beliefs.