The Real Payoff From Artificial Intelligence Is Still a Decade Off
The robot revolution hasn't started yet.
It has been 21 years since IBM’s Deep Blue supercomputer checkmated chess champion Garry Kasparov, marking a historic moment in the development of artificial intelligence technologies. Since then, artificial intelligence has invaded everyday objects, such as cell phones, cars, fridges, and televisions. But the world economy seems to have little to show for the proliferation of smartness. Among advanced economies, productivity growth is slower now than at any time in the past five decades. National GDPs and standards of living, meanwhile, have been relatively stagnant for years.
This situation poses something of a riddle: Previous waves of technical innovation have come with rising productivity and, in turn, leaps forward in economic growth and well-being. For example, once electricity became widespread in the United States in the 20th century, labor productivity started growing at an annual rate of 4 percent—almost four times higher than the current rate.
There are two schools of thought about today’s productivity puzzle. On the one hand are techno-pessimists, such as Northwestern University professor Robert Gordon, who believe that today’s technologies are the issue. The six innovations that powered economic growth from 1870 to 1970—electricity, urban sanitation, chemicals, pharmaceuticals, the internal combustion engine, and modern communications technologies—the thinking goes, were simply more transformative than, say, Siri.
On the other hand are techno-optimists who counter that today’s innovations—cloud computing, big data, and the “internet of things,” which are at the heart of the artificial intelligence revolution—are, indeed, transformative and that their benefits are already being enjoyed by firms and consumers around the world. The problem, scholars such as British economists Jonathan Haskel and Stian Westlake argue, is that national accounting statistics simply cannot capture those benefits. The concept of GDP first emerged in the 1930s to measure economies that were primarily devoted to the production of tangible goods. Intangible goods and services, by contrast, increasingly dominate today’s economies. If GDP figures properly tallied the intangible economy, the argument goes, then productivity growth would look much better.
There is some truth in both theories; certainly, electricity changed the structure of work and home life in ways that Google Home has not. It is likewise true that GDP does not count free online services such as Google, Facebook, and YouTube that massively contribute to the well-being of consumers. But there might be a third, more straightforward, solution to the productivity riddle—one that even reconciles the other two. Simply put, the latest revolution is not showing up in national statistics because it has not yet really begun.
In reality, it takes a considerable amount of time for firms to make good use of new technologies, especially general-purpose technologies, as economists Erik Brynjolfsson, Daniel Rock, and Chad Syverson showed in a working paper for the National Bureau of Economic Research. In fact, it is only after a sufficient stock of the new technology and complementary innovations (both tangible and intangible) are built up that a technological revolution shows up in the numbers. And that usually takes a quarter-century years at least.
General-purpose technologies, as the economists Boyan Jovanovic and Peter Rousseau have written, are innovations that are pervasive, improve over time, and spawn further innovation. They have spurred economic revolutions since the 19th century. The steam engine drove the first wave of industrialization in the 1890s to 1920s; electricity powered the second wave in the 1890s to 1930s; and information technologies brought the third, which started in the 1970s and culminated with the explosion of the internet in the 2000s, thus paving the way for the fourth industrial revolution, which is currently underway. Its key driver is artificial intelligence, which makes robots smart, facilitates the analysis of big data, allows for the customization of almost any product, and enables control of sophisticated industrial processes.
Because general-purpose technologies can be used in so many ways, their adoption takes a long time to reach critical mass. It took more than two decades for electricity to surpass steam (in terms of share of total horsepower in manufacturing), for example, and almost four decades to become the undisputed source of power generation. That makes sense: To make use of electricity, governments had to invest in nationwide electric grids; entrepreneurs had to invent complimentary technologies like light bulbs, cables, and switches; bureaucrats had to agree on standards such as the voltage of the current and the shape of the plug; and ultimately, businesses had to create saleable products compatible with the new source of power.
A similar process took place with modern information and communications technology. It took about two decades for such equipment to surpass more than 1 percent of all capital stock. Then, between 1991 and 2001, the share rose to 5 percent before jumping again to 8 percent in 2008, where it has roughly stabilized. Likewise, the first modern boost to the artificial intelligence revolution occurred in 2011, when the IBM Watson computer system won $1 million on Jeopardy. The next high spot came five years later, when the AlphaGo computer system developed by the Google DeepMind team beat Lee Sedol, one of the world’s greatest Go players, by four games to one. Since then, major breakthroughs have involved skin cancer classification and speech recognition, but artificial intelligence-related activities have mostly remained the prerogative of tech giants or highly specialized fast-growing start-ups.
Jovanovic and Rousseau documented a matching pattern in productivity statistics. In the historical cases of electricity and IT, productivity growth remained sluggish in the first 25 years after the outbreak of a new general-purpose technology. A decade-long acceleration followed, during which growth jumped to around 4 percent before decelerating once again to roughly 1 percent. The lag in productivity growth is not surprising. It takes time for the true potential of a general-purpose technology to become clear, and it takes more time thereafter for firms to decide how to adapt production processes accordingly.
It is thus not surprising that the world can be in both boom time for artificial intelligence and a bust time for productivity growth. The good news is that, as I write, business activities related to artificial intelligence are accelerating. The number of active, venture-backed private companies in the United States that are developing artificial-intelligence systems is 14 times larger now than in 2000. Similarly, industrial robots—many of them programmed with new artificial intelligence—are more pervasive than ever. Between 2003 and 2010, the number of industrial robots worldwide remained roughly stagnant. The figure nearly doubled between 2010 and 2014. By 2020, the stock of robots is expected to be almost three times larger than it was in 2014.
But many artificial intelligence projects are still in the research and development phase. That means that there are a lot of intangible investments (such as software, databases, design, training, and so on) related to this sector, but not goods that national accounts would capture. To see how intangibles are becoming dominant even in traditional sectors, look at the car industry. Software content in vehicles rose from 7 percent of a vehicle’s value in 2000 to 10 percent of a vehicle’s value in 2010. That figure is expected to jump to 30 percent by 2030. Statistical offices are working hard to update the way they build their national accounts, but until radical accounting reforms will not be adopted productivity might remain (apparently) stagnant, even if new technologies become truly widespread and a real boon to the economy.
To be sure, all the ferment in the artificial intelligence sector has probably led to a mismatch between expectations and reality. The Organisation for Economic Co-operation and Development has reported that new technologies developed at the global technological frontier are spreading across countries faster than ever, but that they are taking more time to be adopted by a mass of firms within any given economy. Many small companies are still struggling with the third industrial revolution; artificial intelligence is certainly not a priority for them. And for a while, such adoption will be an economic drain. Companies must invest money, time, and managerial attention to digital assets and capabilities. In many cases, they must duplicate costs to experiment with new processes and models while still preserving their traditional procedures. Take autonomous cars, for example. Even though they are not yet commercially available, they already absorb a lot of resources and attention.
But be patient. If history is any guide, the payoff from artificial intelligence will come at some point—probably not before 2030. So, until then, use the time to learn skills robots will not yet be able to master.