(Un)Productivity in the Digital Age

(Un)Productivity in the Digital Age - Mitko Grigorov


This paper examines why productivity, as measured by output per work hour, has not increased significantly during the current Digital Revolution, despite rapid and intense technological progress and the influx of new inventions. The failure of technological progress to bring immediate increases in productivity and standards of living is paradoxical from an economic view. This paper presents statistical data on productivity and gross domestic product (GDP) growth across a number of economies for the past 40 years. Following that, it reviews several economic and history of science and technology theories about the current lower than expected productivity, including its possible relations to the initial delay in productivity growth during the Industrial Revolution of the late eighteenth century and the Technological Revolution of the early twentieth century. It then presents several explanations for the delay in productivity growth that are specific to the Digital Revolution.

Productivity Statistics

For once, our own intuition aligns with economic models in telling us that new technologies should translate into increases in productivity, faster-than-normal GDP growth, and ultimately higher standards of living. The digital revolution has heralded an age of improved communications, flexible work arrangement, increased automation, and more efficient distribution of labor (or substitution of labor with technology in some cases).  And yet, the numbers – the crux of the digital age – tell an entirely different story. Despite the enormous growth in computer power and a myriad of technological inventions, productivity has largely stalled.[1]

Understanding the factors affecting productivity growth is crucial as it is the most accurate measure of how fast the standards of living improve.[2] The economic theory states  that the more the average worker produces, the more the average worker should take home. This relationship holds true even in an era, as Thomas Piketty (2014) has argued recently, in which a disproportionate amount of total income goes to the owners of assets and capital, rather than to the providers of labor.[3]

We can divide the period between 1970 and the present day into two parts (dotted line on Figure 1) of relatively equal length, with the second covering the time when the Internet and advancement in communications technology became an important part of the economy. For this analysis, we use data for G7 countries (the United States, Japan, Germany, France, the United Kingdom, Italy, and Canada) for two main reasons: 1) these data are reliable; 2) in the two periods, the G7 countries accounted for a significant share (between 60 and 70 percent) of the global economy. Productivity grew by 2.6 percent on average between 1970 and 1990.[4] With that pace, productivity – and by proxy the standards of living – should double every 27 years. The difference between this and  the more recent period, covering 1991 to 2013, is staggering. In the last 23 years productivity grew by only 1.7 percent on average, which translates into doubling of the standards of living every 41 years.[5] To put it another way, the data during the former period suggest productivity would have increased eight times during one’s lifetime, while only four times for the data in the latter period.[6]  While it is true that the second period encompasses the Dot-Com Bust of 2001 and the Global Financial Crisis that started in 2008, the first period includes financial troubles of no less calamity: two oil crises in the 1970s and the notorious market crash of 1986. In fact, when we look at the productivity and GDP growth in Figure 1, we see that with the exception of the downturn during the Global Financial Crisis, the second period actually seems less volatile. We observe a similar trend for real GDP, which grew by 3.3 percent between 1970 and 1990 (or doubling every 21.5 years) and 1.9 percent between 1991 and today – doubling every 36.5 years.[7]  To quote Nobel Laureate and economics heavyweight Robert Solow, one still sees the computer age everywhere else but not in productivity statistics.[8]

Literature Review

Some economists and historians of economics, looking at the United States, argue that productivity and GDP growth were already slow even in the first period (1970-1990), compared to the decades between the end of the World War II and 1970.[9] However, productivity in the United States between 1946 and 1970 was actually comparable to the average G7 productivity growth in the 1970-1990 period. In the same post-war era, productivity grew even faster for other G7 economies (in France, for example, this productivity growth was close to 5 percent), but this should also be interpreted with caution as those countries were recovering from destroyed infrastructure after the war; and given the low starting point and the influx of financial resources through the Marshall Plan, it is natural that growth rates will be higher.

Gordon also observed that productivity in the United States actually increased slightly in the most recent period (1.79 percent between 1990 and 2013, Table 1), especially between 1995 and 2004 (average of 2.29 per year).[10] However, the United States is in this regard an outlier and should not be considered in isolation. As Table 1 shows the United States was one of only two countries for which the productivity growth was lower for 1970-90, compared to 1990-2013. For most countries for which reliable productivity data are available, the decrease in productivity in the last two decades compared to the two preceding decades is astounding. Nor is the downturn only confined to countries that started at a very low level of economic development and were largely played catch-up between 1970 and 1990, such as South Korea. The differences for countries that were in a similar technological development as the United States in 1990, including France and Japan, are just as pronounced.  

So why has the advent of personal computers, the Internet, cell phones, emails, smart phones, efficient word processors and data analysis tools, cheap videoconferencing, and a myriad of other inventions failed to translate into higher productivity and income growth?  Gordon  argues that that technology in general is subject to diminishing returns in its ability to increase economic growth.[11] He also notes that it is highly unlikely that the next couple of decades will match the inventions of the previous ones. If this view holds, productivity growth will remain low in the near future at around 1.3 percent per year.[12] The issue of trying to explain this delay in productivity growth by diminishing returns to technology is that in such explanation we rely on purely economic theories. However, such delay is paradoxical from the point of view of current economic theories as they view technology as disembodied and hence any change in technology should translate instantaneously in changes in productivity.[13] By using the standard economic model to describe this delay, we end up in the contradictory situation of using a theory that does not agree with what we observe to describe the reasons for what we observe.  

Gordon, however, notes that the years from 1906 to 1928 – at the height of the Second Industrial, or Technological, Revolution – also saw slow growth despite important inventions, such as electricity, cars, paved roads, plumbing and running water. If delayed economic progress is a natural subsequence of intense technological progress, perhaps the curse of diminishing returns to technology is not inevitable. Such a view is reinforced when we consider the two preceding industrial revolutions. Crafts and Harley and Antras and Voth explore the British Industrial Revolution of 1770-1860.[14] [15] Their models show that productivity growth during the Industrial Revolution was relatively slow, especially in its initial stages. Antras and Vorth’s model shows a moderate acceleration after 1800.  

Several historians of technology, such as David and Rosenberg, have hypothesized that this delay could be the result of a slow diffusion of the new technologies and the time it took to learn how best to take advantage of the new technologies even after they have been incorporated.[16] [17] They focus on the Second Industrial Revolution and the diffusion of electrically powered engines in industrial America. Such technologies diffused only with great delay and even then it took time until managers and workers learned how to best take advantage of this newer and superior technology. Building on these ideas and using a quantitative model of technological diffusion, Atkeson & Kehoe show that several decades could pass before a sustained increase in the pace of technical change leads to corresponding increases in productivity.[18]

Thus a delay in productivity growth occurred both during the First and the Second Industrial Revolutions. But while we can explain the first with the enormous social change it brought as people moved from the villages and agriculture to the towns and industry, and we can argue that the protracted diffusion of a completely new technology slowed productivity in the second, we are still at pains to explain what is causing the delay during the current Digital Revolution.

Checks to Productivity in the Digital Age

Given the limitations of the literature on the issue, this section offers several explanations for the delay in productivity growth. To begin with, there is all that “noise.” We just don’t  have all the information we need; we are inundated with data most of which are of little relevance to our work. The more of it there is, the lower our ability to find what we really need. Or to put it in other words, we are looking for the same old needle in an ever-growing haystack. This is not just poor knowledge management; it’s a natural outcome of a system that has grown infinitely more complex. We call the Digital Age, “the age of information,” but with the same validity we might as well call it, “the age of disinformation.” This abundance of information (whether relevant or not) forces us in very narrow specializations, making us less capable of keeping track of the big picture. The second issue is that along with improved access to resources (human, data, capital), the computer age offers an innumerable ways for time wasting, in which we happily engage. From news and social networks to games, videos, dating sites, and online shopping, the Internet seems built to distract us from the task at hand.  

As noted, several respectable studies have disproven the claim that productivity grew with astronomical rates during the previous two industrial revolutions, so calling the digital age the “Third Industrial Revolution” might be more accurate than most people would think, but for the wrong reasons.[19] [20] Much like the enormous social change of the Industrial Revolution in the late eighteenth and early nineteenth century, there is a comparable economic change at hand as labor is reallocated from manufacturing to services. However, the services sector has a slightly different production function than manufacturing. While efficiency in manufacturing depends only on the producer, efficiency in the service sector depends both on the producer and the consumer. If a worker is performing an operation on an assembly line that produces running shoes (let’s say he is gluing the tongue to the rest of the shoe), whoever ends up wearing the shoes has no effect on how quickly (or slowly) and with what quality the worker attaches the tongue. The two exist in two separate places in time and space. This is not the case with many services.  

Unlike in manufacturing, the producers and consumers of services interact in the same temporal, and often spatial, environment. You are sharing the time and space with your service provider when getting a haircut for example.  Let’s look at the restaurant business.[21] If our argument holds, we should expect that the levels of productivity be very different for the services and manufacturing sectors. This is indeed the case.   In fact, according the Bureau of Labor Statistics, productivity in the restaurant industry has not increased at all in the past ten years, a rather worrying development. Productivity in some other service sectors, such as retail, has increased somewhat (around 20 percent in the past ten years), but this increase is nowhere near the gains seen by the manufacturing sector, which has soared by 50 percent (Figure 2). This is indeed an important development as, per David and Rosenberg, we would expect a larger lag in manufacturing productivity growth compared to the services as it takes longer for the new technologies to get incorporated.[22] [23] Thus, the diffusion of new technologies alone cannot explain the lag in productivity growth.  

So why is productivity so sluggish in the services and the restaurant sector in particular? In the average restaurant, patrons spend time on their phones before ordering, which, apart from increasing the time until ordering, also creates inefficiencies by making the server return to the table several times.[24] Then the customers take pictures of themselves, the food, the restaurant, often making staff members take group photos (and retaking them), further wasting time. Then, they spend more time on their phones before asking for the check and even more after paying their bill. In the end, the serving time per table might almost double because of inefficiencies.[25] This betrays one of the major aspects of today’s changing market. While new technologies might increase the productivity of the producers of services, they also generate a countervailing movement that decreases “productivity” on the consumer’s side of the process. Producers hence invest more input for the same level of output because the nature of demand in the digital age has changed.[26] At the end, many of the new technologies that we expect to increase productivity end up decreasing it.[0] 
A fourth issue is the information-packaging problem. Information used to be packaged for cumulative consumption. One would read article one and then proceed to article two, and so on and so forth, moving up one’s personal learning curve. In the digital age information is packaged into singular how-to articles, which deal with discrete, concrete tasks. The learning curve is not one of knowledge, but of knowledge management. To put it another way, before the digital revolution, it used to be about what one knew; now it is how quickly one can Google it. But our brains continue to work the way they have for centuries: to internalize something, we need to put it on top of something we already know. This is difficult when every unit on the Internet (and by extension the way we deal with knowledge management) is self-contained and somewhat detached from the rest of reality. We end up in a situation, in which many organizations are struggling to find effective ways to manage knowledge without much success as there is a widening gap between how our strategies assume knowledge is packaged and the actual new digital reality.[28] Marshall McLuhan would have been proud: in the Digital Age, the medium truly is the message.

Fifth, we will look at the re-adjustment problem. This is simply the time spent adjusting to new versions of the same technology or to the latest fad. This happens when a company updates to Windows 7 from Vista, just to change to Windows 8 in a couple of years, or to a new graphics editor or email/messaging client, or when one learns a new programming language because of marginal improvements.  With so many updates and new versions, there is a learning curve even if the new version is more intuitive and user friendly. This notion is consistent with Atkeson and Kehoe, whose models shows that the large stock of built-up knowledge in the old economy prior to the Technological Revolution was the reason the transition was slow.[0]

This is true to an even larger degree when we consider management practices in the Digital Age. We will call this last issue the “Great War” management problem. World War I saw an unprecedented advancement of military technology. The machine gun became the main defense weapon, tanks started rolling across the fields of Europe with menace, and the aircraft was introduced, first for reconnaissance and dropping the occasional bomb, but later as fighters and for air-to-ground operations. Despite all these advances in technology, military strategy remained relatively unchanged since the war of 1870. This led to a virtual stalemate, a prolonged war of attrition, and countless casualties. What does this have to do with management in the digital age? Technology has again grown so fast that it has surpassed our strategies. While we function at the edge of the future, management is still rooted deep in the past. The sheer amount of work hours spent in unproductive meetings, preparing unnecessary reports, and completing outdated administrative tasks is staggering. All of these allocate labor to unproductive activities or break the flow of productive labor. This problem is akin to what described as the “displacement of goals” in a bureaucracy.[30] The rules and procedures that initially served to prevent administrative and financial chaos became goals of their own. The bureaucrat works toward rules and regulations as an immediate goal the way many of today’s managers work toward maintaining structures that are dated and found wanting in the Digital Age.


This paper explored why productivity has not increased significantly during the current Digital Revolution, despite rapid and intense technological progress. It presented statistical data on productivity and GDP growth for several economies for the past 40 years. It also reviewed several economic and history of science and technology theories. Acknowledging the gap in the existing literature, it offered essay style explanations for delayed productivity growth.  

The digital world is expanding at an ever-increasing rate. It has taken on a life of its own, causing the gears of supply and demand to be just a bit out of sync. It is likely that technology will continue to grow in the future without a corresponding growth in productivity, a rather unsettling thought for many an economist.  But while we are busy updating our economic theories, we should also learn to build our great ant colonies on the shifting quicksand of global information. For what we know is that, although Industrial Revolutions bring enormous positive change and benefits to society by expanding the production function, they do so only with a lag, which can be as long as four decades. Productivity did not pick up until after 1800, although the Industrial Revolution started as early as 1760. The same was true for the Technological Revolution that started around 1900, but productivity only increased significantly after World War II.  The initial stage of the current Digital Age started in the late 1970s, although its effects were not felt until the early 1990s. Thus we might currently be near the turning point when productivity will again pick up and, as was the case with the previous two Industrial Revolutions, finally lead to a more equitable distribution of wealth.

Notes & References

  1. Brynjolfsson, E. (1993). The productivity paradox of information technology. Communications of the ACM, 36(12), 66-77. 
  2. Martínez-García, E. (2013). Technological progress is key to improving world living standards. Economic Letter, Federal Reserve Bank of Dallas
  3. Piketty, T. (2014). Capital in the Twenty-first Century: Harvard University Press. Although outside of the scope of this brief paper, the relationship between technological progress and equality is an important one. In the current Digital Revolution, much like during the Industrial Revolution and the Technological Revolution, the distribution of the fruits of production has been – at least initially – skewed toward the owners of capital
  4. Organization for Economic Development and Cooperation. (2014). Productivity growth. Retrieved Nov. 12, 2014, from OECD Statistics. For the purpose of this paper productivity is measured using the classical method first described by Robert Solow in his seminal paper in 1957. Some other methods such as potential total factor productivity, shifts in the world technological frontier, and Malmquist indices (Growiec, 2010) are bound to produce slightly different results, but the main message about the differences between the periods should be consistent regardless of the measurement.
  5. Ibid., 2014.
  6. It should also be mentioned that during both periods the fruits from productivity growth were distributed unequally with the top end of the income distribution benefiting the most. At the same time, real wages (how much stuff the average worker can buy with  what he or she takes home) have largely stagnated.
  7. International Monetary Fund. (2014). Real GDP growth. Retrieved Nov. 12, 2014, from World Economic Outlook database.
  8. Solow, R. M. (1987). We’d better watch out. New York Times Book Review, 36.
  9. Gordon, R. J. (2000). Interpreting the “one big wave” in US long-term productivity growth: Springer.
  10. Op. cit., OECD, 2014.
  11. Gordon, R. J. (2012). Is US economic growth over? Faltering innovation confronts the six headwinds: National Bureau of Economic Research.
  12. Ibid.
  13. Atkeson, A., & Kehoe, P. J. (2001). The transition to a new economy after the second industrial revolution: National Bureau of Economic Research.
  14. Crafts, N. F., & Harley, C. K. (1992). Output growth and the British industrial revolution: a restatement of the Crafts‐Harley view. The Economic History Review, 45(4), 703-730.
  15. Antras, P., & Voth, H.-J. (2003). Factor prices and productivity growth during the British industrial revolution. Explorations in Economic history, 40(1), 52-77.
  16. David, P. A. (1990). The dynamo and the computer: an historical perspective on the modern productivity paradox. The American Economic Review, 355-361.
  17. Rosenberg, N. (1976). On technological expectations. The Economic Journal, 523-535. 
  18. Op. cit., Atkeson & Kehoe (2001).
  19. Op. cit., Crafts & Harley (1992).
  20. Op. cit., Antras & Voth (2002).
  21. According to the National Restaurant Association, this is a 700 million dollar industry with 14 million employees and close to a million locations in the United States alone.
  22. Op. cit., David (1990).
  23. Op. cit., Rosenberg (1976).
  24. Hu, E. (April, 2014). Restaurants: The Modern-day lab for our smartphone-obsessed ways. Blog posted to http://www.npr.org/blogs/
  25. Ibid.
  26. We should note that if we bring utility into the equation, things could look a bit different. Perhaps customers are getting more utility from restaurants by taking pictures with the food or making the experience part of their daily online routine. And perhaps we get more satisfaction from a presentation that is to our liking no matter the information being transferred. So at the end, while the producer’s surplus (what the producers benefit from “selling” their product) is being diminished, the consumer’s surplus might be increased to counter those losses.
  27. Bernstein, E. S. (2012). The transparency paradox a role for privacy in organizational learning and operational control. Administrative Science Quarterly, 57(2), 181216.
  28. Romano, N. C., & Nunamaker Jr, J. F. (2001). Meeting analysis: Findings from research and practice. Paper presented at the System Sciences, 2001. Proceedings of the 34th Annual Hawaii International Conference.
  29. Op. cit., Atkeson & Kehoe (2001).
  30. Merton, R. K. (1962). Bureaucratic structure and personality. Reader on Bureaucracy.
Mitko (BC’10, DC’11) is currently an analyst at the Research Department of the International Monetary Fund. He previously worked as a consultant at the World Bank and a news writer and editor at Fox News. His research interests include the effects of industrial revolutions on productivity and growth, as well as the development of digital and crypto currencies. Mitko fondly remembers his time at SAIS Bologna, where he served as one of the editors of this journal and played the bass in SAIS’s own “Justin and the Junk Bonds.”