[ad_1]
Universities have grown rapidly in recent decades. Higher-education institutions around the world now employ 15 million researchers, up from 4 million in 1980. These employees prepare five times the number of papers every year. Governments have increased expenditure on this sector. This rapid expansion is justified, in part, by adherence to sound economic principles. Universities are expected to produce intellectual and scientific breakthroughs that can be used by businesses, government, and regular people. Such ideas are placed in the public domain, available to all. So, in theory, universities should be an excellent source of productivity growth.
However, in practice, the massive expansion of higher education has coincided with a slowdown in productivity. While output per hour of workers in the rich world grew by 4% per year in the 1950s and 1960s, 1% per year was the norm in the decade before the COVID-19 pandemic. Despite a wave of innovation in artificial intelligence (AI), productivity growth remains weak – less than 1% per year, according to a rough estimate – which is bad news for economic growth. A new paper by five economists Ashish Arora, Sharon Belenzon, Larissa C. Ciocca, Lia Scheer and Hansen Zhang suggests that the rapid growth of universities and the stagnant productivity of the rich world may be two sides of the same coin.
To find out the reason, turn to history. Higher education played a minor role in innovation in the post-war period. Businesses had greater responsibility for achieving scientific breakthroughs: in the US during the 1950s they spent four times more than universities on research. Companies like telecom company AT&T and energy company General Electric were as intelligent as they were profitable. In the 1960s the research and development (R&D) unit of DuPont, a chemicals company, published more articles in the Journal of the American Chemical Society than the Massachusetts Institute of Technology and Caltech combined. Ten or more people at Bell Labs, once part of AT&T, did research that won the Nobel Prize.
Strict anti-monopoly laws led to the emergence of huge corporate laboratories. This often made it difficult for one firm to acquire another firm’s inventions by purchasing them. So businesses had no choice but to develop the idea themselves. The golden age of the corporate lab ended when competition policy became lax in the 1970s and 1980s. Additionally, the increase in university research has convinced many owners that they no longer need to spend the money themselves. Today only a few companies in big tech and pharma offer anything comparable to the DuPont of yesteryear.
The new paper by Mr. Arora and his colleagues, along with a different group of authors in 2019, makes a subtle but devastating suggestion: When it comes to delivering productivity gains, the old, big-business model of science works. Better than the new, university-run one. The authors use a vast range of data, including everything from counting PhDs to analyzing citations. To identify a causal relationship between public science and corporate R&D, they use a complex methodology that involves analyzing changes in the federal budget. More broadly, they find that scientific breakthroughs by public institutions over many years “get little or no response from established corporations”. , it has no effect on corporations’ own publications, their patents, or the number of scientists they employ, the life sciences being the exception. And this, in turn, points to a smaller impact on economy-wide productivity.
Why do companies struggle to utilize ideas produced by universities? The loss of the corporate lab is part of the answer. Such institutions were home to a vibrant mix of thinkers and doers. In the 1940s, Bell Labs had the interdisciplinary team of chemists, metallurgists, and physicists needed to solve the overlapping theoretical and practical problems associated with the development of the transistor. That cross-cutting expertise is now largely gone. Another part of the answer relates to universities. Freed from the demands of corporate overlords, research focuses more on satisfying the curiosity of geeks or increasing citation counts, rather than on finding breakthroughs that will change the world or make money. In moderation, doing research for the sake of research is not a bad thing; Some important technologies, such as penicillin, were discovered almost accidentally. But if everyone is arguing over how many angels dance on the head of a pin, the economy suffers.
When higher-education institutions do work more relevant to the real world, the results are troubling. The authors believe that as universities produce more new PhD graduates, companies will find it easier to invent new materials. Yet universities’ patents have an adverse effect, leading corporations to produce fewer patents themselves. It is possible that existing businesses, concerned about competition from university spinoffs, may cut back on R&D in that area. Although no one knows for sure how these opposing effects are balanced, the authors point to a net decline of about 1.5% per year in corporate patenting. In other words, the vast fiscal resources devoted to public science potentially make businesses across the rich world less innovative.
If you are so smart, why not rich?
Perhaps, over time, universities and the corporate sector will work together more profitably. Tougher competition policy may force businesses to behave a little more like the post-war period, and strengthen their internal research. And corporate researchers, rather than universities, are driving the current generative AI innovation boom: in some cases, corporate labs have already risen from the ashes. However, at some point, governments will need to ask themselves tough questions. In a world of weak economic growth, abundant public support for universities may seem an inappropriate luxury.
© 2023, The Economist Newspaper Limited. All rights reserved. From The Economist, published under license. Original content can be found at www.economist.com
[ad_2]


