You are here

The false promise of America's CHIPS Act

Nov 26,2022 - Last updated at Nov 26,2022

WASHINGTON, DC  —  The US Congress recently approved the CHIPS and Science Act, which allocates over $50 billion to strengthen the semiconductor industry in the hope of making the United States self-sufficient. And US Trade Representative Katherine Tai said that President Joe Biden’s administration should be “replicating” the CHIPS Act for other industries “as the key to American competitiveness”.

Semiconductors are certainly essential to a modern economy, and it makes sense to diversify sources. But it is doubtful that the CHIPS Act will achieve its stated goals, much less that it should be used as a model for similar support to other industries.

The law is flawed in many ways. Subsidies to support research and development account for only 21 per cent of the planned expenditures, with the rest going to support physical plant construction. Yet, the US comparative advantage internationally is unquestionably in R&D. Building manufacturing facilities will not accelerate chip development.

“Moore’s Law” still holds: the number of transistors on an integrated circuit doubles every two years. 5G chips are now in production at advanced semiconductor fabrication plants (“fabs”), and research is well under way to develop the next generation. Each new generation of chip needs new fabs in which to produce them.

Fabs are mind-bogglingly complex, hugely expensive, and require many machines, which foreign companies often are best positioned to provide. In his book Chip War, Chris Miller of Tufts University points out that the Dutch company ASML, for example, has the technology and organisation to produce extreme ultraviolet lithography machines, which are necessary to churn out the most advanced chips. One such machine requires 457,329 parts, which themselves are produced by companies in different countries.

Policymakers have termed the sorts of measures supported by the CHIPS Act “industrial policy”. But the term encompasses all sorts of policies and programs adopted to support economic, and especially industrial, activity. In the US, industrial policy has primarily consisted of measures that support private-sector economic activity, such as investment in R&D and transportation infrastructure. According to one recent assessment, while policymakers have been successful in supporting basic research and activities that enable improved productivity across the private sector, they have done poorly in identifying and favoring individual firms and industries.

Reliance on the private sector underpins the US economy’s historically high rates of innovation and productivity growth, a model that was spectacularly successful in developing the semiconductor industry. And successful efforts in other countries to catch up with the US industry have entailed integrating into it, rather than replicating it. The costs of replication are simply too high. “A facility to fabricate the most advanced logic chips costs twice as much as an aircraft carrier,” Miller notes, “but will only be cutting-edge for a couple of years”.

At the same time, Miller notes, there are questions about whether the industry’s future lies in further Moore’s Law advances or in developing more specialised chips, as Intel and Google are doing.

There are other major concerns. It is estimated that establishing the plants needed to produce chips used in 2019 would require $1.2 trillion in upfront costs, then another $125 billion annually, and that of course does not include costs of R&D, innovation, and establishment of fabs for new state-of-the art-chips. There is no way that the US can achieve self-sufficiency in production of chips now on the market, much less master the technological frontier by itself.

Moreover, the government officials who select which companies’ plans merit support must be at least as skilled as the private-sector applicants. Yet a “skills gap” is reported across the board, electrical engineers, software developers, print technicians, production specialists, and many more.

These issues arise in most new economic activities. The fact that R&D is focused on seeking new results necessarily makes it risky. That itself indicates a comparative advantage for the private sector, so the incentives for excessive caution in the public sector are much greater, because failure is more visible and success less rewarded.

Already, chip demand has softened markedly. The industry is cyclical, and a downturn should not be too surprising. Intel, for example, already reported lower sales and earnings in 2021, and delivery of new chips has been delayed because of glitches. New chip production is likely to come on line just when overall demand is falling.

It is understandable that the US wants to maintain its technological lead in chips. But subsidising large existing firms and government management of the industry almost certainly will not achieve that goal.

A far preferable approach would be to support the global industry among friendly countries, encourage competition, increase the number of highly skilled immigrants with needed skills, support the expansion of qualified training facilities and capacity, increase incentives for students to get appropriate training, and allocate more resources to R&D. These are the types of measures that have served the US economy so well in the past. In the semiconductor race, too, they offer far greater promise of success than the provisions of the CHIPS Act do.

 

Anne O. Krueger, a former World Bank chief economist and former first deputy managing director of the International Monetary Fund, is senior research professor of International Economics at the Johns Hopkins University School of Advanced International Studies and senior fellow at the Centre for International Development at Stanford University. Copyright: Project Syndicate, 2022. www.project-syndicate.org

up
111 users have voted.


Newsletter

Get top stories and blog posts emailed to you each day.

PDF