
Nestled at the foot of Utah’s Wasatch Mountains, the IM Flash plant is a paragon of American high-tech manufacturing. Robots glide along the ceiling, moving silicon wafers the size of dinner plates between hulking machines that deposit and etch microscopic layers of material to build the most advanced memory chips in the world. For the 1,700 technicians and scientists who tend to the robots and troubleshoot problems in the delicate manufacturing process, the jobs offer generous pay and benefits.
For Intel and Micron Technology, the two American companies that jointly own and operate IM Flash, the venture allows both of them to sell cutting-edge, 3-D memory chips while sharing the multibillion-dollar costs of a modern semiconductor factory. In many ways, however, the IM Flash plant is an outlier. “While companies based in the U.S. still dominate chip sales worldwide, only about 13% of the world’s chip manufacturing capacity was in this country in 2015, down from 30% in 1990,” reports The New York Times (Feb. 27, 2017).
“It’s quite a bit more expensive to build a factory in the U.S.,” says Intel. The firm (which also has factories in Ireland, Israel and China) estimates that the extra cost for an American plant is more than $2 billion. Chip makers are hopeful that President Trump, who has promised large corporate tax cuts and a tougher approach to trade with China, will help them. The chip industry spends 1/5 of its revenue on R&D.
Looming in the background is China, which is currently a bit player in the industry but has committed to spend upward of $100 billion to create a world-class chip industry. “China is the largest market for us on the planet,” says an Intel OM exec. “It made sense to locate some production in China.”
Classroom discussion questions:
- Why has chip manufacturing declined in the U.S?
2. Why is this a critical industry to retain?













