OM in the News: One Way to Power New AI Data Centers

Where is the energy to power the hundreds of new data centers that are popping up to run artificial intelligence demands coming from? “In the battle for AI dominance, every engine of the economy is getting recruited into the fight—including jet engines'” writes The Wall Street Journal (Feb. 18, 2026). 

Jet engines are a natural fit. Power equipment giants GE Vernova, Siemens Energy, and Mitsubishi Heavy Industries  already sell power turbines—known as aeroderivatives—that are modeled after these very jet engines. Aircraft engine companies such as GE Aerospace , Howmet Aerospace and Woodward also sell land-based aeroderivative turbines or components.

Yet designing the turbine, which keeps as much of the original jet engine features as possible, is a roughly 18-month undertaking.  Instead, it only takes 30 to 45 days to convert a plane’s jet engine to a power-generating turbine. (There are 2 main modifications to convert an aircraft engine to a land-based natural gas turbine. One is replacing the fuel nozzles to utilize natural gas instead of jet fuel. The other is replacing the large fan on the front of the flight engine with a much smaller fan).

Retired aircraft, at an Air Force base near Tucson, Ariz

A company can remanufacture jet-engine parts with a few years of remaining life for use in power turbines, where they can operate for many additional years. Narrow-body jet engines experience higher stress from repeated takeoffs and landings. Power turbines can run as peakers—turning on only when demand surges—or continuously as baseload. Either way, they accumulate less wear and tear.

About 1,600 commercial aircraft engines are retired every year. If a third of those engines get converted into turbines, that would represent about 13 GW of capacity, or more than a quarter of the existing global natural gas turbine capacity.

AI-obsessed tech giants are planning to spend more than $700 billion in capital expenditures this year. The lure of that cash pile will generate a lot of creativity in the power sector.

Classroom discussion questions:

  1. Why is there a need to convert jet engines?
  2. Discuss the growth of data centers and the demands they create. (See our recent post on that topic.)

OM in the News: The Memory-Chip Shortage

Memory is one of the tech world’s most ubiquitous and essential components that come in 2 major types. DRAM handles more fleeting, immediate tasks like using apps. The other kind, called NAND flash memory, provides long-term storage for photos, videos and other data. And there has been a 7-fold increase in contract prices for DRAM and NAND flash in the past year.

Facing soaring memory-chip prices, the world’s biggest electronics companies are staring at a list of unpalatable responses:(1) charging consumers more, (2) eating the costs or (3) rejiggering product specs. Such is the supply-chain disruption wrought by the global drive into AI, which requires fleets of data centers with servers needing gargantuan amounts of memory, reports The Wall Street Journal (Feb. 13, 2026). 

The memory crunch comes at an inopportune time for companies like Nintendo.

That has caused supply to dry up for the makers of smartphones, PCs, gaming consoles and various other electronic gadgets, and triggered a historic price uptick since early last year that is higher than any increase seen before.

Dell has raised prices for some commercial laptops by as much as 30%, while budget PCs from rival Acer now carry several gigabytes less of multitasking memory. Chinese smartphone maker Xiaomi recently discontinued the lower-memory variant of its new midtier device and raised prices. To summarize: A tough year for smartphones, PCs and game consoles is getting worse. Projected shipment declines are now stumbling deeper. PCs, with memory representing as much as 30% of their total costs, are particularly vulnerable.

With investments into AI infrastructure remaining hot, the prospects of memory prices falling soon don’t appear high. Supply is expected to remain tight through 2028.

Classroom discussion questions:

  1. What is the underlying issue?
  2. What can manufacturers of PCs, smartphones, and game consoles do to protect themselves?

 

OM in the News: AI Push Is Costing a Lot More Than the Moon Landing

It’s bigger than the railroad expansion of the 1850s, the Apollo space program that put astronauts on the moon in the 1960s and the decadeslong build-out of the U.S. interstate highway system that ended in the 1970s.

We’re talking about the data centers now being built and financed by some of the world’s biggest companies in the artificial-intelligence boom. Four U.S. tech giants—Microsoft, Meta, Amazon, and  Google—are planning to spend $670 billion to build out AI infrastructure this year alone as they scramble to increase the computing power needed to operate and scale their AI-related endeavors.

And if you compare this spending to some of the biggest capital efforts in U.S. history by percentage of gross domestic product, you can see exactly how staggering the figures are, reports The Wall Street Journal (Feb. 9, 2026). In fact, it’s dwarfed only by the Louisiana Purchase, completed in 1803, which doubled the size of the U.S. and consumed 3% of the GDP.  (The AI buildout is projected at 2.1% of GDP, while railroads in the 1850s were 2%, the US highway system was 0.4%, and the Apollo space program was 0.2%).

The four companies’ capital spending has been increasing as a percentage of their annual revenue the past few years. In 2026, Meta’s spending could amount to more than 50% of its sales for the first time ever.

How is this build-out an OM issue? First, as we discuss in Chapter 2, these four companies are betting that they will attain competitive advantage by competing on low-cost and response. Second, our chapter on sustainability (Supp. 5) points out the costs of carbon footprints, which data centers generate heavily. Third, as we note in the chapter on location strategies (Ch. 8), the centers locate where power is cheap and plentiful.

As of late 2025, Northern Virginia has 64 data centers under construction, solidifying its position as the world’s largest data center market. The region hosts over 550 existing facilities.  They consume massive amounts of power, comparable to the total usage of large states like Minnesota.

Classroom discussion issues:

  1. Discuss the plusses and minuses of this massive construction trend.
  2. What do the builders hope to obtain?

OM in the News: The Environmental Cost of Quizzing AI

Every time you ask Google’s Gemini a query, it takes the same amount of energy as watching 9 seconds of TV. So says Google’s new report detailing the energy consumption, emissions and water use of its generative AI that users turn to every day for everything from writing tips to fact checking. A single Gemini text query emits 0.03 grams of carbon dioxide equivalent and consumes about 5 drops of water.

Microsoft plans $80B for data centers as power constraints loom

The tech giant appears to be looking to ease brewing anxieties about AI searches: that frequently using generative AI such as Gemini can be detrimental to the environment.

Global demand for AI is ramping up rapidly, writes The Wall Street Journal (Aug. 21, 2025). Electricity demand from data centers worldwide is set to more than double by 2030 to about 945 terawatt-hours, which is more than Japan’s total electricity consumption. A single AI-focused data center can use as much electricity as a small city of 100,000 and as much water as a large neighborhood. But the largest ones, that haven’t been completed yet, could consume 20 times more as much. It’s a particular problem in the U.S., with data centers making up 1/2 of its electricity demand growth over the next 5 years.

OpenAI Chief Executive Sam Altman, when asked how much energy a ChatGPT query uses, responded “the average query uses about the amount an oven would use in just over one second, and 1/15 of a teaspoon of water.”

The type of query we feed to generative AI also matters, however. Energy demands can be dampened if we can remove some of that back and forth, and make our prompts a little simpler and easier to understand. Shorter, more concise prompts, along with using smaller AI models, can dramatically reduce energy use.

Tech giants are announcing many new clean-energy power agreements to fuel their AI ambitions, including Google, which recently announced new power deals from geothermal to hydropower. It also plans on an advanced nuclear reactor project in Tennessee.

It’s important for tech companies to divulge how frequently their AI is receiving queries. If it’s being used by one person, emissions are lower, but that’s different if it’s billions of people at 30 data centers across the world.

Classroom discussion questions:

  1. Why is the growth of AI searches an OM issue?
  2. How can this growth be contained, or minimized?

OM in the News: Locating an AI Data Center Means Huge Power Needs

Meta Platforms just scooped up 2,700 acres of Louisiana farmland for what would be its largest-ever data center, built over flat rice fields in one of the poorest corners of the state.  At 4 million square feet, or 70 football fields, Meta’s data center will cost $10 billion and sit on more acreage than L.S.U. in Baton Rouge, which has more than 34,000 students. CEO Zuckerberg says the site will be used to train future versions of Meta’s open source AI models and be “so large it would cover a significant part of Manhattan.”

Building advanced artificial-intelligence systems will take city-sized amounts of power, which has turbocharged electricity demand projections for the first time this century, reports The Wall Street Journal (March 31, 2025). 

operations management and artificial intelligence and AI and location
Construction at the site of Meta’s new data center in Holly Ridge, La

Tech companies are pressing into unexpected parts of the country, far from traditional data-center markets such as Northern Virginia. They are hunting for huge swaths of flat land with access to natural gas and transmission lines, landing them on the doorstep of oil-and-gas country. To meet the voracious power needs of the project and other growth, Entergy Power intends to spend $3.2 billion to build three natural gas-fired power plants, tapping the state’s vast gas reserves.

In tiny Holly Ridge, La., hundreds of pieces of construction equipment are rolling past, with 5,000 construction workers on the way. Meta will bring money, jobs and local tax revenue. But the project also threatens to burden electricity customers across much of Louisiana with higher costs if demand from the tech giant eventually dries up.

L.S.U. estimates Meta could use 15% of Louisiana’s current electricity generation. That is worrisome to other utility customers largely because of the mismatch between the 40- 50 year lifespan of gas-fired power plants and Entergy’s 15-year deal with Meta.

Meta’s permanent jobs—around 500—are fewer than the thousands that might have accompanied an auto factory. For a region with a median household income of $53,000, the impact will be meaningful, though. Average salaries at Meta are projected at $82,000.

As we discuss in Chapter 8, Location Strategies, states often must offer financial incentives to draw major new employers. To woo Meta, Louisiana approved a sales-tax exemption for data-center equipment and helped procure more land from local farmers.

Classroom discussion questions:

  1. Are the incentives offered Meta unusual or risky?
  2. Why are data centers and their current technologies controversial?

OM in the News: Artificial Intelligence vs. Sustainability

Google just released its environmental report. It doesn’t make for comforting reading. Despite the tech giant’s best efforts to operate its business sustainably, GHG emissions rose 13% from a year earlier and are up almost 50% compared to a 2019 baseline.

The reason? Artificial Intelligence. Or rather the expansion of data centers required to service the needs of its insatiable appetite.  But as The Wall Street Journal (July 8, 2024) writes, Google isn’t alone in this. The sustainability reports of other tech firms tell similar stories. 

Of course, it is not just the power demands of data centers that are driving up the emissions numbers. It is the construction of the infrastructure that is also carbon heavy. So we won’t know if any of the efficiencies AI brings truly offset its environmental costs until those centers are all up and running.

Google and Microsoft have vowed to slash emissions by the end of the decade, but new disclosures show their numbers are moving in the wrong direction. The AI boom is substantially responsible for the lack of progress. Large language models like ChatGPT are powered by energy-intensive data centers, and AI is projected to increase electricity demands from data centers by 50% by 2027.

To address the issue, they’re getting creative. Amazon Web Services is pursuing a deal to buy energy directly from a nuclear power plant on the East Coast. Microsoft has eyed small-scale nuclear, too, and unlike many of its peers, it is an enthusiastic purchaser of carbon offsets. Google’s sustainability report was accompanied by an announcement that it had partnered with BlackRock to build a one-gigawatt pipeline of solar capacity in Taiwan. The company also touted its data center efficiency metrics, saying Google-owned data centers are 1.8 times more energy efficient than average.

Despite these efforts, now that the numbers are trickling in, it’s becoming clear that the growth of AI has presented real challenges to tech companies that have long sought to position themselves as climate leaders.

Classroom discussion questions:

  1. High tech firms have long promoted their sustainability goals. What can they do now that AI is demanding massive new sources of power?
  2. How is this an operations management issue?

OM in the News: Amazon, Incentives, and Electricity

 

When officials in Montgomery County responded to a FOIA request on their bid, they delivered a 10-page document of incentives — with every line of text redacted.

We have blogged a few times about Amazon’s quest for incentives in locating its new HQ2, promising a potential of 50,000 jobs. The few bids that have become public are breathtaking financial packages that indicate just how much states are willing to pony up to woo Amazon. Maryland put together an $8.5 billion bid, and New Jersey got legislative approval to offer $7 billion in tax credits and incentives to pick Newark.

Others are not as forthcoming with how taxpayer’s money will be spent. “We are not releasing documents related to Amazon HQ2. We are not subject to F.O.I.A.,” said Miami-Dade Beacon Council. Requests by the New York Times (Aug.5, 2018) to Austin, Atlanta and Indianapolis met with similar responses. The photo reflects the response from Montgomery County, Md.

But today’s post is not about HQ2. It is about Amazon’s cloud computing business–its fastest growing and most profitable division. Data centers come with a lot of ongoing expenses, the biggest of which is electricity. Over the past 2 years, Amazon added dozens of new data centers with vast fields of servers running 24/7. In at least 2 states, it’s also negotiated with utilities and politicians to stick other people with the bills for millions of dollars of electricity.

Amazon stands out for its success in offloading its power costs and also because it dominates America’s cloud business, writes BusinessWeek (Aug.27, 2018). It has gone from nonexistent to using 2 percent of U.S. electricity! Although data centers typically yield few new jobs, politicians desperate to make up for fading manufacturing businesses have worked closely with utility companies to land Amazon data centers. In Virginia, where Amazon operates at least 29 such centers and is planning 11 more, the company’s 78-page application for a special rate agreement has two versions—a heavily redacted public one and another under seal with state regulators.

This is certainly an interesting topic for classroom discussion when covering Chapter 8.

Classroom discussion questions:

  1. What are the plusses and minuses of providing such incentives?
  2. What is the alternative?