OM in the News: AI Push Is Costing a Lot More Than the Moon Landing

It’s bigger than the railroad expansion of the 1850s, the Apollo space program that put astronauts on the moon in the 1960s and the decadeslong build-out of the U.S. interstate highway system that ended in the 1970s.

We’re talking about the data centers now being built and financed by some of the world’s biggest companies in the artificial-intelligence boom. Four U.S. tech giants—Microsoft, Meta, Amazon, and  Google—are planning to spend $670 billion to build out AI infrastructure this year alone as they scramble to increase the computing power needed to operate and scale their AI-related endeavors.

And if you compare this spending to some of the biggest capital efforts in U.S. history by percentage of gross domestic product, you can see exactly how staggering the figures are, reports The Wall Street Journal (Feb. 9, 2026). In fact, it’s dwarfed only by the Louisiana Purchase, completed in 1803, which doubled the size of the U.S. and consumed 3% of the GDP.  (The AI buildout is projected at 2.1% of GDP, while railroads in the 1850s were 2%, the US highway system was 0.4%, and the Apollo space program was 0.2%).

The four companies’ capital spending has been increasing as a percentage of their annual revenue the past few years. In 2026, Meta’s spending could amount to more than 50% of its sales for the first time ever.

How is this build-out an OM issue? First, as we discuss in Chapter 2, these four companies are betting that they will attain competitive advantage by competing on low-cost and response. Second, our chapter on sustainability (Supp. 5) points out the costs of carbon footprints, which data centers generate heavily. Third, as we note in the chapter on location strategies (Ch. 8), the centers locate where power is cheap and plentiful.

As of late 2025, Northern Virginia has 64 data centers under construction, solidifying its position as the world’s largest data center market. The region hosts over 550 existing facilities.  They consume massive amounts of power, comparable to the total usage of large states like Minnesota.

Classroom discussion issues:

  1. Discuss the plusses and minuses of this massive construction trend.
  2. What do the builders hope to obtain?

OM in the News: Digital Twins and Nuclear Fusion

Digital twins, which we cover in Module F (Simulations and Digital Twins), is a big topic at Nvidia and Siemens as they work together to make nuclear fusion a commercial reality. In that chapter (see p. 847), we define a digital twin as:  “an electronic virtual replica of an operation that allows organizations to mimic how a product, process, or system will perform.”

Workers at Commonwealth Fusion Systems’ campus in Devens, Mass

Fusion engineers at the Nvidia/Siemens venture, called Commonwealth Fusion Systems (CFS), will use its digital twin to run simulations, ultimately to hasten the goal of producing fusion energy at a commercial scale. CFS “will be able to compress years of manual experimentation into weeks” with the AI assistance, said its CEO.

Nuclear fission, which splits atoms to produce energy, is already in use in power plants, reports The Wall Street Journal (Jan. 7, 2026).  But many companies see fusion, the energy process that powers the sun by joining atoms together, as a longer-term bet because it can provide much more energy in a cleaner process. Nuclear energy appeals to tech giants because it releases minimal carbon emissions while providing round-the-clock power—particularly as they look to fuel their AI ambitions.

CFS said it was working with Google on an AI project, and explained that that effort has created something like a co-pilot for its fusion machine, while the digital twin plan “is the virtual airplane.” Google also recently signed a power purchase agreement with CFS to secure energy from what could be the first grid-scale fusion plant.

“The race is on for AI. Everyone is trying to get to the next frontier,” said Nvidia’s CEO.

Classroom discussion questions:

  1. Provide other examples of how digital twins can be used.
  2. Why is this fusion project so important as an OM tool?

OM in the News: The AI’s Industry 100-Hour Workweeks

The explosive growth of artificial intelligence has forced leading tech companies to rethink their human resource strategies and job design, reports The Wall Street Journal (Oct. 23, 2025). As the demand for rapid innovation intensifies, organizations like Google, Microsoft, Meta, and Anthropic are relying on small, highly skilled teams to push the boundaries of AI development. These teams often work 80 to 100 hours per week, far exceeding the traditional schedules we discuss in Chapter 10, as they race to keep up with the pace of technological change.

Several researchers compared the circumstances to war. “We’re basically trying to speedrun 20 years of scientific progress in two years,” said one Anthropic scientist. “Extraordinary advances in AI systems are happening every few months. It’s the most interesting scientific question in the world right now.”

This environment has led to a redefinition of job roles and expectations. Rather than adhering to standard 9-to-5 or even the demanding “9-9-6” (9 a.m. to 9 p.m., six days a week) schedules, some AI workers describe “0-0-2” routines—working around the clock with minimal breaks. The pressure is especially acute for those directly involved in developing new AI models, where the unpredictability of research outcomes and the speed of breakthroughs require constant adaptability.

To support these extreme demands, companies are adapting their HR strategies. Some provide weekend meals and ensure continuous staffing, while others appoint rotating “captains” to monitor model outputs and oversee product development. These measures aim to sustain productivity and manage burnout, acknowledging that the traditional boundaries between work and personal life have blurred for many in the field.

Job design in this context emphasizes autonomy, intrinsic motivation, and a sense of mission. Many top AI researchers are driven not just by compensation but by the excitement of discovery and the belief that their work is shaping a pivotal moment in history. This self-motivation reduces the need for formalized overtime requirements, as employees willingly invest extra hours to stay ahead in the competitive landscape.

But this also raises concerns about sustainability and well-being. While some workers have become wealthy from their efforts, most have little time to enjoy their success or maintain relationships outside of work. The model raises questions about long-term retention and the potential need for more balanced, human-centered HR strategies as AI becomes further integrated into mainstream business operations.

Classroom discussion questions:

  1. Your comments on the 100 hour workweek?
  2. Is this a valid human resource strategy?

OM in the News: The Environmental Cost of Quizzing AI

Every time you ask Google’s Gemini a query, it takes the same amount of energy as watching 9 seconds of TV. So says Google’s new report detailing the energy consumption, emissions and water use of its generative AI that users turn to every day for everything from writing tips to fact checking. A single Gemini text query emits 0.03 grams of carbon dioxide equivalent and consumes about 5 drops of water.

Microsoft plans $80B for data centers as power constraints loom

The tech giant appears to be looking to ease brewing anxieties about AI searches: that frequently using generative AI such as Gemini can be detrimental to the environment.

Global demand for AI is ramping up rapidly, writes The Wall Street Journal (Aug. 21, 2025). Electricity demand from data centers worldwide is set to more than double by 2030 to about 945 terawatt-hours, which is more than Japan’s total electricity consumption. A single AI-focused data center can use as much electricity as a small city of 100,000 and as much water as a large neighborhood. But the largest ones, that haven’t been completed yet, could consume 20 times more as much. It’s a particular problem in the U.S., with data centers making up 1/2 of its electricity demand growth over the next 5 years.

OpenAI Chief Executive Sam Altman, when asked how much energy a ChatGPT query uses, responded “the average query uses about the amount an oven would use in just over one second, and 1/15 of a teaspoon of water.”

The type of query we feed to generative AI also matters, however. Energy demands can be dampened if we can remove some of that back and forth, and make our prompts a little simpler and easier to understand. Shorter, more concise prompts, along with using smaller AI models, can dramatically reduce energy use.

Tech giants are announcing many new clean-energy power agreements to fuel their AI ambitions, including Google, which recently announced new power deals from geothermal to hydropower. It also plans on an advanced nuclear reactor project in Tennessee.

It’s important for tech companies to divulge how frequently their AI is receiving queries. If it’s being used by one person, emissions are lower, but that’s different if it’s billions of people at 30 data centers across the world.

Classroom discussion questions:

  1. Why is the growth of AI searches an OM issue?
  2. How can this growth be contained, or minimized?

OM in the News: Artificial Intelligence vs. Sustainability

Google just released its environmental report. It doesn’t make for comforting reading. Despite the tech giant’s best efforts to operate its business sustainably, GHG emissions rose 13% from a year earlier and are up almost 50% compared to a 2019 baseline.

The reason? Artificial Intelligence. Or rather the expansion of data centers required to service the needs of its insatiable appetite.  But as The Wall Street Journal (July 8, 2024) writes, Google isn’t alone in this. The sustainability reports of other tech firms tell similar stories. 

Of course, it is not just the power demands of data centers that are driving up the emissions numbers. It is the construction of the infrastructure that is also carbon heavy. So we won’t know if any of the efficiencies AI brings truly offset its environmental costs until those centers are all up and running.

Google and Microsoft have vowed to slash emissions by the end of the decade, but new disclosures show their numbers are moving in the wrong direction. The AI boom is substantially responsible for the lack of progress. Large language models like ChatGPT are powered by energy-intensive data centers, and AI is projected to increase electricity demands from data centers by 50% by 2027.

To address the issue, they’re getting creative. Amazon Web Services is pursuing a deal to buy energy directly from a nuclear power plant on the East Coast. Microsoft has eyed small-scale nuclear, too, and unlike many of its peers, it is an enthusiastic purchaser of carbon offsets. Google’s sustainability report was accompanied by an announcement that it had partnered with BlackRock to build a one-gigawatt pipeline of solar capacity in Taiwan. The company also touted its data center efficiency metrics, saying Google-owned data centers are 1.8 times more energy efficient than average.

Despite these efforts, now that the numbers are trickling in, it’s becoming clear that the growth of AI has presented real challenges to tech companies that have long sought to position themselves as climate leaders.

Classroom discussion questions:

  1. High tech firms have long promoted their sustainability goals. What can they do now that AI is demanding massive new sources of power?
  2. How is this an operations management issue?

OM in the News: The End of Employees?

UPS employees at a facility in N.H. pack jet-engine parts bound for Pratt & Whitney factories. The work used to be done by Pratt employees
UPS employees at a facility in N.H. pack jet-engine parts bound for Pratt & Whitney factories. The work used to be done by Pratt employees

No one in the airline industry comes close to Virgin America on a measurement of efficiency called revenue per employee. That’s because baggage delivery, maintenance, reservations, catering and many other jobs aren’t done by employees. “We will outsource every job we can that is not customer-facing,” says Virgin’s CEO.

“Never before have American companies tried so hard to employ so few people,” writes The Wall Street Journal (Feb. 3, 2017). The outsourcing wave that moved apparel-making jobs to China and call-center operations to India is now just as likely to happen inside companies across the U.S. and in almost every industry. This “contractor model” is so prevalent that Google, ranked as the best place to work for 7 of the past 10 years, has roughly equal numbers of outsourced workers and full-time employees. About 70,000 temps, vendors and contractors test drive Google’s cars, review legal documents, make products easier to use, manage marketing and data projects, and do other jobs. (They wear red badges, while regular employees wear white ones).

The biggest allure of outsourcing employees, of course, is more control over costs. Contractors help businesses keep their in-house staffing lean and flexible enough to adapt to new ideas or changes in demand. At large firms, 20-50% of the total workforce often is outsourced. Bank of America, Verizon, P&G, and FedEx have thousands of contractors each. In oil, gas and pharmaceuticals, outside workers can outnumber employees by at least 2 to 1.

Janitorial work and cafeteria services disappeared from most company payrolls long ago. But a similar shift is under way for higher-paying, white-collar jobs such as research scientist, recruiter, operations manager and loan underwriter. Few companies or economists expect this trend to reverse. Moving noncore jobs out of a company allows it to devote more time and energy to the things it does best. Businesses currently spend about $1 trillion a year on outsourcing.

Classroom discussion questions:

  1. What are the disadvantages of this massive outsourcing?
  2. Would students want to take “contract” jobs?

OM in the News: Sensors and Sustainability–A New Look at Autos

google carCheap, powerful, microscopic sensors are ubiquitously entering the $2 trillion automotive industry, reports Diamandis.com (Sept. 21, 2015). It’s a big, inefficient, wasteful and dangerous industry. Here’s how the annual numbers stack up for the U.S.:

  • 33,000 lives are lost and a million injuries.
  • $230 billion of accident cost in the U.S. –about 2-3% of GDP.
  • 50 billion hours (or 1 trillion dollars) of people’s time–around 8% of GDP.
  • 50 billion gallons of imported gasoline (12-15% of the USA’s CO2 emissions).

Autonomous cars appear to be coming fast. Google is leading the way, but Apple, Tesla, Uber and every major car company is following. Today, Google’s self-driving cars have driven far more than 1.5 million miles, safely and fully autonomously. Google’s car are made possible because of their suite of sensors. One in particular is a 64-beam Velodyne LIDAR sensor (Laser Imaging Radar) that, combined with cameras, sonar and GPS, is collecting and analyzing 750 Mb of data per second. The car knows everything that’s happening within 100 meters of the sensor.

The impact: In 20 years there will be more than 54 million autonomous cars on the road, meaning:

  • Saved Lives: Autonomous cars don’t drive drunk, don’t text and don’t fall asleep at the wheel.
  • Reclaiming Land: You can fit 8 times more autonomous cars on our roads. Today, in the U.S. we devote over 10% of the urban land to parking spaces and to our paved highways and roads.
  • Saved Energy: Today we give close to 25% of all of our energy to personal transportation, and 25% of our greenhouse gases are going to the car. If cars don’t crash, you don’t need a 5,000-lb SUV driving around a 100-lb passenger.
  • Saved Money/Higher Productivity: Trading out 4,000-lb. cars for lighter electric cars that don’t crash will save 90% on a person’s automotive transportation bill–plus regain 1- 2 hours of daily productivity, reclaiming hundreds of billions of dollars in the U.S. economy.

Classroom discussion questions:

  1. In what other ways are sensors revolutionizing operations management?
  2. What are the downsides of autonomous cars?

OM in the News: Google-Style Perks Go Mainstream

Alterra Pest Control's full-size basketball court is one of several perks
Alterra Pest Control’s full-size basketball court is one of several perks

Alterra’s offices house an NCAA-regulation-size basketball court, a TruGolf simulator and a 90-inch TV tuned to ESPN. Food trucks come to treat the staff to lunch, and fridges stocked with free bottles of Propel water dot the office. Alterra doesn’t make software, computer chips or driverless cars. The Utah company sells pest-control services. But its managers want employees to feel as cosseted as any in Silicon Valley. As companies try to put themselves on a path to faster growth, reports The Wall Street Journal (Aug. 5, 2015), some are mimicking the workplace practices—and lavish perks—at technology behemoths like Google Inc. and Facebook. 

In industries as varied as insurance, electrical contracting and auto loans, managers are spending millions on office upgrades and amenities like free food and comped vacations, claiming that such trappings elevate jobs in unglamorous sectors, helping to recruit employees and to convince high performers to stay. Employers have begun paying more attention to their workplaces over the past two years, adding amenities that wouldn’t have occurred a decade ago.

Alterra now invests more than 10% of profits in food, events and amenities each year, and claims it is paying off. Employees hailing from competing pest-control companies increase their sales by 70% in their first full year at Alterra, and 96% stay at the company for at least a full year. The firm says the writings of Zappos.com CEO Tony Hsieh prompted its focus on employee happiness, which Hsieh claims breeds corporate success. “It’s not fair that they have all the fun,” says Alterra’s chief executive.

Classroom discussion questions:
1. Do rock climbing walls, bean bag chairs, and free mochas make a company more productive?

2. What alternative incentives can companies offer?