OM in the News: AI’s Big Manufacturing Productivity Gains

The efficiency and productivity improvements AI can deliver through automation and digitalization will help bridge manufacturing’s workforce gap, writes Industry Week (March 13, 2026).

Similar to the PC revolution decades ago, all signs point to AI following suit with enhanced productivity and profitability. Productivity soared when PCs became interconnected across organizations. Manufacturing will see the same breakthrough with “embedded AI”—to help ease workforce bottlenecks with specific solutions. On the shop floor, for example, predictive-maintenance AI (see Chapter 17) can analyze sensor data to forecast equipment failures and avoid labor-sapping downtime.

AI vision systems (Chapter 7) can catch defects on production lines at a pace beyond human capabilities and without the repetition-induced fatigue and employee turnover. Collaborative robots (cobots) and automated mobile robots transport material and can assist with assembly and repetitive operations. AI’s coding capabilities extend to numerical control and other industrial equipment, speeding up setup time and productivity in hard-to-fill technical positions.

The interaction of embedded AI, agent-based AI, and machine learning across different areas of an organization holds the greatest promise in solving long-term labor shortages. AI can already let a customer snap a photo of a damaged part and identify it for replacement. Its real power will manifest when AI can also determine the part’s inventory status and locations, establish shipping terms and timing, add the part to the procurement queue to replenish once it’s sold, alert engineering that a design change for a chronic defect may be in order, and propose alternative designs.

Here is a  current example involving AI across systems: the big  semiconductor company AMD is using generative AI to track down the root cause of delivery delays, simplifying complex supply chain interactions to transform a complex, specialist-dependent, labor-intensive manual process into faster issue resolution and better decision-making. The system cuts the time needed for what was a 14-step process taking 20-30 minutes by 90%, saving more than 3,100 staff hours a year.

Also coming soon to these intelligent product recommendation engines is an ability to parse what can be 50-page tender documents to extract multiple configurable products for sales quotes. That not only saves time, but also enables junior staff to handle work that has previously required experienced hands.

Classroom discussion questions:

  1. What can AI do to improve a procurement system?
  2. What does “embedded AI” mean?

OM in the News: AI Is Mining Our Trash for Treasure

Here’s a job the computers can take without much complaint: sorting recyclables. For humans, it is a foul, laborious job that entails standing over a conveyor belt, plucking beer cans and detergent bottles from a stream of refuse. The job pays little and is hard to fill.

At one recycling facility near Hartford, machines are taking over the dirtiest jobs, reports The Wall Street Journal (Jan. 8, 2026). A few workers remain on the line, mostly to watch for hazardous items. Otherwise, the system of conveyors, magnets, optical sorters and pneumatic blocks runs largely unmanned. The technology allows them to sort up to 60 tons an hour of curbside recycling into precisely sorted bales of paper, plastic, aluminum cans and other materials. The material is sold to mills, manufacturers and remelt facilities, which pay more for cleaner bales.

AI is used to instantly spot recyclables and send instructions to machinery down the line at to remove them.

Watching over it all are computers that analyze material as it passes by at 7 mph. The devices use AI to identify recyclables, flag food-grade material, gauge items’ mass, assess market value and calculate points at which a robotic claw might best clasp each piece.

 The U.S. 50% aluminum tariff has lifted demand for scrap metal, while pulp mill closures have left box makers more reliant than ever on old corrugated containers. And consumer goods companies want to reclaim their bottles as states adopt extended producer responsibility laws aimed at reducing plastic pollution.

Part of the problem: Americans’ poor recycling habits are an obstacle to profit. A lot of beer cans and delivery boxes never even make it to sorting centers. A study in Virginia’s waste stream showed that 28% was recyclable, yet the system was stuck at a recycling rate of about 7% no matter how much it spent trying to teach people how and what to recycle.

The big breakthrough in recycling technology has been combining vision recognition systems with pneumatic blocks. Using puffs of air to separate items has proved much faster and more accurate than robotic pickers, which are limited to about 40 items a minute, compared with thousands for pneumatic system.

Classroom discussion questions:

  1.  Why has recycling been so inefficient?
  2. Should job loss through automation be a concern?

OM in the News: McDonald’s Gives Its Restaurants an AI Makeover

McDonald’s is giving its 43,000 restaurants a technology makeover, starting with internet-connected kitchen equipment, artificial intelligence-enabled drive-throughs and AI-powered tools for managers, reports The Wall Street Journal (March 5, 2025).

McDonald’s is introducing new technology in part to drive better experiences for its crews. “Our restaurants, frankly, can be very stressful,” said the CIO.

The goal? To drive better experiences for its customers and workers who today contend with issues ranging from broken machines to wrong orders. To accomplish that, McDonald’s tapped Google Cloud to bring more computing power to each of its restaurants—giving them the ability to process and analyze data on-site. The setup, known as edge computing, can be a faster, cheaper option than sending data to the cloud, especially in more far-flung locations with less reliable cloud connections.

McDonald’s is also exploring the use of computer vision (see Chapter 7), the form of AI behind facial recognition, in store-mounted cameras to determine whether orders are accurate before they’re handed to customers.

Additionally, the ability to tap edge computing will power voice AI at the drive-through. Edge computing will also help its restaurant managers oversee their in-store operations by creating a “generative AI virtual manager,” which handles administrative tasks such as shift scheduling on managers’ behalf.

AI will be able to help McDonald’s tailor its promotions and offers by using customer data such as prior purchasing history, and even linking it with weather data. “A customer who we know loves our sweet treats could get an offer through the app for a McFlurry on a hot summer day,” said the firm’s CIO.

Despite its first-mover advantage, McDonald’s will still face challenges including cost and the difficulty of rolling out the same technology across franchises and corporate-owned locations. But, compared with some of its quick-service restaurant peers, McDonald’s has been relatively aggressive at investing in new digital technologies. That, combined with the vast amount of data it has collected on its customers, gives the fast-food giant a leg up on figuring out how to improve customer loyalty.

Classroom discussion questions:

  1. What is “edge computing” and why is it a powerful tool for OM?
  2. Summarize the technology makeover being undertaken. Why is the firm going down this expensive path?

OM in the News: Making Vaccine Bottles

Combating the Covid-19 pandemic is at the top of the global agenda. Providing vaccines to populations around the globe means providing 8 billion doses—with only one for every person in the world. In addition to the availability of the vaccine, a decisive factor in the race against time is the accessibility of the glass vials. Producers of the vials are massively ramping up their production so as not to become the proverbial bottleneck in the supply chain, reports New Equipment Digest (April 6, 2021).

vaccine

However, medical-grade vaccine vials are not standard glass tubes. They are all made of the special glass borosilicate and require customized production lines. For example, the glass must be resistant to a wide range of chemicals and temperature changes and must not contaminate medicines. Any interaction between the container and the liquid inside must be prevented, as any chemical interference could affect the vaccine. Even the smallest scratch, crack or fissure can render an entire batch unusable, contaminate the line during the filling process or even lead to a machine standstill.

The demands on manufacturers are enormous: it is not only a matter of producing large quantities quickly but also of maintaining particularly high-quality standards. So what is needed is very fast quality control with high reliability in defect detection. One solution is vision systems, our topic on page 296 in Chapter 7. Powerful cameras can capture images of 120 vials per minute to be inspected for dimensional accuracy or surface condition with very high precision. Defects such as cracks, scratches, chips, inclusions or stains are detected with an accuracy of 0.1 square millimeters. Intelligent software enables accurate fault description analysis and classification. Testing takes place at various points in the manufacturing process, such as directly after the bottles have been formed or shortly before packaging.

Classroom discussion questions:

  1. What are vision systems and why are they a useful OM tool?
  2. Which of the quality control tools in Chapter 6 (Figure 6.6) of your Heizer/Render/Munson text could vial producers employ?

OM in the News: Tyson’s Computer Vision Technology Improves Inventory Accuracy

Tyson is rolling out a computer-vision-enabled inventory tracking system at facilities where it packs chicken into trays for grocery stores, writes Supply Chain Dive (Feb. 11, 2020) The system can read SKU information and weight, replacing what Tyson described as communication by hand gestures followed by manual inventory entry. By the end of the year, Tyson’s automated inventory tracking technology will combine computer vision, machine-learning and edge computing to expand its speed and processing capability.

Automated inventory tracking using computer vision led to a double-digit increase in inventory accuracy in the 3 facilities currently using the technology. The company plans to expand the program to all 10 of its poultry plants.

Though cold, wet storage environments make implementing new technologies difficult, the payoff of real-time accurate inventory information is already evident for Tyson. The company recently opened the Tyson Manufacturing Automation Center, where it works with manufacturers and suppliers to develop new technologies and trains employees to use it. The company has spent $215 million on new technologies in the last 5 years.

Precise, real-time inventory visibility can increase the frequency with which Tyson fulfills grocery customer orders on time and in full in the best of times. But inventory management is particularly key in times of uncertainty, and Tyson is dealing with plenty. The disruptive forces of shifting global trade policy, a fire at an important Tyson facility, and African swine fever, which all distorted usual supply and demand patterns, made a relevant forecast next to impossible.

Major meat companies are leaning toward similar monitoring technologies and automation, whether through production or processing. Cargill is starting to use computer vision to track animal health in dairy operations. But a more consumer-directed application inspired Tyson’s work. Similar technology enables Amazon’s cashier-less stores, which led executives to explore applying it in poultry plants.

Classroom discussion questions:

  1. Describe what a vision system is (See Chapter 7 of your Heizer/Render/Munson text).
  2.  How will this help Tyson control inventory?

OM in the News: Vision-Automation Technology is Taking over the Factory Floor

 

Humans overseeing the toppings at a German frozen-pizza plant, a task now within the reach of technology.

Robots that see underpin the future of self-driving cars, humanoid robots and autonomous drones, writes The Wall Street Journal (Sept. 14, 2018). Now, food manufacturers are combining advances in laser vision with artificial-intelligence software so that automated arms can carry out more-complex tasks, such as slicing chicken cutlets precisely or inspecting toppings on machine-made pizzas.

Being able to see is a major frontier in robotics and automation—crossing it is key to autonomous vehicles that can navigate obstacles, humanoid robots that can more closely integrate with humans and drones that can fly more safely. Companies world-wide are investing in computer vision-based technology.  Intel recently bought Mobileye for $15 billion, in part for the Israeli company’s vision-based driver-assistance technology.

Food manufacturers have been early adopters of new technologies from canning to bread slicers, and vision automation has been used for years for tasks such as reading bar codes and sorting packaged products. Leaders are finding the technology valuable because robot eyes outpace the human eye at certain tasks. Now technical improvements, tougher materials and declining prices mean Tyson can integrate vision technology in its new $300 million chicken-processing plant. The technology helps optimize the use of each part of the bird.

Advances so far allow vision technology to ensure frozen pizzas have the correct toppings. Other applications include the ultrasonic slicing of cheese, cutting bread rolls with water jets and picking pancakes off a production line. Car makers, historically the biggest user of vision technology, are using it for emergency braking and scanning road signs; logistics companies deploy it to more quickly identify packages, and consumer electronics companies to position liquid-crystal display screens more precisely than is possible with the naked eye.

Classroom discussion questions:

1. Why are vision systems becoming an important OM tool?

2. Will driver-assistance technology really eliminate the need for drivers? Why? When?