Prof. Howard Weiss shares his insights on the power of Excel’s Solver.
I previously have posted for The OM Blog that in the operations course it is important to help students develop their Excel skills. Today I will introduce students to nonlinear programming in Excel’s Solver for Trend Analysis models, a topic in Chapter 4 of your Heizer/Render/Munson text. It highlights to the students exactly what is being optimized – sum of squared errors.
Trend Analysis
Example 8 from Chapter 4 illustrates the Solver process. The initial intercept of 10 and slope of 10 yield the forecasts in column D. Errors and squared errors follow from the forecasts and demands with the sum of squared errors shown in cell F12.
This is Solver’s objective. The changing cells are the intercept and slope, there are no constraints, and the method in Solver is GRG Nonlinear. In addition, for least squares the “Make Unconstrained Variables Non-Negative” needs to be unchecked since slopes/trends can be negative in forecasting– although not in this example.
After solving, the solution, not displayed here, appears as Intercept 56.71, slope=10.54 (as shown in the text) and the minimum sum of squares, not shown in the text or figure above, is 773.