dc.description.abstract | Although robustness is often discussed when solving stochastic problems, definitions in the literature vary. We present five quantitative definitions of robustness that are close to the intuitive qualitative meaning of robustness as used by other authors. Since many scheduling problems are NP-hard, they are sometimes solved using Metaheuristic approaches. In such a case, we need a way of estimating robustness efficiently. Using Stochastic Parallel Machine Scheduling as a test problem, we analyze efficient measures for estimating these definitions.
Our results show three things. First, there are some limitations to 'slack-based' measured proposed by other authors. Second, optimizing the deterministic variant of the problem shows decent results on some problem instances, but can be a poor predictor of the expected makespan in schedules with lots of inter machine precedence arcs and no slack. In these cases, a statistical approximation approach showed good results. Third, in less extreme instances with less precedence arcs and whose solutions have more slack, the use of an efficient robustness estimator does not improve the robustness of the solutions produced by our metaheuristic approach.
Taken together, our results show that in some cases deterministic makespan minimization performed as well as any other measure, but that situations exist in which it is outperformed by a statistical approximation approach. We did not find any situations in which the statistical approximation was outperformed by deterministic makespan pr any other robustness estimation measure. It remains an open question which aspect of the schedule or problem instance determines if uncertainty must be considered. Our hypothesis is that as schedules approach the global optimum (for any robustness definition presented herein), the quality of the robustness estimation method used in the metaheuristic approach becomes more important. | |