Model Validation and Uncertainty Quantification, Volume 3

150 K.L. Van Buren and F.M. Hemez further defined in Sect. 14.3.1. Then, the robust-optimal paradigm searches for the worst-case performance across all designs that, while being requirement-compliant, give the best-possible performance over the uncertainty space U of calibration variables. Mathematically, it requires two embedded optimization problems: p.Robust/ D arg max Compliant Designs; p min 2 U gmax; such that gmax gC (14.4) where the inner optimization searches for the best-possible performance at a fixed design and over the calibration space U, and the outer optimization explores the worst-possible case over all designs. Now that we have stressed the difference between performance-optimal and robust-optimal designs, we go back to the discussion of sensitivity analysis to conclude with a second observation. Just like a performance-optimal design offers the risk of not delivering the expected performance due to uncertain calibration variables, a sensitivity ranking of design parameters is vulnerable to the same uncertainty, especially if the calibration variables interact with design parameters to significantly change the simulation predictions. In our conceptual example, the function g2 dominates performance at the nominal setting of calibration variable (recall Fig. 14.3). Thus, the design parameter p2 would be found more influential, and selected for design optimization. However, when the value of the calibration variable is changed, the reverse occurs: the modified function gˆ1 dominates performance (recall Fig. 14.5), leading to design parameter p1 being ranked number one in sensitivity. Clearly, sensitivity analysis cannot be executed at every possible combination of calibration variables to identify these potential influence-ranking reversals. To deal with this challenge, we propose the concept of robust-optimal sensitivity where a robustness criterion, much like the one conceptually illustrated in Eq. 14.4, is embedded within the sensitivity analysis. 14.3 Sensitivity Analysis Using the Robustness Performance The methodology proposed to screen for design parameters that are influential using a robust-optimal criterion is introduced next. First, Sect. 14.3.1 provides an overview of IGDT, which is the criterion used herein to quantify robustness. Next, Sect. 14.3.2 introduces concepts of the Morris OAT sensitivity analysis. Using the Morris method is particularly appealing because the size of its DOE scales linearly with the number of design parameters, rather than other screening techniques that rely on information volumes that increase exponentially, or faster. Section 14.3.3 discusses how Morris and IGDT are integrated together. An application to the NASA Multidisciplinary Uncertainty Quantification Challenge Problem is given in Sect. 14.4. 14.3.1 Info-Gap Decision Theory for Robustness Analysis This section provides a brief overview of IGDT, to keep the discussion self-contained as much as possible. Further details of the theory and its implementation can be found in [7]. Supporting a decision using IGDT, irrespective of the discipline or application, requires the combination of three attributes. The first attribute is a model, the second one is a performance criterion, and the third one is a representation of uncertainty using an info-gap model. The three attributes are introduced to explain how decision-making is formulated for our application. The first two attributes, model and performance criterion, are evoked in the conceptual example of Sect. 14.2. The model can be an analytic formula, numerical function, or black-box software that depends on design parameters, p, and calibration variables, ™, such as illustrated in Fig. 14.1. The performance is a criterion, such as Eq. 14.2, that converts the multidimensional model output to a scalar quantity, herein denoted by the symbol R. In Sect. 14.2, RDgMax is used to define performance. As mentioned earlier, the model is said to be requirement-compliant when the inequality R<RC is satisfied whereRC is a user-defined, critical requirement of performance. The third attribute, that warrants more explanation, is the info-gap representation of uncertainty. We postulate that the uncertainty, that we wish to be robust to, originates from the calibration variables. Therefore, the info-gap model applies to the calibration variables. This is not a restrictive assumption since robustness analysis can be applied to other aspects of a simulation that introduce uncertainty or arbitrary assumptions. For example, Van Buren et al. [11] study the extent to which the predictions of wind turbine dynamics are robust to model-form assumption.

RkJQdWJsaXNoZXIy MTMzNzEzMQ==