In many engineering applications, the predictions of system responses are uncertain due to the incomplete knowledge related to the system environment (excitation), the system model itself, or the evaluation of its performance. A probabilistic approach provides a rational and consistent framework for quantifying these uncertainties and explicitly incorporating them into the system description. In this setting, and supported by recent advances in computer/computational science, such as distributed/parallel computing and multi-core architecture, simulation-based techniques have emerged as a powerful tool to support the predictive analysis and robust design of engineering systems. Still, these tools, requiring a large number of evaluations of the system-model response, may impose a prohibitive computational demand for applications involving complex numerical models.

To alleviate such computational challenges, this seminar discusses the implementation of soft-computing techniques, in particular kernel-based approaches and surrogate modeling tools, within a stochastic simulation framework for uncertainty propagation. Though the techniques discussed have general applicability, focus is given here on implementation for natural hazard mitigation. Two separate soft-computing applications are presented.

Risk assessment through surrogate modeling. The implementation and optimization of metamodels for efficient probabilistic performance assessment is discussed, focusing on systems with very high dimensional output, something that becomes increasingly relevant with the trend of increasing complexity and scale of modern engineering problems. Combination with principal component analysis is investigated to reduce the dimensionality of the problem, exploiting temporal/spatial correlation of the system response output. The implementation emphasized here is real-time hurricane risk assessment and the development of standalone tools that can be efficiently used by emergency response managers.

Importance sampling for stochastic optimization. The adaptive formulation of importance sampling (IS) densities is discussed by sharing information across the iterations of gradient-based algorithms in stochastic optimization applications. The important questions addressed are (i) for which model parameters should be IS considered (to avoid well known challenges in selecting IS densities in high-dimensions) and (ii) what should be the characteristics of these densities. Kernel density estimation (KDE) is proposed for latter with bandwidth selected to optimize the anticipated computational efficiency. For the former, a recently developed global sensitivity analysis is adopted that efficiently quantifies the importance of each parameter towards the overall probabilistic performance, allowing a rational selection of the parameters to establish IS for. Both these tasks are seamlessly integrated within the simulation-based assessment of the objective function exploiting stochastic sampling concepts, imposing ultimately a minimal additional computational burden (over the probabilistic performance assessment). The applications examined correspond to optimization of vibration suppression devices for two separate problems: (i) design of the suspension for a quarter-car model driving on a rough road and (ii) design of a floor-isolation system for protecting a computer server against seismic hazard.