This paper constructs composite indices using partial least squares (PLS) and principal component analysis (PCA) focusing on the treatment of non-metric variables. A composite index is built typically as a linear combination of variables and the quality of a composite index depends on the weighting. PCA calculates weights from the relationships between variables, whereas PLS calculates weights from the relationships between outcome variables and variables. PLS performs better in terms of prediction and more robust against measurement errors in variables compared with PCA. One often has non-metric variables in applications, which need appropriate treatments to apply PCA or PLS. We review the treatments of non-metric variables in PCA and PLS in the literature and compare their performances by means of a simulation study. PLS with binary coding performs better than others in terms of prediction and the ease of interpretation. We build two wealth indices and a globalization index as applications.

Statistical Quality Monitoring of Profiles and Surfaces: Current Approaches and Future Challenges

More and more often quality inspection and monitoring has to deal with shapes (profiles and surfaces) of manufactured objects, as well as with their critical dimensions. A current trend to complex shapes design and manufacturing can be observed, thanks to the spread of some machining processes (e.g., additive manufacturing) and to the attention devoted to macro- and micro-shapes in product design as a way to enhance functional and aesthetic requirements (e.g., surface texturing).
On the other side, metrology is facing a paradigm shift where contact systems are substituted or coupled with contactless ones, sometimes asking for multisensor data fusion for surface reconstruction.
In this scenario, the paper describes possible solutions and current challenges when i) quality inspection requires statistical models for surface reconstruction starting from noisy point clouds; ii) statistical quality monitoring is applied to reconstructed surfaces via control charting.

The most often used operator to aggregate criteria in decision making problems is the classical weighted arithmetic mean. In many problems however, the criteria considered interact, and a substitute to the weighted arithmetic mean has to be adopted. In this talk we want to transfer methods from decision making theory to defining quality indices.
Even quality indices are often based on the aggregation of indices with respect to sub criteria, e.g. an index is obtained as arithmetic mean of sub indices.
In our talk we focus on quality indices in supplier management. Obviously, supplier management is a central part of any quality management system. Especially in the context of outsourcing or lean production assessments of suppliers are of great importance.
The final assessment of a supplier – maybe supplier A, supplier B and supplier C- is based on a weighted aggregation of such criteria like qualification level of employees, indices for product quality, financial indices, delivery based indices (e.g, adherence to delivery dates), etc.
From a mathematical and logical point of view simple aggregation ignores effects like necessity of fulfillment of criteria and interaction among criteria (redundancy, synergy).
Therefore, we present in this talk a general approach for aggregating sub criteria based on the so called Choquet integral, which is defined by a corresponding fuzzy measure modelling interaction effects. We introduce the Choquet integral as a general tool for dealing with multiple criteria decision making. After a theoretical exposition giving the fundamental basis with respect to fuzzy measures practical problems corresponding to the assessment of suppliers are addressed. In particular the problem of determining useful and suitable fuzzy measures is discussed.
We close our talk with a case study from supplier management.

I propose to host an open problem session (as I had done in at ENBIS2014 in Antwerp). I would need a 60-90 minute time slot and would issue a call for open problems in August. During the session, we could treat a maximum of three problems.

The way such a session runs is as follows:
Problems are either submitted before or during the conference. Time permitting, they can even be taken from the audience of the session (done in Antwerp). The person who volunteers the problem then has 10-12 minutes to explain. After that it goes by rounds of clarification and, hopefully, useful suggestions on how to treat the project.

Approximate Bayesian Computation Design with an Application to Spatial Extremes

Simulation-based optimal design techniques are a convenient tool for solving optimal design problems with potentially complex model structures. The goal is to find the optimal configuration of factor settings with respect to an expected utility criterion. In addition, we assume that the model likelihood function is intractable but that sampling from the probability model is possible. Therefore, we utilize approximate Bayesian computation (ABC) methods to estimate the expected utility criterion.

The methodology is applied to find the optimal design for a network of temperature measuring stations when the main interest is in inference for extreme events. ABC methods are applied because the likelihood function of the multivariate extreme value distribution (Schlather model) is not available in closed form for dimensions greater than two.

Practical Issues in Using Nonlinear Regression Analysis for Short-Term Prediction of High Water Levels

After the 1953 flood in the South-West Netherlands which killed almost 2000 people, the Dutch government initiated the Delta Plan, a large-scale
protection system consisting of dikes and dams. An important part of the Delta Plan is the Easter Scheldt Storm Surge Barrier, which is a bridge-like
construction that can be closed when water levels become too high.
When the water level is above 2.75 m, a Decision Team is physically present at the barrier to close the barrier at the right moment. In order to keep
flooding risk at acceptable levels, the barrier automatically closes and takes over full control when there is a water level prediction of 3 m
exactly. In order to avoid environmental damage, the barrier is not allowed to close at predicted water levels below 3 m. It is thus important for
the Decision Team to avoid unnecessary closures of the barrier as well avoid that the barrier takes over full control. The Decision Team has access to
several long-term water level predictions. As an addition to these existing predictions, we are developing short-time prediction models. Part of these short-time prediction models is a nonlinear regression model with a quadratic polynomial to model the trend and various sine waves to model oscillations. Straightforward use of nonlinear regression to water levels measured every 10 seconds during the past 30 minutes surprisingly failed because of singular gradients. In this talk we will show the approaches that we are using to overcome the numerical problems, including scaling of the variables involved, using the variable projection algorithm of Golub and Pereyra to exploit that the model has conditionally linear parameters and the the systematic use of different starting point for the nonlinear regression.