Gloss is a critical issue in many applications in the coating industry. Gloss depends on optical and rheological properties of complex mixtures, and estimating gloss from basic properties is still a challenge. In order to predict the gloss of an industrial thickened-to-application formulation this work presents a gloss-rheology semi empirical-modeling approach based on a gloss excess function and previous work from other authors. A new matt (low gloss) hybrid waterborne polyurethane dispersion composed out of a self-matting agent (A) and a traditional silica-based matting agent (B) has been studied, and the resulting gloss of the mixture has been correlated to pure component gloss values and dynamic viscosity at medium shear rate. Several modeling options have been tested and their goodness of fit has been determined. The most promising options have been selected and validated towards untrained data sets.

Carbon capture and storage (CCS) and carbon capture and utilisation (CCU) are acknowledged as important R&D priorities to achieve environmental goals set for next decades. This work studies biomass-based energy supply chains with CO2 capture and utilisation. The problem is formulated as a mixed-integer linear program. This study presents a flexible supply chain superstructure to answer issues on economic and environmental benefits achievable by integrating biomass-coal plants, CO2 capture and utilisation plants; i.e. location of intermediate steps, fraction of CO2 emissions captured per plant, CO2 utilisation plants' size, among others. Moreover, eventual incentives and environmental revenues will be discussed to make an economically feasible project. A large-size case study located in Spain will be presented to highlight the proposed approach. Two key scenarios are envisaged: (i) Biomass, capture or utilisation of CO2 are not contemplated; (ii) Biomass, capture and CO2 utilisation are all considered. Finally, concluding remarks are drawn.

Sulfaquinoxaline (SQX) is an antimicrobial of the sulfonamides class. Usually employed in veterinary medicine, this contaminant of emerging concern has been found in superficial and groundwater and its consequences for the environment and human health are not completely known. In this study, SQX (C0 = 500 µg L-1, 1 L) degradation by an ozonation process at pH 3, 7, and 11 was evaluated. Ozonation was effective in degrading SQX: efficiency exceeding 99% was obtained applying an ozone dose of 2.8 mg L-1 at pH 3. Assays were performed according to a 22 design of experiments (DOE) with star points and three central points for statistical validity. Minimum and maximum levels were set at 3 and 11 for pH, and 0 and 11.5 mg L-1 for applied ozone dose. There was no significant interaction between these variables, and the pH value played the most important role in terms of contaminant degradation. In relation to toxicity, samples ozonated at pH 3 did not inhibit the luminescence of the bacteria, even though different intermediates were formed and identified by mass spectra. At pH 7, inhibition of luminescence remained almost constant (at around 30%) according to ozonation time or ozone dose. However, the hydroxyl radical, the major oxidant at pH 11, was responsible for the formation of toxic intermediates.

Microgrids are energy systems that can work independently from the main grid in a stable and self-sustainable way. They rely on energy management systems to schedule optimally the distributed energy resources. Conventionally, the main research in this field is focused on scheduling problems applicable for specific case studies rather than in generic architectures that can deal with the uncertainties of the renewable energy sources. This paper contributes a design and experimental validation of an adaptable energy management system implemented in an online scheme, as well as an evaluation framework for quantitatively assess the enhancement attained by different online energy management strategies. The proposed architecture allows the interaction of measurement, forecasting and optimization modules, in which a generic generation-side mathematical problem is modeled, aiming to minimize operating costs and load disconnections. The whole energy management system has been tested experimentally in a test bench under both grid-connected and islanded mode. Also, its performance has been proved considering severe mismatches in forecast generation and load. Several experimental results have demonstrated the effectiveness of the proposed EMS, assessed by the corresponding average gap with respect to a selected benchmark strategy and ideal boundaries of the best and worst known solutions.

Microgrids are energy systems that aggregate distributed energy resources, loads, and power electronics devices in a stable and balanced way. They rely on energy management systems to schedule optimally the distributed energy resources. Conventionally, many scheduling problems have been solved by using complex algorithms that, even so, do not consider the operation of the distributed energy resources. This paper presents the modeling and design of a modular energy management system and its integration to a grid-connected battery-based microgrid. The scheduling model is a power generation-side strategy, defined as a general mixed-integer linear programming by taking into account two stages for proper charging of the storage units. This model is considered as a deterministic problem that aims to minimize operating costs and promote self-consumption based on 24-hour ahead forecast data. The operation of the microgrid is complemented with a supervisory control stage that compensates any mismatch between the offline scheduling process and the real time microgrid operation. The proposal has been tested experimentally in a hybrid microgrid at the Microgrid Research Laboratory, Aalborg University.

Volatile methyl siloxanes (VMS) were evaluated in ten Catalan urban areas with different industrial impacts, such as petrochemical industry, electrical and mechanical equipment, metallurgical and chemical industries, municipal solid waste treatment plant and cement and food industries, during 2013–2015. 24 h samples were taken with LCMA-UPC pump samplers specially designed in our laboratory, with a flow range of 70 ml min-1. A sorbent-based sampling method, successfully developed to collect a wide-range of VOC, was used. The analysis was performed by automatic thermal desorption coupled with capillary gas chromatography/mass spectrometry detector. The presented methodology allows the evaluation of VMS together with a wide range of other VOC, increasing the number of compounds that can be determined in outdoor air quality assessment of urban areas. This aspect is especially relevant as a restriction of several VMS (D4 and D5) in consumer products has been made by the European Chemicals Agency and US EPA is evaluating to include D4 in the Toxic Substances Control Act, regarding the concern of the possible effects of these compounds in human health and the environment. SVMS concentrations (L2-L5, D3-D6 and trimethylsilanol) varied between 0.3 ± 0.2 µg m-3 and 18 ± 12 µg m-3, determined in a hotspot area. Observed VMS concentrations were generally of the same order of magnitude than the previously determined in Barcelona, Chicago and Zurich urban areas, but higher than the published from suburban sites and Arctic locations. Cyclic siloxanes concentrations were up to two-three orders of magnitude higher than those of linear siloxanes, accounting for average contributions to the total concentrations of 97 ± 6% for all samples except for the hotspot area, where cyclic VMS accounted for 99.9 ± 0.1%. D5 was the most abundant siloxane in 5 sampling points; however, differing from the generally observed in previous studies, D3 was the most abundant compound in the other 5 sampling points.

Sustainable processes have recently awaked an increasing interest in the process systems engineering literature. In industry, this kind of problems inevitably required a multi-objective analysis to evaluate the environmental impact in addition to the economic performance. Bio-based processes have the potential to enhance the sustainability level of the energy sector. Nevertheless, such processes very often show variable conditions and present an uncertain behavior. The approaches presented for solving multi-objective problems under uncertainty have neglected the potential effects of different quality streams on the overall system. Here, it is presented an alternative approach, based on a State Task Network formulation, capable of optimizing under uncertain conditions, considering multiple selection criteria and accounting for the material quality effect. The resulting set of Pareto solutions are then assessed using the Elimination and Choice Expressing Reality-IV method, which identify the ones showing better overall performance considering the uncertain parameters space

Optimization under uncertainty has attracted recently an increasing interest in the process systems engineering literature. The inclusion of uncertainties in an optimization problem inevitably leads to the need to manage the associated risk in order to control the variability of the objective function in the uncertain parameters space. So far, risk management methods have focused on optimizing a single risk metric along with the expected performance. In this work we propose an alternative approach that can handle several risk metrics simultaneously. First, a multi-objective stochastic model containing a set of risk metrics is formulated. This model is then solved efficiently using a tailored decomposition strategy inspired on the Sample Average Approximation. After a normalization step, the resulting solutions are assessed using Pareto filters, which identify solutions showing better performance in the uncertain parameters space. The capabilities and benefits of our approach are illustrated through a design and planning supply chain case study

In this work, an integrated Game Theory (GT) approach is developed for the coordination of multi-enterprise Supply Chains (SCs) in a competitive uncertain environment. The conflicting goals of the different participants are solved through coordination contracts using a non-cooperative non-zero-sum Stackelberg game under the leadership of the manufacturer. The Stackelberg payoff matrix is built under the nominal conditions, and then evaluated under different probable uncertain scenarios using a Monte-Carlo simulation. The competition between the Stackelberg game players and the third parties is solved through a Nash Equilibrium game. A novel way to analyze the game outcome is proposed based on a win–win Stackelberg set of “Pareto-frontiers”. The benefits of the resulting MINLP tactical models are illustrated by a case study with different vendors around a client SC. The results show that the coordinated decisions lead to higher expected payoffs compared to the standalone case, while also leading to uncertainty reduction.

Degradation of bisphenol A (BPA, 0.5 L, 30 mg L-1) was studied by photo-Fenton treatment, while Fenton reagents were variables. The efficiency of the degradation process was evaluated by the reduction of total organic carbon (TOC), the biochemical oxygen demand (BOD), and toxicity. For toxicity analysis, bacterial methods were found infeasible, but the in vitro assay of VERO cells culture was successfully applied. Experiments according to a 22 design of experiments (DOE) with star points and three center points for statistical validity allowed selecting those process conditions (Fe(II) and H2O2 load) that maximized the process performance. Photo-Fenton process effectively eliminated BPA and partly degraded its by-products (residual TOC <15 %) under substoichiometric H2O2 dose (100.62 mg L-1) and at least 4 mg L-1 Fe(II), after a 90-min treatment. All treated samples were at least partially biodegradable. The cytotoxic concentration (LD50) of BPA for VERO cells was 7 mg L-1. With small H2O2 amount (15.24 mg L-1), only low BPA mineralization (TOC = 92 %) was attained. Toxicity was also detected to 50 % of cellular mortality even at long reaction times. However, 40.25 mg L-1 of H2O2 decreased residual TOC to 70 % while cell mortality decreased down to 25 %. With more H2O2, the residual TOC decreased down to 15 % but cell mortality remained within the 20–25 % level. Photo-Fenton increased the biodegradability and reduced the toxicity of the studied sample.

Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method

This paper introduces an optimization-based approach for the simultaneous solution of batch process synthesis and plant allocation, with decisions like the selection of chemicals, process stages, task-unit assignments, operating modes, and optimal control profiles, among others. The modeling strategy is based on the representation of structural alternatives in a state-equipment network (SEN) and its formulation as a mixed-logic dynamic optimization (MLDO) problem. Particularly, the disjunctive multistage modeling strategy by Oldenburg and Marquardt (2008) is extended to combine and organize single-stage and multistage models for representing the sequence of continuous and batch units in each structural alternative and for synchronizing dynamic profiles in input and output operations with material transference. Two numerical examples illustrate the application of the proposed methodology, showing the enhancement of the adaptability potential of batch plants and the improvement of global process performance thanks to the quantification of interactions between process synthesis and plant allocation decisions.

Production scheduling becomes a highly complex task in the process industry, especially when it is necessary to take immediate decisions for unexpected situations, since most industrial systems exhibit nonlinear behavior and many uncertain parameters (UP). This paper presents an industrial case study where immediate reactive production scheduling is required to control unexpected changes in the plant emissions profile associated to the existence of bounded UP. Since only a very time consuming Simulation Based Optimization (SBO) system of the plant is available to find the new optimal schedule which minimizes the emissions’ peaks, the proposed solution relies on the use of Meta-Multiparametric (M-MP) techniques, which will off-line use the SBO system to create a model which, given any initially unknown situation, will describe the changes to be introduced in the system to limit these emissions’ peaks. The results show that the M-MP techniques used are able to model the optimum input-output relations behind the SBO data and that, in contrast to existing similar procedures, both continuous and discrete UP (like batch sequence information) can be considered.

Emission factors of formaldehyde and VOCs were determined for two waste treatment plants (WTP) located in
the metropolitan area of Barcelona city. Formaldehyde emission factors were determined from the biogas engines
exhausts and the process chimneys (after the biofilter process), and VOC emission factors were determined
in the process chimneys. Formaldehyde and VOC were dynamically sampled usingDNPH-coated adsorbent tubes
with ozone scrubber and multi-sorbent bed tubes (Carbotrap, Carbopack X and Carboxen 569), respectively,
using portable pump equipment. Formaldehyde emission factors from biogas engines were found between
0.001–0.04 g s-1. Additionally, formaldehyde and VOC emission factors from process chimneys were found to
be between 0.0002–0.003 g s-1 and 0.9 ± 0.3 g s-1, respectively. Employing real emission factors, the expected
concentrations derived from the WTPs in their nearby urban areaswere calculated using The Atmospheric Pollution
Model (TAPM, CSIRO), and impact maps were generated. On the other hand, ambient air formaldehyde and VOC
concentrations were determined in selected locations close to the evaluated waste treatment facilities using both
active and passive samplers, and were between 2.5±0.4–5.9±1.0 µgm-3and91±48–242±121 µgm-3, respectively.
The concentrations of formaldehyde and VOC derived exclusively from the waste treatment plants were
around 2% and 0.3 ± 0.9% of the total formaldehyde and VOC concentrations found in ambient air, respectively.

Sustainable process ing ha s attracted recently an increasing interest in the process systems engineering literature. From a practical perspective , addressing sustainability inevitably require s a Multi - Objective analysis and optimization to evaluate the environmental impact in additi on to the economic performance. Bio - based process es are typically used in order to promote sustainability, although the quality of the bio - mass us ually is variable and present an uncertainty behavior . However, the approaches presented so far to address mul ti objective problems under uncertainty have focused on simplifying the optimiz ation model reducing the set of feasible choices and neglecting the potential effects of quality variations streams on the overall system performance . We present here an altern ative approach based on the State Task Network formulation capable to address M ulti - O bjective Optimization problems under uncertain conditions, which includes the material quality effect, all together. The resulting set of solutions are then assessed using ELECTRE IV method, which identif ies the ones showing better overall performance within the uncertain parameters space

Current Supply chain (Se) decision support tools are devoted to the optimization of one objective function from a bias centralized perspective, without considering the coexistence with other participants and their uncertain conditions.
Towards multi-enterprise-wide coordination (M-EWC), the aim of this thesis is to contribute to PSE by new decision support tools able to achieve global coordination/collaboration with all organizations participating in large-scale chemical SCs systems. Different approaches are proposed in this thesis to capture the interaction between the enterprises of contrasting and competitive goals considering the role of the uncertainty of each participant on the decision-making of the other participants.
The first contribution of this thesis is to achieve global coordination with the supporting enterprises (third parties). A global coordination framework is developed to integrate the third parties decisions as full SCs in the decision-making process of a large-scale multi-product se of multiple echelons. A generic coordinated tactical model is developed to highlight the interaction between the se of interest and the third parties SCs. A comparison is undertaken between the proposed coordinated approach and the traditional non-coordinated approach to highlight its potential on the tactical decision-making and the resource consumption.
The second contribution of this thesis is to highlight the role of the third parties as business partners through pricing on the economic performance of the global coordinated system. The coordination framework proposed in the above mentioned paragraph is extended to integrate the third parties price policies as degree of freedom decisions in the coordinated se tactical model.
Third different approaches to approximate the third parties price policies variations are proposed and compared based on the average and discounting trends built on the demand elasticity theory. The consequences of using the proposed pricing approximation models on the global coordination, regarding not only their effect on the tactical decisions, but also the additional complexity in the mathematical formulations to be solved, are analyzed and compared.
Fourth, the inter-organizational coordination between the interacting enterprises is achieved based on cooperative and non-cooperative systems. The decisions obtained from the cooperative, non-cooperative, and standalone systems are analyzed and compared.
Fifth, a non-cooperative non-zero-sum Scenario-Based Dynamic Negotiation (SBDN) approach, with non-symmetrical roles, is proposed to set the best conditions for the coordination/collaboration contracts between enterprises of contrasting objectives participating as full SCs in a large-scale multi-enterprise multi-product SC. Under the leadership of the manufacturer, the uncertain reaction of the follower partner resulted from the uncertain behavior of its se third parties is considered in the leader model as a probability of acceptance.
Sixth, an evaluation methodology is proposed to evaluate the negotiation outcome based on the probability distributions taking into consideration the variability of the follower profits successful scenarios and risk behavior.
The final contribution of this thesis is developing an integrated game theory approach for inter-organizational coordination. A non-cooperative non-zero-sum Stackelberg game is developed to capture the contrasting goals under different uncertain and competitive circumstances. The competition between the Stackelberg game players and the counterpart third parties is modeled through Nash Equilibrium game. Later in this document, a Stackelberg set of Pareto frontiers is obtained, where each point corresponds to possible expected win-win coordination contract.
Finally, a comparison is undertaken between the coordination contracts resulted from this integrated game theory and the SBDN approach under the same circumstances.

This paper presents an unsupervised data-driven method for Fault Detection and Diagnosis (FDD) of nonlinear dynamic processes. The proposed approach is based on the combination of automatic and non-automatic clustering techniques with a data-driven observer based on Multivariate Dynamic Kriging (MDK) metamodels. The proposed framework is studied via its application to a well-known benchmark simulation case study based on the control of a three-tanks system, showing promising performance in terms of accuracy, robustness and simplicity of applications.

Sulfaquinoxaline (SQX) is an antimicrobial that has been detected in environmental water samples, and its side effects in the environment and in human health are still unknown. To the best of our knowledge, the degradation of SQX using Fenton and photo-Fenton processes has not been reported previously. In this study, evaluation was made of SQX removal and mineralization at laboratory and pilot plant scales. The residual antimicrobial activity was monitored using Escherichia coli and Staphylococcus aureus bacteria. Experimental design was used to assess the influence of the presence or absence of a radiation source, as well as different H2O2 concentrations, during the degradation. The assays carried out at the laboratory scale (2 L) were reproduced at a larger pilot plant scale (15 L), using the same experimental conditions. In both cases, over 90% of the initial SQX concentration (25 mg L-1) was removed from the solutions. The degradation rate obtained in the pilot plant was between two to four times higher than obtained at the laboratory scale. Models are Powered by Editorial Manager® and ProduXion Manager® from Aries Systems Corporation proposed for the removal and mineralization of SQX by Fenton and photo-Fenton processes. At the laboratory scale, the only significant variable affecting SQX mineralization was the presence or absence of a radiation source. For both scales, the best condition was the photo-Fenton process using 178 mg L-1 H2O2, for which the residual toxicity inhibited bacteria growth by <13%. The results obtained with the pilot plant were similar to those obtained at the laboratory scale.

A novel scenario-based dynamic negotiation approach is proposed for the coordination of decentralized supply chains under uncertainty. The relations between the involved organizations (client, provider and third parties) and their respective conflicting objectives are captured through a non-zero-sum and non symmetric roles SBDN negotiation. The client (leader) designs coordination agreements considering the uncertain reaction of the provider (follower) resulting from the uncertain nature of the third parties, which is modeled as a probability of acceptance function. Different negotiation scenarios are studied: (i) cooperative, and (ii) non-cooperative and (iii) standalone cases. The use of the resulting models is illustrated through a case study with different vendors around a "leader" (client) in a decentralized scenario. Although the usual cooperation hypothesis will allow higher overall profit expectations, using the proposed approach it is possible to identify non-cooperative scenarios with high individual profit expectations which are more likely to be accepted by all individual partners. (C) 2016 Elsevier Ltd. All rights reserved.

The integration of decision-making procedures typically assigned to different hierarchical levels in a production system (strategic, tactical, and operational) requires the use of complex multi-scale mathematical models and high computational efforts, in addition to the need of an extensive management of data and knowledge within the production system. The aim of this study is to propose a comprehensive solution for this integration problem through the use of Conceptual-Constraints. The presented methodology is based on a model in a domain ontology and the use of generalized concepts to develop tailor-made decision making models, created according to the introduced data. Different decision making formulations are reviewed and, accordingly, comprehensive Conceptual-Constraints for the different concepts (like material balances) can be determined. This work shows how these Conceptual-Constraints can be used when the quality of information is changed, enabling multi-scale implementations.

Developing data-driven fault detection systems for chemical plants requires managing uncertain data labels and dynamic attributes due to operator-process interactions. Mislabeled data is a known problem in computer science that has received scarce attention from the process systems community. This work introduces and examines the effects of operator actions in records and labels, and the consequences in the development of detection models. Using a state space model, this work proposes an iterative relabeling scheme for retraining classifiers that continuously refines dynamic attributes and labels. Three case studies are presented: a reactor as a motivating example, flooding in a simulated de-Butanizer column, as a complex case, and foaming in an absorber as an industrial challenge. For the first case, detection accuracy is shown to increase by 14% while operating costs are reduced by 20%. Moreover, regarding the de-Butanizer column, the performance of the proposed strategy is shown to be 10% higher than the filtering strategy. Promising results are finally reported in regard of efficient strategies to deal with the presented problem

This work addresses the optimization of supply and distribution chains considering the effect that equipment aging can cause over the performance of the facilities involved in the process. The decaying performance of the facilities is modeled as an exponential equation, thus giving rise to a novel MINLP formulation. The capabilities of the proposed
approach have been tested through its application to a case study considering a simplification of the Spanish natural gas network. Results demonstrate that overlooking the effect of equipment aging can lead to infeasible solutions in practice and show how the proposed model allows overcoming such limitations thus becoming a practical tool to support the decision-making process in the distribution sector.

Sulfaquinoxaline antimicrobial has been detected on environmental samples and its side effects are still unknown. Degradation of sulfaquinoxaline sodium by Fenton and photo-Fenton processes has not been reported before. Sulfaquinoxaline degradation and the evaluation of the mineralization and toxicity reduction were studied in a laboratory scale and after tested on a pilot plant. The results obtained using a small reactor were reproducible at large scale and more than 99% of the initial sulfaquinoxaline concentration (25 mg L-1) was reached. No antimicrobial activity was observed to the Staphylococcus aureus and Escherichia coli when mineralization was higher than 50% on the pilot plant.

This paper presents a hybrid approach to enhance the performance of the data-based
Pattern Classification Techniques (PCTs) used for Fault Detection and Diagnosis (FDD)
of nonlinear dynamic noisy processes. The method combines kriging metamodels with
PCT (e.g. Support Vector Machines). The metamodels are used in two different ways;
first, they are used as Multivariate Dynamic Kriging(s) (MDKs) which estimate the
process dynamic behavior/outputs, second, as classical static models which are used for
smoothing noise and imputing missing values of the process actual outputs
measurements. So during the process operations, the estimated and the smoothed actual
outputs are compared, and residual/error signals are generated that is used by the
classifier to detect and diagnose the process possible faults. The method is applied to a
benchmark case study, showing a high enhancement in such PCTs due to the
introduction of the process dynamics information to these PCTs via the MDKs, and by
smoothing the noise and imputing the missing measurements using the static kriging.

This work investigates the application of different metamodeling techniques for enhancing the information quality of the process history databases, through smoothing the noise/outliers and imputing missing data that usually contaminate such databases. The information quality enhancements are aimed at improving the training of the datadriven classification techniques used for Fault Detection and Diagnosis (FDD) of the process. A simulation case study of a Continuous Stirred Tank-Reactor (CSTR) is used to produce training datasets containing noisy, outlier and missing values. Three metamodeling techniques namely; Ordinary Kriging (OK), Artificial Neural Networks (ANN) and Polynomial Regression (PR) are used to smooth the noise and outliers, and to impute the missing values. Next, the FDD performance of the Support Vector Machines (SVM) classifier is compared when it trained with the recuperated datasets by the metamodels, while datasets have noisy, outlier and missing values. The results show high enhancement in the performance of the SVM when it trained with the recuperated data using the metamodels, especially when OK is exploited.

Traditional methods for Fault Detection and Diagnosis (FDD) usually, consider that
processes operate under a single steady condition, but because of several reasons (e.g.:
equipment aging), operation conditions of industrial processes change continuously in
practice. Under these new circumstances, the use of the originally tuned FDD system
would cause false alarms and will reduce the fault classification performance. In this
study, the Hyperplane-Distance Support Vector Machine (HD-SVM) method is
exploited for process FDD to maintain FDD performance when it decays because of the
ageing. Its effectiveness is shown through simulation studies on a CSTR reactor, for
which an aging term is simulated by progressively decreasing the heat transfer
coefficient (5%). This aging will lead to reduce the classification performance
accordingly. Next, performance of HD-SVM, Traditional Incremental Learning (TIL)
and Non-Incremental Learning (NIL) (using all data) are compared. The HD-SVM
incremental learning is shown to reduce the training time of the classifier, while
increasing the accuracy of the classifier. Therefore, HD-SVM is shown to cover the
weaknesses of Traditional incremental learning algorithms to lose possible information
and to improve classification performance in process FDD.

This work proposes a Data-Based MultiParametric-Model Predictive Control (DBMPMPC) methodology, which enables simple implementations of explicit MPC in situations when the deep mathematical knowledge required to develop traditional MP-MPC techniques is not available. Additionally, it can also assist in situations when it is difficult to apply traditional MP-MPC, due to the process model complexity or high nonlinearity. The proposed method uses machine learning techniques (Ordinary Kriging (OK), Support Vector Regression (SVR) and Artificial Neural Networks (ANN)), which are trained offline using input-output information. During the online application, the optimal control is calculated through simple interpolations using these multiparametric metamodels, avoiding the need for dynamic optimization. The method is tested with benchmark problems used in the MP-MPC literature. The results show high accuracy and robustness using a simple method, bypassing complex mathematical formulations.

Fault diagnosis (FD) using data-driven methods is essential for monitoring complex process systems, but its performance is severely affected by the quality of the used information. Additionally, processing huge amounts of data recorded by modern monitoring systems may be complex and time consuming if no data mining and/or preprocessing
methods are employed. Thus, features selection for FD is advisable in order to determine the optimal subset of features/variables for conducting statistical analyses or building a machine-learning model. In this work, features selection are formulated as an optimization problem. Several relevancy indices, such as Maximum Relevance
(MR), Value Difference Metric (VDM), and Fit Criterion (FC), and redundancy indices such as Minimum Redundancy (mR), Redundancy VDM (RVDM), and Redundancy Fit Criterion (RFC) are combined to determine the optimal subset of features. Another approach of features selection is based on the optimal performance of the classifier,
which is achieved by a classifier wrapped with genetic algorithm. Efficiency of this strategy is explored considering different classifiers, namely Support Vector Machine (SVM), Decision Tree (DT), K-Nearest Neighbours (KNN) Classifier and Gaussian Naïve Bayes (GNB).
A Genetic algorithm (GA), as a Derivative Free Optimization (DFO) technique, has been used due to the robustness to deal with different kinds of problems. The optimal subset of obtained features has been tested with SVM, DT, KNN, and GNB for the Tennessee-Eastman process benchmark with 19 classes. Results show that, when the
performance of the classifier is used as the objective function the wrapper method obtains the best features set.

The integration of decision-making procedures usually assigned to different hierarchical production systems requires the use of complex mathematical models and high computational efforts, in addition to the need of an extensive management of data and knowledge within the production systems. This work addresses this integration problem and proposes a comprehensive solution approach, as well as guidelines for Computer Aided Process Engineering (CAPE) tools managing the corresponding cyberinfrastructure. This study presents a methodology based on a domain ontology which is used as the connector between the introduced data, the different available formulations developed to solve the decision-making problem, and the necessary information to build the finally required problem instance. The methodology has demonstrated its capability to help exploiting different available decision-making problem formulations in complex cases, leading to new applications and/or extensions of these available formulations in a robust and flexible way.

This work proposes a Data-Based MultiParametric-Model Predictive Control (DBMP-MPC) methodology, which enables simple implementations of explicit MPC in situations when the deep mathematical knowledge required to develop traditional MP-MPC techniques is not available. Additionally, it can also assist in situations when it is difficult to apply traditional MP-MPC, due to the process model complexity or high nonlinearity. The proposed method uses machine learning techniques (Ordinary Kriging (OK), Support Vector Regression (SVR) and Artificial Neural Networks (ANN)), which are trained offline using input-output information. During the online application, the optimal control is calculated through simple interpolations using these multiparametric metamodels, avoiding the need for dynamic optimization. The method is tested with benchmark problems used in the MP-MPC literature. The results show high accuracy and robustness using a simple method, bypassing complex mathematical formulations.

Recently, energy systems have experienced a change of paradigm, from a large-scale centr alized approach to the in-situ exploitation of renewabl e sources. Special atte ntion has been paid to microgrids, a particular case of distributed generation where consumer nodes include generation and can be either grid- connected or isolated. This work aims to develop a general model to determine the optimal sizing of an energy system under fixed conditions and to analyze the eff ect of considering different cycle patterns on the solution. The mi xed integer linear programming (MILP) formulation proposes allows determining the best combination of available technologi es that satisfies the demand of a given set of scenarios at minimum total cost. The model has been implemented using AIMMS and applied to a case study consisting of a five-member Mediterranean house. The results obtained reveal the need to select the most convenient time cycles for defining the scenarios of the sizing model

Although design is a problem that has been addressed in the literature, maintaining, upgrading and expanding energy distribution networks along the entire life-cycle is a topic that has received scarce attention. The problem includes considering the long-term dependence of the efficiency of the investments along their life span. This work presents a novel model for the optimization of energy distribution networks considering the decaying performance caused by equipment aging. Increasing maintenance costs have been included to model the decaying performance, thus giving rise to an MINLP formulation. A simplified case study based on a real electricity distribution
network has been used as a test bed for the proposed approach. Results show that unrealistic sizing and planning solutions are obtained when the decaying performance is not considered and demonstrate that the proposed MINLP overcomes such limitations.

A generic tactical model is developed considering third party price policies for the optimization of coordinated and centralized multi-product Supply Chains (SCs). To allow a more realistic assessment of these policies in each marketing situation, different price approximation models to estimate these policies are proposed, which are based on the demand elasticity theory, and result in different model implementations (LP, NLP, and MINLP). The consequences of using the proposed models on the SCs coordination, regarding not only their practical impact on the tactical decisions, but also the additional mathematical difficulties to be solved, are verified through a case study in which the coordination of a production–distribution SC and its energy generation SC is analyzed. The results show how the selection of the price approximation model affects the tactical decisions. The average price approximation leads to the worst decisions with a significant difference in the real total cost in comparison with the best piecewise approximation.

Nowadays, market trends have made companies modify the way of doing business. The current context of globalization, economic crisis, political conditions and competence among enterprises involves the continuous challenge for achieving their Key Performance Indicators. The successful achievement of these indicators depends on the Supply Chain (SC) efficiency. Thus, companies work towards the optimization of their overall SC in response to the competition from other companies as well as to take advantage of the flexibility in the restriction on world trade. This is done by the exploration of any resource flow (including raw materials and intermediates and final products, or energy), as well as any echelon of the Supply Chain network (such as suppliers, production plants, distribution centres and final markets). In this point, the Supply Chain Management addresses the process of planning, implementing and controlling the SC operations in an efficient way, throughout the management of material, information and financial flows across a network of entities within a SC. This includes the coordination and collaboration of processes and activities involved in this network. However, the complexity of considering the overall SC as well as the presence of uncertainty (demand, prices, process parameters) introduce more complexity in the coordination of all the activities or processes which take place through the SC.
The aim of this thesis is to enhance the decision making process of industrial processes, by the development of new mathematical models to better coordinate all available information and to improve the synchronization production and demand, considering different time scales.
Hence, this thesis presents a general overview of production process requirements within a SC and a review of the current state of the art, which has allowed to identify the open issues in the area in the context of Process Systems Engineering. Moreover, the first part of the thesis also presents an analysis of existing approaches, methods and tools used through this thesis.
The second part of this work deals with the integrated management of production and demand constraints. This part first explores how the profitability of the SC can be improved by considering simultaneously production side and demand side management under deterministic conditions. Thus, discrete and hybrid time formulations have been presented to study the performance of the time representation. Furthermore, the discrete time formulation has been extended to deal with the external and internal uncertainty, through the implementation of a reactive approach. This part is also addresses the coordinated management of production and demand as well as the use of external and internal resources. Therefore, a new generalized mathematical formulation which integrates all resources involved in the production process within a supply chain is presented.
The third part of this thesis is focused on the integrated SC optimization. Particularly, this part concerns the integration of hierarchical decision levels, by the exploitation of mathematical models that assess the consequences of considering simultaneously scheduling and planning decisions when designing a SC network. The Synthesis State Task Network concept is introduced to extend its typical representation of a process to incorporate information associated to the synthesis problem by the implementation of synthesis blocks. Finally, an integrated information management system based on an ontological framework is presented. The aim of this information platform is to coordinate all available information for decision making. This integrated platform will allow monitoring the real-time evolution of industrial processes within a supply chain. Moreover, this system may be used as an Operator Training System.
Finally, the last part of this thesis provides the final conclusions and further work to be developed.

The increment of electrical and electronic appliances for improving the lifestyle of residential consumers had led to a larger demand of energy. In order to supply their energy requirements, the consumers have changed the paradigm by integrating renewable energy sources to their power grid. Therefore, consumers become prosumers in which they internally generate and consume energy looking for an autonomous operation. This paper proposes an energy management system for coordinating the operation of distributed household prosumers. It was found that better performance is achieved when cooperative operation with other prosumers in a neighborhood environment is achieved. Simulation and experimental results validate the proposed strategy by comparing the performance of islanded prosumers with the operation in cooperative mode

In this paper, an energy management system is defined as a flexible architecture. This proposal can be applied to home and residential areas when they include generation units. The system has been integrated and tested in a grid-connected microgrid prototype, where optimal power generation profiles are obtained by considering economic aspects

This paper presents a fast Super-Resolution (SR) algorithm based on a selective patch processing. Motivated by the observation that some regions of images are smooth and unfocused and can be properly upscaled with fast interpolation methods, we locally estimate the probability of performing a degradation-free upscaling. Our proposed framework explores the usage of supervised machine learning techniques and tackles the problem using binary boosted tree classifiers. The applied upscaler is chosen based on the obtained probabilities: (1) A fast upscaler (e.g. bicubic interpolation) for those regions which are smooth or (2) a linear regression SR algorithm for those which are ill-posed. The proposed strategy accelerates SR by only processing the regions which benefit from it, thus not compromising quality. Furthermore all the algorithms composing the pipeline are naturally parallelizable and further speed-ups could be obtained.

An important problem to be addressed by diagnostic systems in industrial applications is the estimation of faults with incomplete observations. This work discusses different approaches for handling missing data, and performance of data-driven fault diagnosis schemes. An exploiting classifier and combined methods were assessed in Tennessee-Eastman process, for which diverse incomplete observations were produced. The use of several indicators revealed the trade-off between performances of the different schemes. Support vector machines (SVM) and C4.5, combined with k-nearest neighbourhood (kNN), produce the highest robustness and accuracy, respectively. Bayesian networks (BN) and centroid appear as inappropriate options in terms of accuracy, while Gaussian naive Bayes (GNB) is sensitive to imputation values. In addition, feature selection was explored for further performance enhancement, and the proposed contribution index showed promising results. Finally, an industrial case was studied to assess informative level of incomplete data in terms of the redundancy ratio and generalize the discussion. (C) 2015 Elsevier Ltd. All rights reserved.

The solution of process systems engineering problems involves their formal representation and application of algorithms and strategies related to several scientific disciplines, such as computer science or operations research. In this work, the domain of operations research is modelled within a semantic representation in order to systematize the application of the available methods and tools to the decision-making processes within organizations. As a result, operations research ontology is created. Such ontology is embedded in a wider framework that contains two additional ontologies, namely, the enterprise ontology project and a mathematical representation, and additionally it communicates with optimization algorithms. The new ontology provides a means for automating the creation of mathematical models based on operations research principles.