In the challenging downhole environment, drilling tools are normally subject to high temperature, severe vibration, and other harsh operation conditions. The drilling activities generate massive field data, namely field reliability big data (FRBD), which includes downhole operation, environment, failure, degradation, and dynamic data. Field reliability big data has large size, high variety, and extreme complexity. FRBD presents abundant opportunities and great challenges for drilling tool reliability analytics. Consequently, as one of the key factors to affect drilling tool reliability, the downhole vibration factor plays an essential role in the reliability analytics based on FRBD. This paper reviews the important parameters of downhole drilling operations, examines the mode, physical and reliability impact of downhole vibration, and presents the features of reliability big data analytics. Specifically, this paper explores the application of vibration factor in reliability big data analytics covering tool lifetime/failure prediction, prognostics/diagnostics, condition monitoring (CM), and maintenance planning and optimization. Furthermore, the authors highlight the future research about how to better apply the downhole vibration factor in reliability big data analytics to further improve tool reliability and optimize maintenance planning.

The high estimated position error in current commercial-off-the-shelf (GPS/INS) impedes achieving precise autonomous takeoff and landing (TOL) flight operations. To overcome this problem, in this paper, we propose an integrated global positioning system (GPS)/inertial navigation system (INS)/optical flow (OF) solution in which the OF provides an accurate augmentation to the GPS/INS. To ensure accurate and robust OF augmentation, we have used a robust modeling method to estimate OF based on a set of real-time experiments conducted under various simulated helicopter-landing scenarios. Knowing that the accuracy of the OF measurements is dependent on the accuracy of the height measurements, we have developed a real-time testing environment to model and validate the obtained dynamic OF model at various heights. The performance of the obtained OF model matches the real OF sensor with 87.70% fitting accuracy. An accuracy of 0.006 m/s mean error between the real OF sensor velocity and the velocity of the OF model is also achieved. The velocity measurements of the obtained OF model and the position of the GPS/INS are used in performing a dynamic model-based sensor fusion algorithm. In the proposed solution, the OF sensor is engaged when the vehicle approaches a landing spot that is equipped with a predefined landing pattern. The proposed solution has succeeded in performing a helicopter auto TOL with a maximum position error of 27 cm.

More than four decades have passed since the introduction of safety standards for impact attenuation surfaces (IAS) used in playgrounds. Falls in children's playground are a major source of injuries and IAS is one of the best methods of preventing severe head injuries. However, the ability of IAS in prevention of other types of injuries, such as upper limb fractures, is unclear. Accordingly, in this paper, ten synthetic playground surfaces were tested to examine their performance beyond the collected head injury criterion (HIC) and maximum G-force (Gmax) outputs recommended by ASTM F1292. The aim of this work was to investigate any limitations with current safety criteria and proposing additional criteria to filter hazardous IAS that technically comply with the current 1000 HIC and 200 Gmax thresholds. The proposed new criterion is called the impulse force criterion (If). If combines two important injury predictor characteristics, namely: HIC duration that is time duration of the most severe impact; and the change in momentum that addresses the IAS properties associated with bounce. Additionally, the maximum jerk (Jmax), the bounce, and the IAS absorbed work are presented. HIC, Gmax, If, and Jmax followed similar trends regarding material thickness and drop height. Moreover, the bounce and work done by the IAS on the falling missile at increasing drop heights was similar for all surfaces apart from one viscoelastic foam sample. The results presented in this paper demonstrate the limitations of current safety criteria and should, therefore, assist future research to reduce long-bone injuries in playgrounds.

Offshore petroleum platforms present complex, time-sensitive situations that can make emergency evacuations difficult to manage. Virtual environments (VE) can train safety-critical tasks and help prepare personnel to respond to real-world offshore emergencies. Before industries can adopt VE training, its utility must be established to ensure the technology provides effective training. This paper presents the results of two experiments that investigated the training utility of VE training. The experiments focused particularly on determining the most appropriate method to deliver offshore emergency egress training using a virtual environment. The first experiment used lecture-based teaching (LBT). The second experiment investigated the utility of a simulation-based mastery learning (SBML) pedagogical method from the medical field to address offshore emergency egress training. Both training programs (LBT and SBML) were used to train naïve participants in basic onboard familiarization and emergency evacuation procedures. This paper discusses the training efficacy of the SBML method in this context and compares the results of the SBML experimental study to the results of the LBT training experiment. Efficacy of the training methods is measured by a combination of time spent training and performance achieved by each of the training groups. Results show that the SBML approach to VE training was more time effective and produced better performance in the emergency scenarios. SBML training can help address individual variability in competence. Limitations to the SBML training are discussed and recommendations to improve the delivery of SBML training are presented. Overall, the results indicate that employing SBML training in industry can improve human reliability during emergencies through increased competence and compliance.

There are several ways of quantifying flood hazard. When the scale of the analysis is large, flood hazard simulation for an entire city becomes costly and complicated. The first part of this paper proposes utilizing experience and knowledge of local experts about flood characteristics in the area in order to come up with a first-level flood hazard and risk zoning maps, by implementing overlay operations in Arc GIS. In this step, the authors use the concept of pairwise comparison to eliminate the need for carrying out a complicated simulation to quantify flood hazard and risk. The process begins with identifying the main factors that contribute to flooding in a particular area. Pairwise comparison was used to elicit knowledge from local experts and assigned weights for each factor to reflect their relative importance toward flood hazard and risk. In the second part of this paper, the authors present a decision-making framework to support a flood risk response plan. Once the highest risk zones have been identified, a city can develop a risk response plan, for which this paper presents a decision-making framework to select an effective set of alternatives. The framework integrates tools from multicriteria decision-making, charrette design process to guide the pairwise elicitation, and a cost-effective analysis to include the limited budget constraint for any city. The theoretical framework uses the city of Addis Ababa for the first part of the paper. For the second part, the paper utilizes a hypothetical case of Addis Ababa and a mock city infrastructure department to illustrate the implementation of the framework.

In traditional reliability problems, the distribution of a basic random variable is usually unimodal; in other words, the probability density of the basic random variable has only one peak. In real applications, some basic random variables may follow bimodal distributions with two peaks in their probability density. When binomial variables are involved, traditional reliability methods, such as the first-order second moment (FOSM) method and the first-order reliability method (FORM), will not be accurate. This study investigates the accuracy of using the saddlepoint approximation (SPA) for bimodal variables and then employs SPA-based reliability methods with first-order approximation to predict the reliability. A limit-state function is at first approximated with the first-order Taylor expansion so that it becomes a linear combination of the basic random variables, some of which are bimodally distributed. The SPA is then applied to estimate the reliability. Examples show that the SPA-based reliability methods are more accurate than FOSM and FORM.

Hierarchical Bayesian models (HBMs) have been increasingly used for various engineering applications. We classify two types of HBM found in the literature as hierarchical prior model (HPM) and hierarchical stochastic model (HSM). Then, we focus on studying the theoretical implications of the HSM. Using examples of polynomial functions, we show that the HSM is capable of separating different types of uncertainties in a system and quantifying uncertainty of reduced order models under the Bayesian model class selection framework. To tackle the huge computational cost for analyzing HSM, we propose an efficient approximation scheme based on importance sampling (IS) and empirical interpolation method (EIM). We illustrate our method using two engineering examples—a molecular dynamics simulation for Krypton and a pharmacokinetic/pharmacodynamics (PKPD) model for cancer drug.

The data collected on second-to-second operations of large-scale freight and logistics systems have increased in recent years. Data analytics can provide valuable insight and improve efficiency and reduce waste of resources. Understanding sources of uncertainty, including emergent and future conditions, is critical to enterprise resilience, recognizing regimes of operations, and to decision-making for capacity expansions, etc. This paper demonstrates analyses of operations data at a marine container terminal and disaggregates layers of uncertainty and discusses implications for operations decision-making and capacity expansion. The layers arise from various sources and perspectives such as level of detail in data collection and compatibilities of data sources, missing entries in databases, natural and human-induced disruptions, and competing stakeholder views of what should be the performance metrics. Among the resulting observations is that long truck turn times are correlated with high traffic volume which is distributed across most states of operations. Furthermore, data quality and presentation of performance metrics should be considered when interpreting results from data analyses. The potential influences of emergent and future conditions of technologies, markets, commerce, environment, behaviors, regulations, organizations, environment, and others on the regimes of terminal operations are examined.

Real components always deviate from their ideal dimensions. This makes every component, even a serial production, unique. Although they look the same, differences can always be observed due to different scattering factors and variations in the manufacturing process. All these factors inevitably lead to parts that deviate from their ideal shape and, therefore, have different properties than the ideal component. Changing properties can lead to major problems or even failure during operation. It is necessary to specify the permitted deviations to ensure that every single product nevertheless meets its technical requirements. Furthermore, it is necessary to estimate the consequences of the permitted deviations, which is done via tolerance analysis. During this process, components are assembled virtually and varied with the uncertainties specified by the tolerances. A variation simulation is one opportunity to calculate these effects for geometric deviations. Since tolerance analysis enables engineers to identify weak points in an early design stage, it is important to know the contribution that every single tolerance has on a certain quality-relevant characteristic, to restrict or increase the correct tolerances. In this paper, a fuzzy-based method to calculate the sensitivity is introduced and compared with the commonly used extended Fourier amplitude sensitivity test (EFAST) method. Special focus of this work is the differentiation of the sensitivity for the total system and the sensitivities for the subsystems defined by the α-cuts of the fuzzy numbers. It discusses the impact of the number of evaluations and nonlinearity on sensitivity for EFAST and the fuzzy-based method.

Adding advanced safety features (e.g., airbags) to restraint systems in tactical vehicles could decrease the injury risk of their occupants. The impact of frontal crashes on the occupants has been assessed recently through experimental data and finite element (FE) models. However, the number of such experiments is relatively small due to high cost. In this paper, we conduct an uncertainty study to infer the advantage of including advanced safety features, if a larger number of experiments were possible. We introduce the concept of group injury risk distribution that allows us to quantify under uncertainty the injury risk associated with advanced safety features, while averaging out the effect of uncontrollable factors such as body size. Statistically, the group injury risk distribution is a mixture of individual injury risk distributions of design conditions in the group. We infer that advanced safety features have the potential to reduce substantially injury risk in frontal crashes.

Investment castings are used in industrial sectors including automobile, aerospace, chemical, biomedical, and other critical applications; they are required to be of significant quality (free of defects and possess the desired range of mechanical properties). In practice, this is a big challenge, since there are large number of parameters related to process and alloy composition are involved in process. Also, their values change for every casting, and their effect on quality is not very well understood. It is, however, difficult to identify the most critical parameters and their specific values influencing the quality of investment castings. This is achieved in the present work by employing foundry data analytics based on Bayesian inference to compute the values of posterior probability for each input parameter. Computation of posterior probability for each parameter in turn involves computation of local probability (LP), prior odd, conditional probability (CP), joint probability (JP), prior odd, likelihood ratio (LR) as well as posterior odd. Computed value of posterior probability helps (parameters are considered to be critical if the value of posterior probability is high) in identifying critical parameter and their specific range of values affecting quality of investment castings. This is demonstrated on real-life data collected from an industrial foundry. Controlling the identified parameters within the specific range of values resulted in improved quality. Unlike computer simulation, artificial neural networks (ANNs), and statistical methods explored by earlier researchers, the proposed approach is easy to implement in industry for controlling and optimizing the parameters to achieve castings that are defect free as well as in desired range of mechanical properties. The current work also shows the way forward for building similar systems for other casting and manufacturing processes.

Abstract
The far-reaching consequences of train derailments have been a major concern to industry and government despite their relatively low occurrence. These consequences include injury, loss of life and property, interruption of services, and destruction of the environment. Thus, it is imperative to carefully examine train derailment severity. The majority of extant literature has failed to consider the multivariate nature of derailment severity and has instead focused mainly on only one severity outcome, namely, the number of derailed cars. However, it is also important to analyze the monetary damage incurred by railroads during derailments. In this paper, a joint mixed copula-based model for derailed cars and monetary damage is presented for the combined analysis of their relationship with a set of covariates that might affect both outcomes. Marginal generalized regression linear models are combined with a bivariate copula, which characterizes the dependence between the two variables. Copulas also address endogeneity due to similar unobserved or omitted variables that may affect both response variables. The copula-based regression model was found to be more appropriate than the independent multivariate regression model. The incorporation of the copula to characterize the dependence resulted in a greater effect on the dispersion estimate than the point estimates. Derailment speed was found to have the most pronounced effect on both response variables. However, it was found to have a greater impact on monetary damage than the number of derailed cars.

Abstract
Estimating the safety effects of emerging or future technology based on expert acquisitions is challenging because the accumulated judgment is at risk to be biased and imprecise. Therefore, this semiquantitative study is proposing and demonstrating an upgraded bowtie analysis for safety effect assessments that can be performed without the need for expert acquisition. While bowtie analysis is commonly used in, for example, process engineering, it is novel in road traffic safety. Four crash case studies are completed using bowtie analysis, letting the input parameters sequentially vary over the entire range of possible expert opinions. The results suggest that only proactive safety measures estimated to decrease the probability of specific crash risk factors to at least “very improbable” can perceptibly decrease crash probability. Further, the success probability of a reactive measure must be at least “moderately probable” to reduce the probability of a serious or fatal crash by half or more. This upgraded bowtie approach allows the identification of (1) the sensitivity of the probability of a crash and its consequences to expert judgment used in the bowtie model and (2) the necessary effectiveness of a chosen safety measure allowing adequate changes in the probability of a crash and its consequences.

Abstract
This paper presents a comprehensive statistical study on crack and leakage defects of road tunnel linings in China with newly proposed quantitative indexes that can be easily evaluated by robotic inspection techniques. These indexes are crack density, average crack length, nominal crack width, leakage density, and leakage state index. A database covering the defects of 116 road tunnels in China is built from on-site inspection and literatures. Key factors for each evaluation index are determined by analysis of variance and partial correlation analysis. Nonstructural crack is found to be the dominant type of cracking. Temperature-shrinkage stress is the main reason for cracking in cold-region tunnels. Seepage is the most common type of leakage. The underground water source plays a crucial role in the formation of leakage. Probabilistic modeling of the defect is carefully investigated: crack density and leakage state index can approximately follow normal distribution. Leakage density and average crack length are lognormally distributed, whereas nominal crack width of composite-lined new tunnels follows exponential distribution. Bayesian updating is carried out for nonparametric estimation in the case in which the conventional maximum likelihood method is not applicable due to data deficiency. The proposed probabilistic models are validated by on-site inspection and theoretical checking computation.

Abstract
Managing geotechnical risk in design-build (DB) projects is complicated because the contract is awarded before a thorough subsurface investigation is conducted, making geotechnical uncertainty high during bid preparation. Typically, owners allocate geotechnical risk between themselves and competing design-builders in the DB project’s request of proposals (RFP), and the perceived RFP risk profile is reflected in the bid prices proposed by competing DB teams. The level of perceived geotechnical uncertainty is exacerbated when the RFP’s geotechnical content is either inadequate or ambiguous, a condition that may not have been recognized by the RFP’s authors. Hence, the purpose of this paper is to understand the influence of differing perceptions on the DB project’s pricing. The paper analyzes the difference in perceived DB geotechnical risk for 27 common geotechnical risk factors identified in a survey of state department of transportation (DOT) geotechnical engineers and professionals of the DB industry. Those geotechnical risk factors were rated on the basis of frequency and impact by 46 DOT and industry practitioners, and the results are synthesized using importance index theory. The study finds a statistically significant difference in the perceptions of importance of geotechnical risk factors between public agencies and the DB industry. The average perceived difference in the rated factors is nearly 12%. This paper recommends that the perceptions of geotechnical risk be aligned before the contract award using progressive DB procurement or after the award using a scope validation period to provide an opportunity to share the geotechnical risks.

Abstract
Prognosis aims at calculating the remaining useful life (RUL) of a system by estimating its current health state and then predicting its future behavior. In this paper, the prediction of fatigue crack growth in structural elements made of unidirectional fiber-reinforced composites is considered. Model uncertainty and measurement uncertainty are included, as well as future loading uncertainty. Both cases of constant amplitude loading and variable amplitude loading (block loading) are examined. The analytical model that describes the fatigue crack growth is highly nonlinear and contains fixed model parameters that depend on material and loading parameters that may vary or not, depending on the applied load. Thus, because of its ability to handle uncertainties and high nonlinearities, but also to perform joint parameter-state estimation, a particle filter is used. In the first part, fatigue crack growth prognosis under constant amplitude loading is realized. The loading parameters are constant and known a priori, while the model parameters are jointly estimated along with the crack length. In the second part, fatigue crack growth prognosis under variable amplitude loading is performed. This time, the loading parameters are unknown and change abruptly at unknown time steps in accordance with the applied variable block loading. The two-sided cumulative sum (CUSUM) algorithm is implemented to detect abrupt load variations and help the particle filter to adapt and learn new loading parameters values. With the combination of these two techniques, the prognosis module could be informed of the sudden crack length increase and correct the predicted remaining useful life. In both case studies, real data from fatigue tests on unidirectional fiber-reinforced titanium matrix composites are used.

Shibboleth is an access management service that provides single sign-on protected resources.
It replaces the multiple user names and passwords necessary to access subscription-based content with a single user name and password that can be entered once per session.
It operates independently of a user's location or IP address.
If your institution uses Shibboleth authentication, please contact your site administrator to receive your user name and password.