Sample records for hybridizations statistical analyses

In this paper, hybrid logic is used to formulate three control flow analyses for Mobile Ambients, a process calculus designed for modelling mobility. We show that hybrid logic is very well-suited to express the semantic structure of the ambient calculus and how features of hybrid logic can...

This paper presents novel extensions and applications of the UPPAAL-SMC model checker. The extensions allow for statistical model checking of stochastic hybrid systems. We show how our race-based stochastic semantics extends to networks of hybrid systems, and indicate the integration technique ap...

This paper presents novel extensions and applications of the UPPAAL-SMC model checker. The extensions allow for statistical model checking of stochastic hybrid systems. We show how our race-based stochastic semantics extends to networks of hybrid systems, and indicate the integration technique...... applied for implementing this semantics in the UPPAAL-SMC simulation engine. We report on two applications of the resulting tool-set coming from systems biology and energy aware buildings....

The performance of disease surveillance systems is evaluated and monitored using a diverse set of statisticalanalyses throughout each stage of surveillance implementation. An overview of their main elements is presented, with a specific emphasis on syndromic surveillance directed to outbreak detection in resource-limited settings. Statisticalanalyses are proposed for three implementation stages: planning, early implementation, and consolidation. Data sources and collection procedures are described for each analysis.During the planning and pilot stages, we propose to estimate the average data collection, data entry and data distribution time. This information can be collected by surveillance systems themselves or through specially designed surveys. During the initial implementation stage, epidemiologists should study the completeness and timeliness of the reporting, and describe thoroughly the population surveyed and the epidemiology of the health events recorded. Additional data collection processes or external data streams are often necessary to assess reporting completeness and other indicators. Once data collection processes are operating in a timely and stable manner, analyses of surveillance data should expand to establish baseline rates and detect aberrations. External investigations can be used to evaluate whether abnormally increased case frequency corresponds to a true outbreak, and thereby establish the sensitivity and specificity of aberration detection algorithms.Statistical methods for disease surveillance have focused mainly on the performance of outbreak detection algorithms without sufficient attention to the data quality and representativeness, two factors that are especially important in developing countries. It is important to assess data quality at each state of implementation using a diverse mix of data sources and analytical methods. Careful, close monitoring of selected indicators is needed to evaluate whether systems are reaching their

The evolution of a screen cylinder wake was studied by analysing its statistical properties over a streamwise range of x/d={10-60}. The screen cylinder was made of a stainless steel screen mesh of 67% porosity. The experiments were conducted in a wind tunnel at a Reynolds number of 7000 using an X-probe. The results were compared with those obtained in the wake generated by a solid cylinder. It was observed that the evolution of the statistics in the wake of the screen cylinder was different from that of a solid cylinder, reflecting the differences in the formation of the organized large-scale vortices in both wakes. The streamwise evolution of the Reynolds stresses, energy spectra and cross-correlation coefficients indicated that there exists a critical location that differentiates the screen cylinder wake into two regions over the measured streamwise range. The formation of the fully formed large-scale vortices was delayed until this critical location. Comparison with existing results for screen strips showed that although the near-wake characteristics and the vortex formation mechanism were similar between the two wake generators, variation in the Strouhal frequencies was observed and the self-preservation states were non-universal, reconfirming the dependence of a wake on its initial condition.

This handbook is a realization of a long term goal of BMDP Statistical Software. As the software supporting statistical analysis has grown in breadth and depth to the point where it can serve many of the needs of accomplished statisticians it can also serve as an essential support to those needing to expand their knowledge of statistical applications. Statisticians should not be handicapped by heavy computation or by the lack of needed options. When Applied Statistics, Principle and Examples by Cox and Snell appeared we at BMDP were impressed with the scope of the applications discussed and felt that many statisticians eager to expand their capabilities in handling such problems could profit from having the solutions carried further, to get them started and guided to a more advanced level in problem solving. Who would be better to undertake that task than the authors of Applied Statistics? A year or two later discussions with David Cox and Joyce Snell at Imperial College indicated that a wedding of the proble...

In this paper, hybrid logic is used to formulate a rational reconstruction of a previously published control flow analysis for the mobile ambients calculus and we further show how a more precise flow-sensitive analysis, that takes the ordering of action sequences into account, can be formulated...... in a natural way. We show that hybrid logic is very well suited to express the semantic structure of the ambient calculus and how features of hybrid logic can be exploited to reduce the "administrative overhead" of the analysis specification and thus simplify it. Finally, we use HyLoTab, a fully automated...

In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

In this article, we revisit the coherent gravitational wave search problem of compact binary coalescences with multidetector network consisting of advanced interferometers like LIGO-Virgo. Based on the loss of the optimal multidetector signal-to-noise ratio (SNR), we construct a hybridstatistic as a best of maximum-likelihood-ratio (MLR) statistic tuned for face-on and face-off binaries. The statistical properties of the hybridstatistic is studied. The performance of this hybridstatistic is compared with that of the coherent MLR statistic for generic inclination angles. Owing to the single synthetic data stream, the hybridstatistic gives few false alarms compared to the multidetector MLR statistic and small fractional loss in the optimum SNR for a large range of binary inclinations. We demonstrate that, for a LIGO-Virgo network and binary inclination ɛ 11 0 ° , the hybridstatistic captures more than 98% of the network optimum matched filter SNR with a low false alarm rate. The Monte Carlo exercise with two distributions of incoming inclination angles—namely, U [cos ɛ ] and a more realistic distribution proposed by B. F. Schutz [Classical Quantum Gravity 28, 125023 (2011)]—are performed with the hybridstatistic and give approximately 5% and 7% higher detection probabilities, respectively, compared to the two stream multidetector MLR statistic for a fixed false alarm probability of 1 0-5.

Jizba-Arimitsu entropy (also called hybrid entropy) combines axiomatics of Rényi and Tsallis entropy. It has many common properties with them, on the other hand, some aspects as e.g., MaxEnt distributions, are different. In this paper, we discuss statistical properties of hybrid entropy. We define hybrid entropy for continuous distributions and its relation to discrete entropy. Additionally, definition of hybrid divergence and its connection to Fisher metric is also presented. Interestingly, Fisher metric connected to hybrid entropy differs from corresponding Fisher metrics of Rényi and Tsallis entropy. This motivates us to introduce average hybrid entropy, which can be understood as an average between Tsallis and Rényi entropy.

In this article, we present some simple yet effective statistical techniques for analysing and comparing large DNA sequences. These techniques are based on frequency distributions of DNA words in a large sequence, and have been packaged into a software called SWORDS. Using sequences available in public domain databases housed in the Internet, we demonstrate how SWORDS can be conveniently used by molecular biologists and geneticists to unmask biologically important features hidden in large sequences and assess their statistical significance.

In this article, we revisit the problem of coherent multi-detector search of gravitational wave from compact binary coalescence with Neutron stars and Black Holes using advanced interferometers like LIGO-Virgo. Based on the loss of optimal multi-detector signal-to-noise ratio (SNR), we construct a hybridstatistic as a best of maximum-likelihood-ratio(MLR) statistic tuned for face-on and face-off binaries. The statistical properties of the hybridstatistic is studied. The performance of this hybridstatistic is compared with that of the coherent MLR statistic for generic inclination angles. Owing to the single synthetic data stream, the hybridstatistic gives low false alarms compared to the multi-detector MLR statistic and small fractional loss in the optimum SNR for a large range of binary inclinations. We have demonstrated that for a LIGO-Virgo network and binary inclination, \\epsilon 110 deg., the hybridstatistic captures more than 98% of network optimum matched filter SNR with low false alarm rate. The...

Research into the physiology of exercise and kinanthropometry is intended to improve our understanding of how the body responds and adapts to exercise. If such studies are to be meaningful, they have to be well designed and analysed. Advances in personal computing have made available statisticalanalyses that were previously the preserve of elaborate mainframe systems and have increased opportunities for investigation. However, the ease with which analyses can be performed can mask underlying philosophical and epistemological shortcomings. The aim of this review is to examine the use of four techniques that are especially relevant to physiological studies: (1) bivariate correlation and linear and non-linear regression, (2) multiple regression, (3) repeated-measures analysis of variance and (4) multi-level modelling. The importance of adhering to underlying statistical assumptions is emphasized and ways to accommodate violations of these assumptions are identified.

This paper introduces hybrid random fields, which are a class of probabilistic graphical models aimed at allowing for efficient structure learning in high-dimensional domains. Hybrid random fields, along with the learning algorithm we develop for them, are especially useful as a pseudo-likelihood estimation technique (rather than a technique for estimating strict joint probability distributions). In order to assess the generality of the proposed model, we prove that the class of pseudo-likelihood distributions representable by hybrid random fields strictly includes the class of joint probability distributions representable by Bayesian networks. Once we establish this result, we develop a scalable algorithm for learning the structure of hybrid random fields, which we call 'Markov Blanket Merging'. On the one hand, we characterize some complexity properties of Markov Blanket Merging both from a theoretical and from the experimental point of view, using a series of synthetic benchmarks. On the other hand, we evaluate the accuracy of hybrid random fields (as learned via Markov Blanket Merging) by comparing them to various alternative statistical models in a number of pattern classification and link-prediction applications. As the results show, learning hybrid random fields by the Markov Blanket Merging algorithm not only reduces significantly the computational cost of structure learning with respect to several considered alternatives, but it also leads to models that are highly accurate as compared to the alternative ones.

Since Shull's original description of heterosis, breeders have made wide use of this phenomenon. However while breeders and agronomists have been utilizing heterosis as a means of improving crop productivity, the biological basis of heterosis remains unknown. It is generally believed that our understanding of heterosis will greatly enhance our ability to form new genotypes either to be used directly as F1 hybrids or to form the basis for the selection programs to follow. Efforts have been made to understand the phenomenon. They have been directly related to our capabilities for genetic analyses through the years. So, while the original data came out of studies at the phenotypic morphological level they were followed by physiological and later by biochemical data. With the advent of electrophoresis and the consequent ease of accumulation of data related to isozyme variability, a number of attempts have been made to relate genetic relatedness of inbreds with the performance of their F1 hybrid. An inherent difficulty of this approach arises because of the pedigree diversities among the parental lines. To overcome this problem the same approach is followed in lines of similar pedigree, e.g., coming out of the same original population (F2 of a single F1 hybrid) after selection. The data indicate a significant positive correlation between heterozygosity of parental inbreds and heterosis of their respective F1 hybrid estimated as deviation from the mid-parental value. Some recent data from studies at the total protein level will also be discussed.

A new statistical technique, the Cox method, used for analysing functional connectivity of simultaneously recorded multiple spike trains is presented. This method is based on the theory of modulated renewal processes and it estimates a vector of influence strengths from multiple spike trains (called reference trains) to the selected (target) spike train. Selecting another target spike train and repeating the calculation of the influence strengths from the reference spike trains enables researchers to find all functional connections among multiple spike trains. In order to study functional connectivity an "influence function" is identified. This function recognises the specificity of neuronal interactions and reflects the dynamics of postsynaptic potential. In comparison to existing techniques, the Cox method has the following advantages: it does not use bins (binless method); it is applicable to cases where the sample size is small; it is sufficiently sensitive such that it estimates weak influences; it supports the simultaneous analysis of multiple influences; it is able to identify a correct connectivity scheme in difficult cases of "common source" or "indirect" connectivity. The Cox method has been thoroughly tested using multiple sets of data generated by the neural network model of the leaky integrate and fire neurons with a prescribed architecture of connections. The results suggest that this method is highly successful for analysing functional connectivity of simultaneously recorded multiple spike trains.

G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

Full Text Available The size distribution of neuronal avalanches in cortical networks has been reported to follow a power law distribution with exponent close to -1.5, which is a reflection of long-range spatial correlations in spontaneous neuronal activity. However, identifying power law scaling in empirical data can be difficult and sometimes controversial. In the present study, we tested the power law hypothesis for neuronal avalanches by using more stringent statisticalanalyses. In particular, we performed the following steps: (i analysis of finite-size scaling to identify scale-free dynamics in neuronal avalanches, (ii model parameter estimation to determine the specific exponent of the power law, and (iii comparison of the power law to alternative model distributions. Consistent with critical state dynamics, avalanche size distributions exhibited robust scaling behavior in which the maximum avalanche size was limited only by the spatial extent of sampling ("finite size" effect. This scale-free dynamics suggests the power law as a model for the distribution of avalanche sizes. Using both the Kolmogorov-Smirnov statistic and a maximum likelihood approach, we found the slope to be close to -1.5, which is in line with previous reports. Finally, the power law model for neuronal avalanches was compared to the exponential and to various heavy-tail distributions based on the Kolmogorov-Smirnov distance and by using a log-likelihood ratio test. Both the power law distribution without and with exponential cut-off provided significantly better fits to the cluster size distributions in neuronal avalanches than the exponential, the lognormal and the gamma distribution. In summary, our findings strongly support the power law scaling in neuronal avalanches, providing further evidence for critical state dynamics in superficial layers of cortex.

The size distribution of neuronal avalanches in cortical networks has been reported to follow a power law distribution with exponent close to -1.5, which is a reflection of long-range spatial correlations in spontaneous neuronal activity. However, identifying power law scaling in empirical data can be difficult and sometimes controversial. In the present study, we tested the power law hypothesis for neuronal avalanches by using more stringent statisticalanalyses. In particular, we performed the following steps: (i) analysis of finite-size scaling to identify scale-free dynamics in neuronal avalanches, (ii) model parameter estimation to determine the specific exponent of the power law, and (iii) comparison of the power law to alternative model distributions. Consistent with critical state dynamics, avalanche size distributions exhibited robust scaling behavior in which the maximum avalanche size was limited only by the spatial extent of sampling ("finite size" effect). This scale-free dynamics suggests the power law as a model for the distribution of avalanche sizes. Using both the Kolmogorov-Smirnov statistic and a maximum likelihood approach, we found the slope to be close to -1.5, which is in line with previous reports. Finally, the power law model for neuronal avalanches was compared to the exponential and to various heavy-tail distributions based on the Kolmogorov-Smirnov distance and by using a log-likelihood ratio test. Both the power law distribution without and with exponential cut-off provided significantly better fits to the cluster size distributions in neuronal avalanches than the exponential, the lognormal and the gamma distribution. In summary, our findings strongly support the power law scaling in neuronal avalanches, providing further evidence for critical state dynamics in superficial layers of cortex.

Full Text Available The article focuses on assessing the effectiveness of a non-statistical approach to bankruptcy modelling in enterprises operating in the logistics sector. In order to describe the issue more comprehensively, the aforementioned prediction of the possible negative results of business operations was carried out for companies functioning in the Polish region of Podkarpacie, and in Slovakia. The bankruptcy predictors selected for the assessment of companies operating in the logistics sector included 28 financial indicators characterizing these enterprises in terms of their financial standing and management effectiveness. The purpose of the study was to identify factors (models describing the bankruptcy risk in enterprises in the context of their forecasting effectiveness in a one-year and two-year time horizon. In order to assess their practical applicability the models were carefully analysed and validated. The usefulness of the models was assessed in terms of their classification properties, and the capacity to accurately identify enterprises at risk of bankruptcy and healthy companies as well as proper calibration of the models to the data from training sample sets.

Because biological processes can result in different loci having different evolutionary histories, species tree estimation requires multiple loci from across multiple genomes. While many processes can result in discord between gene trees and species trees, incomplete lineage sorting (ILS), modeled by the multi-species coalescent, is considered to be a dominant cause for gene tree heterogeneity. Coalescent-based methods have been developed to estimate species trees, many of which operate by combining estimated gene trees, and so are called "summary methods". Because summary methods are generally fast (and much faster than more complicated coalescent-based methods that co-estimate gene trees and species trees), they have become very popular techniques for estimating species trees from multiple loci. However, recent studies have established that summary methods can have reduced accuracy in the presence of gene tree estimation error, and also that many biological datasets have substantial gene tree estimation error, so that summary methods may not be highly accurate in biologically realistic conditions. Mirarab et al. (Science 2014) presented the "statistical binning" technique to improve gene tree estimation in multi-locus analyses, and showed that it improved the accuracy of MP-EST, one of the most popular coalescent-based summary methods. Statistical binning, which uses a simple heuristic to evaluate "combinability" and then uses the larger sets of genes to re-calculate gene trees, has good empirical performance, but using statistical binning within a phylogenomic pipeline does not have the desirable property of being statistically consistent. We show that weighting the re-calculated gene trees by the bin sizes makes statistical binning statistically consistent under the multispecies coalescent, and maintains the good empirical performance. Thus, "weighted statistical binning" enables highly accurate genome-scale species tree estimation, and is also statistically

Full Text Available Because biological processes can result in different loci having different evolutionary histories, species tree estimation requires multiple loci from across multiple genomes. While many processes can result in discord between gene trees and species trees, incomplete lineage sorting (ILS, modeled by the multi-species coalescent, is considered to be a dominant cause for gene tree heterogeneity. Coalescent-based methods have been developed to estimate species trees, many of which operate by combining estimated gene trees, and so are called "summary methods". Because summary methods are generally fast (and much faster than more complicated coalescent-based methods that co-estimate gene trees and species trees, they have become very popular techniques for estimating species trees from multiple loci. However, recent studies have established that summary methods can have reduced accuracy in the presence of gene tree estimation error, and also that many biological datasets have substantial gene tree estimation error, so that summary methods may not be highly accurate in biologically realistic conditions. Mirarab et al. (Science 2014 presented the "statistical binning" technique to improve gene tree estimation in multi-locus analyses, and showed that it improved the accuracy of MP-EST, one of the most popular coalescent-based summary methods. Statistical binning, which uses a simple heuristic to evaluate "combinability" and then uses the larger sets of genes to re-calculate gene trees, has good empirical performance, but using statistical binning within a phylogenomic pipeline does not have the desirable property of being statistically consistent. We show that weighting the re-calculated gene trees by the bin sizes makes statistical binning statistically consistent under the multispecies coalescent, and maintains the good empirical performance. Thus, "weighted statistical binning" enables highly accurate genome-scale species tree estimation, and is also

In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of a...

In pulsar timing, timing residuals are the differences between the observed times of arrival and predictions from the timing model. A comprehensive timing model will produce featureless residuals, which are presumably composed of dominating noise and weak physical effects excluded from the timing model (e.g. gravitational waves). In order to apply optimal statistical methods for detecting weak gravitational wave signals, we need to know the statistical properties of noise components in the residuals. In this paper we utilize a variety of non-parametric statistical tests to analyze the whiteness and Gaussianity of the North American Nanohertz Observatory for Gravitational Waves (NANOGrav) 5-year timing data, which are obtained from Arecibo Observatory and Green Bank Telescope from 2005 to 2010. We find that most of the data are consistent with white noise; many data deviate from Gaussianity at different levels, nevertheless, removing outliers in some pulsars will mitigate the deviations.

In pulsar timing, timing residuals are the differences between the observed times of arrival and the predictions from the timing model. A comprehensive timing model will produce featureless residuals, which are presumably composed of dominating noise and weak physical effects excluded from the timing model (e.g. gravitational waves). In order to apply the optimal statistical methods for detecting the weak gravitational wave signals, we need to know the statistical properties of the noise components in the residuals. In this paper we utilize a variety of non-parametric statistical tests to analyze the whiteness and Gaussianity of the North American Nanohertz Observatory for Gravitational Waves (NANOGrav) 5-year timing data which are obtained from the Arecibo Observatory and the Green Bank Telescope from 2005 to 2010 (Demorest et al. 2013). We find that most of the data are consistent with white noise; Many data deviate from Gaussianity at different levels, nevertheless, removing outliers in some pulsars will m...

Describes female math underachievement and counseling or teaching techniques being implemented on college campuses to alleviate math anxiety. Students derive unique benefits from being taught research and statistics courses by female professors. To be effective, female professors must embody distinct feminist roles and perspectives. Specific…

The Stillwater Power Plant is the first hybrid plant in the world able to bring together a medium-enthalpy geothermal unit with solar thermal and solar photovoltaic systems. Solar field and power plant models have been developed to predict the performance of the Stillwater geothermal / solar-thermal hybrid power plant. The models have been validated using operational data from the Stillwater plant. A preliminary effort to optimize performance of the Stillwater hybrid plant using optical characterization of the solar field has been completed. The Stillwater solar field optical characterization involved measurement of mirror reflectance, mirror slope error, and receiver position error. The measurements indicate that the solar field may generate 9% less energy than the design value if an appropriate tracking offset is not employed. A perfect tracking offset algorithm may be able to boost the solar field performance by about 15%. The validated Stillwater hybrid plant models were used to evaluate hybrid plant operating strategies including turbine IGV position optimization, ACC fan speed and turbine IGV position optimization, turbine inlet entropy control using optimization of multiple process variables, and mixed working fluid substitution. The hybrid plant models predict that each of these operating strategies could increase net power generation relative to the baseline Stillwater hybrid plant operations.

This review focuses on the striking recent progress in solving for hydrophobic interactions between small inert molecules. We discuss several new understandings. Firstly, the _inverse _temperature phenomenology of hydrophobic interactions, _i.e., strengthening of hydrophobic bonds with increasing temperature, is decisively exhibited by hydrophobic interactions between atomic-scale hard sphere solutes in water. Secondly, inclusion of attractive interactions associated with atomic-size hydrophobic reference cases leads to substantial, non-trivial corrections to reference results for purely repulsive solutes. Hydrophobic bonds are _weakened by adding solute dispersion forces to treatment of reference cases. The classic statistical mechanical theory for those corrections is not accurate in this application, but molecular quasi-chemical theory shows promise. Finally, because of the masking roles of excluded volume and attractive interactions, comparisons that do not discriminate the different possibilities face an...

Full Text Available This paper presents a statistical aspect of experimental study on the in-plane shear behaviour of hybrid composite sandwich panel with intermediate layer. The study was aimed at providing information of how significant the contribution of intermediate layer to the in-plane shear behaviour of new developed sandwich panel. The investigation was designed as a single factor experimental design and the results were throughly analysed with statistics software; Minitab 15. The panels were tested by applying a tensile force along the diagonal of the test frame simulating pure shear using a 100 kN MTS servo-hydraulic UTM. The result shows that the incorporation of intermediate layer has sinificantly enhanced the in-plane shear behaviour of hybrid composite sandwich panel. The statistical analysis shows that the value of F0 is much higher than the value of Ftable, which has a meaning that the improvement provided by the incorporation of intermediate layer is statistically significant.

The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…

Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.

With the increment of the complexity of structural systems and the span of spatial structures, the interactions between parts of the structures, especially between some flexible substructures, become too complex to be analyzed clearly. In this paper, taking an actual gymnasium of a long-span spatial steel-cable-membrane hybrid structure as the calculation model, the static and dynamic analyses of the hybrid structures are performed by employing the global analysis of the whole hybrid structure and the substructural analysis of the truss arch substructure, the cable-membrane substructure, etc. In addition, the comparison of stresses and displacements of structural members in the global and substructural analyses is made. The numerical results show that serious errors exist in the substructural analysis of the hybrid structure, and the global analysis is necessary for the hybrid structure under the excitation of static loads and seismic loads.

In this article, a hybrid algorithm of particle swarm optimization (PSO) with statistical parameter (HSPSO) is proposed. Basic PSO for shifted multimodal problems have low searching precision due to falling into a number of local minima. The proposed approach uses statistical characteristics to update the velocity of the particle to avoid local minima and help particles to search global optimum with improved convergence. The performance of the newly developed algorithm is verified using various standard multimodal, multivariable, shifted hybrid composition benchmark problems. Further, the comparative analysis of HSPSO with variants of PSO is tested to control frequency of hybrid renewable energy system which comprises solar system, wind system, diesel generator, aqua electrolyzer and ultra capacitor. A significant improvement in convergence characteristic of HSPSO algorithm over other variants of PSO is observed in solving benchmark optimization and renewable hybrid system problems.

The RooStatsCms (RSC) software framework allows analysis modelling and combination, statistical studies together with the access to sophisticated graphics routines for results visualisation. The goal of the project is to complement the existing analyses by means of their combination and accurate statistical studies.

The RooStatsCms (RSC) software framework allows analysis modelling and combination, statistical studies together with the access to sophisticated graphics routines for results visualisation. The goal of the project is to complement the existing analyses by means of their combination and accurate statistical studies.

Full Text Available The web-based, Java-written SOCR (Statistical Online Computational Resource toolshave been utilized in many undergraduate and graduate level statistics courses for sevenyears now (Dinov 2006; Dinov et al. 2008b. It has been proven that these resourcescan successfully improve students' learning (Dinov et al. 2008b. Being rst publishedonline in 2005, SOCR Analyses is a somewhat new component and it concentrate on datamodeling for both parametric and non-parametric data analyses with graphical modeldiagnostics. One of the main purposes of SOCR Analyses is to facilitate statistical learn-ing for high school and undergraduate students. As we have already implemented SOCRDistributions and Experiments, SOCR Analyses and Charts fulll the rest of a standardstatistics curricula. Currently, there are four core components of SOCR Analyses. Linearmodels included in SOCR Analyses are simple linear regression, multiple linear regression,one-way and two-way ANOVA. Tests for sample comparisons include t-test in the para-metric category. Some examples of SOCR Analyses' in the non-parametric category areWilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, Kolmogorov-Smirno testand Fligner-Killeen test. Hypothesis testing models include contingency table, Friedman'stest and Fisher's exact test. The last component of Analyses is a utility for computingsample sizes for normal distribution. In this article, we present the design framework,computational implementation and the utilization of SOCR Analyses.

Forcefields used in biomolecular simulations are comprised of energetic terms that are physical in nature, based on parameter fitting to quantum mechanical simulation or experimental data, or statistical, drawing off high-resolution structural data to describe distributions of molecular features. Combining the two in a single forcefield is challenging, since physical terms describe some, but not all, of the observed statistics, leading to double counting. In this manuscript, we develop a general scheme for correcting statistical potentials used in combination with physical terms. We apply these corrections to the sidechain torsional potential used in the Rosetta all-atom forcefield. We show the approach identifies instances of double-counted interactions, including electrostatic interactions between sidechain and nearby backbone, and steric interactions between neighboring Cβ atoms within secondary structural elements. Moreover, this scheme allows for the inclusion of intraresidue physical terms, previously turned off to avoid overlap with the statistical potential. Combined, these corrections lead to a forcefield with improved performance on several structure prediction tasks, including rotamer prediction and native structure discrimination.

Regional climate change studies usually rely on downscaling of global climate model (GCM) output in order to resolve important fine-scale features and processes that govern local climate. Previous efforts have used one of two techniques: (1) dynamical downscaling, in which a regional climate model is forced at the boundaries by GCM output, or (2) statistical downscaling, which employs historical empirical relationships to go from coarse to fine resolution. Studies using these methods have been criticized because they either dynamical downscaled only a few GCMs, or used statistical downscaling on an ensemble of GCMs, but missed important dynamical effects in the climate change signal. This study describes the development and evaluation of a hybrid dynamical-statstical downscaling method that utilizes aspects of both dynamical and statistical downscaling to address these concerns. The first step of the hybrid method is to use dynamical downscaling to understand the most important physical processes that contribute to the climate change signal in the region of interest. Then a statistical model is built based on the patterns and relationships identified from dynamical downscaling. This statistical model can be used to downscale an entire ensemble of GCMs quickly and efficiently. The hybrid method is first applied to a domain covering Los Angeles Region to generate projections of temperature change between the 2041-2060 and 1981-2000 periods for 32 CMIP5 GCMs. The hybrid method is also applied to a larger region covering all of California and the adjacent ocean. The hybrid method works well in both areas, primarily because a single feature, the land-sea contrast in the warming, controls the overwhelming majority of the spatial detail. Finally, the dynamically downscaled temperature change patterns are compared to those produced by two commonly-used statistical methods, BCSD and BCCA. Results show that dynamical downscaling recovers important spatial features that the

A hybridstatistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials.

Full Text Available According to Howard Garner, Professor of Cognition and Education at Harvard University, intelligence of humans cannot be measured with a single factor such as the IQ level. Instead, he and others have suggested that humans have different types of intelligence. This paper examines whether students registered in online or mostly online courses have a different type of intelligence from students registered in traditional face-to-face courses. At the beginning of the fall semester of 2011, a group of 128 students from four different courses in Business Statistics completed a survey to determine their types of intelligence. Our findings reveal surprising results with important consequences in terms of teaching styles that better fit our students.

The analysis of hybrid fossil-geothermal power plants is extended to compound hybrid systems which combine the features of previously analyzed systems: the geothermal-preheat and the fossil-superheat systems. Compound systems of the one- and two-stage type are considered. A compilation of working formulae from earlier studies is included for completeness. Results are given for parametric analyses of compound hybrid plants. System performance was determined for wellhead conditions of 150, 200, and 250/sup 0/C, and for steam fractions of 10, 20, 30, and 40%. For two-stage systems an additional cycle variable, the hot water flash fraction, was varied from 0 to 100% in increments of 25%. From the viewpoint of thermodynamics, compound hybrid plants are superior to individual all-geothermal and all-fossil plants, and are shown to have certain advantages over basic geothermal-preheat and fossil-superheat hybrid plants. The flexibility of compound hybrid systems is illustrated by showing how such plants might be used at six geothermal sites in the western United States. The question of the optimum match between the energy resources and the power plant is addressed, and an analysis given for a hypothetical geothermal resource.

The Transplant Registry Unified Management Program (TRUMP) made it possible for members of the Japan Society for Hematopoietic Cell Transplantation (JSHCT) to analyze large sets of national registry data on autologous and allogeneic hematopoietic stem cell transplantation. However, as the processes used to collect transplantation information are complex and differed over time, the background of these processes should be understood when using TRUMP data. Previously, information on the HLA locus of patients and donors had been collected using a questionnaire-based free-description method, resulting in some input errors. To correct minor but significant errors and provide accurate HLA matching data, the use of a Stata or EZR/R script offered by the JSHCT is strongly recommended when analyzing HLA data in the TRUMP dataset. The HLA mismatch direction, mismatch counting method, and different impacts of HLA mismatches by stem cell source are other important factors in the analysis of HLA data. Additionally, researchers should understand the statisticalanalyses specific for hematopoietic stem cell transplantation, such as competing risk, landmark analysis, and time-dependent analysis, to correctly analyze transplant data. The data center of the JSHCT can be contacted if statistical assistance is required.

The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most

from one municipality was sorted at "Level III", e.g. detailed, while the two others were sorted only at "Level I"). The results showed that residual household waste mainly contained food waste (42 +/- 5%, mass per wet basis) and miscellaneous combustibles (18 +/- 3%, mass per wet basis). The residual...... household waste generation rate in the study areas was 3-4 kg per person per week. Statisticalanalyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three...

This brief paper reports a hybrid algorithm we developed recently to solve the global optimization problems of multimodal functions, by combining the advantages of two powerful population-based metaheuristics-differential evolution (DE) and particle swarm optimization (PSO). In the hybrid denoted by DEPSO, each individual in one generation chooses its evolution method, DE or PSO, in a statistical learning way. The choice depends on the relative success ratio of the two methods in a previous learning period. The proposed DEPSO is compared with its PSO and DE parents, two advanced DE variants one of which is suggested by the originators of DE, two advanced PSO variants one of which is acknowledged as a recent standard by PSO community, and also a previous DEPSO. Benchmark tests demonstrate that the DEPSO is more competent for the global optimization of multimodal functions due to its high optimization quality.

Tolerancing conducted by design engineers to meet customers' needs is a prerequisite for producing high-quality products. Engineers use handbooks to conduct tolerancing. While use of statistical methods for tolerancing is not something new, engineers often use known distributions, including the normal distribution. Yet, if the statistical distribution of the given variable is unknown, a new statistical method will be employed to design tolerance. In this paper, we use generalized lambda distribution for design and analyses component tolerance. We use percentile method (PM) to estimate the distribution parameters. The findings indicated that, when the distribution of the component data is unknown, the proposed method can be used to expedite the design of component tolerance. Moreover, in the case of assembled sets, more extensive tolerance for each component with the same target performance can be utilized.

As basal angiosperm,Liriodendron chinense (Hemsl.) Sarg.and Liriodendron tulipifera Linn.are two species belong to Liriodendron genus.Hybrid yellowpoplar was obtained through crossing between Liriodendron tulipifera Linn.x L.chinense(Hemsl.) Sarg.Although hybrid yellow-poplar was strong in both growth and adaptation,its fruiting rate was as low as its parents.In this study,we profiled the proteome in hybrid yellow-poplar stigma before and after pollination.Comparative analyses of two dimensional gel electrophoresis maps from un-pollinated and pollinated stigmas showed that 30 proteins were increased and 27 proteins decreased after pollination.Functional categorization showed that most of them were metabolism-related,stress response related and protein biosynthesis,degradation and destinationrelated proteins.Also there were some redox-related and cell signaling-related proteins.All these changed proteins might involve in or affect the pollen and stigma interaction in hybrid yellow-poplar.This study will be helpful in understanding the regulation of Liriodendron genus sexual reproduction.

The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.

Full Text Available Abstract Background The analysis of large-scale gene expression data is a fundamental approach to functional genomics and the identification of potential drug targets. Results derived from such studies cannot be trusted unless they are adequately designed and reported. The purpose of this study is to assess current practices on the reporting of experimental design and statisticalanalyses in gene expression-based studies. Methods We reviewed hundreds of MEDLINE-indexed papers involving gene expression data analysis, which were published between 2003 and 2005. These papers were examined on the basis of their reporting of several factors, such as sample size, statistical power and software availability. Results Among the examined papers, we concentrated on 293 papers consisting of applications and new methodologies. These papers did not report approaches to sample size and statistical power estimation. Explicit statements on data transformation and descriptions of the normalisation techniques applied prior to data analyses (e.g. classification were not reported in 57 (37.5% and 104 (68.4% of the methodology papers respectively. With regard to papers presenting biomedical-relevant applications, 41(29.1 % of these papers did not report on data normalisation and 83 (58.9% did not describe the normalisation technique applied. Clustering-based analysis, the t-test and ANOVA represent the most widely applied techniques in microarray data analysis. But remarkably, only 5 (3.5% of the application papers included statements or references to assumption about variance homogeneity for the application of the t-test and ANOVA. There is still a need to promote the reporting of software packages applied or their availability. Conclusion Recently-published gene expression data analysis studies may lack key information required for properly assessing their design quality and potential impact. There is a need for more rigorous reporting of important experimental

Based on quantum statistical mechanics and microscopic quantum dynamics, we prove Planck's and Kelvin's principles for macroscopic systems in a general and realistic setting. We consider a hybrid quantum system that consists of the thermodynamic system, which is initially in thermal equilibrium, and the "apparatus" which operates on the former, and assume that the whole system evolves autonomously. This provides a satisfactory derivation of the second law for macroscopic systems.

Phylogenetic relationships of rabbitfishes (the family Siganidae), ecologically important components as primary consumers in coral reef communities, were studied using mitochondrial cytochrome b gene and nuclear ITS1 (internal transcribed spacer 1) sequence analyses. The analyses of 19 out of 22 species known in the Western Pacific region revealed that siganids are genetically clustered into three major clades, which are characterized by some morphological and ecological traits. Between closely related species, such as Siganus guttatus-S. lineatus and S. virgatus-S. doliatus, and also between two morphs recognized in S. corallinus, small but discernible genetic differentiation was found, implying that the components of each pair are incipient species. On the other hand, between some species, such as S. fuscescens-S. canaliculatus and S. unimaculatus-S.vulpinus, individuals of the components of each pair were found to construct a genetic mosaic, suggesting that the components are genetic color morphs within a single biological species, respectively. Moreover, evidence from morphological characters, mtDNA, and nuclear DNA gave an inconsistent picture of identity and relationships for several individuals. They were regarded as hybrids or individuals with hybrid origin. Such instances were observed not only between closely related species, such as S. guttatus-S. lineatus, S. virgatus-S. doliatus, and two morphs (incipient species) in S. corallinus, respectively, but also between distantly related ones, such as S. corallinus-S. puellus. In fact, more than half of the species examined (11/20, when treating the two morphs in S. corallinus as independent species) were involved in hybridization. These suggest that hybridization is much more prevalent in marine fishes than previously assumed, and may have some relevance to their diversification.

Full Text Available A review on recent studies about monthly and daily rainfall in Catalonia is presented. Monthly rainfall is analysed along the west Mediterranean Coast and in Catalonia, quantifying aspects as the irregularity of monthly amounts and the spatial distribution of the Standard Precipitation Index. Several statistics are applied to daily rainfall series such as their extreme value and intraannual spatial distributions, the variability of the average and standard deviation rain amounts for each month, their amount and time distributions, and time trends affecting four pluviometric indices for different percentiles and class intervals. All these different analyses constitute the continuity of the scientific study of Catalan rainfall, which started about a century ago.

The sexual mating of the pathogenic yeast Cryptococcus neoformans is important for pathogenesis studies because the fungal virulence is linked to the alpha mating type (MAT(alpha)). We characterized C. neoformans mating pheromones (MF(alpha) 1 and MFa1) from 122 strains to understand intervariety hybridization or mating and intervariety virulence. MF(alpha) 1 in three C. neoformans varieties showed (a) specific nucleotide polymorphisms, (b) different copy numbers and chromosomal localizations, and (c) unique deduced amino acids in two geographic populations of C. neoformans var. gattii. MF(alpha) 1 of different varieties cross-hybridized in Southern hybridizations. Their phylogenetic analyses showed purifying selection (neutral evolution). These observations suggested that MAT(alpha) strains from any of the three C. neoformans varieties could mate or hybridize in nature with MATa strains of C. neoformans var. neoformans. A few serotype A/D diploid strains provided evidence for mating or hybridization, while a majority of A/D strains tested positive for haploid MF(alpha) 1 identical to that of C. neoformans var. grubii. MF(alpha) 1 sequence and copy numbers in diploids were identical to those of C. neoformans var. grubii, while their MFa1 sequences were identical to those of C. neoformans var. neoformans; thus, these strains were hybrids. The mice survival curves and histological lesions revealed A/D diploids to be highly pathogenic, with pathogenicity levels similar to that of the C. neoformans var. grubii type strain and unlike the low pathogenicity levels of C. neoformans var. neoformans strains. In contrast to MF(alpha) 1 in three varieties, MFa1 amplicons and hybridization signals could be obtained only from two C. neoformans var. neoformans reference strains and eight A/D diploids. This suggested that a yet undiscovered MFa pheromone(s) in C. neoformans var. gattii and C. neoformans var. grubii is unrelated to, highly divergent from, or rarer than that in C

We present two main results, based on the models and the statisticalanalyses of 1672 U-band flares. We also discuss the behaviours of the white-light flares. In addition, the parameters of the flares detected from two years of observations on CR Dra are presented. By comparing with the flare parameters obtained from other UV Ceti type stars, we examine the behaviour of optical flare processes along the spectral types. Moreover, we aimed, using large white-light flare data,to analyse the flare time-scales in respect to some results obtained from the X-ray observations. Using the SPSS V17.0 and the GraphPad Prism V5.02 software, the flares detected from CR Dra were modelled with the OPEA function and analysed with t-Test method to compare similar flare events in other stars. In addition, using some regression calculations in order to derive the best histograms, the time-scales of the white-light flares were analysed. Firstly, CR Dra flares have revealed that the white-light flares behave in a similar way as th...

Using statistical methods to analyse digital material for patterns makes it possible to detect patterns in big data that we would otherwise not be able to detect. This paper seeks to exemplify this fact by statisticallyanalysing a large corpus of references in systematic reviews. The aim...

Rear-end crash is one of the most common types of traffic crashes in the U.S. A good understanding of its characteristics and contributing factors is of practical importance. Previously, both multinomial Logit models and Bayesian network methods have been used in crash modeling and analysis, respectively, although each of them has its own application restrictions and limitations. In this study, a hybrid approach is developed to combine multinomial logit models and Bayesian network methods for comprehensively analyzing driver injury severities in rear-end crashes based on state-wide crash data collected in New Mexico from 2010 to 2011. A multinomial logit model is developed to investigate and identify significant contributing factors for rear-end crash driver injury severities classified into three categories: no injury, injury, and fatality. Then, the identified significant factors are utilized to establish a Bayesian network to explicitly formulate statistical associations between injury severity outcomes and explanatory attributes, including driver behavior, demographic features, vehicle factors, geometric and environmental characteristics, etc. The test results demonstrate that the proposed hybrid approach performs reasonably well. The Bayesian network reference analyses indicate that the factors including truck-involvement, inferior lighting conditions, windy weather conditions, the number of vehicles involved, etc. could significantly increase driver injury severities in rear-end crashes. The developed methodology and estimation results provide insights for developing effective countermeasures to reduce rear-end crash injury severities and improve traffic system safety performance.

Improving water management in water stressed-regions requires reliable seasonal precipitation predication, which remains a grand challenge. Numerous statistical and dynamical model simulations have been developed for predicting precipitation. However, both types of models offer limited seasonal predictability. This study outlines a hybridstatistical-dynamical modeling framework for predicting seasonal precipitation. The dynamical component relies on the physically based North American Multi-Model Ensemble (NMME) model simulations (99 ensemble members). The statistical component relies on a multivariate Bayesian-based model that relates precipitation to atmosphere-ocean teleconnections (also known as an analog-year statistical model). Here the Pacific Decadal Oscillation (PDO), Multivariate ENSO Index (MEI), and Atlantic Multidecadal Oscillation (AMO) are used in the statistical component. The dynamical and statistical predictions are linked using the so-called Expert Advice algorithm, which offers an ensemble response (as an alternative to the ensemble mean). The latter part leads to the best precipitation prediction based on contributing statistical and dynamical ensembles. It combines the strength of physically based dynamical simulations and the capability of an analog-year model. An application of the framework in the southwestern United States, which has suffered from major droughts over the past decade, improves seasonal precipitation predictions (3-5 month lead time) by 5-60% relative to the NMME simulations. Overall, the hybrid framework performs better in predicting negative precipitation anomalies (10-60% improvement over NMME) than positive precipitation anomalies (5-25% improvement over NMME). The results indicate that the framework would likely improve our ability to predict droughts such as the 2012-2014 event in the western United States that resulted in significant socioeconomic impacts.

With advancements in next-generation sequencing technology, a massive amount of sequencing data is generated, which offers a great opportunity to comprehensively investigate the role of rare variants in the genetic etiology of complex diseases. Nevertheless, the high-dimensional sequencing data poses a great challenge for statistical analysis. The association analyses based on traditional statistical methods suffer substantial power loss because of the low frequency of genetic variants and the extremely high dimensionality of the data. We developed a Weighted U Sequencing test, referred to as WU-SEQ, for the high-dimensional association analysis of sequencing data. Based on a nonparametric U-statistic, WU-SEQ makes no assumption of the underlying disease model and phenotype distribution, and can be applied to a variety of phenotypes. Through simulation studies and an empirical study, we showed that WU-SEQ outperformed a commonly used sequence kernel association test (SKAT) method when the underlying assumptions were violated (e.g., the phenotype followed a heavy-tailed distribution). Even when the assumptions were satisfied, WU-SEQ still attained comparable performance to SKAT. Finally, we applied WU-SEQ to sequencing data from the Dallas Heart Study (DHS), and detected an association between ANGPTL 4 and very low density lipoprotein cholesterol.

Any genome-wide analysis is hampered by reduced statistical power due to multiple comparisons. This is particularly true for interaction analyses, which have lower statistical power than analyses of associations. To assess gene-environment interactions in population settings we have recently proposed a statistical method based on a modified two-step approach, where first genetic loci are selected by their associations with disease and environment, respectively, and subsequently tested for interactions. We have simulated various data sets resembling real world scenarios and compared single-step and two-step approaches with respect to true positive rate (TPR) in 486 scenarios and (study-wide) false positive rate (FPR) in 252 scenarios. Our simulations confirmed that in all two-step methods the two steps are not correlated. In terms of TPR, two-step approaches combining information on gene-disease association and gene-environment association in the first step were superior to all other methods, while preserving a low FPR in over 250 million simulations under the null hypothesis. Our weighted modification yielded the highest power across various degrees of gene-environment association in the controls. An optimal threshold for step 1 depended on the interacting allele frequency and the disease prevalence. In all scenarios, the least powerful method was to proceed directly to an unbiased full interaction model, applying conventional genome-wide significance thresholds. This simulation study confirms the practical advantage of two-step approaches to interaction testing over more conventional one-step designs, at least in the context of dichotomous disease outcomes and other parameters that might apply in real-world settings.

Terrain aspect alters the spatial distribution of insolation across topography, driving eco-pedo-hydro-geomorphic feedbacks that can alter landform evolution and result in valley asymmetries for a suite of land surface characteristics (e.g. slope length and steepness, vegetation, soil properties, and drainage development). Asymmetric valleys serve as natural laboratories for studying how landscapes respond to climate perturbation. In the semi-arid montane granodioritic terrain of the Idaho batholith, Northern Rocky Mountains, USA, prior works indicate that reduced insolation on northern (pole-facing) aspects prolongs snow pack persistence, and is associated with thicker, finer-grained soils, that retain more water, prolong the growing season, support coniferous forest rather than sagebrush steppe ecosystems, stabilize slopes at steeper angles, and produce sparser drainage networks. We hypothesize that the primary drivers of valley asymmetry development are changes in the pedon-scale water-balance that coalesce to alter catchment-scale runoff and drainage development, and ultimately cause the divide between north and south-facing land surfaces to migrate northward. We explore this conceptual framework by coupling land surface analyses with statistical modeling to assess relationships and the relative importance of land surface characteristics. Throughout the Idaho batholith, we systematically mapped and tabulated various statistical measures of landforms, land cover, and hydroclimate within discrete valley segments (n=~10,000). We developed a random forest based statistical model to predict valley slope asymmetry based upon numerous measures (n>300) of landscape asymmetries. Preliminary results suggest that drainages are tightly coupled with hillslopes throughout the region, with drainage-network slope being one of the strongest predictors of land-surface-averaged slope asymmetry. When slope-related statistics are excluded, due to possible autocorrelation, valley

Sandstones are a popular natural stone due to their wide occurrence and availability. The different applications for these stones have led to an increase in demand. From the viewpoint of conservation and the natural stone industry, an understanding of the material behaviour of this construction material is very important. Sandstones are a highly heterogeneous material. Based on statisticalanalyses with a sufficiently large dataset, a systematic approach to predicting the material behaviour should be possible. Since the literature already contains a large volume of data concerning the petrographical and petrophysical properties of sandstones, a large dataset could be compiled for the statisticalanalyses. The aim of this study is to develop constraints on the material behaviour and especially on the weathering behaviour of sandstones. Approximately 300 samples from historical and presently mined natural sandstones in Germany and ones described worldwide were included in the statistical approach. The mineralogical composition and fabric characteristics were determined from detailed thin section analyses and descriptions in the literature. Particular attention was paid to evaluating the compositional and textural maturity, grain contact respectively contact thickness, type of cement, degree of alteration and the intergranular volume. Statistical methods were used to test for normal distributions and calculating the linear regression of the basic petrophysical properties of density, porosity, water uptake as well as the strength. The sandstones were classified into three different pore size distributions and evaluated with the other petrophysical properties. Weathering behavior like hygric swelling and salt loading tests were also included. To identify similarities between individual sandstones or to define groups of specific sandstone types, principle component analysis, cluster analysis and factor analysis were applied. Our results show that composition and porosity

distribution to describe counts of a given species in a particular region and season. 4. Using a large database of historical at-sea seabird survey data, we applied this technique to identify appropriate statistical distributions for modeling a variety of species, allowing the distribution to vary by season. For each species and season, we used the selected distribution to calculate and map retrospective statistical power to detect hotspots and coldspots, and map pvalues from Monte Carlo significance tests of hotspots and coldspots, in discrete lease blocks designated by the U.S. Department of Interior, Bureau of Ocean Energy Management (BOEM). 5. Because our definition of hotspots and coldspots does not explicitly include variability over time, we examine the relationship between the temporal scale of sampling and the proportion of variance captured in time series of key environmental correlates of marine bird abundance, as well as available marine bird abundance time series, and use these analyses to develop recommendations for the temporal distribution of sampling to adequately represent both shortterm and long-term variability. We conclude by presenting a schematic “decision tree” showing how this power analysis approach would fit in a general framework for avian survey design, and discuss implications of model assumptions and results. We discuss avenues for future development of this work, and recommendations for practical implementation in the context of siting and wildlife assessment for offshore renewable energy development projects.

Full Text Available Abstract Background Data requirements by governments, donors and the international community to measure health and development achievements have increased in the last decade. Datasets produced in surveys conducted in several countries and years are often combined to analyse time trends and geographical patterns of demographic and health related indicators. However, since not all datasets have the same structure, variables definitions and codes, they have to be harmonised prior to submitting them to the statisticalanalyses. Manually searching, renaming and recoding variables are extremely tedious and prone to errors tasks, overall when the number of datasets and variables are large. This article presents an automated approach to harmonise variables names across several datasets, which optimises the search of variables, minimises manual inputs and reduces the risk of error. Results Three consecutive algorithms are applied iteratively to search for each variable of interest for the analyses in all datasets. The first search (A captures particular cases that could not be solved in an automated way in the search iterations; the second search (B is run if search A produced no hits and identifies variables the labels of which contain certain key terms defined by the user. If this search produces no hits, a third one (C is run to retrieve variables which have been identified in other surveys, as an illustration. For each variable of interest, the outputs of these engines can be (O1 a single best matching variable is found, (O2 more than one matching variable is found or (O3 not matching variables are found. Output O2 is solved by user judgement. Examples using four variables are presented showing that the searches have a 100% sensitivity and specificity after a second iteration. Conclusion Efficient and tested automated algorithms should be used to support the harmonisation process needed to analyse multiple datasets. This is especially relevant when

Background Data requirements by governments, donors and the international community to measure health and development achievements have increased in the last decade. Datasets produced in surveys conducted in several countries and years are often combined to analyse time trends and geographical patterns of demographic and health related indicators. However, since not all datasets have the same structure, variables definitions and codes, they have to be harmonised prior to submitting them to the statisticalanalyses. Manually searching, renaming and recoding variables are extremely tedious and prone to errors tasks, overall when the number of datasets and variables are large. This article presents an automated approach to harmonise variables names across several datasets, which optimises the search of variables, minimises manual inputs and reduces the risk of error. Results Three consecutive algorithms are applied iteratively to search for each variable of interest for the analyses in all datasets. The first search (A) captures particular cases that could not be solved in an automated way in the search iterations; the second search (B) is run if search A produced no hits and identifies variables the labels of which contain certain key terms defined by the user. If this search produces no hits, a third one (C) is run to retrieve variables which have been identified in other surveys, as an illustration. For each variable of interest, the outputs of these engines can be (O1) a single best matching variable is found, (O2) more than one matching variable is found or (O3) not matching variables are found. Output O2 is solved by user judgement. Examples using four variables are presented showing that the searches have a 100% sensitivity and specificity after a second iteration. Conclusion Efficient and tested automated algorithms should be used to support the harmonisation process needed to analyse multiple datasets. This is especially relevant when the numbers of datasets

Full Text Available Medical image segmentation is the most essential and crucial process in order to facilitate the characterization and visualization of the structure of interest in medical images. Relevant application in neuroradiology is the segmentation of MRI data sets of the human brain into the structure classes gray matter, white matter and cerebrospinal fluid (CSF and tumor. In this paper, brain image segmentation algorithms such as Fuzzy C means (FCM segmentation and Kohonen means(K means segmentation were implemented. In addition to this, new hybrid segmentation technique, namely, Fuzzy Kohonen means of image segmentation based on statistical feature clustering is proposed and implemented along with standard pixel value clustering method. The clustered segmented tissue images are compared with the Ground truth and its performance metric is also found. It is found that the feature based hybrid segmentation gives improved performance metric and improved classification accuracy rather than pixel based segmentation.

The diversity of immunoglobulin (IG) and T cell receptor (TR) chains depends on several mechanisms: combinatorial diversity, which is a consequence of the number of V, D and J genes and the N-REGION diversity, which creates an extensive and clonal somatic diversity at the V-J and V-D-J junctions. For the IG, the diversity is further increased by somatic hypermutations. The number of different junctions per chain and per individual is estimated to be 10(12). We have chosen the human TRAV-TRAJ junctions as an example in order to characterize the required criteria for a standardized analysis of the IG and TR V-J and V-D-J junctions, based on the IMGT-ONTOLOGY concepts, and to serve as a first IMGT junction reference set (IMGT, http://imgt.cines.fr). We performed a thorough statistical analysis of 212 human rearranged TRAV-TRAJ sequences, which were aligned and analysed by the integrated IMGT/V-QUEST software, which includes IMGT/JunctionAnalysis, then manually expert-verified. Furthermore, we compared these 212 sequences with 37 other human TRAV-TRAJ junction sequences for which some particularities (potential sequence polymorphisms, sequencing errors, etc.) did not allow IMGT/JunctionAnalysis to provide the correct biological results, according to expert verification. Using statistical learning, we constructed an automatic warning system to predict if new, automatically analysed TRAV-TRAJ sequences should be manually re-checked. We estimated the robustness of this automatic warning system.

Sessile drop experiments on horizontal surfaces are commonly used to characterise surface properties in science and in industry. The advancing angle and the receding angle are measurable on every solid. Specially on horizontal surfaces even the notions themselves are critically questioned by some authors. Building a standard, reproducible and valid method of measuring and defining specific (advancing/receding) contact angles is an important challenge of surface science. Recently we have developed two/three approaches, by sigmoid fitting, by independent and by dependent statisticalanalyses, which are practicable for the determination of specific angles/slopes if inclining the sample surface. These approaches lead to contact angle data which are independent on "user-skills" and subjectivity of the operator which is also of urgent need to evaluate dynamic measurements of contact angles. We will show in this contribution that the slightly modified procedures are also applicable to find specific angles for experiments on horizontal surfaces. As an example droplets on a flat freshly cleaned silicon-oxide surface (wafer) are dynamically measured by sessile drop technique while the volume of the liquid is increased/decreased. The triple points, the time, the contact angles during the advancing and the receding of the drop obtained by high-precision drop shape analysis are statisticallyanalysed. As stated in the previous contribution the procedure is called "slow movement" analysis due to the small covered distance and the dominance of data points with low velocity. Even smallest variations in velocity such as the minimal advancing motion during the withdrawing of the liquid are identifiable which confirms the flatness and the chemical homogeneity of the sample surface and the high sensitivity of the presented approaches.

I present here a review of past and present multi-disciplinary research of the Pittsburgh Computational AstroStatistics (PiCA) group. This group is dedicated to developing fast and efficient statistical algorithms for analysing huge astronomical data sources. I begin with a short review of multi-resolutional kd-trees which are the building blocks for many of our algorithms. For example, quick range queries and fast n-point correlation functions. I will present new results from the use of Mixture Models (Connolly et al. 2000) in density estimation of multi-color data from the Sloan Digital Sky Survey (SDSS). Specifically, the selection of quasars and the automated identification of X-ray sources. I will also present a brief overview of the False Discovery Rate (FDR) procedure (Miller et al. 2001a) and show how it has been used in the detection of ``Baryon Wiggles'' in the local galaxy power spectrum and source identification in radio data. Finally, I will look forward to new research on an automated Bayes Netw...

Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10-50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at "Level III", e.g. detailed, while the two others were sorted only at "Level I"). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3-4 kg per person per week. Statisticalanalyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single-family and multi-family house areas), the individual percentage composition of food waste, paper, and glass was significantly different between the housing types. This indicates that housing type is a critical stratification parameter. Separating food leftovers from food packaging during manual sorting of the sampled waste did not have significant influence on the proportions of food waste

This article describes the utilisation of an unsupervised machine learning technique and statistical approaches (e.g., the Kolmogorov-Smirnov test) that assist cycling experts in the crucial decision-making processes for athlete selection, training, and strategic planning in the track cycling Omnium. The Omnium is a multi-event competition that will be included in the summer Olympic Games for the first time in 2012. Presently, selectors and cycling coaches make decisions based on experience and intuition. They rarely have access to objective data. We analysed both the old five-event (first raced internationally in 2007) and new six-event (first raced internationally in 2011) Omniums and found that the addition of the elimination race component to the Omnium has, contrary to expectations, not favoured track endurance riders. We analysed the Omnium data and also determined the inter-relationships between different individual events as well as between those events and the final standings of riders. In further analysis, we found that there is no maximum ranking (poorest performance) in each individual event that riders can afford whilst still winning a medal. We also found the required times for riders to finish the timed components that are necessary for medal winning. The results of this study consider the scoring system of the Omnium and inform decision-making toward successful participation in future major Omnium competitions.

Many global and regional forest cover products have recently become available. The most advanced and comprehensive of these include the global land cover datasets (GLC2000, MODIS, GLOBCOVER), MODIS Vegetation Continuous Fields (VCF), LANDSAT based (e.g. Sexton et al., 2013) and radar based (e.g. Saatchi et al., 2010; Baccini et al., 2012; Santoro et al., 2012) products. However, they often contradict each other and are typically inconsistent with forest statistics. In particular, global land cover datasets contradict each other in many areas, have limited information about forest density and are not consistent with forest statistics. VCF most likely provides the most comprehensive information about forest density with a spatial resolution of 230m during 2000-2010. However when observing VCF dynamics for individual pixels, one can see variation that cannot be explained by forest cover dynamics, but instead by unstable pixel geometry and clouds. Landsat based products also suffer from cloud cover and cannot recognize sparse forest with canopy closure of 30% or less. Space-based radar is free from cloud, but still cannot reliably delineate areas as forest/non forest (Santoro, 2012). We compare all of the above mentioned remote sensing products with a sample of high resolution imagery provided by Google Earth. We have applied the crowd sourcing platform Geo-Wiki (Fritz et al., 2010, 2012) to collect 22K training points where the percentage of forest cover was estimated for a 1km pixel size. We applied the method of geographically weighted regression to calculate the map of probability of forest cover and the map of forest share. This involved the use of the Geo-Wiki training points in combination with the land cover products, MODIS VCF and LANDSAT. The synergy of remote sensing, statistics and crowd sourcing approaches was investigated to better understand the spatial distribution of forests. Both calibrated (using FAO FRA statistics) and non-calibrated ('best guess

Contact angle determination by sessile drop technique is essential to characterise surface properties in science and in industry. Different specific angles can be observed on every solid which are correlated with the advancing or the receding of the triple line. Different procedures and definitions for the determination of specific angles exist which are often not comprehensible or reproducible. Therefore one of the most important things in this area is to build standard, reproducible and valid methods for determining advancing/receding contact angles. This contribution introduces novel techniques to analyse dynamic contact angle measurements (sessile drop) in detail which are applicable for axisymmetric and non-axisymmetric drops. Not only the recently presented fit solution by sigmoid function and the independent analysis of the different parameters (inclination, contact angle, velocity of the triple point) but also the dependent analysis will be firstly explained in detail. These approaches lead to contact angle data and different access on specific contact angles which are independent from "user-skills" and subjectivity of the operator. As example the motion behaviour of droplets on flat silicon-oxide surfaces after different surface treatments is dynamically measured by sessile drop technique when inclining the sample plate. The triple points, the inclination angles, the downhill (advancing motion) and the uphill angles (receding motion) obtained by high-precision drop shape analysis are independently and dependently statisticallyanalysed. Due to the small covered distance for the dependent analysis (static to the "slow moving" dynamic contact angle determination. They are characterised by small deviations of the computed values. Additional to the detailed introduction of this novel analytical approaches plus fit solution special motion relations for the drop on inclined surfaces and detailed relations about the reactivity of the freshly cleaned silicon wafer

Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statisticalanalyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

Full Text Available Abstract Background Current tools for Co-phylogenetic analyses are not able to cope with the continuous accumulation of phylogenetic data. The sophisticated statistical test for host-parasite co-phylogenetic analyses implemented in Parafit does not allow it to handle large datasets in reasonable times. The Parafit and DistPCoA programs are the by far most compute-intensive components of the Parafit analysis pipeline. We present AxParafit and AxPcoords (Ax stands for Accelerated which are highly optimized versions of Parafit and DistPCoA respectively. Results Both programs have been entirely re-written in C. Via optimization of the algorithm and the C code as well as integration of highly tuned BLAS and LAPACK methods AxParafit runs 5–61 times faster than Parafit with a lower memory footprint (up to 35% reduction while the performance benefit increases with growing dataset size. The MPI-based parallel implementation of AxParafit shows good scalability on up to 128 processors, even on medium-sized datasets. The parallel analysis with AxParafit on 128 CPUs for a medium-sized dataset with an 512 by 512 association matrix is more than 1,200/128 times faster per processor than the sequential Parafit run. AxPcoords is 8–26 times faster than DistPCoA and numerically stable on large datasets. We outline the substantial benefits of using parallel AxParafit by example of a large-scale empirical study on smut fungi and their host plants. To the best of our knowledge, this study represents the largest co-phylogenetic analysis to date. Conclusion The highly efficient AxPcoords and AxParafit programs allow for large-scale co-phylogenetic analyses on several thousands of taxa for the first time. In addition, AxParafit and AxPcoords have been integrated into the easy-to-use CopyCat tool.

An S-matrix formalism of the statistical theory of nuclear reactions has been developed by Weidenmuller et al., based upon the Engelbrecht-Weidenmuller transformation and extended to cases where direct reactions are present as a means of deriving expressions for the fluctuation cross section going beyond the framework of conventional Hauser-Feshbach theory. This unified approach, from which a coherent sum of fluctuation and direct-interaction cross sections is combined to yield a net reaction cross section, provides a means of deriving a comprehensive and accurate theoretical description of the scattering process. Although a framework for the formal theory has been constructed, it had not previously been applied to the qualitative analyses of scattering data. As described in this thesis, a computer program "NANCY" has been compiled by modifying Tamura's coupled -channels code "JUPITOR-1" (through modifications suggested by Moldauer) and incorporating Smith's optical model routine "SCAT", as a means of generating the entire symmetric S -matrix. Using this program, computations were undertaken to determine numerically the energy-averaged cross sections for inelastic neutron scattering on ('232)Th and ('238)U from threshold to several MeV. With appropriate variation of coupling strengths between the ground state rotational band and vibrational levels good fits to the experimental data were attained, which compared favorably with theoretical results generated from conventional approaches.

It has been well established that the architecture of chromatin in cell nuclei is not random but functionally correlated. Chromatin damage caused by ionizing radiation raises complex repair machineries. This is accompanied by local chromatin rearrangements and structural changes which may for instance improve the accessibility of damaged sites for repair protein complexes. Using stably transfected HeLa cells expressing either green fluorescent protein (GFP) labelled histone H2B or yellow fluorescent protein (YFP) labelled histone H2A, we investigated the positioning of individual histone proteins in cell nuclei by means of high resolution localization microscopy (Spectral Position Determination Microscopy = SPDM). The cells were exposed to ionizing radiation of different doses and aliquots were fixed after different repair times for SPDM imaging. In addition to the repair dependent histone protein pattern, the positioning of antibodies specific for heterochromatin and euchromatin was separately recorded by SPDM. The present paper aims to provide a quantitative description of structural changes of chromatin after irradiation and during repair. It introduces a novel approach to analyse SPDM images by means of statistical physics and graph theory. The method is based on the calculation of the radial distribution functions as well as edge length distributions for graphs defined by a triangulation of the marker positions. The obtained results show that through the cell nucleus the different chromatin re-arrangements as detected by the fluorescent nucleosomal pattern average themselves. In contrast heterochromatic regions alone indicate a relaxation after radiation exposure and re-condensation during repair whereas euchromatin seemed to be unaffected or behave contrarily. SPDM in combination with the analysis techniques applied allows the systematic elucidation of chromatin re-arrangements after irradiation and during repair, if selected sub-regions of nuclei are

Full Text Available Due to a boom in the dairy industry in Northeast China, the hay industry has been developing rapidly. Thus, it is very important to evaluate the hay quality with a rapid and accurate method. In this research, a novel technique that combines near infrared spectroscopy (NIRs with three different statisticalanalyses (MLR, PCR and PLS was used to predict the chemical quality of sheepgrass (Leymus chinensis in Heilongjiang Province, China including the concentrations of crude protein (CP, acid detergent fiber (ADF, and neutral detergent fiber (NDF. Firstly, the linear partial least squares regression (PLS was performed on the spectra and the predictions were compared to those with laboratory-based recorded spectra. Then, the MLR evaluation method for CP has a potential to be used for industry requirements, as it needs less sophisticated and cheaper instrumentation using only a few wavelengths. Results show that in terms of CP, ADF and NDF, (i the prediction accuracy in terms of CP, ADF and NDF using PLS was obviously improved compared to the PCR algorithm, and comparable or even better than results generated using the MLR algorithm; (ii the predictions were worse compared to laboratory-based spectra with the MLR algorithmin, and poor predictions were obtained (R2, 0.62, RPD, 0.9 using MLR in terms of NDF; (iii a satisfactory accuracy with R2 and RPD by PLS method of 0.91, 3.2 for CP, 0.89, 3.1 for ADF and 0.88, 3.0 for NDF, respectively, was obtained. Our results highlight the use of the combined NIRs-PLS method could be applied as a valuable technique to rapidly and accurately evaluate the quality of sheepgrass hay.

Ideally meta-analyses (MAs) should consolidate the characteristics of orthodontic research in order to produce an evidence-based answer. However severe flaws are frequently observed in most of them. The aim of this study was to evaluate the statistical methods, the methodology, and the quality characteristics of orthodontic MAs and to assess their reporting quality during the last years. Electronic databases were searched for MAs (with or without a proper systematic review) in the field of orthodontics, indexed up to 2011. The AMSTAR tool was used for quality assessment of the included articles. Data were analyzed with Student's t-test, one-way ANOVA, and generalized linear modelling. Risk ratios with 95% confidence intervals were calculated to represent changes during the years in reporting of key items associated with quality. A total of 80 MAs with 1086 primary studies were included in this evaluation. Using the AMSTAR tool, 25 (27.3%) of the MAs were found to be of low quality, 37 (46.3%) of medium quality, and 18 (22.5%) of high quality. Specific characteristics like explicit protocol definition, extensive searches, and quality assessment of included trials were associated with a higher AMSTAR score. Model selection and dealing with heterogeneity or publication bias were often problematic in the identified reviews. The number of published orthodontic MAs is constantly increasing, while their overall quality is considered to range from low to medium. Although the number of MAs of medium and high level seems lately to rise, several other aspects need improvement to increase their overall quality.

number of zeros. Using QQ plots these data characteristics show a lack of normality from the data after contamination. Normality is improved when looking at log(CFU/cm2). Variance component analysis (VCA) and analysis of variance (ANOVA) were used to estimate the amount of variance due to each source and to determine which sources of variability were statistically significant. In general, the sampling methods interacted with the across event variability and with the across room variability. For this reason, it was decided to do analyses for each sampling method, individually. The between event variability and between room variability were significant for each method, except for the between event variability for the swabs. For both the wipes and vacuums, the within room standard deviation was much larger (26.9 for wipes and 7.086 for vacuums) than the between event standard deviation (6.552 for wipes and 1.348 for vacuums) and the between room standard deviation (6.783 for wipes and 1.040 for vacuums). Swabs between room standard deviation was 0.151, while both the within room and between event standard deviations are less than 0.10 (all measurements in CFU/cm2).

Many common modalities of medical images acquire high-resolution and multispectral images, which are subsequently processed, visualized, and transmitted by subsampling. These subsampled images compromise resolution for processing ability, thus risking loss of significant diagnostic information. A hybrid multiresolution vector quantizer (HMVQ) has been developed exploiting the statistical characteristics of the features in a multiresolution wavelet-transformed domain. The global codebook generated by HMVQ, using a combination of multiresolution vector quantization and residual scalar encoding, retains edge information better and avoids significant blurring observed in reconstructed medical images by other well-known encoding schemes at low bit rates. Two specific image modalities, namely, X-ray radiographic and magnetic resonance imaging (MRI), have been considered as examples. The ability of HMVQ in reconstructing high-fidelity images at low bit rates makes it particularly desirable for medical image encoding and fast transmission of 3D medical images generated from multiview stereo pairs for visual communications.

We present the hybrid opacity code SCO-RCG which combines statistical approaches with fine-structure calculations. Radial integrals needed for the computation of detailed transition arrays are calculated by the code SCO (Super-configuration Code for Opacity), which calculates atomic structure at finite temperature and density, taking into account plasma effects on the wave-functions. Levels and spectral lines are then computed by an adapted RCG routine of R. D. Cowan. SCO-RCG now includes the Partially Resolved Transition Array model, which allows one to replace a complex transition array by a small-scale detailed calculation preserving energy and variance of the genuine transition array and yielding improved high-order moments. An approximate method for studying the impact of strong magnetic field on opacity and emissivity was also recently implemented.

As laws tighten to limit commercial ivory trading and protect threatened species like whales and elephants, increased sales of fake ivory products have become widespread. This study describes a method, handheld X-ray fluorescence (XRF) as a noninvasive technique for elemental analysis, to differentiate quickly between ivory (Asian and African elephant, mammoth) from non-ivory (bones, teeth, antler, horn, wood, synthetic resin, rock) materials. An equation consisting of 20 elements and light elements from a stepwise discriminant analysis was used to classify samples, followed by Bayesian binary regression to determine the probability of a sample being 'ivory', with complementary log log analysis to identify the best fit model for this purpose. This Bayesian hybrid classification model was 93% accurate with 92% precision in discriminating ivory from non-ivory materials. The method was then validated by scanning an additional ivory and non-ivory samples, correctly identifying bone as not ivory with >95% accuracy, except elephant bone, which was 72%. It was less accurate for wood and rock (25-85%); however, a preliminary screening to determine if samples are not Ca-dominant could eliminate inorganic materials. In conclusion, elemental analyses by XRF can be used to identify several forms of fake ivory samples, which could have forensic application.

This introductory textbook for business statistics teaches statistical analysis and research methods via business case studies and financial data using Excel, MINITAB, and SAS. Every chapter in this textbook engages the reader with data of individual stock, stock indices, options, and futures. One studies and uses statistics to learn how to study, analyze, and understand a data set of particular interest. Some of the more popular statistical programs that have been developed to use statistical and computational methods to analyze data sets are SAS, SPSS, and MINITAB. Of those, we look at MINITAB and SAS in this textbook. One of the main reasons to use MINITAB is that it is the easiest to use among the popular statistical programs. We look at SAS because it is the leading statistical package used in industry. We also utilize the much less costly and ubiquitous Microsoft Excel to do statistical analysis, as the benefits of Excel have become widely recognized in the academic world and its analytical capabilities...

This report is an assessment of the current model and presentation form of bio energy statistics. It appears proposed revision and enhancement of both collection and data representation. In the context of market development both in general for energy and particularly for bio energy and government targets, a good bio energy statistics form the basis to follow up the objectives and means.(eb)

Methods of lossless compression of medical image data are considered. Selected class of efficient algorithms have been constructed, examined, and optimized to conclude the most useful tools for medical image archiving and transmission. Image data scanning, 2D context-based prediction and interpolation, and statistical models of entropy coder have been optimized to compress effectively ultrasound (US), magnetic resonance (MR), and computed tomography (CT) images. The SSM technique of suitable data decomposing scanning method followed by probabilistic modeling of the context in arithmetic encoding have occurred the most useful in our experiments. Context order, shape, and alphabet have been fitted to local data characteristics to decrease image data correlation and dilution of statistical model. Average bit rate value over test images is equal to 2.53 bpp for SSM coder and significantly overcomes 2.92 bpp of CALIC bit rate. Moreover, optimization of lossless wavelet coder by thinking of efficient subband decomposition schemes, and integer-to-integer transforms is reported. Efficient hybrid coding method (SHEC) as a complete tool for medical image archiving and transmission is proposed. SHEC develops SSM by including CALIC-like coder to compress the highest quality images and JPEG2000 wavelet coder for progressive delivering of high and middle quality images in telemedicine systems.

In this study, a HybridStatistical Narrow Band (HSNB) model is implemented to make fast and accurate predictions of radiative transfer effects on hypersonic entry flows. The HSNB model combines a Statistical Narrow Band (SNB) model for optically thick molecular systems, a box model for optically thin molecular systems and continua, and a Line-By-Line (LBL) description of atomic radiation. Radiative transfer calculations are coupled to a 1D stagnation-line flow model under thermal and chemical nonequilibrium. Earth entry conditions corresponding to the FIRE 2 experiment, as well as Titan entry conditions corresponding to the Huygens probe, are considered in this work. Thermal nonequilibrium is described by a two temperature model, although non-Boltzmann distributions of electronic levels provided by a Quasi-Steady State model are also considered for radiative transfer. For all the studied configurations, radiative transfer effects on the flow, the plasma chemistry and the total heat flux at the wall are analyzed in detail. The HSNB model is shown to reproduce LBL results with an accuracy better than 5% and a speed up of the computational time around two orders of magnitude. Concerning molecular radiation, the HSNB model provides a significant improvement in accuracy compared to the Smeared-Rotational-Band model, especially for Titan entries dominated by optically thick CN radiation.

The hybrid between olive flounder Paralichthys olivaceus and stone flounder Kareius bicoloratus was produced by artificial insemination of olive flounder eggs with stone flounder sperm. Sinistral and dextral are two types of hybrid progeny after metamorphosis. Karyotypes of both hybrid flounders are the same as those of the two parental species. Of the 22 loci examined from 12 allozymes, 12 confirmed hybridization of the paternal and maternal loci in hybrids and no difference was found in allozyme patterns of sinistral and dextral hybrid fishes. RAPD patterns of these specimens were also studied with 38 primers selected from 104 tested. Among them, the PCR products of 30 primers showed hybridization of the paternal and maternal bands. Genetic variation between hybrids and their parental stocks was analyzed by RAPD using 10 of the above 38 primers. The average heterozygosity and genetic distance were calculated. The results suggested that the filial generation could inherit a little more genetic materials from paternal fish than that from maternal fish.

The hybrid between olive flounder Paralichthys olivaceus and stone flounder Kareius bicoloratus was produced by artificial insemination of olive flounder eggs with stone flounder Sperm. Sinistral and dextral are two types of hybrid progeny after metamorphosis. Karyotypes of both hybrid flounders are the same as those of the two parental species. Of the 22 loci examined from 12 allozymes,12 confirmed hybridization of the paternal and matemal loci in hybrids and no difference was found in allozyme patterns of sinistral and dextral hybrid fishes. RAPD patterns of these specimens were also studied with 38 primers selected from 104 tested. Among them, the PCR products of 30 primers showed hybridization of the paternal and maternal bands. Genetic variation between hybrids and their parental stocks was analyzed by RAPD using 10 of the above 38 primers. The average heterozygosity and genetic distance were calculated. The results suggested that the filial generation could inherit a little more genetic materials from paternal fish than that from maternal fish.

In this paper, we extend the analysis of hybrid fossil-geothermal power plants to compound systems which combine the features of the two previously analyzed hybrid plants, the geothermal preheat and the fossil superheat systems. Compound systems of the one- and two-stage type are considered. A complete summary of formulae to assess the performance of the plants is included for completeness. From the viewpoint of thermodynamics, compound hybrid plants are superior to individual all-geothermal and all-fossil plants, and have certain advantages over basic geothermal-preheat and fossil-superheat hybrid plants. The flexibility of compound hybrid systems is illustrated by showing how such plants might be used at several geothermal sites in the western United States.

Part II presents a statistical model devised by the authors for evaluating toxicological analyses results. The model includes: 1. Establishment of a reference value, basing on our own measurements taken by two independent analytical methods. 2. Selection of laboratories -- basing on the deviation of the obtained values from reference ones. 3. On consideration of variance analysis, t-student's test and differences test, subsequent quality controls and particular laboratories have been evaluated.

The Department of Energy's (DOE) Office of FreedomCAR (Cooperative Automotive Research) and Vehicle Technologies office has a strong interest in making rapid progress in permanent magnet (PM) machine development. The DOE FreedomCAR program is directing various technology development projects that will advance the technology and hopefully lead to a near-term request for proposals (RFP) for a to-be-determined level of initial production. This aggressive approach is possible because the technology is clearly within reach and the approach is deemed essential, based on strong market demand, escalating fuel prices, and competitive considerations. In response, this study began parallel development paths that included a literature search/review, development and utilization of multiple parametric models, verification of the modeling methodology, development of an interior PM (IPM) machine baseline design, development of alternative machine baseline designs, and cost analyses for several candidate machines. This report summarizes the results of these activities as of September 2004. This report provides background and summary information for recent machine parametric studies and testing programs that demonstrate both the potential capabilities and technical limitations of brushless PM machines (axial gap and radial gap), the IPM machine, the surface-mount PM machines (interior or exterior rotor), induction machines, and switched-reluctance machines. The FreedomCAR program, while acknowledging the progress made by Oak Ridge National Laboratory (ORNL), Delphi, Delco-Remy International, and others in these programs, has redirected efforts toward a ''short path'' to a marketable and competitive PM motor for hybrid electric vehicle (HEV) traction applications. The program has developed a set of performance targets for the type of traction machine desired. The short-path approach entails a comprehensive design effort focusing on the IPM machine and meeting

The Department of Energy's (DOE) Office of FreedomCAR (Cooperative Automotive Research) and Vehicle Technologies has a strong interest in making rapid progress in permanent magnet (PM) machine development. The program is directing various technology development projects that will advance the technology and lead to request for proposals (RFP) for manufacturer prototypes. This aggressive approach is possible because the technology is clearly within reach and the approach is deemed essential, based on strong market demand, escalating fuel prices, and competitive considerations. In response, this study began parallel development paths that included a literature search/review, development and utilization of multiple parametric models to determine the effects of design parameters, verification of the modeling methodology, development of an interior PM (IPM) machine baseline design, development of alternative machine baseline designs, and cost analyses for several candidate machines. This interim progress report summarizes the results of these activities as of June 2004. This report provides background and summary information for recent machine parametric studies and testing programs that demonstrate both the potential capabilities and technical limitations of brushless PM machines (axial gap and radial gap), the IPM machine, the surface-mount PM machines (interior or exterior rotor), induction machines, and switched reluctance machines. The FreedomCAR program, while acknowledging the progress made by Oak Ridge National Laboratory, Delphi, Delco-Remy International, and others in these programs, has redirected efforts toward a ''short path'' to a marketable and competitive PM motor for hybrid electric vehicle traction applications. The program has developed a set of performance targets for the type of traction machine desired. The short-path approach entails a comprehensive design effort focusing on the IPM machine and meeting the performance

This Ph.D. thesis deals with mathematical and statistical modeling of synaptic vesicle distribution, shape, orientation and interactions. The first major part of this thesis treats the problem of determining the effect of stress on synaptic vesicle distribution and interactions. Serial section...... on differences of statistical measures in section and the same measures in between sections. Three-dimensional (3D) datasets are reconstructed by using image registration techniques and estimated thicknesses. We distinguish the effect of stress by estimating the synaptic vesicle densities and modeling......, which leads to more accurate results. Finally, we present a thorough statistical investigation of the shape, orientation and interactions of the synaptic vesicles during active time of the synapse. Focused ion beam-scanning electron microscopy images of a male mammalian brain are used for this study...

Heterogeneity in diagnostic meta-analyses is common because of the observational nature of diagnostic studies and the lack of standardization in the positivity criterion (cut-off value) for some tests. So far the unexplained heterogeneity across studies has been quantified by either using the I(2) statistic for a single parameter (i.e. either the sensitivity or the specificity) or visually examining the data in a receiver-operating characteristic space. In this paper, we derive improved I(2) statistics measuring heterogeneity for dichotomous outcomes, with a focus on diagnostic tests. We show that the currently used estimate of the 'typical' within-study variance proposed by Higgins and Thompson is not able to properly account for the variability of the within-study variance across studies for dichotomous variables. Therefore, when the between-study variance is large, the 'typical' within-study variance underestimates the expected within-study variance, and the corresponding I(2) is overestimated. We propose to use the expected value of the within-study variation in the construction of I(2) in cases of univariate and bivariate diagnostic meta-analyses. For bivariate diagnostic meta-analyses, we derive a bivariate version of I(2) that is able to account for the correlation between sensitivity and specificity. We illustrate the performance of these new estimators using simulated data as well as two real data sets.

Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked.

Sampling error refers to variability that is unique to the sample. If the sample is the entire population, then there is no sampling error. A related point is that sampling error is a function of sample size, as a hypothetical example illustrates. As the sample statistics more and more closely approximate the population parameters, the sampling…

Full Text Available Gossypium anomalum represents an inestimable source of genes that could potentially be transferred into the gene pool of cultivated cotton. To resolve interspecific hybrid sterility problems, we previously treated triploid hybrids derived from a cross between Gossypium hirsutum and G. anomalum with 0.15% colchicine and obtained a putative fertile hexaploid. In this study, we performed morphological, molecular and cytological analyses to assess the hybridity and doubled status of putative interspecific hybrid plants. Most of the morphological characteristics of the putative hexaploid plants were intermediate between G. hirsutum and G. anomalum. Analysis of mitotic metaphase plates revealed 78 chromosomes, confirming the doubled hybrid status of the hexaploid. Genome-wide molecular analysis with different genome-derived SSR markers revealed a high level of polymorphism (96.6% between G. hirsutum and G. anomalum. The marker transferability rate from other species to G. anomalum was as high as 98.0%. The high percentage of polymorphic markers with additive banding profiles in the hexaploid indicates the hybridity of the hexaploid on a genome-wide level. A-genome-derived markers were more powerful for distinguishing the genomic differences between G. hirsutum and G. anomalum than D-genome-derived markers. This study demonstrates the hybridity and chromosomally doubled status of the (G. anomalum × G. hirsutum2 hexaploid using morphological, cytological and molecular marker methods. The informative SSR markers screened in the study will be useful marker resources for tracking the flow of G. anomalum genetic material among progenies that may be produced by future backcrosses to G. hirsutum.

Gossypium anomalum represents an inestimable source of genes that could potentially be transferred into the gene pool of cultivated cotton. To resolve interspecific hybrid sterility problems, we previously treated triploid hybrids derived from a cross between Gossypium hirsutum and G. anomalum with 0.15% colchicine and obtained a putative fertile hexaploid. In this study, we performed morphological, molecular and cytological analyses to assess the hybridity and doubled status of putative interspecific hybrid plants. Most of the morphological characteristics of the putative hexaploid plants were intermediate between G. hirsutum and G.anomalum. Analysis of mitotic metaphase plates revealed 78 chromosomes, confirming the doubled hybrid status of the hexaploid. Genome-wide molecular analysis with different genome-derived SSR markers revealed a high level of polymorphism(96.6%) between G. hirsutum and G. anomalum. The marker transferability rate from other species to G. anomalum was as high as 98.0%. The high percentage of polymorphic markers with additive banding profiles in the hexaploid indicates the hybridity of the hexaploid on a genome-wide level. A-genome-derived markers were more powerful for distinguishing the genomic differences between G. hirsutum and G. anomalum than D-genome-derived markers. This study demonstrates the hybridity and chromosomally doubled status of the(G. anomalum × G. hirsutum)2hexaploid using morphological, cytological and molecular marker methods. The informative SSR markers screened in the study will be useful marker resources for tracking the flow of G. anomalum genetic material among progenies that may be produced by future backcrosses to G. hirsutum.

textabstractUsing the standard linear model as a base, a unified theory of Bayesian Analyses of Cointegration Models is constructed. This is achieved by defining (natural conjugate) priors in the linear model and using the implied priors for the cointegration model. Using these priors, posterior res

Investigates two aspects of research analyses in quantitative research studies reported in the 1996 issues of "Journal of Counseling & Development" (JCD). Acceptable methodological practice regarding significance testing and evaluation of score reliability has evolved considerably. Contemporary thinking on these issues is described; practice as…

We comment on Sijtsma's (2014) thought-provoking essay on how to minimize questionable research practices (QRPs) in psychology. We agree with Sijtsma that proactive measures to decrease the risk of QRPs will ultimately be more productive than efforts to target individual researchers and their work. In particular, we concur that encouraging researchers to make their data and research materials public is the best institutional antidote against QRPs, although we are concerned that Sijtsma's proposal to delegate more responsibility to statistical and methodological consultants could inadvertently reinforce the dichotomy between the substantive and statistical aspects of research. We also discuss sources of false-positive findings and replication failures in psychological research, and outline potential remedies for these problems. We conclude that replicability is the best metric of the minimization of QRPs and their adverse effects on psychological research.

Full Text Available RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI, autoregressive time-lagged regression (AR(1, and hidden Markov model (HMM approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis.

Full Text Available The foundation of the post-modern portfolio theory is creating a portfolio based on a desired target return. This specifically applies to the performance of investment and pension funds that provide a rate of return meeting payment requirements from investment funds. A desired target return is the goal of an investment or pension fund. It is the primary benchmark used to measure performances, dynamic monitoring and evaluation of the risk–return ratio on investment funds. The analysis in this paper is based on monthly returns of Macedonian investment and pension funds (June 2011 - June 2014. Such analysis utilizes the basic, but highly informative statistical characteristic moments like skewness, kurtosis, Jarque–Bera, and Chebyishev’s Inequality. The objective of this study is to perform a trough analysis, utilizing the above mentioned and other types of statistical techniques (Sharpe, Sortino, omega, upside potential, Calmar, Sterling to draw relevant conclusions regarding the risks and characteristic moments in Macedonian investment and pension funds. Pension funds are the second largest segment of the financial system, and has great potential for further growth due to constant inflows from pension insurance. The importance of investment funds for the financial system in the Republic of Macedonia is still small, although open-end investment funds have been the fastest growing segment of the financial system. Statistical analysis has shown that pension funds have delivered a significantly positive volatility-adjusted risk premium in the analyzed period more so than investment funds.

Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

High-resolution laser altimetry can depict the topography of large landslides with un- precedented accuracy and allow better management of the hazards posed by such slides. The surface of most landslides is rougher, on a local scale of a few meters, than adjacent unfailed slopes. This characteristic can be exploited to automatically detect and map landslides in landscapes represented by high resolution DTMs. We have used laser altimetry measurements of local topographic roughness to identify and map the perimeter and internal features of a large earthflow in the South Island, New Zealand. Surface roughness was first quantified by statistically characterizing the local variabil- ity of ground surface orientations using both circular and spherical statistics. These measures included the circular resultant, standard deviation and dispersion, and the three-dimensional spherical resultant and ratios of the normalized eigenvalues of the direction cosines. The circular measures evaluate the amount of change in topographic aspect from pixel-to-pixel in the gridded data matrix. The spherical statistics assess both the aspect and steepness of each pixel. The standard deviation of the third di- rection cosine was also used alone to define the variability in just the steepness of each pixel. All of the statistical measures detect and clearly map the earthflow. Cir- cular statistics also emphasize small folds transverse to the movement in the most active zone of the slide. The spherical measures are more sensitive to the larger scale roughness in a portion of the slide that includes large intact limestone blocks. Power spectra of surface roughness were also calculated from two-dimensional Fourier transformations in local test areas. A small earthflow had a broad spectral peak at wavelengths between 10 and 30 meters. Shallower soil failures and surface erosion produced surfaces with a very sharp spectral peak at 12 meters wavelength. Unfailed slopes had an order of magnitude

Full Text Available We have developed an innovative hybrid problem-based learning (PBL methodology. The methodology has the following distinctive features: i Each complex question was decomposed into a set of coherent finer subquestions by following the carefully designed criteria to maintain a delicate balance between guiding the students and inspiring them to think independently. This learning methodology enabled the students to solve the complex questions progressively in an inductive context. ii Facilitated by the utilization of our web-based learning systems, the teacher was able to interact with the students intensively and could allocate more teaching time to provide tailor-made feedback for individual student. The students were actively engaged in the learning activities, stimulated by the intensive interaction. iii The answers submitted by the students could be automatically consolidated in the report of the Moodle system in real-time. The teacher could adjust the teaching schedule and focus of the class to adapt to the learning progress of the students by analysing the automatically generated report and log files of the web-based learning system. As a result, the attendance rate of the students increased from about 50% to more than 90%, and the students’ learning motivation have been significantly enhanced.

Full Text Available This article presents the proposed application of one type of the modified Shewhart control charts in the monitoring of changes in the aggregated level of financial ratios. The control chart x̅ has been used as a basis of analysis. The examined variable from the sample in the mentioned chart is the arithmetic mean. The author proposes to substitute it with a synthetic measure that is determined and based on the selected ratios. As the ratios mentioned above, are expressed in different units and characters, the author applies standardisation. The results of selected comparative analyses have been presented for both bankrupts and non-bankrupts. They indicate the possibility of using control charts as an auxiliary tool in financial analyses.

Back-analyses are useful for characterizing the geomorphological and mechanical processes and parameters involved in the initiation and propagation of landslides. These processes and parameters can in turn be used for improving forecasts of scenarios and hazard assessments in areas or sites which have similar settings to the back-analysed cases. The selection of the modeled landslide that produces the best agreement with the actual observations requires running a number of simulations by varying the type of model and the sets of input parameters. The comparison of the simulated and observed parameters is normally performed by visual comparison of geomorphological or dynamic variables (e.g., geometry of scarp and final deposit, maximum velocities and depths). Over the past six years, a method developed by NGI has been used by some researchers for a more objective selection of back-analysed input model parameters. That method includes an adaptation of the equations for calculation of classifiers, and a comparative evaluation of classifiers of the selected parameter sets in the Receiver Operating Characteristic (ROC) space. This contribution presents an updating of the methodology. The proposed procedure allows comparisons between two or more "clouds" of classifiers. Each cloud represents the performance of a model over a range of input parameters (e.g., samples of probability distributions). Considering the fact that each cloud does not necessarily produce a full ROC curve, two new normalised ROC-space parameters are introduced for characterizing the performance of each cloud. The first parameter is representative of the cloud position relative to the point of perfect classification. The second parameter characterizes the position of the cloud relative to the theoretically perfect ROC curve and the no-discrimination line. The methodology is illustrated with back-analyses of slope stability and landslide runout of selected case studies. This research activity has been

Determining whether a conflict between gene trees and species trees represents incomplete lineage sorting (ILS) or hybridization involving native and/or invasive species has implications for reconstructing evolutionary relationships and guiding conservation decisions. Among vertebrates, turtles represent an exceptional case for exploring these issues because of the propensity for even distantly related lineages to hybridize. In this study we investigate a group of freshwater turtles (Trachemys) from a part of its range (the Greater Antilles) where it is purported to have undergone reticulation events from both natural and anthropogenic processes. We sequenced mtDNA for 83 samples, sequenced three nuDNA markers for 45 samples, and cloned 29 polymorphic sequences, to identify species boundaries, hybridization, and intergrade zones for Antillean Trachemys and nearby mainland populations. Initial coalescent analyses of phased nuclear alleles (using (*)BEAST) recovered a Bayesian species tree that strongly conflicted with the mtDNA phylogeny and traditional taxonomy, and appeared to be confounded by hybridization. Therefore, we undertook exploratory phylogenetic analyses of mismatched alleles from the "coestimated" gene trees (Heled and Drummond, 2010) in order to identify potential hybrid origins. The geography, morphology, and sampling context of most samples with potential introgressed alleles suggest hybridization over ILS. We identify contact zones between different species on Jamaica (T. decussata × T. terrapen), on Hispaniola (T. decorata × T. stejnegeri), and in Central America (T. emolli × T. venusta). We are unable to determine whether the distribution of T. decussata on Jamaica is natural or the result of prehistoric introduction by Native Americans. This uncertainty means that the conservation status of the Jamaican T. decussata populations and contact zone with T. terrapen are unresolved. Human-mediated dispersal events were more conclusively implicated

The coupling of molecular dynamics (MD) simulations with finite element methods (FEM) yields computationally efficient models that link fundamental material processes at the atomistic level with continuum field responses at higher length scales. The theoretical challenge involves developing a seamless connection along an interface between two inherently different simulation frameworks. Various specialized methods have been developed to solve particular classes of problems. Many of these methods link the kinematics of individual MD atoms with FEM nodes at their common interface, necessarily requiring that the finite element mesh be refined to atomic resolution. Some of these coupling approaches also require simulations to be carried out at 0 K and restrict modeling to two-dimensional material domains due to difficulties in simulating full three-dimensional material processes. In the present work, a new approach to MD-FEM coupling is developed based on a restatement of the standard boundary value problem used to define a coupled domain. The method replaces a direct linkage of individual MD atoms and finite element (FE) nodes with a statistical averaging of atomistic displacements in local atomic volumes associated with each FE node in an interface region. The FEM and MD computational systems are effectively independent and communicate only through an iterative update of their boundary conditions. With the use of statistical averages of the atomistic quantities to couple the two computational schemes, the developed approach is referred to as an embedded statistical coupling method (ESCM). ESCM provides an enhanced coupling methodology that is inherently applicable to three-dimensional domains, avoids discretization of the continuum model to atomic scale resolution, and permits finite temperature states to be applied.

Statistical experiments, more commonly referred to as Monte Carlo or simulation studies, are used to study the behavior of statistical methods and measures under controlled situations. Whereas recent computing and methodological advances have permitted increased efficiency in the simulation process, known as variance reduction, such experiments remain limited by their finite nature and hence are subject to uncertainty; when a simulation is run more than once, different results are obtained. However, virtually no emphasis has been placed on reporting the uncertainty, referred to here as Monte Carlo error, associated with simulation results in the published literature, or on justifying the number of replications used. These deserve broader consideration. Here we present a series of simple and practical methods for estimating Monte Carlo error as well as determining the number of replications required to achieve a desired level of accuracy. The issues and methods are demonstrated with two simple examples, one evaluating operating characteristics of the maximum likelihood estimator for the parameters in logistic regression and the other in the context of using the bootstrap to obtain 95% confidence intervals. The results suggest that in many settings, Monte Carlo error may be more substantial than traditionally thought.

Nickel-Hydrogen (Ni/H2) secondary batteries will be implemented as a power source for the Space Station Freedom as well as for other NASA missions. Consequently, characterization tests of Ni/H2 cells from Eagle-Picher, Whittaker-Yardney, and Hughes were completed at the NASA Lewis Research Center. Watt-hour efficiencies of each Ni/H2 cell were measured for regulated charge and discharge cycles as a function of temperature, charge rate, discharge rate, and state of charge. Temperatures ranged from -5 C to 30 C, charge rates ranged from C/10 to 1C, discharge rates ranged from C/10 to 2C, and states of charge ranged from 20 percent to 100 percent. Results from regression analyses and analyses of mean watt-hour efficiencies demonstrated that overall performance was best at temperatures between 10 C and 20 C while the discharge rate correlated most strongly with watt-hour efficiency. In general, the cell with back-to-back electrode arrangement, single stack, 26 percent KOH, and serrated zircar separator and the cell with a recirculating electrode arrangement, unit stack, 31 percent KOH, zircar separators performed best.

Nickel-Hydrogen (Ni/H2) secondary batteries will be implemented as a power source for the Space Station Freedom as well as for other NASA missions. Consequently, characterization tests of Ni/H2 cells from Eagle-Picher, Whittaker-Yardney, and Hughes were completed at the NASA Lewis Research Center. Watt-hour efficiencies of each Ni/H2 cell were measured for regulated charge and discharge cycles as a function of temperature, charge rate, discharge rate, and state of charge. Temperatures ranged from -5 C to 30 C, charge rates ranged from C/10 to 1C, discharge rates ranged from C/10 to 2C, and states of charge ranged from 20 percent to 100 percent. Results from regression analyses and analyses of mean watt-hour efficiencies demonstrated that overall performance was best at temperatures between 10 C and 20 C while the discharge rate correlated most strongly with watt-hour efficiency. In general, the cell with back-to-back electrode arrangement, single stack, 26 percent KOH, and serrated zircar separator and the cell with a recirculating electrode arrangement, unit stack, 31 percent KOH, zircar separators performed best.

Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both...... comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub......-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10-50 waste fractions, organised according to a three-level (tiered approach) facilitating,comparison of the waste data between individual sub-areas with different fractionation (waste...

Full Text Available This paper provides a review and introduction on agile manufacturing. Tactics of agile manufacturing are mapped into different production areas (eight-construct latent: manufacturing equipment and technology, processes technology and know-how, quality and productivity improvement, production planning and control, shop floor management, product design and development, supplier relationship management, and customer relationship management. The implementation level of agile manufacturing tactics is investigated in each area. A structural equation model is proposed. Hypotheses are formulated. Feedback from 456 firms is collected using five-point-Likert-scale questionnaire. Statistical analysis is carried out using IBM SPSS and AMOS. Multicollinearity, content validity, consistency, construct validity, ANOVA analysis, and relationships between agile components are tested. The results of this study prove that the agile manufacturing tactics have positive effect on the overall agility level. This conclusion can be used by manufacturing firms to manage challenges when trying to be agile.

transmission electron microscopy is used to acquire images from two experimental groups of rats: 1) rats subjected to a behavioral model of stress and 2) rats subjected to sham stress as the control group. The synaptic vesicle distribution and interactions are modeled by employing a point process approach......This Ph.D. thesis deals with mathematical and statistical modeling of synaptic vesicle distribution, shape, orientation and interactions. The first major part of this thesis treats the problem of determining the effect of stress on synaptic vesicle distribution and interactions. Serial section....... The model is able to correctly separate the two experimental groups. Two different approaches to estimate the thickness of each section of specimen being imaged are introduced. The first approach uses Darboux frame and Cartan matrix to measure the isophote curvature and the second approach is based...

In this review, we compare different descriptions of photon-number statistics in harmonic generation processes within quantum, classical and semiclassical approaches. First, we study the exact quantum evolution of the harmonic generation by applying numerical methods including those of Hamiltonian diagonalization and global characteristics. We show explicitly that the harmonic generations can indeed serve as a source of nonclassical light. Then, we demonstrate that the quasi-stationary sub-Poissonian light can be generated in these quantum processes under conditions corresponding to the so-called no-energy-transfer regime known in classical nonlinear optics. By applying method of classical trajectories, we demonstrate that the analytical predictions of the Fano factors are in good agreement with the quantum results. On comparing second and higher harmonic generations in the no-energy-transfer regime, we show that the highest noise reduction is achieved in third-harmonic generation with the Fano-factor of the ...

Characterizing and classifying molecular variation within biological samples is critical for determining fundamental mechanisms of biological processes that will lead to new insights including improved disease understanding. Towards these ends, time-of-flight secondary ion mass spectrometry (ToF-SIMS) was used to examine increasingly complex samples of biological relevance, including monosaccharide isomers, pure proteins, complex protein mixtures, and mouse embryo tissues. The complex mass spectral data sets produced were analyzed using five common statistical and chemometric multivariate analysis techniques: principal component analysis (PCA), linear discriminant analysis (LDA), partial least squares discriminant analysis (PLSDA), soft independent modeling of class analogy (SIMCA), and decision tree analysis by recursive partitioning. PCA was found to be a valuable first step in multivariate analysis, providing insight both into the relative groupings of samples and into the molecular basis for those groupings. For the monosaccharides, pure proteins and protein mixture samples, all of LDA, PLSDA, and SIMCA were found to produce excellent classification given a sufficient number of compound variables calculated. For the mouse embryo tissues, however, SIMCA did not produce as accurate a classification. The decision tree analysis was found to be the least successful for all the data sets, providing neither as accurate a classification nor chemical insight for any of the tested samples. Based on these results we conclude that as the complexity of the sample increases, so must the sophistication of the multivariate technique used to classify the samples. PCA is a preferred first step for understanding ToF-SIMS data that can be followed by either LDA or PLSDA for effective classification analysis. This study demonstrates the strength of ToF-SIMS combined with multivariate statistical and chemometric techniques to classify increasingly complex biological samples

Pharmacodynamic (PD) clinical studies are characterised by a high degree of multiplicity. This multiplicity is the result of the design of these studies that typically investigate effects of a number of biomarkers at various doses and multiple time points. Measurements are taken at many or all points of a "hyper-grid" that can be understood as the cross-product of a number of dimensions each of which has typically 3-30 discrete values. This exploratory design helps understanding the phenomena under investigation, but has made a confirmatory statistical analysis of these studies difficult, so that such an analysis is often missing in this type of studies. In this contribution we show that the cross-product structure of PD studies allows to combine several well-known techniques to address multiplicity in an effective way, so that a confirmatory analysis of these studies becomes feasible without unrealistic loss of power. We demonstrate the application of this technique in two studies that use the quantitative EEG (qEEG) as biomarker for drug activity at the GABA-A receptor. QEEG studies suffer particularly from the curse of multiplicity, since, in addition to the common dimensions like dose and time, the qEEG is measured at many locations over the scalp and in a number of frequency bands which inflate the multiplicity by a factor of about 250.

The aim of cost-utility analysis is to support decision making in healthcare by providing a standardised mechanism for comparing resource use and health outcomes across programmes of work. The focus of this paper is the denominator of the cost-utility analysis, specifically the methodology and statistical challenges associated with calculating QALYs from patient-level data collected as part of a trial. We provide a brief description of the most common questionnaire used to calculate patient level utility scores, the EQ-5D, followed by a discussion of other ways to calculate patient level utility scores alongside a trial including other generic measures of health-related quality of life and condition- and population-specific questionnaires. Detail is provided on how to calculate the mean QALYs per patient, including discounting, adjusting for baseline differences in utility scores and a discussion of the implications of different methods for handling missing data. The methods are demonstrated using data from a trial. As the methods chosen can systematically change the results of the analysis, it is important that standardised methods such as patient-level analysis are adhered to as best as possible. Regardless, researchers need to ensure that they are sufficiently transparent about the methods they use so as to provide the best possible information to aid in healthcare decision making.

Full Text Available Purpose. Defining and analysis of the probabilistic and spectral characteristics of random current in regenerative braking mode of DC electric rolling stock are observed in this paper. Methodology. The elements and methods of the probability theory (particularly the theory of stationary and non-stationary processes and methods of the sampling theory are used for processing of the regenerated current data arrays by PC. Findings. The regenerated current records are obtained from the locomotives and trains in Ukraine railways and trams in Poland. It was established that the current has uninterrupted and the jumping variations in time (especially in trams. For the random current in the regenerative braking mode the functions of mathematical expectation, dispersion and standard deviation are calculated. Histograms, probabilistic characteristics and correlation functions are calculated and plotted down for this current too. It was established that the current of the regenerative braking mode can be considered like the stationary and non-ergodic process. The spectral analysis of these records and “tail part” of the correlation function found weak periodical (or low-frequency components which are known like an interharmonic. Originality. Firstly, the theory of non-stationary random processes was adapted for the analysis of the recuperated current which has uninterrupted and the jumping variations in time. Secondly, the presence of interharmonics in the stochastic process of regenerated current was defined for the first time. And finally, the patterns of temporal changes of the correlation current function are defined too. This allows to reasonably apply the correlation functions method in the identification of the electric traction system devices. Practical value. The results of probabilistic and statistic analysis of the recuperated current allow to estimate the quality of recovered energy and energy quality indices of electric rolling stock in the

Full Text Available Endogenous retroviruses (ERVs are a class of transposable elements found in all vertebrate genomes that contribute substantially to genomic functional and structural diversity. A host species acquires an ERV when an exogenous retrovirus infects a germ cell of an individual and becomes part of the genome inherited by viable progeny. ERVs that colonized ancestral lineages are fixed in contemporary species. However, in some extant species, ERV colonization is ongoing, which results in variation in ERV frequency in the population. To study the consequences of ERV colonization of a host genome, methods are needed to assign each ERV to a location in a species’ genome and determine which individuals have acquired each ERV by descent. Because well annotated reference genomes are not widely available for all species, de novo clustering approaches provide an alternative to reference mapping that are insensitive to differences between query and reference and that are amenable to mobile element studies in both model and non-model organisms. However, there is substantial uncertainty in both identifying ERV genomic position and assigning each unique ERV integration site to individuals in a population. We present an analysis suitable for detecting ERV integration sites in species without the need for a reference genome. Our approach is based on improved de novo clustering methods and statistical models that take the uncertainty of assignment into account and yield a probability matrix of shared ERV integration sites among individuals. We demonstrate that polymorphic integrations of a recently identified endogenous retrovirus in deer reflect contemporary relationships among individuals and populations.

To study the stochastic response of a beam-soil structure under a moving random load, a hybrid approach based on the pseudo-excitation method and the wavelet method is proposed. Using the pseudo-excitation method, the non-stationary random vibration analysis is transformed into a conventional moving harmonic load problem. Analytical solutions of the power spectral density and standard deviation of vertical displacement are derived in an integral form. However, the integrand is singular and highly oscillatory, and the computational time is an important consideration because a large number of frequency points must be computed. To calculate the response accurately and efficiently, a wavelet approach is introduced. Numerical results show that the frequency band which brings the most significant response is dependent on the load velocity. The hybrid method provides a useful tool to estimate the ground vibration caused by traffic loads.

The commonly accepted view of synapsis is that only 2 homologues can synapse at any one site and that this restriction applies to polyploids as well. However, triple synapsis has been observed is some triploid plants and in triploid chicken. In humans, triple synapsis of the long arm of chromosome 21 was detected in sperm of a trisomic 21 individual. More recently, studies of oocytes from trisomic 21 and 18 fetuses also indicated extensive triple synapsis along the entire length of the chromosomes. To further investigate this question, we undertook an evaluation of trivalent synapsis in fetal oocytes from 2 trisomic 21 and 2 trisomic 18 fetuses using fluorescent in situ hybridization (FISH) with whole chromosome probes. Oocytes were hybridized with whole chromosome probes obtained from ONCOR, Inc. after fixation with methanol and acetic acid. Slides were scored for the distribution of prophase stages, hybridization efficiency, and hybridization characteristics of chromosomes 18 and 21 in the trisomic 18 and 21 fetuses respectively. Fifty-eight per cent (379/650) of pachytenes analyzed for chromosome 18 contained a conspicous trivalent and 319 (48%) of these nuclei contained a single, thick, continuous fluorescent signal consistent with complete triple synapsis along the entire length of all 3 chromosomes. Sixteen per cent (104/650) of pachytene contained 2 signals consistent with a bivalent and a univalent, and 9 cells contained 3 thin signals consistent with asynapsis of all 3 chromosomes. The remaining 158 pachytenes had unusual pairing configurations that we could not classify, but they most likely represent trivalents with partial pairing between different homologues. In the 2 trisomic 21 fetuses, the majority (143/232) of pachytenes also contained one signal while only 52 cells contained a bivalent and univalent. Five cells contained 3 separate signals. These results confirm the existence of triple synapsis in human meiosis.

Full Text Available Traffic crashes in Riyadh city cause losses in the form of deaths, injuries and property damages, in addition to the pain and social tragedy affecting families of the victims. In 2005, there were a total of 47,341 injury traffic crashes occurred in Riyadh city (19% of the total KSA crashes and 9% of those crashes were severe. Road safety in Riyadh city may have been adversely affected by: high car ownership, migration of people to Riyadh city, high daily trips reached about 6 million, high rate of income, low-cost of petrol, drivers from different nationalities, young drivers and tremendous growth in population which creates a high level of mobility and transport activities in the city. The primary objective of this paper is therefore to explore factors affecting the severity and frequency of road crashes in Riyadh city using appropriate statistical models aiming to establish effective safety policies ready to be implemented to reduce the severity and frequency of road crashes in Riyadh city. Crash data for Riyadh city were collected from the Higher Commission for the Development of Riyadh (HCDR for a period of five years from 1425H to 1429H (roughly corresponding to 2004-2008. Crash data were classified into three categories: fatal, serious-injury and slight-injury. Two nominal response models have been developed: a standard multinomial logit model (MNL and a mixed logit model to injury-related crash data. Due to a severe underreporting problem on the slight injury crashes binary and mixed binary logistic regression models were also estimated for two categories of severity: fatal and serious crashes. For frequency, two count models such as Negative Binomial (NB models were employed and the unit of analysis was 168 HAIs (wards in Riyadh city. Ward-level crash data are disaggregated by severity of the crash (such as fatal and serious injury crashes. The results from both multinomial and binary response models are found to be fairly consistent but

Both statistical and rule-based approaches to part-of-speech (POS) disambiguation have their own advantages and limitations. Especially for Korean, the narrow windows provided by hidden markov model (HMM) cannot cover the necessary lexical and long-distance dependencies for POS disambiguation. On the other hand, the rule-based approaches are not accurate and flexible to new tag-sets and languages. In this regard, the statistical/rule-based hybrid method that can take advantages of both approaches is called for the robust and flexible POS disambiguation. We present one of such method, that is, a two-phase learning architecture for the hybridstatistical/rule-based POS disambiguation, especially for Korean. In this method, the statistical learning of morphological tagging is error-corrected by the rule-based learning of Brill [1992] style tagger. We also design the hierarchical and flexible Korean tag-set to cope with the multiple tagging applications, each of which requires different tag-set. Our experiments s...

Full Text Available In recent years, the utilization of hybrid polymer matrix composite materials in many engineering fields has increasedtremendously. The present investigation is devoted on fabric-reinforced hybrid composite laminates with different volumefractions of the constituent materials; epoxy resin, plain-woven glass fabric, and textile satin fabric. Fracture toughness ofa material has immense importance in the determination of the resistance of the material to crack propagation. Hence thisarticle explores the findings of the experimentation on the compressive strength and fracture toughness of fabric-reinforcedlaminates with 0/90O & ± 45O orientation with five notch configurations. The fracture toughness has been found to increasecontinuously with increased volumes of glass fabric and it is less dependent on notch size upto certain limit. Data collectedduring experimentations are validated using analysis of variance (ANOVA technique. Percentage contribution of each parameterwas evaluated using ANOVA technique with fiber content, orientation and notch size as input parameters, while theoutput parameter being the OHC strength of the laminate

Both reservoir engineers and petrophysicists have been concerned about dividing a reservoir into zones for engineering and petrophysics purposes. Through decades, several techniques and approaches were introduced. Out of them, statistical reservoir zonation, stratigraphic modified Lorenz (SML) plot and the principal component and clustering analyses techniques were chosen to apply on the Nubian sandstone reservoir of Palaeozoic - Lower Cretaceous age, Gulf of Suez, Egypt, by using five adjacent wells. The studied reservoir consists mainly of sandstone with some intercalation of shale layers with varying thickness from one well to another. The permeability ranged from less than 1 md to more than 1000 md. The statistical reservoir zonation technique, depending on core permeability, indicated that the cored interval of the studied reservoir can be divided into two zones. Using reservoir properties such as porosity, bulk density, acoustic impedance and interval transit time indicated also two zones with an obvious variation in separation depth and zones continuity. The stratigraphic modified Lorenz (SML) plot indicated the presence of more than 9 flow units in the cored interval as well as a high degree of microscopic heterogeneity. On the other hand, principal component and cluster analyses, depending on well logging data (gamma ray, sonic, density and neutron), indicated that the whole reservoir can be divided at least into four electrofacies having a noticeable variation in reservoir quality, as correlated with the measured permeability. Furthermore, continuity or discontinuity of the reservoir zones can be determined using this analysis.

The results of an extensive set of parametric studies are presented which provide analytical data of the effects of various tokamak parameters on the performance and cost of the DTHR (Demonstration Tokamak Hybrid Reactor). The studies were centered on a point design which is described in detail. Variations in the device size, neutron wall loading, and plasma aspect ratio are presented, and the effects on direct hardware costs, fissile fuel production (breeding), fusion power production, electrical power consumption, and thermal power production are shown graphically. The studies considered both ignition and beam-driven operations of DTHR and yielded results based on two empirical scaling laws presently used in reactor studies. Sensitivity studies were also made for variations in the following key parameters: the plasma elongation, the minor radius, the TF coil peak field, the neutral beam injection power, and the Z/sub eff/ of the plasma.

A 3-dimensional hybrid stress element with a traction-free cylindrical surface based on amodified complementary energy principle has been derived for efficient and accurate analysis of stressconcentration around circular cutouts in thin to thick laminated composites. New expressions of sixstress components are developed by using three stress-functions in cylindrical co-ordinates, so that thehomogeneous equilibrium equations, the interlayer surface transverse-stresses and the traction-freeboundary condition on the cylindrical surface are satisfied exactly, while the interelement traction conti-nuity has been relaxed via the Lagrange multiplier method. Transverse-shear deformation effects areincorporated in each layer with displacement continuity enforced along interlayer surface. Selected ex-amples are used to demonstrate the efficiency and accuracy of the present special element.

The two tests employed in the hybrid testing scheme are Page’s cumulative sums for all streams within a Balance Period (maximum of the maximums and average of the maximums) and Crosier’s multivariate cumulative sum applied to incremental cumulative sums across Balance Periods. The role of residuals for both kinds of data is discussed.

Develop a classificatory tool to identify different populations of patients with Traumatic Brain Injury based on the characteristics of deficit and response to treatment. A KDD framework where first, descriptive statistics of every variable was done, data cleaning and selection of relevant variables. Then data was mined using a generalization of Clustering based on rules (CIBR), an hybrid AI and Statistics technique which combines inductive learning (AI) and clustering (Statistics). A prior Knowledge Base (KB) is considered to properly bias the clustering; semantic constraints implied by the KB hold in final clusters, guaranteeing interpretability of the resultis. A generalization (Exogenous Clustering based on rules, ECIBR) is presented, allowing to define the KB in terms of variables which will not be considered in the clustering process itself, to get more flexibility. Several tools as Class panel graph are introduced in the methodology to assist final interpretation. A set of 5 classes was recommended by the system and interpretation permitted profiles labeling. From the medical point of view, composition of classes is well corresponding with different patterns of increasing level of response to rehabilitation treatments. All the patients initially assessable conform a single group. Severe impaired patients are subdivided in four profiles which clearly distinct response patterns. Particularly interesting the partial response profile, where patients could not improve executive functions. Meaningful classes were obtained and, from a semantics point of view, the results were sensibly improved regarding classical clustering, according to our opinion that hybrid AI & Stats techniques are more powerful for KDD than pure ones.

In this study, range extender engine designed should be able to meet the power needs of a power generator of hybrid electrical vehicle that has a minimum of 18 kW. Using this baseline model, the following range extenders will be compared between conventional SI piston engine (Baseline, BsL), engine capacity 1998 cm3, and efficiency-oriented SI piston with engine capacity 999 cm3 and 499 cm3 with 86 mm bore and stroke square gasoline engine in the performance, emission prediction of range extender engine, standard of charge by using engine and vehicle simulation software tools. In AVL Boost simulation software, range extender engine simulated from 1000 to 6000 rpm engine loads. The highest peak engine power brake reached up to 38 kW at 4500 rpm. On the other hand the highest torque achieved in 100 Nm at 3500 rpm. After that using AVL cruise simulation software, the model of range extended electric vehicle in series configuration with main components such as internal combustion engine, generator, electric motor, battery and the arthemis model rural road cycle was used to simulate the vehicle model. The simulation results show that engine with engine capacity 999 cm3 reported the economical performances of the engine and the emission and the control of engine cycle parameters.

Full Text Available BACKGROUND: The genetic diversity of Trypanosoma cruzi, the etiological agent of Chagas disease, has been traditionally divided in two major groups, T. cruzi I and II, corresponding to discrete typing units TcI and TcII-VI under a recently proposed nomenclature. The two major groups of T. cruzi seem to differ in important biological characteristics, and are thus thought to represent a natural division relevant for epidemiological studies and development of prophylaxis. To understand the potential connection between the different manifestations of Chagas disease and variability of T. cruzi strains, it is essential to have a correct reconstruction of the evolutionary history of T. cruzi. METHODOLOGY/PRINCIPAL FINDINGS: Nucleotide sequences from 32 unlinked loci (>26 Kilobases of aligned sequence were used to reconstruct the evolutionary history of strains representing the known genetic variability of T. cruzi. Thorough phylogenetic analyses show that the original classification of T. cruzi in two major lineages does not reflect its evolutionary history and that there is only strong evidence for one major and recent hybridization event in the history of this species. Furthermore, estimates of divergence times using Bayesian methods show that current extant lineages of T. cruzi diverged very recently, within the last 3 million years, and that the major hybridization event leading to hybrid lineages TcV and TcVI occurred less than 1 million years ago, well before the contact of T. cruzi with humans in South America. CONCLUSIONS/SIGNIFICANCE: The described phylogenetic relationships among the six major genetic subdivisions of T. cruzi should serve as guidelines for targeted epidemiological and prophylaxis studies. We suggest that it is important to reconsider conclusions from previous studies that have attempted to uncover important biological differences between the two originally defined major lineages of T. cruzi especially if those conclusions

We have developed software and statistical tools for linkage analysis of polygenic diseases. We use type I diabetes mellitus (insulin-dependent diabetes mellitus, IDDM) as our model system. Two susceptibility loci (IDDM1 on 6p21 and IDDM2 on 11p15) are well established, and recent genome searches suggest the existence of other susceptibility loci. We have implemented CASPAR, a software tool that makes it possible to test for linkage quickly and efficiently using multiple polymorphic DNA markers simultaneously in nuclear families consisting of two unaffected parents and a pair of affected siblings (ASP). We use a simulation-based method to determine whether lod scores from a collection of ASP tests are significant. We test our new software and statistical tools to assess linkage of IDDM5 and IDDM7 conditioned on analyses with 1 or 2 other unlinked type I diabetes susceptibility loci. The results from the CASPAR analysis suggest that conditioning of IDDM5 on IDDM1 and IDDM4, and of IDDM7 on IDDM1 and IDDM2 provides significant benefits for the genetic analysis of polygenic loci.

Folding and unfolding simulations of three-dimensional lattice proteins were analyzed using a simplified statistical mechanical model in which their amino acid sequences and native conformations were incorporated explicitly. Using this statistical mechanical model, under the assumption that only interactions between amino acid residues within a local structure in a native state are considered, the partition function of the system can be calculated for a given native conformation without any adjustable parameter. The simulations were carried out for two different native conformations, for each of which two foldable amino acid sequences were considered. The native and non-native contacts between amino acid residues occurring in the simulations were examined in detail and compared with the results derived from the theoretical model. The equilibrium thermodynamic quantities (free energy, enthalpy, entropy, and the probability of each amino acid residue being in the native state) at various temperatures obtained from the simulations and the theoretical model were also examined in order to characterize the folding processes that depend on the native conformations and the amino acid sequences. Finally, the free energy landscapes were discussed based on these analyses.

Full Text Available The objectives of this technical paper were to propose the optimum partial replacement of cement by fly ash based on the complete consumption of calcium hydroxide from hydration reactions of cement and the long-term strength activity index based on equivalent calcium silicate hydrate as well as the propagation of uncertainty due to randomness inherent in main chemical compositions in cement and fly ash. Firstly the hydration- and pozzolanic reactions as well as stoichiometry were reviewed. Then the optimum partial replacement of cement by fly ash was formulated. After that the propagation of uncertainty due to main chemical compositions in cement and fly ash was discussed and the reliability analyses for applying the suitable replacement were reviewed. Finally an applicability of the concepts mentioned above based on statistical data of materials available was demonstrated. The results from analyses were consistent with the testing results by other researchers. The results of this study provided guidelines of suitable utilization of fly ash for partial replacement of cement. It was interesting to note that these concepts could be extended to optimize partial replacement of cement by other types of pozzolan which were described in the other papers of the authors.

Phylogenetic comparisons of the different mammalian genetic transmission elements (mtDNA, X-, Y-, and autosomal DNA) is a powerful approach for understanding the process of speciation in nature. Through such comparisons the unique inheritance pathways of each genetic element and gender-biased processes can link genomic structure to the evolutionary process, especially among lineages which have recently diversified, in which genetic isolation may be incomplete. Bulldog bats of the genus Noctilio are an exemplar lineage, being a young clade, widely distributed, and exhibiting unique feeding ecologies. In addition, currently recognized species are paraphyletic with respect to the mtDNA gene tree and contain morphologically identifiable clades that exhibit mtDNA divergences as great as among many species. To test taxonomic hypotheses and understand the contribution of hybridization to the extant distribution of genetic diversity in Noctilio, we used phylogenetic, coalescent stochastic modeling, and divergence time estimates using sequence data from cytochrome-b, cytochrome c oxidase-I, zinc finger Y, and zinc finger X, as well as evolutionary reconstructions based on amplified fragment length polymorphisms (AFLPs) data. No evidence of ongoing hybridization between the two currently recognized species was identified. However, signatures of an ancient mtDNA capture were recovered in which an mtDNA lineage of one species was captured early in the noctilionid radiation. Among subspecific mtDNA clades, which were generally coincident with morphology and statistically definable as species, signatures of ongoing hybridization were observed in sex chromosome sequences and AFLP. Divergence dating of genetic elements corroborates the diversification of extant Noctilio beginning about 3 Ma, with ongoing hybridization between mitochondrial lineages separated by 2.5 myr. The timeframe of species' divergence within Noctilio supports the hypothesis that shifts in the dietary

Full Text Available BACKGROUND: Individual participant data (IPD meta-analyses that obtain "raw" data from studies rather than summary data typically adopt a "two-stage" approach to analysis whereby IPD within trials generate summary measures, which are combined using standard meta-analytical methods. Recently, a range of "one-stage" approaches which combine all individual participant data in a single meta-analysis have been suggested as providing a more powerful and flexible approach. However, they are more complex to implement and require statistical support. This study uses a dataset to compare "two-stage" and "one-stage" models of varying complexity, to ascertain whether results obtained from the approaches differ in a clinically meaningful way. METHODS AND FINDINGS: We included data from 24 randomised controlled trials, evaluating antiplatelet agents, for the prevention of pre-eclampsia in pregnancy. We performed two-stage and one-stage IPD meta-analyses to estimate overall treatment effect and to explore potential treatment interactions whereby particular types of women and their babies might benefit differentially from receiving antiplatelets. Two-stage and one-stage approaches gave similar results, showing a benefit of using anti-platelets (Relative risk 0.90, 95% CI 0.84 to 0.97. Neither approach suggested that any particular type of women benefited more or less from antiplatelets. There were no material differences in results between different types of one-stage model. CONCLUSIONS: For these data, two-stage and one-stage approaches to analysis produce similar results. Although one-stage models offer a flexible environment for exploring model structure and are useful where across study patterns relating to types of participant, intervention and outcome mask similar relationships within trials, the additional insights provided by their usage may not outweigh the costs of statistical support for routine application in syntheses of randomised controlled

In random-effects model meta-analysis, the conventional DerSimonian-Laird (DL) estimator typically underestimates the between-trial variance. Alternative variance estimators have been proposed to address this bias. This study aims to empirically compare statistical inferences from random......-effects model meta-analyses on the basis of the DL estimator and four alternative estimators, as well as distributional assumptions (normal distribution and t-distribution) about the pooled intervention effect. We evaluated the discrepancies of p-values, 95% confidence intervals (CIs) in statistically...... significant meta-analyses, and the degree (percentage) of statistical heterogeneity (e.g. I(2)) across 920 Cochrane primary outcome meta-analyses. In total, 414 of the 920 meta-analyses were statistically significant with the DL meta-analysis, and 506 were not. Compared with the DL estimator, the four...

The validity of weighted mean results estimated in meta-analysis has been criticized. This paper presents a set of simple statistical and graphical techniques that can be used in meta-analysis to evaluate common points of criticism. The graphical techniques are based on funnel graph diagrams. Problems and techniques for dealing with them that are discussed include: (1) the so-called 'apples and oranges' problem, stating that mean results in meta-analysis tend to gloss over important differences that should be highlighted. A test of the homogeneity of results is described for testing the presence of this problem. If results are highly heterogeneous, a random effects model of meta-analysis is more appropriate than the fixed effects model of analysis. (2) The possible presence of skewness in a sample of results. This can be tested by comparing the mode, median and mean of the results in the sample. (3) The possible presence of more than one mode in a sample of results. This can be tested by forming a frequency distribution of the results and examining the shape of this distribution. (4) The sensitivity of the mean to the possible presence of atypical results (outliers) can be tested by comparing the overall mean to the mean of all results except the one suspected of being atypical. (5) The possible presence of publication bias can be tested by visual inspection of funnel graph diagrams in which data points have been sorted according to statistical significance and direction of effect. (6) The possibility of underestimating the standard error of the mean in meta-analyses by using multiple, correlated results from the same study as the unit of analysis can be addressed by using the jack-knife technique for estimating the uncertainty of the mean. Brief examples, taken from road safety research, are given of all these techniques.

Even though statistical thinking and critical thinking appear to have strong links from a theoretical point of view, empirical research into the intersections and potential interrelatedness of these aspects of competence is scarce. Our research suggests that thinking skills in both areas may be interdependent. Given this interconnection, it should…

Mnemonics (memory aids) are often viewed as useful in helping students recall information, and thereby possibly reducing stress and freeing up more cognitive resources for higher-order thinking. However, there has been little research on statistics mnemonics, especially for large classes. This article reports on the results of a study conducted…

Full Text Available Surface science, which includes the preparation, development and analysis of surfaces and coatings, is essential in both fundamental and applied as well as in engineering and industrial research. Contact angle measurements using sessile drop techniques are commonly used to characterize coated surfaces or surface modifications. Well-defined surfaces structures at both nanoscopic and microscopic level can be achieved but the reliable characterization by means of contact angle measurements and their interpretation often remains an open question. Thus, we focused our research effort on one main problem of surface science community, which is the determination of correct and valid definitions and measurements of contact angles. In this regard, we developed the high-precision drop shape analysis (HPDSA, which involves a complex transformation of images from sessile drop experiments to Cartesian coordinates and opens up the possibility of a physically meaningful contact angle calculation. To fulfill the dire need for a reproducible contact angle determination/definition, we developed three easily adaptable statisticalanalyses procedures. In the following, the basic principles of HPDSA will be explained and applications of HPDSA will be illustrated. Thereby, the unique potential of this analysis approach will be illustrated by means of selected examples.

This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…

Regional deposition of inhaled aerosols in the respiratory tract is a significant factor in assessing the biological effects from exposure to a variety of environmental particles. Understanding the deposition efficiency of inhaled aerosol particles in the nasal and oral airways can help evaluate doses to the extrathoracic region as well as to the lung. Dose extrapolation from laboratory animals to humans has been questioned due to significant physiological and anatomical variations. Although human studies are considered ideal for obtaining in vivo toxicity information important in risk assessment, the number of subjects in the study is often small compared to epidemiological and animal studies. This study measured in vivo the nasal airway dimensions and the extrathoracic deposition of ultrafine aerosols in 10 normal adult males. Variability among individuals was significant. The nasal geometry of each individual was characterized at a resolution of 3 mm using magnetic resonance imaging (MRI) and acoustic rhinometry (AR). The turbulent diffusion theory was used to describe the nonlinear nature of extrathoracic aerosol deposition. To determine what dimensional features of the nasal airway were responsible for the marked differences in particle deposition, the MIXed-effects NonLINear Regression (MIXNLIN) procedure was used to account for the random effort of repeated measurements on the same subject. Using both turbulent diffusion theory and MIXNLIN, the ultrafine particle deposition is correlated with nasal dimensions measured by the surface area, minimum cross-sectional area, and complexity of the airway shape. The combination of MRI and AR is useful for characterizing both detailed nasal dimensions and temporal changes in nasal patency. We conclude that a suitable statistical procedure incorporated with existing physical theories must be used in data analyses for experimental studies of aerosol deposition that involve a relatively small number of human subjects.

The complex process of allopolyploid speciation includes various mechanisms ranging from species crosses and hybrid genome doubling to genome alterations and the establishment of new allopolyploids as persisting natural entities. Currently, little is known about the genetic mechanisms that underlie hybrid genome doubling, despite the fact that natural allopolyploid formation is highly dependent on this phenomenon. We examined the genetic basis for the spontaneous genome doubling of triploid F1 hybrids between the direct ancestors of allohexaploid common wheat (Triticum aestivum L., AABBDD genome), namely Triticumturgidum L. (AABB genome) and Aegilopstauschii Coss. (DD genome). An Ae. tauschii intraspecific lineage that is closely related to the D genome of common wheat was identified by population-based analysis. Two representative accessions, one that produces a high-genome-doubling-frequency hybrid when crossed with a T. turgidum cultivar and the other that produces a low-genome-doubling-frequency hybrid with the same cultivar, were chosen from that lineage for further analyses. A series of investigations including fertility analysis, immunostaining, and quantitative trait locus (QTL) analysis showed that (1) production of functional unreduced gametes through nonreductional meiosis is an early step key to successful hybrid genome doubling, (2) first division restitution is one of the cytological mechanisms that cause meiotic nonreduction during the production of functional male unreduced gametes, and (3) six QTLs in the Ae. tauschii genome, most of which likely regulate nonreductional meiosis and its subsequent gamete production processes, are involved in hybrid genome doubling. Interlineage comparisons of Ae. tauschii’s ability to cause hybrid genome doubling suggested an evolutionary model for the natural variation pattern of the trait in which non-deleterious mutations in six QTLs may have important roles. The findings of this study demonstrated that the

Full Text Available The complex process of allopolyploid speciation includes various mechanisms ranging from species crosses and hybrid genome doubling to genome alterations and the establishment of new allopolyploids as persisting natural entities. Currently, little is known about the genetic mechanisms that underlie hybrid genome doubling, despite the fact that natural allopolyploid formation is highly dependent on this phenomenon. We examined the genetic basis for the spontaneous genome doubling of triploid F1 hybrids between the direct ancestors of allohexaploid common wheat (Triticum aestivum L., AABBDD genome, namely Triticumturgidum L. (AABB genome and Aegilopstauschii Coss. (DD genome. An Ae. tauschii intraspecific lineage that is closely related to the D genome of common wheat was identified by population-based analysis. Two representative accessions, one that produces a high-genome-doubling-frequency hybrid when crossed with a T. turgidum cultivar and the other that produces a low-genome-doubling-frequency hybrid with the same cultivar, were chosen from that lineage for further analyses. A series of investigations including fertility analysis, immunostaining, and quantitative trait locus (QTL analysis showed that (1 production of functional unreduced gametes through nonreductional meiosis is an early step key to successful hybrid genome doubling, (2 first division restitution is one of the cytological mechanisms that cause meiotic nonreduction during the production of functional male unreduced gametes, and (3 six QTLs in the Ae. tauschii genome, most of which likely regulate nonreductional meiosis and its subsequent gamete production processes, are involved in hybrid genome doubling. Interlineage comparisons of Ae. tauschii's ability to cause hybrid genome doubling suggested an evolutionary model for the natural variation pattern of the trait in which non-deleterious mutations in six QTLs may have important roles. The findings of this study demonstrated

Variance between studies in a meta-analysis will exist. This heterogeneity may be of clinical, methodological or statistical origin. The last of these is quantified by the I(2) -statistic. We investigated, using simulated studies, the accuracy of I(2) in the assessment of heterogeneity and the effec

Considering the epistemic uncertainties within the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model when it is used for the response analysis of built-up systems in the mid-frequency range, the hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis (ETFE/SEA) model is established by introducing the evidence theory. Based on the hybrid ETFE/SEA model and the sub-interval perturbation technique, the hybrid Sub-interval Perturbation and Evidence Theory-based Finite Element/Statistical Energy Analysis (SIP-ETFE/SEA) approach is proposed. In the hybrid ETFE/SEA model, the uncertainty in the SEA subsystem is modeled by a non-parametric ensemble, while the uncertainty in the FE subsystem is described by the focal element and basic probability assignment (BPA), and dealt with evidence theory. Within the hybrid SIP-ETFE/SEA approach, the mid-frequency response of interest, such as the ensemble average of the energy response and the cross-spectrum response, is calculated analytically by using the conventional hybrid FE/SEA method. Inspired by the probability theory, the intervals of the mean value, variance and cumulative distribution are used to describe the distribution characteristics of mid-frequency responses of built-up systems with epistemic uncertainties. In order to alleviate the computational burdens for the extreme value analysis, the sub-interval perturbation technique based on the first-order Taylor series expansion is used in ETFE/SEA model to acquire the lower and upper bounds of the mid-frequency responses over each focal element. Three numerical examples are given to illustrate the feasibility and effectiveness of the proposed method.

For the effective utilization of solar energy at houses, a heating system using an air hybrid collector (capable of simultaneously performing heat collection and photovoltaic power generation). As the specimen house, a wooden house of a total floor area of 120m{sup 2} was simulated. Collected air is fanned into a crushed stone heat accumulator (capable of storing one day`s collection) or into a living room. The output of solar cell arrays is put into a heat pump (capable of handling a maximum hourly load of 36,327kJ/h) via an inverter so as to drive the fan (corresponding to average insolation on the heat collecting plate of 10.7MJ/hm{sup 2} and heat collecting efficiency of 40%), and shortage in power if any is supplied from the system interconnection. A hybrid collector, as compared with the conventional air collector, is lower in thermal efficiency but the merit that it exhibits with respect to power generation is far greater than what is needed to counterbalance the demerit. When the hybrid system is in heating operation, there is an ideal heat cycle of collection, accumulation, and radiation when the load is light, but the balance between accumulation and radiation is disturbed when the load is heavy. 4 refs., 8 figs., 3 tabs.

There is a critical need for accurate land cover information for resource assessment, biophysical modeling, greenhouse gas studies, and for estimating possible terrestrial responses and feedbacks to climate change. However, practically all existing land cover datasets have quite a high level of uncertainty and suffer from a lack of important details that does not allow for relevant parameterization, e.g., data derived from different forest inventories. The objective of this study is to develop a methodology in order to create a hybrid land cover dataset at the level which would satisfy requirements of the verified terrestrial biota full greenhouse gas account (Shvidenko et al., 2008) for large regions i.e. Russia. Such requirements necessitate a detailed quantification of land classes (e.g., for forests - dominant species, age, growing stock, net primary production, etc.) with additional information on uncertainties of the major biometric and ecological parameters in the range of 10-20% and a confidence interval of around 0.9. The approach taken here allows the integration of different datasets to explore synergies and in particular the merging and harmonization of land and forest inventories, ecological monitoring, remote sensing data and in-situ information. The following datasets have been integrated: Remote sensing: Global Land Cover 2000 (Fritz et al., 2003), Vegetation Continuous Fields (Hansen et al., 2002), Vegetation Fire (Sukhinin, 2007), Regional land cover (Schmullius et al., 2005); GIS: Soil 1:2.5 Mio (Dokuchaev Soil Science Institute, 1996), Administrative Regions 1:2.5 Mio, Vegetation 1:4 Mio, Bioclimatic Zones 1:4 Mio (Stolbovoi & McCallum, 2002), Forest Enterprises 1:2.5 Mio, Rivers/Lakes and Roads/Railways 1:1 Mio (IIASA's data base); Inventories and statistics: State Land Account (FARSC RF, 2006), State Forest Account - SFA (FFS RF, 2003), Disturbances in forests (FFS RF, 2006). The resulting hybrid land cover dataset at 1-km resolution comprises

Full Text Available Abstract Background Giant Galápagos tortoises on the island of Española have been the focus of an intensive captive breeding-repatriation programme for over 35 years that saved the taxon from extinction. However, analysis of 118 samples from released individuals indicated that the bias sex ratio and large variance in reproductive success among the 15 breeders has severely reduced the effective population size (Ne. Results We report here that an analysis of an additional 473 captive-bred tortoises released back to the island reveals an individual (E1465 that exhibits nuclear microsatellite alleles not found in any of the 15 breeders. Statisticalanalyses incorporating genotypes of 304 field-sampled individuals from all populations on the major islands indicate that E1465 is most probably a hybrid between an Española female tortoise and a male from the island of Pinzón, likely present on Española due to human transport. Conclusion Removal of E1465 as well as its father and possible (half-siblings is warranted to prevent further contamination within this taxon of particular conservation significance. Despite this detected single contamination, it is highly noteworthy to emphasize the success of this repatriation program conducted over nearly 40 years and involving release of over 2000 captive-bred tortoises that now reproduce in situ. The incorporation of molecular genetic analysis of the program is providing guidance that will aid in monitoring the genetic integrity of this ambitious effort to restore a unique linage of a spectacular animal.

In the present study, we examined whether there was an association between dopamine-β hydroxylase (DBH) promoter polymorphisms (a 5'-ins/del and a GTn repeats) and a history of suicide attempt in 223 chronic schizophrenia individuals using statistical and molecular analyses. Within the genetic association study design, we compared the statistical haplotype phase with the molecular phase produced by the amplicon size analysis. The two DBH polymorphisms were analysed using the Applied Biosystem 3130 and the statisticalanalyses were carried out using UNPHASED v.3.1.5 and PHASE v.2.1.1 to determine the haplotype frequencies and infer the phase in each patient. Then, DBH polymorphisms were incorporated into the Haploscore analysis to test the association with a history of suicide attempt. In our sample, 62 individuals had a history of suicide attempt. There was no association between DBH polymorphisms and a history of suicide attempt across the different analytical strategies applied. There was no significant difference between the haplotype frequencies produced by the amplicon size analysis and statistical analytical strategies. However, some of the haplotype pairs inferred in the PHASE analysis were inconsistent with the molecular haplotype size measured by the ABI 3130. The amplicon size analysis proved to be the most accurate method using the haplotype as a possible genetic marker for future testing. Although the results were not significant, further molecular analyses of the DBH gene and other candidate genes can clarify the utility of the molecular phase in psychiatric genetics and personalized medicine.

Full Text Available In the recent years, many protocols aimed at reproducibly sequencing reduced-genome subsets in non-model organisms have been published. Among them, RAD-sequencing is one of the most widely used. It relies on digesting DNA with specific restriction enzymes and performing size selection on the resulting fragments. Despite its acknowledged utility, this method is of limited use with degraded DNA samples, such as those isolated from museum specimens, as these samples are less likely to harbor fragments long enough to comprise two restriction sites making possible ligation of the adapter sequences (in the case of double-digest RAD or performing size selection of the resulting fragments (in the case of single-digest RAD. Here, we address these limitations by presenting a novel method called hybridization RAD (hyRAD. In this approach, biotinylated RAD fragments, covering a random fraction of the genome, are used as baits for capturing homologous fragments from genomic shotgun sequencing libraries. This simple and cost-effective approach allows sequencing of orthologous loci even from highly degraded DNA samples, opening new avenues of research in the field of museum genomics. Not relying on the restriction site presence, it improves among-sample loci coverage. In a trial study, hyRAD allowed us to obtain a large set of orthologous loci from fresh and museum samples from a non-model butterfly species, with a high proportion of single nucleotide polymorphisms present in all eight analyzed specimens, including 58-year-old museum samples. The utility of the method was further validated using 49 museum and fresh samples of a Palearctic grasshopper species for which the spatial genetic structure was previously assessed using mtDNA amplicons. The application of the method is eventually discussed in a wider context. As it does not rely on the restriction site presence, it is therefore not sensitive to among-sample loci polymorphisms in the restriction sites

The current work presents an innovative statistical approach to model ordinal variables in environmental monitoring studies. An ordinal variable has values that can only be compared as "less", "equal" or "greater" and it is not possible to have information about the size of the difference between two particular values. The example of ordinal variable under this study is the vas deferens sequence (VDS) used in imposex (superimposition of male sexual characters onto prosobranch females) field assessment programmes for monitoring tributyltin (TBT) pollution. The statistical methodology presented here is the ordered logit regression model. It assumes that the VDS is an ordinal variable whose values match up a process of imposex development that can be considered continuous in both biological and statistical senses and can be described by a latent non-observable continuous variable. This model was applied to the case study of Nucella lapillus imposex monitoring surveys conducted in the Portuguese coast between 2003 and 2008 to evaluate the temporal evolution of TBT pollution in this country. In order to produce more reliable conclusions, the proposed model includes covariates that may influence the imposex response besides TBT (e.g. the shell size). The model also provides an analysis of the environmental risk associated to TBT pollution by estimating the probability of the occurrence of females with VDS ≥ 2 in each year, according to OSPAR criteria. We consider that the proposed application of this statistical methodology has a great potential in environmental monitoring whenever there is the need to model variables that can only be assessed through an ordinal scale of values.

2 statistical models for evaluation of toxicological studies results have been presented. Model I. after R. Hoschek and H. J. Schittke (2) involves: 1. Elimination of the values deviating from most results-by Grubbs' method (2). 2. Analysis of the differences between the results obtained by the participants of the action and tentatively assumed value. 3. Evaluation of significant differences between the reference value and average value for a given series of measurements. 4. Thorough evaluation of laboratories based on evaluation coefficient fx. Model II after Keppler et al. As a criterion for evaluating the results the authors assumed the median. Individual evaluation of laboratories was performed on the basis of: 1. Adjusted test "t" 2. Linear regression test.

Birches (Betula spp.) hybridize readily, confounding genetic signatures of refugial isolation and postglacial migration. We aimed to distinguish hybridization from range-shift processes in the two widespread and cold-adapted species B. nana and B. pubescens, previously shown to share a similarly east-west-structured variation in plastid DNA (pDNA). We sampled the two species throughout their ranges and included reference samples of five other Betula species and putative hybrids. We analysed 901 individual plants using mainly nuclear high-resolution markers (amplified fragment length polymorphisms; AFLPs); a subset of 64 plants was also sequenced for two pDNA regions. Whereas the pDNA variation as expected was largely shared between B. nana and B. pubescens, the two species were distinctly differentiated at AFLP loci. In B. nana, both the AFLP and pDNA results corroborated the former pDNA-based hypothesis that it expanded from at least two major refugia in Eurasia, one south of and one east of the North European ice sheets. In contrast, B. pubescens showed a striking lack of geographic structuring of its AFLP variation. We identified a weak but significant increase in nuclear (AFLP) gene flow from B. nana into B. pubescens with increasing latitude, suggesting hybridization has been most frequent at the postglacial expansion front of B. pubescens and that hybrids mainly backcrossed to B. pubescens. Incongruence between pDNA and AFLP variation in B. pubescens can be explained by efficient expansion from a single large refugium combined with leading-edge hybridization and plastid capture from B. nana during colonization of new territory already occupied by this more cold-tolerant species.

The study had two primary aims. The first aim was to combine a human resources costing and accounting approach (HRCA) with a quantitative statistical approach in order to get an integrated model. The second aim was to apply this integrated model in a quasi-experimental study in order to investigate whether preventive intervention affected sickness absence costs at the company level. The intervention studied contained occupational organizational measures, competence development, physical and psychosocial working environmental measures and individual and rehabilitation measures on both an individual and a group basis. The study is a quasi-experimental design with a non-randomized control group. Both groups involved cleaning jobs at predominantly female workplaces. The study plan involved carrying out before and after studies on both groups. The study included only those who were at the same workplace during the whole of the study period. In the HRCA model used here, the cost of sickness absence is the net difference between the costs, in the form of the value of the loss of production and the administrative cost, and the benefits in the form of lower labour costs. According to the HRCA model, the intervention used counteracted a rise in sickness absence costs at the company level, giving an average net effect of 266.5 Euros per person (full-time working) during an 8-month period. Using an analogue statistical analysis on the whole of the material, the contribution of the intervention counteracted a rise in sickness absence costs at the company level giving an average net effect of 283.2 Euros. Using a statistical method it was possible to study the regression coefficients in sub-groups and calculate the p-values for these coefficients; in the younger group the intervention gave a calculated net contribution of 605.6 Euros with a p-value of 0.073, while the intervention net contribution in the older group had a very high p-value. Using the statistical model it was

Full Text Available The paper exhibits the examination of life quality evaluation of helical coil springs in the railway industry as it impacts the safety of the transportation of goods and people. The types of spring considered are: the external spring, internal spring and stabiliser spring. Statistical process control was utilised as the fundamental instrument in the investigation. Measurements were performed using a measuring tape, dynamic actuators and the vernier caliper. The purpose of this research was to examine the usability of old helical springs found in a railway environment. The goal of the experiment was to obtain factual statistical information to determine the life quality of the helical springs used in the railroad transportation environment. Six sigma advocacies were additionally used as a part of this paper. According to six sigma estimation examination only the stabilizers and inner springs for coil bar diameter met the six sigma prerequisites. It is reasoned that the coil springs should be replaced as they do not meet the six sigma requirements.

We combined genometric (DNA walks) and statistical (detrended fluctuation analysis) methods on 456 prokaryotic chromosomes from 309 different bacterial and archaeal species to look for specific patterns and long-range correlations along the genome and relate them to ecological lifestyles. The position of each nucleotide along the complete genome sequence was plotted on an orthogonal plane (DNA landscape), and fluctuation analysis applied to the DNA walk series showed a long-range correlation in contrast to the lack of correlation for artificially generated genomes. Different features in the DNA landscapes among genomes from different ecological and metabolic groups of prokaryotes appeared with the combined analysis. Transition from hyperthermophilic to psychrophilic environments could have been related to more complex structural adaptations in microbial genomes, whereas for other environmental factors such as pH and salinity this effect would have been smaller. Prokaryotes with domain-specific metabolisms, such as photoautotrophy in Bacteria and methanogenesis in Archaea, showed consistent differences in genome correlation structure. Overall, we show that, beyond the relative proportion of nucleotides, correlation properties derived from their sequential position within the genome hide relevant phylogenetic and ecological information. This can be studied by combining genometric and statistical physics methods, leading to a reduction of genome complexity to a few useful descriptors.

Full Text Available The recycled Kevlar®/polyester/low-melting-point polyester (recycled Kevlar®/PET/LPET nonwoven geotextiles are immersed in neutral, strong acid, and strong alkali solutions, respectively, at different temperatures for four months. Their tensile strength is then tested according to various immersion periods at various temperatures, in order to determine their durability to chemicals. For the purpose of analyzing the possible factors that influence mechanical properties of geotextiles under diverse environmental conditions, the experimental results and statisticalanalyses are incorporated in this study. Therefore, influences of the content of recycled Kevlar® fibers, implementation of thermal treatment, and immersion periods on the tensile strength of recycled Kevlar®/PET/LPET nonwoven geotextiles are examined, after which their influential levels are statistically determined by performing multiple regression analyses. According to the results, the tensile strength of nonwoven geotextiles can be enhanced by adding recycled Kevlar® fibers and thermal treatment.

Full Text Available Aim of study: Understanding inter- and intra-specific competition for water is crucial in drought-prone environments. However, little is known about the spatial interdependencies for water uptake among individuals in mixed stands. The aim of this work was to compare water uptake patterns during a drought episode in two common Mediterranean tree species, Quercus ilex L. and Pinus halepensis Mill., using the isotope composition of xylem water (δ18O, δ2H as hydrological marker. Area of study: The study was performed in a mixed stand, sampling a total of 33 oaks and 78 pines (plot area= 888 m2. We tested the hypothesis that both species uptake water differentially along the soil profile, thus showing different levels of tree-to-tree interdependency, depending on whether neighbouring trees belong to one species or the other. Material and Methods: We used pair-correlation functions to study intra-specific point-tree configurations and the bivariate pair correlation function to analyse the inter-specific spatial configuration. Moreover, the isotopic composition of xylem water was analysed as a mark point pattern. Main results: Values for Q. ilex (δ18O = –5.3 ± 0.2‰, δ2H = –54.3 ± 0.7‰ were significantly lower than for P. halepensis (δ18O = –1.2 ± 0.2‰, δ2H = –25.1 ± 0.8‰, pointing to a greater contribution of deeper soil layers for water uptake by Q. ilex. Research highlights: Point-process analyses revealed spatial intra-specific dependencies among neighbouring pines, showing neither oak-oak nor oak-pine interactions. This supports niche segregation for water uptake between the two species.

This study investigates the suitability of multivariate techniques, including principal component analysis and discriminant function analysis, for analysing polycyclic aromatic hydrocarbon and heavy metal-contaminated aquatic sediment data. We show that multivariate "fingerprint" analysis of relative abundances of contaminants can characterize a contamination source and distinguish contaminated sediments of interest from background contamination. Thereafter, analysis of the unstandardized concentrations among samples contaminated from the same source can identify migration pathways within a study area that is hydraulically complex and has a long contamination history, without reliance on complex hydrodynamic data and modelling techniques. Together, these methods provide an effective tool for drinking water source monitoring and protection.

In the present study, an attempt was made to compare the statistical tools used for analysing the data of repeated dose toxicity studies with rodents conducted in 45 countries, with that of Japan. The study revealed that there was no congruence among the countries in the use of statistical tools for analysing the data obtained from the above studies. For example, to analyse the data obtained from repeated dose toxicity studies with rodents, Scheffé's multiple range and Dunnett type (joint type Dunnett) tests are commonly used in Japan, but in other countries use of these statistical tools is not so common. However, statistical techniques used for testing the above data for homogeneity of variance and inter-group comparisons do not differ much between Japan and other countries. In Japan, the data are generally not tested for normality and the same is true with the most of the countries investigated. In the present investigation, out of 127 studies examined, data of only 6 studies were analysed for both homogeneity of variance and normal distribution. For examining homogeneity of variance, we propose Levene's test, since the commonly used Bartlett's test may show heterogeneity in variance in all the groups, if a slight heterogeneity in variance is seen any one of the groups. We suggest the data may be examined for both homogeneity of variance and normal distribution. For the data of the groups that do not show heterogeneity of variance, to find the significant difference among the groups, we recommend Dunnett's test, and for those show heterogeneity of variance, we recommend Steel's test.

This study presents results of the effects of bisphenol A (BPA) on adult egg production, egg hatchability, egg development rates and juvenile growth rates in the freshwater gastropod, Marisa cornuarietis. We observed no adult mortality, substantial inter-snail variability in reproductive output, and no effects of BPA on reproduction during 12 weeks of exposure to 0, 0.1, 1.0, 16, 160 or 640 microg/L BPA. We observed no effects of BPA on egg hatchability or timing of egg hatching. Juveniles showed good growth in the control and all treatments, and there were no significant effects of BPA on this endpoint. Our results do not support previous claims of enhanced reproduction in Marisa cornuarietis in response to exposure to BPA. Statistical power analysis indicated high levels of inter-snail variability in the measured endpoints and highlighted the need for sufficient replication when testing treatment effects on reproduction in M. cornuarietis with adequate power.

The depletion of ozone in the stratosphere is examined, and causes for the depletion are cited. Ground station and satellite measurements of ozone, which are taken on a worldwide basis, are discussed. Instruments used in ozone measurement are discussed, such as the Dobson spectrophotometer, which is credited with providing the longest and most extensive series of observations for ground based observation of stratospheric ozone. Other ground based instruments used to measure ozone are also discussed. The statistical differences of ground based measurements of ozone from these different instruments are compared to each other, and to satellite measurements. Mathematical methods (i.e., trend analysis or linear regression analysis) of analyzing the variability of ozone concentration with respect to time and lattitude are described. Various time series models which can be employed in accounting for ozone concentration variability are examined.

When analyze the uncertainty of the cost and the schedule of the spaceflight project, it is needed to know the value of the schedule-cost correlation coefficient. This paper deduces the schedule distribution, considering the effect of the cost, and proposes the estimation formula of the correlation coefficient between the ln(schedule) and the cost. On the basis of the fact and Taylor expansion, the relation expression between the schedule-cost correlation coefficient and the ln-schedule-cost correlation coefficient is put forward. By analyzing the value features of the estimation formula of the ln-schedule-cost correlation coefficient, the general rules are proposed to ascertain the value of the schedule-cost correlation coefficient. An example is given to demonstrate how to approximately amend the schedule-cost correlation coefficient based on the historical statistics, which reveals the traditional assigned value is inaccurate. The universality of this estimation method is analyzed.

Three boundary-layer cloud object types, stratus, stratocumulus and cumulus, that occurred over the Pacific Ocean during January-August 1998, are identified from the CERES (Clouds and the Earth s Radiant Energy System) single scanner footprint (SSF) data from the TRMM (Tropical Rainfall Measuring Mission) satellite. This study emphasizes the differences and similarities in the characteristics of each cloud-object type between the tropical and subtropical regions and among different size categories and among small geographic areas. Both the frequencies of occurrence and statistical distributions of cloud physical properties are analyzed. In terms of frequencies of occurrence, stratocumulus clouds dominate the entire boundary layer cloud population in all regions and among all size categories. Stratus clouds are more prevalent in the subtropics and near the coastal regions, while cumulus clouds are relatively prevalent over open ocean and the equatorial regions, particularly, within the small size categories. The largest size category of stratus cloud objects occurs more frequently in the subtropics than in the tropics and has much larger average size than its cumulus and stratocumulus counterparts. Each of the three cloud object types exhibits small differences in statistical distributions of cloud optical depth, liquid water path, TOA albedo and perhaps cloud-top height, but large differences in those of cloud-top temperature and OLR between the tropics and subtropics. Differences in the sea surface temperature (SST) distributions between the tropics and subtropics influence some of the cloud macrophysical properties, but cloud microphysical properties and albedo for each cloud object type are likely determined by (local) boundary-layer dynamics and structures. Systematic variations of cloud optical depth, TOA albedo, cloud-top height, OLR and SST with cloud object sizes are pronounced for the stratocumulus and stratus types, which are related to systematic

Red-green color vision is strongly suspected to enhance the survival of its possessors. Despite being red-green color blind, however, many species have successfully competed in nature, which brings into question the evolutionary advantage of achieving red-green color vision. Here, we propose a new method of identifying positive selection at individual amino acid sites with the premise that if positive Darwinian selection has driven the evolution of the protein under consideration, then it should be found mostly at the branches in the phylogenetic tree where its function had changed. The statistical and molecular methods have been applied to 29 visual pigments with the wavelengths of maximal absorption at approximately 510-540 nm (green- or middle wavelength-sensitive [MWS] pigments) and at approximately 560 nm (red- or long wavelength-sensitive [LWS] pigments), which are sampled from a diverse range of vertebrate species. The results show that the MWS pigments are positively selected through amino acid replacements S180A, Y277F, and T285A and that the LWS pigments have been subjected to strong evolutionary conservation. The fact that these positively selected M/LWS pigments are found not only in animals with red-green color vision but also in those with red-green color blindness strongly suggests that both red-green color vision and color blindness have undergone adaptive evolution independently in different species.

Data analysis is essential to translational medicine, epidemiology, and the scientific process. Although recent advances in promoting reproducibility and reporting standards have made some improvements, the data analysis process remains insufficiently documented and susceptible to avoidable errors, bias, and even fraud. Comprehensively accounting for the full analytical process requires not only records of the statistical methodology used, but also records of communications among the research team. In this regard, the data analysis process can benefit from the principle of accountability that is inherent in other disciplines such as clinical practice. We propose a novel framework for capturing the analytical narrative called the Accountable Data Analysis Process (ADAP), which allows the entire research team to participate in the analysis in a supervised and transparent way. The framework is analogous to an electronic health record in which the dataset is the "patient" and actions related to the dataset are recorded in a project management system. We discuss the design, advantages, and challenges in implementing this type of system in the context of academic health centers, where team based science increasingly demands accountability.

Data analysis is essential to translational medicine, epidemiology, and the scientific process. Although recent advances in promoting reproducibility and reporting standards have made some improvements, the data analysis process remains insufficiently documented and susceptible to avoidable errors, bias, and even fraud. Comprehensively accounting for the full analytical process requires not only records of the statistical methodology used, but also records of communications among the research team. In this regard, the data analysis process can benefit from the principle of accountability that is inherent in other disciplines such as clinical practice. We propose a novel framework for capturing the analytical narrative called the Accountable Data Analysis Process (ADAP), which allows the entire research team to participate in the analysis in a supervised and transparent way. The framework is analogous to an electronic health record in which the dataset is the “patient” and actions related to the dataset are recorded in a project management system. We discuss the design, advantages, and challenges in implementing this type of system in the context of academic health centers, where team based science increasingly demands accountability. PMID:26290897

Our aim was to analyze, monitor, and predict the outcomes of processes in a full-scale seawater reverse osmosis (SWRO) desalination plant using multivariate statistical techniques. Multivariate analysis of variance (MANOVA) was used to investigate the performance and efficiencies of two SWRO processes, namely, pore controllable fiber filterreverse osmosis (PCF-SWRO) and sand filtration-ultra filtration-reverse osmosis (SF-UF-SWRO). Principal component analysis (PCA) was applied to monitor the two SWRO processes. PCA monitoring revealed that the SF-UF-SWRO process could be analyzed reliably with a low number of outliers and disturbances. Partial least squares (PLS) analysis was then conducted to predict which of the seven input parameters of feed flow rate, PCF/SF-UF filtrate flow rate, temperature of feed water, turbidity feed, pH, reverse osmosis (RO)flow rate, and pressure had a significant effect on the outcome variables of permeate flow rate and concentration. Root mean squared errors (RMSEs) of the PLS models for permeate flow rates were 31.5 and 28.6 for the PCF-SWRO process and SF-UF-SWRO process, respectively, while RMSEs of permeate concentrations were 350.44 and 289.4, respectively. These results indicate that the SF-UF-SWRO process can be modeled more accurately than the PCF-SWRO process, because the RMSE values of permeate flowrate and concentration obtained using a PLS regression model of the SF-UF-SWRO process were lower than those obtained for the PCF-SWRO process.

Objectives We compared calculations of relative risks of cancer death in Swedish mammography trials and in other cancer screening trials. Participants Men and women from 30 to 74 years of age. Setting Randomised trials on cancer screening. Design For each trial, we identified the intervention period, when screening was offered to screening groups and not to control groups, and the post-intervention period, when screening (or absence of screening) was the same in screening and control groups. We then examined which cancer deaths had been used for the computation of relative risk of cancer death. Main outcome measures Relative risk of cancer death. Results In 17 non-breast screening trials, deaths due to cancers diagnosed during the intervention and post-intervention periods were used for relative risk calculations. In the five Swedish trials, relative risk calculations used deaths due to breast cancers found during intervention periods, but deaths due to breast cancer found at first screening of control groups were added to these groups. After reallocation of the added breast cancer deaths to post-intervention periods of control groups, relative risks of 0.86 (0.76; 0.97) were obtained for cancers found during intervention periods and 0.83 (0.71; 0.97) for cancers found during post-intervention periods, indicating constant reduction in the risk of breast cancer death during follow-up, irrespective of screening. Conclusions The use of unconventional statistical methods in Swedish trials has led to overestimation of risk reduction in breast cancer death attributable to mammography screening. The constant risk reduction observed in screening groups was probably due to the trial design that optimised awareness and medical management of women allocated to screening groups. PMID:26152677

Cultivation independent analyses of soil microbial community structures are frequently used to describe microbiological soil characteristics. This approach is based on direct extraction of total soil DNA followed by PCR amplification of selected marker genes and subsequent genetic fingerprint analyses. Semi-automated genetic fingerprinting techniques such as terminal restriction fragment length polymorphism (T-RFLP) and ribosomal intergenic spacer analysis (RISA) yield high-resolution patterns of highly diverse soil microbial communities and hold great potential for use in routine soil quality monitoring, when rapid high throughput screening for differences or changes is more important than phylogenetic identification of organisms affected. Our objective was to perform profound statistical analysis to evaluate the cultivation independent approach and the consistency of results from T-RFLP and RISA. As a model system, we used two different heavy metal treated soils from an open top chamber experiment. Bacterial T-RFLP and RISA profiles of 16S rDNA were converted into numeric data matrices in order to allow for detailed statisticalanalyses with cluster analysis, Mantel test statistics, Monte Carlo permutation tests and ANOVA. Analyses revealed that soil DNA-contents were significantly correlated with soil microbial biomass in our system. T-RFLP and RISA yielded highly consistent and correlating results and both allowed to distinguish the four treatments with equal significance. While RISA represents a fast and general fingerprinting method of moderate cost and labor intensity, T-RFLP is technically more demanding but offers the advantage of phylogenetic identification of detected soil microorganisms. Therefore, selection of either of these methods should be based on the specific research question under investigation.

Linking model and data, we detail the LANL diving reduced gradient bubble model (RGBM), dynamical principles, and correlation with data in the LANL Data Bank. Table, profile, and meter risks are obtained from likelihood analysis and quoted for air, nitrox, helitrox no-decompression time limits, repetitive dive tables, and selected mixed gas and repetitive profiles. Application analyses include the EXPLORER decompression meter algorithm, NAUI tables, University of Wisconsin Seafood Diver tables, comparative NAUI, PADI, Oceanic NDLs and repetitive dives, comparative nitrogen and helium mixed gas risks, USS Perry deep rebreather (RB) exploration dive,world record open circuit (OC) dive, and Woodville Karst Plain Project (WKPP) extreme cave exploration profiles. The algorithm has seen extensive and utilitarian application in mixed gas diving, both in recreational and technical sectors, and forms the bases forreleased tables and decompression meters used by scientific, commercial, and research divers. The LANL Data Bank is described, and the methods used to deduce risk are detailed. Risk functions for dissolved gas and bubbles are summarized. Parameters that can be used to estimate profile risk are tallied. To fit data, a modified Levenberg-Marquardt routine is employed with L2 error norm. Appendices sketch the numerical methods, and list reports from field testing for (real) mixed gas diving. A Monte Carlo-like sampling scheme for fast numerical analysis of the data is also detailed, as a coupled variance reduction technique and additional check on the canonical approach to estimating diving risk. The method suggests alternatives to the canonical approach. This work represents a first time correlation effort linking a dynamical bubble model with deep stop data. Supercomputing resources are requisite to connect model and data in application.

Full Text Available Abstract Background Quantitative trait loci (QTL detection on a huge amount of phenotypes, like eQTL detection on transcriptomic data, can be dramatically impaired by the statistical properties of interval mapping methods. One of these major outcomes is the high number of QTL detected at marker locations. The present study aims at identifying and specifying the sources of this bias, in particular in the case of analysis of data issued from outbred populations. Analytical developments were carried out in a backcross situation in order to specify the bias and to propose an algorithm to control it. The outbred population context was studied through simulated data sets in a wide range of situations. The likelihood ratio test was firstly analyzed under the "one QTL" hypothesis in a backcross population. Designs of sib families were then simulated and analyzed using the QTL Map software. On the basis of the theoretical results in backcross, parameters such as the population size, the density of the genetic map, the QTL effect and the true location of the QTL, were taken into account under the "no QTL" and the "one QTL" hypotheses. A combination of two non parametric tests - the Kolmogorov-Smirnov test and the Mann-Whitney-Wilcoxon test - was used in order to identify the parameters that affected the bias and to specify how much they influenced the estimation of QTL location. Results A theoretical expression of the bias of the estimated QTL location was obtained for a backcross type population. We demonstrated a common source of bias under the "no QTL" and the "one QTL" hypotheses and qualified the possible influence of several parameters. Simulation studies confirmed that the bias exists in outbred populations under both the hypotheses of "no QTL" and "one QTL" on a linkage group. The QTL location was systematically closer to marker locations than expected, particularly in the case of low QTL effect, small population size or low density of markers, i

Titanium alloys are characterized by high mechanical properties and elevated corrosion resistance. The combination of laser welding with MIG/GMAW has proven to improve beneficial effects of both processes (keyhole, gap-bridging ability) while limiting their drawbacks (high thermal gradient, low mechanical resistance) In this paper, the hybrid Laser-GMAW welding of Ti-6Al-4V 3-mm thick sheets is investigated using a specific designed trailing shield. The joint geometry was the double fillet welded T-joint. Bead morphologies, microstructures and mechanical properties (micro-hardness) of welds were evaluated and compared to those achieved for the base metals.

Full Text Available As the largest carbon emission source in China, the power sector grows rapidly owing to the country’s unprecedented urbanization and industrialization processes. In order to explore a low carbon urbanization pathway by reducing carbon emissions of the power sector, the Chinese government launched an international low carbon city (ILCC project in Shenzhen. This paper presents a feasibility analysis on the potential hybrid energy system based on local renewable energy resources and electricity demand estimation over the three planning stages of the ILCC project. Wind power, solar power, natural gas and the existing power grid are components considered in the hybrid energy system. The simulation results indicate that the costs of energy in the three planning stages are 0.122, 0.105 and 0.141 $/kWh, respectively, if external wind farms and pumped storage hydro stations (PSHSs exist. The optimization results reveal that the carbon reduction rates are 46.81%, 62.99% and 75.76% compared with the Business as Usual scenarios. The widely distributed water reservoirs in Shenzhen provide ideal conditions to construct PSHS, which is crucial in enhancing renewable energy utilization.

In the power drive system of the Electric Vehicles (EVs) and Hybrid Electric Vehicles (HEVs), High Voltage (HV) cables play a major role in evaluating the EMI of the whole system. Transfer impedance (ZT) is the most commonly used performance parameter for the HV cable. To analyse and design HV cables and connectors with better shielding effectiveness (SE), appropriate measurement and simulation methods are required. In this paper, Ground Plate Method (GPM) with improvements has been proposed to measure ZT. Use of low-frequency ferrites to avoid ground-loop effects has also been investigated. Additionally, a combination of analytical model with a circuit model has been implemented to simulate limitations (frequency response) of the test setup. Also parametrical studies using the analytical model have been performed to analyse the shielding behaviour of HV cables.

The generation of diploid spermatozoa is essential for the continuity of tetraploid lineages. The DNA content of diploid spermatozoa from allotetraploid hybrids of red crucian carp and common carp was nearly twice as great as that of haploid spermatozoa from common carp, and the durations of rapid and slow progressive motility were longer. We performed comparative proteomic analyses to measure variations in protein composition between diploid and haploid spermatozoa. Using two-dimensional electrophoresis followed by liquid chromatography tandem mass spectrometry, 21 protein spots that changed in abundance were analyzed. As the common carp and the allotetraploid hybrids are not fully sequenced organisms, we identified proteins by Mascot searching against the National Center for Biotechnology Information non-redundant (NR) protein database for the zebrafish (Danio rerio), and verified them against predicted homologous proteins derived from transcriptomes of the testis. Twenty protein spots were identified successfully, belonging to four gene ontogeny categories: cytoskeleton, energy metabolism, the ubiquitin-proteasome system, and other functions, indicating that these might be associated with the variation in diploid spermatozoa. This categorization of variations in protein composition in diploid spermatozoa will provide new perspectives on male polyploidy. Moreover, our approach indicates that transcriptome data are useful for proteomic analyses in organisms lacking full protein sequences.

The ligand exchange process of cis-platin in aqueous solution was studied using RISM-SCF-SEDD (reference interaction site model-self-consistent field with spatial electron density distribution) method, a hybrid approach of quantum chemistry and statistical mechanics. The analytical nature of RISM theory enables us to compute accurate reaction free energy in aqueous solution based on CCSD(T), together with the microscopic solvation structure around the complex. We found that the solvation effect is indispensable to promote the dissociation of the chloride anion from the complex.

Full Text Available Abstract Background Stoichiometric models constitute the basic framework for fluxome quantification in the realm of metabolic engineering. A recurrent bottleneck, however, is the establishment of consistent stoichiometric models for the synthesis of recombinant proteins or viruses. Although optimization algorithms for in silico metabolic redesign have been developed in the context of genome-scale stoichiometric models for small molecule production, still rudimentary knowledge of how different cellular levels are regulated and phenotypically expressed prevents their full applicability for complex product optimization. Results A hybrid framework is presented combining classical metabolic flux analysis with projection to latent structures to further link estimated metabolic fluxes with measured productivities. We first explore the functional metabolic decomposition of a baculovirus-producing insect cell line from experimental data, highlighting the TCA cycle and mitochondrial respiration as pathways strongly associated with viral replication. To reduce uncertainty in metabolic target identification, a Monte Carlo sampling method was used to select meaningful associations with the target, from which 66% of the estimated fluxome had to be screened out due to weak correlations and/or high estimation errors. The proposed hybrid model was then validated using a subset of preliminary experiments to pinpoint the same determinant pathways, while predicting the productivity of independent cultures. Conclusions Overall, the results indicate our hybrid metabolic flux analysis framework is an advantageous tool for metabolic identification and quantification in incomplete or ill-defined metabolic networks. As experimental and computational solutions for constructing comprehensive global cellular models are in development, the contribution of hybrid metabolic flux analysis should constitute a valuable complement to current computational platforms in bridging the

In medical science, modern IT concepts are increasingly important to gather new findings out of complex diseases. Data Warehouses (DWH) as central data repository systems play a key role by providing standardized, high-quality and secure medical data for effective analyses. However, DWHs in medicine must fulfil various requirements concerning data privacy and the ability to describe the complexity of (rare) disease phenomena. Here, i2b2 and tranSMART are free alternatives representing DWH solutions especially developed for medical informatics purposes. But different functionalities are not yet provided in a sufficient way. In fact, data import and export is still a major problem because of the diversity of schemas, parameter definitions and data quality which are described variously in each single clinic. Further, statisticalanalyses inside i2b2 and tranSMART are possible, but restricted to the implemented functions. Thus, data export is needed to provide a data basis which can be directly included within statistics software like SPSS and SAS or data mining tools like Weka and RapidMiner. The standard export tools of i2b2 and tranSMART are more or less creating a database dump of key-value pairs which cannot be used immediately by the mentioned tools. They need an instance-based or a case-based representation of each patient. To overcome this lack, we developed a concept called Generic Case Extractor (GCE) which pivots the key-value pairs of each clinical fact into a row-oriented format for each patient sufficient to enable analyses in a broader context. Therefore, complex pivotisation routines where necessary to ensure temporal consistency especially in terms of different data sets and the occurrence of identical but repeated parameters like follow-up data. GCE is embedded inside a comprehensive software platform for systems medicine.

There has been a dramatic increase in the amount of quantitative data derived from the measurement of changes at different levels of biological complexity during the post-genomic era. However, there are a number of issues associated with the use of computational tools employed for the analysis of such data. For example, computational tools such as R and MATLAB require prior knowledge of their programming languages in order to implement statisticalanalyses on data. Combining two or more tools in an analysis may also be problematic since data may have to be manually copied and pasted between separate user interfaces for each tool. Furthermore, this transfer of data may require a reconciliation step in order for there to be interoperability between computational tools. Developments in the Taverna workflow system have enabled pipelines to be constructed and enacted for generic and ad hoc analyses of quantitative data. Here, we present an example of such a workflow involving the statistical identification of differentially-expressed genes from microarray data followed by the annotation of their relationships to cellular processes. This workflow makes use of customised maxdBrowse web services, a system that allows Taverna to query and retrieve gene expression data from the maxdLoad2 microarray database. These data are then analysed by R to identify differentially-expressed genes using the Taverna RShell processor which has been developed for invoking this tool when it has been deployed as a service using the RServe library. In addition, the workflow uses Beanshell scripts to reconcile mismatches of data between services as well as to implement a form of user interaction for selecting subsets of microarray data for analysis as part of the workflow execution. A new plugin system in the Taverna software architecture is demonstrated by the use of renderers for displaying PDF files and CSV formatted data within the Taverna workbench. Taverna can be used by data analysis

Full Text Available Abstract Background There has been a dramatic increase in the amount of quantitative data derived from the measurement of changes at different levels of biological complexity during the post-genomic era. However, there are a number of issues associated with the use of computational tools employed for the analysis of such data. For example, computational tools such as R and MATLAB require prior knowledge of their programming languages in order to implement statisticalanalyses on data. Combining two or more tools in an analysis may also be problematic since data may have to be manually copied and pasted between separate user interfaces for each tool. Furthermore, this transfer of data may require a reconciliation step in order for there to be interoperability between computational tools. Results Developments in the Taverna workflow system have enabled pipelines to be constructed and enacted for generic and ad hoc analyses of quantitative data. Here, we present an example of such a workflow involving the statistical identification of differentially-expressed genes from microarray data followed by the annotation of their relationships to cellular processes. This workflow makes use of customised maxdBrowse web services, a system that allows Taverna to query and retrieve gene expression data from the maxdLoad2 microarray database. These data are then analysed by R to identify differentially-expressed genes using the Taverna RShell processor which has been developed for invoking this tool when it has been deployed as a service using the RServe library. In addition, the workflow uses Beanshell scripts to reconcile mismatches of data between services as well as to implement a form of user interaction for selecting subsets of microarray data for analysis as part of the workflow execution. A new plugin system in the Taverna software architecture is demonstrated by the use of renderers for displaying PDF files and CSV formatted data within the Taverna

The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger et al., 2010). Meta-analyses are supposed to reduce bias in literature reviews. Carter et al.'s (2015) meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and Funnel Plot Asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test and Precision Effect Estimate with Standard Error (PEESE) procedures. Despite these serious problems, the Carter et al. (2015) meta-analysis results actually indicate that there is a real depletion effect - contrary to their title.

Full Text Available We compared a local clustering and a cluster morphology statistic using anthrax outbreaks in large (cattle and small (sheep and goats domestic ruminants across Kazakhstan. The Getis-Ord (Gi* statistic and a multidirectional optimal ecotope algorithm (AMOEBA were compared using 1st, 2nd and 3rd order Rook contiguity matrices. Multivariate statistical tests were used to evaluate the environmental signatures between clusters and non-clusters from the AMOEBA and Gi* tests. A logistic regression was used to define a risk surface for anthrax outbreaks and to compare agreement between clustering methodologies. Tests revealed differences in the spatial distribution of clusters as well as the total number of clusters in large ruminants for AMOEBA (n = 149 and for small ruminants (n = 9. In contrast, Gi* revealed fewer large ruminant clusters (n = 122 and more small ruminant clusters (n = 61. Significant environmental differences were found between groups using the Kruskall-Wallis and Mann- Whitney U tests. Logistic regression was used to model the presence/absence of anthrax outbreaks and define a risk surface for large ruminants to compare with cluster analyses. The model predicted 32.2% of the landscape as high risk. Approximately 75% of AMOEBA clusters corresponded to predicted high risk, compared with ~64% of Gi* clusters. In general, AMOEBA predicted more irregularly shaped clusters of outbreaks in both livestock groups, while Gi* tended to predict larger, circular clusters. Here we provide an evaluation of both tests and a discussion of the use of each to detect environmental conditions associated with anthrax outbreak clusters in domestic livestock. These findings illustrate important differences in spatial statistical methods for defining local clusters and highlight the importance of selecting appropriate levels of data aggregation.

We compared a local clustering and a cluster morphology statistic using anthrax outbreaks in large (cattle) and small (sheep and goats) domestic ruminants across Kazakhstan. The Getis-Ord (Gi*) statistic and a multidirectional optimal ecotope algorithm (AMOEBA) were compared using 1st, 2nd and 3rd order Rook contiguity matrices. Multivariate statistical tests were used to evaluate the environmental signatures between clusters and non-clusters from the AMOEBA and Gi* tests. A logistic regression was used to define a risk surface for anthrax outbreaks and to compare agreement between clustering methodologies. Tests revealed differences in the spatial distribution of clusters as well as the total number of clusters in large ruminants for AMOEBA (n = 149) and for small ruminants (n = 9). In contrast, Gi* revealed fewer large ruminant clusters (n = 122) and more small ruminant clusters (n = 61). Significant environmental differences were found between groups using the Kruskall-Wallis and Mann-Whitney U tests. Logistic regression was used to model the presence/absence of anthrax outbreaks and define a risk surface for large ruminants to compare with cluster analyses. The model predicted 32.2% of the landscape as high risk. Approximately 75% of AMOEBA clusters corresponded to predicted high risk, compared with ~64% of Gi* clusters. In general, AMOEBA predicted more irregularly shaped clusters of outbreaks in both livestock groups, while Gi* tended to predict larger, circular clusters. Here we provide an evaluation of both tests and a discussion of the use of each to detect environmental conditions associated with anthrax outbreak clusters in domestic livestock. These findings illustrate important differences in spatial statistical methods for defining local clusters and highlight the importance of selecting appropriate levels of data aggregation.

The paper "Evaluation of Colorado Learning Attitudes about Science Survey" [1] proposes a new, much shorter, version of the CLASS based on standard factor analysis. In this comment we explain why we believe the analysis that is used is inappropriate, and the proposed modified CLASS will be measuring something quite different, and less useful, than the original. The CLASS was based on extensive interviews with students and is intended to be a formative measurement of instruction that is probing a much more complex construct and with different goals than what is handled with classic psychometrics. We are writing this comment to reiterate the value of combining techniques of cognitive science with statisticalanalyses as described in detail in Adams & Wieman, 2011 [2] when developing a test of expert-like thinking for use in formative assessment. This type of approach is also called for by the National Research Council in a recent report [3].

Statistical and trend analyses of selected water-quality data collected at three streamflow stations in the lower Neches River basin, Texas, are summarized in order to document baseline water-quality conditions in stream segments that flow through the Big Thicket National Preserve in southeast Texas. Dissolved-solids concentrations in the streams are small, less than 132 milligrams per liter in 50 percent of the samples analyzed from each of the sites. Dissolved-oxygen concentrations in the Neches River at Evadale (08041000) generally are large, exceeding 8.0 milligrams per liter in more than 50 percent of the samples analyzed. Total nitrogen and total phosphorus concentrations in samples from this site have not exceeded 1.8 milligrams per liter and 0.20 milligram per liter, respectively.

This study presents a new method to analyse the properties of the sea-level signal recorded by coastal tide gauges in the long wave range that is in a window between wind/storm waves and tides and is typical of several phenomena like local seiches, coastal shelf resonances and tsunamis. The method consists of computing four specific functions based on the time gradient (slope) of the recorded sea level oscillations, namely the instantaneous slope (IS) as well as three more functions based on IS, namely the reconstructed sea level (RSL), the background slope (BS) and the control function (CF). These functions are examined through a traditional spectral fast Fourier transform (FFT) analysis and also through a statistical analysis, showing that they can be characterised by probability distribution functions PDFs such as the Student's t distribution (IS and RSL) and the beta distribution (CF). As an example, the method has been applied to data from the tide-gauge station of Siracusa, Italy.

To fully understand the thermodynamic nature of polymer blends and accurately predict their miscibility on a microscopic level, a hybrid model employing both statistical mechanics and molecular dynamics techniques was developed to effectively predict the total free energy of mixing. The statistical mechanics principles were used to derive an expression for the deformational entropy of the chains in the polymeric blends that could be evaluated from molecular dynamics trajectories. Evaluation of the entropy loss due to the deformation of the polymer chains in the case of coiling as a result of the repulsive interactions between the blend components or in the case of swelling due to the attractive interactions between the polymeric segments predicted a negative value for the deformational entropy resulting in a decrease in the overall entropy change upon mixing. Molecular dynamics methods were then used to evaluate the enthalpy of mixing, entropy of mixing, the loss in entropy due to the deformation of the polymeric chains upon mixing and the total free energy change for a series of polar and non-polar, poly(glycolic acid), PGA, polymer blends.

Heavy metals are considered toxic to humans and ecosystems. In the present study, heavy metal concentration in soil was investigated using the single pollution index (PIi), the integrated Nemerow pollution index (PIN), and the geoaccumulation index (Igeo) to determine metal accumulation and its pollution status at the abandoned site of the Capital Iron and Steel Factory in Beijing and its surrounding area. Multivariate statistical (principal component analysis and correlation analysis), geostatistical analysis (ArcGIS tool), combined with stable Pb isotopic ratios, were applied to explore the characteristics of heavy metal pollution and the possible sources of pollutants. The results indicated that heavy metal elements show different degrees of accumulation in the study area, the observed trend of the enrichment factors, and the geoaccumulation index was Hg > Cd > Zn > Cr > Pb > Cu ≈ As > Ni. Hg, Cd, Zn, and Cr were the dominant elements that influenced soil quality in the study area. The Nemerow index method indicated that all of the heavy metals caused serious pollution except Ni. Multivariate statistical analysis indicated that Cd, Zn, Cu, and Pb show obvious correlation and have higher loads on the same principal component, suggesting that they had the same sources, which are related to industrial activities and vehicle emissions. The spatial distribution maps based on ordinary kriging showed that high concentrations of heavy metals were located in the local factory area and in the southeast-northwest part of the study region, corresponding with the predominant wind directions. Analyses of lead isotopes confirmed that Pb in the study soils is predominantly derived from three Pb sources: dust generated during steel production, coal combustion, and the natural background. Moreover, the ternary mixture model based on lead isotope analysis indicates that lead in the study soils originates mainly from anthropogenic sources, which contribute much more

Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

Microbial communities (microbiomes) are associated with almost all metazoans, including the honey bee Apis mellifera. Honey bees are social insects, maintaining complex hive systems composed of a variety of integral components including bees, comb, propolis, honey, and stored pollen. Given that the different components within hives can be physically separated and are nutritionally variable, we hypothesize that unique microbial communities may occur within the different microenvironments of honey bee colonies. To explore this hypothesis and to provide further insights into the microbiome of honey bees, we use a hybrid of fatty acid methyl ester (FAME) and phospholipid-derived fatty acid (PLFA) analysis to produce broad, lipid-based microbial community profiles of stored pollen, adults, pupae, honey, empty comb, and propolis for 11 honey bee hives. Averaging component lipid profiles by hive, we show that, in decreasing order, lipid markers representing fungi, Gram-negative bacteria, and Gram-positive bacteria have the highest relative abundances within honey bee colonies. Our lipid profiles reveal the presence of viable microbial communities in each of the six hive components sampled, with overall microbial community richness varying from lowest to highest in honey, comb, pupae, pollen, adults and propolis, respectively. Finally, microbial community lipid profiles were more similar when compared by component than by hive, location, or sampling year. Specifically, we found that individual hive components typically exhibited several dominant lipids and that these dominant lipids differ between components. Principal component and two-way clustering analyses both support significant grouping of lipids by hive component. Our findings indicate that in addition to the microbial communities present in individual workers, honey bee hives have resident microbial communities associated with different colony components.

Full Text Available Microbial communities (microbiomes are associated with almost all metazoans, including the honey bee Apis mellifera. Honey bees are social insects, maintaining complex hive systems composed of a variety of integral components including bees, comb, propolis, honey, and stored pollen. Given that the different components within hives can be physically separated and are nutritionally variable, we hypothesize that unique microbial communities may occur within the different microenvironments of honey bee colonies. To explore this hypothesis and to provide further insights into the microbiome of honey bees, we use a hybrid of fatty acid methyl ester (FAME and phospholipid-derived fatty acid (PLFA analysis to produce broad, lipid-based microbial community profiles of stored pollen, adults, pupae, honey, empty comb, and propolis for 11 honey bee hives. Averaging component lipid profiles by hive, we show that, in decreasing order, lipid markers representing fungi, Gram-negative bacteria, and Gram-positive bacteria have the highest relative abundances within honey bee colonies. Our lipid profiles reveal the presence of viable microbial communities in each of the six hive components sampled, with overall microbial community richness varying from lowest to highest in honey, comb, pupae, pollen, adults and propolis, respectively. Finally, microbial community lipid profiles were more similar when compared by component than by hive, location, or sampling year. Specifically, we found that individual hive components typically exhibited several dominant lipids and that these dominant lipids differ between components. Principal component and two-way clustering analyses both support significant grouping of lipids by hive component. Our findings indicate that in addition to the microbial communities present in individual workers, honey bee hives have resident microbial communities associated with different colony components.

Full Text Available Male sterility induced by a chemical hybridization agent (CHA is an important tool for utilizing crop heterosis. Monosulphuron ester sodium (MES, a new acetolactate synthase-inhibitor herbicide belonging to the sulphonylurea family, has been developed as an effective CHA to induce male sterility in rapeseed (Brassica napus L.. To understand MES-induced male sterility in rapeseed better, comparative cytological and proteomic analyses were conducted in this study. Cytological analysis indicated that defective tapetal cells and abnormal microspores were gradually generated in the developing anthers of MES-treated plants at various development stages, resulting in unviable microspores and male sterility. A total of 141 differentially expressed proteins between the MES-treated and control plants were revealed, and 131 of them were further identified by MALDI-TOF/TOF MS. Most of these proteins decreased in abundance in tissues of MES-treated rapeseed plants, and only a few increased. Notably, some proteins were absent or induced in developing anthers after MES treatment. These proteins were involved in several processes that may be crucial for tapetum and microspore development. Down-regulation of these proteins may disrupt the coordination of developmental and metabolic processes, resulting in defective tapetum and abnormal microspores that lead to male sterility in MES-treated plants. Accordingly, a simple model of CHA-MES-induced male sterility in rapeseed was established. This study is the first cytological and dynamic proteomic investigation on CHA-MES-induced male sterility in rapeseed, and the results provide new insights into the molecular events of male sterility.

The heritability, the number of segregating genes and the type of gene interaction of nine agronomic traits were analysed based on F2 populations of synthetic oilseed Brassica napus produced from interspecific hybridization of B. campestris and B. oleracea through ovary culture. The nine traits—plant height, stem width, number of branches, length of main raceme, number of pods per plant, number of seeds per pod, length of pod, seed weight per plant and 1000-seed weight—had heritabilities of 0.927, 0.215, 0.172, 0.381, 0.360, 0.972, 0.952, 0.516 and 0.987 respectively, while the mean numbers of controlling genes for these characters were 7.4, 10.4, 9.9, 12.9, 11.5, 21.7, 20.5, 19.8 and 6.4 respectively. According to estimated coefficients of skewness and kurtosis of the traits tested, no significant gene interaction was found for plant height, stem width, number of branches, length of main raceme, number of seeds per pod and 1000-seed weight. Seed yield per plant is an important target for oilseed production. In partial correlation analysis, number of pods per plant, number of seeds per pod and 1000-seed weight were positively correlated with seed yield per plant. On the other hand, length of pod was negatively correlated ($r = -0.69$) with seed yield per plant. Other agronomic characters had no significant correlation to seed yield per plant. In this experiment, the linear regressions of seed yield per plant and other agronomic traits were also analysed. The linear regression equation was $y = 0.074x_{8} + 1.819x_{9} + 6.72x_{12} - 60.78 (R^{2} = 0.993)$, where $x_{8}$, $x_{9}$ and $x_{12}$ represent number of pods per plant, number of seeds per pod and 1000-seed weight respectively. The experiment also showed that erucic acid and oil contents of seeds from F2 plants were lower than those of their maternal parents. However, glucosinolate content was higher than that of the maternal plants. As for protein content, similar results were found in the F2 plants and

Dogroses are characterized by a unique meiosis system, the so-called canina meiosis, which facilitates sexual reproduction at odd-number ploidy. The mostly pentaploid somatic level of dogroses is restored by a merger of haploid sperm cells and tetraploid egg cells. We analyzed experimental hybrids between different dogrose species using microsatellites to determine pollen-transmitted alleles. This information was used to reconstruct the putative hybridogenic origin of Rosa micrantha and R. dumalis and to estimate the frequency of spontaneous hybridization in a natural population. We found no evidence for the hybrid origin of R. dumalis, but our data suggest that R. micrantha presumably arose by hybridization between R. rubiginosa and R. canina or R. corymbifera. We observed only hexaploid individuals of R. micrantha, thus the establishment of this hybridogenic species was favored when unreduced gametes contributed to their origin. We demonstrate that spontaneous hybrids originated infrequently from the parental species in a natural population, but hybridization was often associated with the formation of unreduced gametes. We postulate that unreduced gametes play a major role in the evolutionary success of dogrose hybrids because they provide highly homologous chromosomes crucial for bivalent formation during canina meiosis and thus ensuring this unique form of sexual reproduction.

Brown trout Salmo trutta and Atlantic salmon Salmo salar can interbreed and produce viable hybrid offspring. Literature indicates that maternal species can either be brown trout (North America) or Atlantic salmon (South- ern Europe and Ireland), and bidirectional hybridization has also been reported (England, Northern Europe and subantarctic French Territory). In coastal rivers where both species are sympatric, brown trout populations often split into two morphs, stream residents ...

A series of tris(hydroxymethyl)aminomethane (TRIS)-based linear (bis(TRIS)) and triangular (tris(TRIS)) ligands has been synthesised and were covalently attached to the Wells-Dawson type cluster [P(2)V(3)W(15)O(62)](9-) to generate a series of nanometer-sized inorganic-organic hybrid polyoxometalate clusters. These huge hybrids, with a molecular mass similar to that of small proteins in the range of ≈10-16 kDa, were unambiguously characterised by using high-resolution ESI-MS. The ESI-MS spectra of these compounds revealed, in negative ion mode, a characteristic pattern showing distinct groups of peaks corresponding to different anionic charge states ranging from 3(-) to 8(-) for the hybrids. Each peak in these individual groups could be unambiguously assigned to the corresponding hybrid cluster anion with varying combinations of tetrabutylammonium (TBA) and other cations. This study therefore highlights the prowess of the high-resolution ESI-MS for the unambiguous characterisation of large, nanoscale, inorganic-organic hybrid clusters that have huge mass, of the order of 10-16 kDa. Also, the designed synthesis of these compounds points to the fact that we were able to achieve a great deal of structural pre-design in the synthesis of these inorganic-organic hybrid polyoxometalates (POMs) by means of a ligand design route, which is often not possible in traditional "one-pot" POM synthesis.

Electron spin echo envelope modulation (ESEEM) is a technique of pulsed-electron paramagnetic resonance (EPR) spectroscopy. The analyis of ESEEM data to extract information about the nuclear and electronic structure of a disordered (powder) paramagnetic system requires accurate and efficient numerical simulations. A single coupled nucleus of known nuclear g value (g(N)) and spin I=1 can have up to eight adjustable parameters in the nuclear part of the spin Hamiltonian. We have developed OPTESIM, an ESEEM simulation toolbox, for automated numerical simulation of powder two- and three-pulse one-dimensional ESEEM for arbitrary number (N) and type (I, g(N)) of coupled nuclei, and arbitrary mutual orientations of the hyperfine tensor principal axis systems for N>1. OPTESIM is based in the Matlab environment, and includes the following features: (1) a fast algorithm for translation of the spin Hamiltonian into simulated ESEEM, (2) different optimization methods that can be hybridized to achieve an efficient coarse-to-fine grained search of the parameter space and convergence to a global minimum, (3) statistical analysis of the simulation parameters, which allows the identification of simultaneous confidence regions at specific confidence levels. OPTESIM also includes a geometry-preserving spherical averaging algorithm as default for N>1, and global optimization over multiple experimental conditions, such as the dephasing time (tau) for three-pulse ESEEM, and external magnetic field values. Application examples for simulation of (14)N coupling (N=1, N=2) in biological and chemical model paramagnets are included. Automated, optimized simulations by using OPTESIM lead to a convergence on dramatically shorter time scales, relative to manual simulations.

Full Text Available To alleviate the emission of greenhouse gas and the dependence on fossil fuel, Plug-in Hybrid Electrical Vehicles (PHEVs have gained an increasing popularity in current decades. Due to the fluctuating electricity prices in the power market, a charging schedule is very influential to driving cost. Although the next-day electricity prices can be obtained in a day-ahead power market, a driving plan is not easily made in advance. Although PHEV owners can input a next-day plan into a charging system, e.g., aggregators, day-ahead, it is a very trivial task to do everyday. Moreover, the driving plan may not be very accurate. To address this problem, in this paper, we analyze energy demands according to a PHEV owner’s historical driving records and build a personalized statistic driving model. Based on the model and the electricity spot prices, a rolling optimization strategy is proposed to help make a charging decision in the current time slot. On one hand, by employing a heuristic algorithm, the schedule is made according to the situations in the following time slots. On the other hand, however, after the current time slot, the schedule will be remade according to the next tens of time slots. Hence, the schedule is made by a dynamic rolling optimization, but it only decides the charging decision in the current time slot. In this way, the fluctuation of electricity prices and driving routine are both involved in the scheduling. Moreover, it is not necessary for PHEV owners to input a day-ahead driving plan. By the optimization simulation, the results demonstrate that the proposed method is feasible to help owners save charging costs and also meet requirements for driving.

An interspecific hybrid F1 of Cucumis hystrix Chakr. × Cucumis sativus L. (NC4406) was used to establish the developmental sequence and to characterize the male and female gametophytes at cytological level for further understanding of the phylogenic relationship and the mechanism of fertility or sterility in the interspecific hybrid F1. The development of male and female gametophytes was studied through meiotic analysis and paraffin section observation technique, respectively.Meanwhile, the fertility level was assessed through hybrid F1 backcrossing to cultivated cucumber 4406. Variable chromosome confgurations were observed in the pollen mother cells (PMCs) of hybrid F1 at metaphase Ⅰ, e.g., univalents,bivalents, trivalents, quadravalents, etc. At anaphase Ⅰ and Ⅱ, chromosome lagging and bridges were frequently observed as well, which led to the formation of polyads and only a partial number of microspores could develop into fertile pollen grains (about 23.3%). Observations of the paraffin sections showed numerous degenerated and abnormal embryo sacs during the development of female gametophytes, and only 40% of the female gametophytes could develop into normal eight-nuclear megaspore. On an average, 22.8 and 6.3 seeds per fruit could be obtained from the reciprocal backcross. The interspecific hybrid F1 of C. hystrix × NC4406 was partially fertile; however, the meiotic behaviors of hybrid F1 showed a high level of intergenomic recombination between C. hystrix and C. sativus chromosomes, which indicated that it plays an important role for introgression of useful traits from C. hystrix into C. sativus.

reconstructed sea level (RSL, the background slope (BS and the control function (CF. These functions are examined through a traditional spectral fast Fourier transform (FFT analysis and also through a statistical analysis, showing that they can be characterised by probability distribution functions PDFs such as the Student's t distribution (IS and RSL and the beta distribution (CF. As an example, the method has been applied to data from the tide-gauge station of Siracusa, Italy.

Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

Parameter optimization for a novel 3-DOF hybrid mechanical arm was presented by using a statistics method called the statistics parameters optimization method based on index atlases.Several kinematics and mechanics performance evaluation indices were proposed and discussed,according to the kinematics and mechanics analyses of the mechanical arm.Considering the assembly technique,a prototype of the 3-DOF hybrid mechanical arm was developed,which provided a basis for applications of the 3-DOF hybrid mechanical arm.The novel 3-DOF hybrid mechanical arm can be applied to the modern industrial fields requiring high stiffness,lower inertia and good technological efficiency.A novel 6-DOF hybrid humanoid mechanical arm was built,in which the present mechanical arm was connected with a spherical 3-DOF parallel manipulator.

Cooling towers are also utilized in climate technique to dispose of process heat. Water use, the added chemicals and the cleaning cost constitute the largest debit items in exploitation. Open wet cooling towers use water all through the year. The closed hybrid medium blast cooler only uses water in the summer. (mk) [Dutch] Koeltorens worden, ook in de klimaattechniek, ingezet om proceswarmte af te voeren. Het waterverbruik, de toegevoegde chemicalien en de schoonmaakkosten behoren tot de grotere kostenposten in de exploitatie. Open natte koeltorens verbruiken het hele jaar door water. De gesloten hybride-mediumterugkoeler verbruikt alleen water in de zomer.

Full Text Available In the political and war crisis which embraced Bosnia and Herzegovina in the spring of 1992 with an end of war hostilities in the autumn of 1995 when the "Dayton Peace Agreement" emerged (November 1995, a media war occurred. From the very beginning, this war had an international character. The question on the number of war victims (killed and missing "exploded" in June of 1993 when Haris Silajdžić stated that there had been 200000 dead among the Muslims. This figure uncritically became the basis for all later media and local "empirical truths" on the number of victims. All statistical and demographic disciplines were exploited to support, if not prove, the propaganda standpoints. Objectivity was oppressed by an ugly "face of the war". Having in mind the experience of the Second World War in Yugoslavia the question on the number of victims does not cease to be topical for decades after the end of the war. Bosnia and Herzegovina is more than a confirmation. This question seems to intervene (and in a way "feed of" with the most difficult political and international questions and court trials. ("International Court of Justice", indictment of Bosnia and Herzegovina against The Federal Republic of Yugoslavia, namely Serbia. The methodological analysis of the most important works which deal with the question of the number of victims in the Bosnian war (above all, those done by Bosnian institutes and authors indicate the "mistakes" made by the character of these works (propaganda. The manipulation with statistical methods and numbers is not new. Methodological and numerical traps can slip even to the most informed. The use of statistics and social science in court trials seems to show Janus's face of science: on one side the authentic "moral passion" of researchers finds great sense, and on the other side special interests strive to impose themselves through the (most refined instrumentation of science and knowledge. (The example of Mr. Patrick Ball

Extending the crop survey application of remote sensing from small experimental regions to state and national levels requires that a sample of agricultural fields be chosen for remote sensing of crop acreage, and that a statistical estimate be formulated with measurable characteristics. The critical requirements for the success of the application are reviewed in this report. The problem of sampling in the presence of cloud cover is discussed. Integration of remotely sensed information about crops into current agricultural crop forecasting systems is treated on the basis of the USDA multiple frame survey concepts, with an assumed addition of a new frame derived from remote sensing. Evolution of a crop forecasting system which utilizes LANDSAT and future remote sensing systems is projected for the 1975-1990 time frame.

由于交直流混合微电网可以减少多重变换器运行所产生的损耗、谐波电流，同时能够提高系统的经济性、可靠性，所以现在已成为当今微电网的主要发展方向。笔者将从电压等级、接地方式、母线结构和网络拓扑等角度，探讨交直流混合微电网的规划设计，以供有意对交直流混合微电网进行深入研究的专家学者参考。%Due to micro AC/DC hybrid power grid can reduce multiple converter loss, harmonic current generated by the operation, at the same time can improve the efficiency of system, the reliability, so now it is the main direction of micro grid today. Grounding method, the author will from voltage grade, bus structure and network topology of micro AC/DC hybrid power grid planning and design were discussed, for the intention to conduct the thorough research to the micro AC/DC hybrid power grid in the experts and scholars to provide the reference.

对机体结构声发射数据的表征参数进行分析，建立了数据处理的统计模型，并运用该模型分析了某型飞机全机疲劳试验和某现役飞机随机机载声发射监测数据，提出了结构损伤判据，可供实际应用。%AE technique can be used to monitor the formation and development of fatigue cracks in steel structure dynamically and continuously. In this paper, a mathematical statistical model is established by analyzing characteristic parameters of AE data from aircraft. By using it, the authors make a study of AE data from an aircraft during its fatigue test and a fighter structure during its flight. On this basis, a practical damage standard is advanced, which can be applied to judge whether cracks are being formed or developed in an aircraft under given probability. At the end of the paper, the research orientations are given to the establishment of a more general standard.

A toolkit has been developed for calculating the 3-dimensional biological effective dose (BED) distributions in multi-phase, external beam radiotherapy treatments such as those applied in liver stereotactic body radiation therapy (SBRT) and in multi-prescription treatments. This toolkit also provides a wide range of statistical results related to dose and BED distributions. MATLAB 2010a, version 7.10 was used to create this GUI toolkit. The input data consist of the dose distribution matrices, organ contour coordinates, and treatment planning parameters from the treatment planning system (TPS). The toolkit has the capability of calculating the multi-phase BED distributions using different formulas (denoted as true and approximate). Following the calculations of the BED distributions, the dose and BED distributions can be viewed in different projections (e.g. coronal, sagittal and transverse). The different elements of this toolkit are presented and the important steps for the execution of its calculations are illustrated. The toolkit is applied on brain, head & neck and prostate cancer patients, who received primary and boost phases in order to demonstrate its capability in calculating BED distributions, as well as measuring the inaccuracy and imprecision of the approximate BED distributions. Finally, the clinical situations in which the use of the present toolkit would have a significant clinical impact are indicated.

Full Text Available Late Embryogenesis Abundant Proteins (LEAPs are ubiquitous proteins expected to play major roles in desiccation tolerance. Little is known about their structure - function relationships because of the scarcity of 3-D structures for LEAPs. The previous building of LEAPdb, a database dedicated to LEAPs from plants and other organisms, led to the classification of 710 LEAPs into 12 non-overlapping classes with distinct properties. Using this resource, numerous physico-chemical properties of LEAPs and amino acid usage by LEAPs have been computed and statistically analyzed, revealing distinctive features for each class. This unprecedented analysis allowed a rigorous characterization of the 12 LEAP classes, which differed also in multiple structural and physico-chemical features. Although most LEAPs can be predicted as intrinsically disordered proteins, the analysis indicates that LEAP class 7 (PF03168 and probably LEAP class 11 (PF04927 are natively folded proteins. This study thus provides a detailed description of the structural properties of this protein family opening the path toward further LEAP structure - function analysis. Finally, since each LEAP class can be clearly characterized by a unique set of physico-chemical properties, this will allow development of software to predict proteins as LEAPs.

In the United States, the production of the Klebsiella pneumoniae carbapenemase (KPC) is an important mechanism of carbapenem resistance in Gram-negative pathogens. Infections with KPC-producing organisms are associated with increased morbidity and mortality; therefore, the rapid detection of KPC-producing pathogens is critical in patient care and infection control. We developed a real-time PCR assay complemented with traditional high-resolution melting (HRM) analysis, as well as statistically based genotyping, using the Rotor-Gene ScreenClust HRM software to both detect the presence of bla(KPC) and differentiate between KPC-2-like and KPC-3-like alleles. A total of 166 clinical isolates of Enterobacteriaceae, Pseudomonas aeruginosa, and Acinetobacter baumannii with various β-lactamase susceptibility patterns were tested in the validation of this assay; 66 of these organisms were known to produce the KPC β-lactamase. The real-time PCR assay was able to detect the presence of bla(KPC) in all 66 of these clinical isolates (100% sensitivity and specificity). HRM analysis demonstrated that 26 had KPC-2-like melting peak temperatures, while 40 had KPC-3-like melting peak temperatures. Sequencing of 21 amplified products confirmed the melting peak results, with 9 isolates carrying bla(KPC-2) and 12 isolates carrying bla(KPC-3). This PCR/HRM assay can identify KPC-producing Gram-negative pathogens in as little as 3 h after isolation of pure colonies and does not require post-PCR sample manipulation for HRM analysis, and ScreenClust analysis easily distinguishes bla(KPC-2-like) and bla(KPC-3-like) alleles. Therefore, this assay is a rapid method to identify the presence of bla(KPC) enzymes in Gram-negative pathogens that can be easily integrated into busy clinical microbiology laboratories.

Residents of eastern Washington, northeastern Oregon, and western Idaho were exposed to I released into the atmosphere from operations at the Hanford Nuclear Site from 1944 through 1972, especially in the late 1940's and early 1950's. This paper describes the estimated doses to the thyroid glands of the 3,440 evaluable participants in the Hanford Thyroid Disease Study, which investigated whether thyroid morbidity was increased in people exposed to radioactive iodine from Hanford during 1944-1957. The participants were born during 1940-1946 to mothers living in Benton, Franklin, Walla Walla, Adams, Okanogan, Ferry, or Stevens Counties in Washington State. Whenever possible someone with direct knowledge of the participant's early life (preferably the participant's mother) was interviewed about the participant's individual dose-determining characteristics (residence history, sources and quantities of food, milk, and milk products consumed, production and processing techniques for home-grown food and milk products). Default information was used if no interview respondent was available. Thyroid doses were estimated using the computer program Calculation of Individual Doses from Environmental Radionuclides (CIDER) developed by the Hanford Environmental Dose Reconstruction Project. CIDER provided 100 sets of doses to represent uncertainty of the estimates. These sets were not generated independently for each participant, but reflected the effects of uncertainties in characteristics shared by participants. Estimated doses (medians of each participant's 100 realizations) ranged from 0.0029 mGy to 2823 mGy, with mean and median of 174 and 97 mGy, respectively. The distribution of estimated doses provided the Hanford Thyroid Disease Study with sufficient statistical power to test for dose-response relationships between thyroid outcomes and exposure to Hanford's I.

Full Text Available Lake El'gygytgyn, located in the Far East Russian Arctic, was formed by a meteorite impact about 3.58 Ma ago. In 2009, the ICDP Lake El'gygytgyn Drilling Project obtained a continuous sediment sequence of the lacustrine deposits and the upper part of the impact breccia. Here, we present grain-size data of the past 2.6 Ma. General downcore grain-size variations yield coarser sediments during warm periods and finer ones during cold periods. According to Principal Component Analyses (PCA, the climate-dependent variations in grain-size distributions mainly occur in the coarse silt and very fine silt fraction. During interglacial periods, accumulation of coarser grain sizes in the lake center is supposed to be caused by redistribution of clastic material by a wind-induced current pattern during the ice-free period. Sediment supply to the lake is triggered by the thickness of the active layer in the catchment, and the availability of water as transport medium. During glacial periods, sedimentation at Lake El'gygytgyn is hampered by the occurrence of a perennial ice-cover with sedimentation being restricted to seasonal moats and vertical conducts through the ice. Thus, the summer temperature predominantly triggers transport of coarse material into the lake center. Time series analysis that was carried out to gain insight in the frequency of the grain-size data showed grain-size variations predominately on Milankovitch's eccentricity, obliquity and precession bands. Variations in the relative power of these three oscillation bands during the Quaternary imply that climate conditions at Lake El'gygytgyn are mainly triggered by global glacial/interglacial variations (eccentricity, obliquity and local insolation forcing (precession, respectively.

In 2003 the U.S. Geological Survey, in cooperation with the San Antonio Water System, did a study using historical data to statistically analyze hydrologic system components in the San Antonio region of Texas and to develop transfer-function models to simulate water levels at selected sites (wells) in the Edwards aquifer on the basis of rainfall. Water levels for two wells in the confined zone in Medina County and one well in the confined zone in Bexar County were highly correlated and showed little or no lag time between water-level responses. Water levels in these wells also were highly correlated with springflow at Comal Springs. Water-level hydrographs for 35 storms showed that an individual well can respond differently to similar amounts of rainfall. Fourteen water-level-recession hydrographs for a Medina County well showed that recession rates were variable. Transfer-function models were developed to simulate water levels at one confined-zone well and two recharge-zone wells in response to rainfall. For the confined-zone well, 50 percent of the simulated water levels are within 10 feet of the measured water levels, and 80 percent of the simulated water levels are within 15 feet of the measured water levels. For one recharge-zone well, 50 percent of the simulated water levels are within 5 feet of the measured water levels, and 90 percent of the simulated water levels are within 14 feet of the measured water levels. For the other recharge-zone well, 50 percent of the simulated water levels are within 14 feet of the measured water levels, and 90 percent of the simulated water levels are within 27 feet of the measured water levels. The transfer-function models showed that (1) the Edwards aquifer in the San Antonio region responds differently to recharge (effective rainfall) at different wells; and (2) multiple flow components are present in the aquifer. If simulated long-term system response results from a change in the hydrologic budget, then water levels would

The purpose of this work is to gain insights into the 2011-2012 eruption of El Hierro (Canary Islands) by mapping the evolution of the seismic b-value. The El Hierro seismic sequence offers a rather unique opportunity to investigate the process of reawakening of an oceanic intraplate volcano after a long period of repose. The 2011-2012 eruption is a submarine volcanic event that took place about 2 km off of the southern coast of El Hierro. The eruption was accompanied by an intense seismic swarm and surface manifestations of activity. The earthquake catalogue during the period of unrest includes over 12 000 events, the largest with magnitude 4.6. The seismic sequence can be grouped into three distinct phases, which correspond to well-separated spatial clusters and distinct earthquake regimes. The estimated b-value is of 1.18 ± 0.03, and a magnitude of completeness of 1.3, for the entire catalogue. B is very close to 1.0, which indicates completeness of the earthquake catalogue with only minor departures from the linearity of Gutenberg-Richter frequency-magnitude distribution. The most straightforward interpretation of this result is that the seismic swarm reached its final stages, and no additional large magnitude events should be anticipated, similarly to what one would expect for non-volcanic earthquake sequences. The results, dividing the activity in different phases, illustrate remarkable differences in the estimate of b-value during the early and late stages of the eruption. The early pre-eruptive activity was characterized by a b-value of 2.25. In contrast, the b-value was 1.25 during the eruptive phase. Based on our analyses, and the results of other studies, we propose a scenario that may account for the observations reported in this work. We infer that the earthquakes that occurred in the first phase reflect magma migration from the upper mantle to crustal depths. The area where magma initially intruded into the crust, because of its transitional nature

The gene expression profiles of hybrid poplar (Populus alba × Populus tremula var. glandulosa) cells in suspension culture after exposure to salinity (NaCl) induced stress were examined by constructing two suppression subtractive hybridization (SSH) libraries. cDNA from non-treated cells was used as a driver and cDNA samples from cell suspension cultures exposed to 150 mM NaCl for 2 or 10 h were used as testers. Randomly selected clones from each SSH library were sequenced and 727 high-quality expressed sequence tags (ESTs) were obtained and analyzed. Four novel ESTs were identified. Between the two libraries, 542 unique SSH clones were selected for placement on a cDNA microarray. In total, 18 differentially expressed genes were identified with 4 and 12 genes being significantly differentially expressed 2 and 10 h after the treatment, respectively. Genes related to metabolism and protein synthesis and several genes whose protein products are implicated in salt or other abiotic stress-related responses were expressed in the salt-stressed cells.

Light-induced photocarrier generation is an essential process in all solar cells, including organic-inorganic hybrid (CH3NH3PbI3 ) solar cells, which exhibit a high short-circuit current density (Jsc ) of approximately 20 mA /cm2 . Although the high Jsc observed in the hybrid solar cells relies on strong electron-photon interaction, the optical transitions in the perovskite material remain unclear. Here, we report artifact-free CH3NH3PbI3 optical constants extracted from ultrasmooth perovskite layers without air exposure and assign all of the optical transitions in the visible and ultraviolet region unambiguously, based on density-functional theory (DFT) analysis that assumes a simple pseudocubic crystal structure. From the self-consistent spectroscopic ellipsometry analysis of the ultrasmooth CH3NH3PbI3 layers, we find that the absorption coefficients of CH3NH3PbI3 (α =3.8 ×104 cm-1 at 2.0 eV) are comparable to those of CuInGaSe2 and CdTe, and high α values reported in earlier studies are overestimated seriously by the extensive surface roughness of CH3NH3PbI3 layers. The polarization-dependent DFT calculations show that CH3NH3 + interacts strongly with the PbI3 - cage, modifying the CH3NH3PbI3 dielectric function in the visible region rather significantly. In particular, the transition matrix element of CH3NH3PbI3 varies, depending on the position of CH3NH3 + within the Pb—I network. When the effect of CH3NH3 + on the optical transition is eliminated in the DFT calculation, the CH3NH3PbI3 dielectric function deduced from DFT shows an excellent agreement with the experimental result. As a result, distinct optical transitions observed at E0(Eg)=1.61 eV , E1=2.53 eV , and E2=3.24 eV in CH3NH3PbI3 are attributed to the direct semiconductor-type transitions at the R , M , and X points in the pseudocubic Brillouin zone, respectively. We further perform the quantum efficiency (QE) analysis for a standard hybrid-perovskite solar cell incorporating a mesoporous TiO2

Full Text Available Prp19 is the founding member of the NineTeen Complex, or NTC, which is a spliceosomal subcomplex essential for spliceosome activation. To define Prp19 connectivity and dynamic protein interactions within the spliceosome, we systematically queried the Saccharomyces cerevisiae proteome for Prp19 WD40 domain interaction partners by two-hybrid analysis. We report that in addition to S. cerevisiae Cwc2, the splicing factor Prp17 binds directly to the Prp19 WD40 domain in a 1:1 ratio. Prp17 binds simultaneously with Cwc2 indicating that it is part of the core NTC complex. We also find that the previously uncharacterized protein Urn1 (Dre4 in Schizosaccharomyces pombe directly interacts with Prp19, and that Dre4 is conditionally required for pre-mRNA splicing in S. pombe. S. pombe Dre4 and S. cerevisiae Urn1 co-purify U2, U5, and U6 snRNAs and multiple splicing factors, and dre4Δ and urn1Δ strains display numerous negative genetic interactions with known splicing mutants. The S. pombe Prp19-containing Dre4 complex co-purifies three previously uncharacterized proteins that participate in pre-mRNA splicing, likely before spliceosome activation. Our multi-faceted approach has revealed new low abundance splicing factors connected to NTC function, provides evidence for distinct Prp19 containing complexes, and underscores the role of the Prp19 WD40 domain as a splicing scaffold.

Full Text Available In Romania, oral and facial cancers represent approximately 5% of all cancers. Deactivation and unregulated expression of oncogenes and tumor suppressor genes may be involved in the pathogenesis of oral squamous cell carcinoma. The genomic change results in numerical and structural chromosomal alterations, particularly in chromosomes 3, 9, 11 and 17. The aim of our study was to identify numerical aberrations of chromosome 17, deletion or amplification of p53 gene and to reveal correlations between abnormalities of chromosome 17and of p53 gene with TNM status and grading in 15 subjects with oral squamous cell carcinoma. 80 % of cases presented chromosome 17 polysomy and only 20% of cases had chromosome 17 monosomy. 46.6 % of samples revealed p53 gene amplification and 33.3 % of them p53 deletion. Polysomy of chromosome 17 was also detected in tumor-adjacent epithelia. The degree of the cytogenetic abnormality did not correlate with the stage of the disease, the histological differentiation of oral squamous cell carcinoma and lymph node metastasis. Molecular cytogenetic techniques, using fluorescence in situ hybridization with chromosome-specific DNA probes, facilitate the confirmation of presumed chromosomal aberrations with high sensitivity and specificity.

Complex traits are caused by multiple genetic and environmental factors, and are therefore difficult to study compared with simple Mendelian diseases. The modes of inheritance of Mendelian diseases are often known. Methods to dissect such diseases are well described in literature. For complex geneti

Full Text Available The uplink capacity and the interference statistics of the sectorsof a long groove-shaped road W-CDMA microcell are studied. A model of 9microcells in a groove-shaped road is used to analyze the uplink. Ahybrid model for the propagation is used in the analysis. The capacityand the interference statistics of the cell are studied for differentsector ranges, different specific attenuation factors, differentantenna side lobe levels and different bend losses.

The agricultural pest Ceratitis capitata, also known as the Mediterranean fruit fly or Medfly, is a fruit crop pest of very high economic relevance in different continents. The strategy to separate Ceratitis males from females (sexing) in mass rearing facilities is a useful step before the sterilization and release of male-only flies in Sterile Insect Technique control programs (SIT). The identification of genes having early embryonic male-specific expression, including Y-linked genes, such as the Maleness factor, could help to design novel and improved methods of sexing in combination with transgenesis, aiming to confer conditional female-specific lethality or female-to-male sexual reversal. We used a combination of Suppression Subtractive Hybrydization (SSH), Mirror Orientation Selection (MOS) anddifferential screening hybridization (DSH) techniques to approach the problem of isolating corresponding mRNAs expressed in XX/XY embryos versus XX-only embryos during a narrow developmental window (8-10 hours after egg laying, AEL ). Here we describe a novel strategy we have conceived to obtain relatively large amounts of XX-only embryos staged at 8-10 h AEL and so to extract few micrograms of polyA+ required to apply the complex technical procedure. The combination of these 3 techniques led to the identification of a Y-linked putative gene, CcGm2, sharing high sequence identity to a paralogous gene, CcGm1, localized either on an autosome or on the X chromosome. We propose that CcGm2 is a first interesting putative Y-linked gene which could play a role in sex determination. The function exterted by this gene should be investigated by novel genetic tools, such as CRISPR-CAS9, which will permit to target only the Y-linked paralogue, avoiding to interfere with the autosomal or X-linked paralogue function.

Triple quadrupole (QqQ), time of flight (TOF) and quadrupole-time of flight (QTOF) analysers have been compared for the detection of anabolic steroids in human urine. Ten anabolic steroids were selected as model compounds based on their ionization and the presence of endogenous interferences. Both qualitative and quantitative analyses were evaluated. QqQ allowed for the detection of all analytes at the minimum required performance limit (MRPL) established by the World Anti-Doping Agency (between 2 and 10 ng mL(-1) in urine). TOF and QTOF approaches were not sensitive enough to detect some of the analytes (3'-hydroxy-stanozolol or the metabolites of boldenone and formebolone) at the established MRPL. Although a suitable accuracy was obtained, the precision was unsatisfactory (RSD typically higher than 20%) for quantitative purposes irrespective of the analyser used. The methods were applied to 30 real samples declared positives either for the misuse of boldenone, stanozolol and/or methandienone. Most of the compounds were detected by every technique, however QqQ was necessary for the detection of some metabolites in a few samples. Finally, the possibility to detect non-target steroids has been explored by the use of TOF and QTOF. The use of this approach revealed that the presence of boldenone and its metabolite in one sample was due to the intake of androsta-1,4,6-triene-3,17-dione. Additionally, the intake of methandienone was confirmed by the post-target detection of a long-term metabolite.

Porous silica and hybrid silica chromatographic support particles having particle diameters ranging approximately from 1 microm to 15 microm have been characterized by flow/hyperlayer field-flow fractionation (FFF). The particle size accuracy has been improved significantly in this work by a second-order polynomial calibration. Very good agreement between the FFF data and scanning electron microscopic (SEM) results has been achieved. The effects of particle porosity, pore sizes, and particle sizes on the particle size accuracy in electrical sensing zone (ESZ) analyses have been discussed. It has been demonstrated by computer simulation and experimental measurements that false peaks can be generated in certain particle size regions when the static light scattering (SLS) technique is applied to tightly distributed spherical chromatographic support particles.

We have studied the titanium K-shell emission spectra from multi-keV x-ray source experiments with hybrid targets on the OMEGA laser facility. Using the collisional-radiative TRANSPEC code, dedicated to K-shell spectroscopy, we reproduced the main features of the detailed spectra measured with the time-resolved MSPEC spectrometer. We have developed a general method to infer the Ne, Te and Ti characteristics of the target plasma from the spectral analysis (ratio of integrated Lyman-α to Helium-α in-band emission and the peak amplitude of individual line ratios) of the multi-keV x-ray emission. These thermodynamic conditions are compared to those calculated independently by the radiation-hydrodynamics transport code FCI2.

Central Thrust is located ˜5 km structurally below the previous mapped locations. Deformation temperature increases up structural section from ˜450°C to ˜650°C and overlaps with peak metamorphic temperature indicating that penetrative shearing was responsible for the exhumation of the GHS occurred at "close" to peak metamorphic conditions. I interpreted the telescoping and the inversion of the paleo-isotherms at the base of the GHS as produced mainly by a sub-simple shearing (Wm = 0.88-1) pervasively distributed through the lower portion of the GHS. The results are consistent with hybrid channel flow-type models where the boundary between lower and upper portions of the GHS, broadly corresponding to the tectono-metamorphic discontinuity recently documented in west Nepal, represents the limit between buried material, affected by dominant simple shearing, and exhumed material affected by a general flow dominates by pure shearing. This interpretation is consistent with the recent models suggesting the simultaneous operation of channel flow- and critical wedge-type processes at different structural depth.

Full Text Available In this work we propose a new hybrid model, a combination of the manifold learning Principal Components (PC technique and the traditional multiple regression (PC-regression, for short and medium-term forecasting of daily, aggregated, day-ahead, electricity system-wide load in the Greek Electricity Market for the period 2004–2014. PC-regression is shown to effectively capture the intraday, intraweek and annual patterns of load. We compare our model with a number of classical statistical approaches (Holt-Winters exponential smoothing of its generalizations Error-Trend-Seasonal, ETS models, the Seasonal Autoregressive Moving Average with exogenous variables, Seasonal Autoregressive Integrated Moving Average with eXogenous (SARIMAX model as well as with the more sophisticated artificial intelligence models, Artificial Neural Networks (ANN and Support Vector Machines (SVM. Using a number of criteria for measuring the quality of the generated in-and out-of-sample forecasts, we have concluded that the forecasts of our hybrid model outperforms the ones generated by the other model, with the SARMAX model being the next best performing approach, giving comparable results. Our approach contributes to studies aimed at providing more accurate and reliable load forecasting, prerequisites for an efficient management of modern power systems.

The errors on statistics measured in finite galaxy catalogs are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi (1996) is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly nonlinear to weakly nonlinear scales. The final analytic formu...

对于大跨度空间钢结构而言,整体稳定分析不可忽略.北京将台商务中心冬季花园采用网壳与拉索杂交结构,玻璃幕墙固定于拉索上.其中网壳部分由9榀空间钢拱和普通壳体杆件组成,可以看作拱壳杂交结构.本文选取了结构中跨度最长的单榀拱结构进行稳定分析,包括特征值屈曲分析及非线性全过程分析.结果表明,单拱的稳定承载力满足要求,非线性对单榀结构的稳定性能有较大影响.此外,尽管几何初始缺陷对弹性结构的稳定性能有一定影响,但对弹塑性结构的稳定性能影响不大.%Global stability analysis cannot be ignored for large-span spatial steel structures. Beijing Jiangtai Winter Garden adopts hybrid structure with latticed shell and vertical cables,on which glass curtain wall is fixed. Therein the reticulated shell is composed of 9 spatial arches and other common frames,thus it can be seen as arch-reticulated shell hybrid structure. This paper analyzed the stability of one single arch structure which has the longest span in the whole structure, including buckling analyses and non-linear full range analyses. The results showed that the stability of single arch met requirements, and the nonlinearity had great influence on the stability of single arch structure. Besides, although the initial geometric imperfection had certain effect on the elastic structure, it had little effect on the elasto-plastic one.

Micellar electrokinetic chromatography fingerprinting combined with quantification was successfully developed and applied to monitor the quality consistency of Weibizhi tablets, which is a classical compound preparation used to treat gastric ulcers. A background electrolyte composed of 57 mmol/L sodium borate, 21 mmol/L sodium dodecylsulfate and 100 mmol/L sodium hydroxide was used to separate compounds. To optimize capillary electrophoresis conditions, multivariate statisticalanalyses were applied. First, the most important factors influencing sample electrophoretic behavior were identified as background electrolyte concentrations. Then, a Box-Benhnken design response surface strategy using resolution index RF as an integrated response was set up to correlate factors with response. RF reflects the effective signal amount, resolution, and signal homogenization in an electropherogram, thus, it was regarded as an excellent indicator. In fingerprint assessments, simple quantified ratio fingerprint method was established for comprehensive quality discrimination of traditional Chinese medicines/herbal medicines from qualitative and quantitative perspectives, by which the quality of 27 samples from the same manufacturer were well differentiated. In addition, the fingerprint-efficacy relationship between fingerprints and antioxidant activities was established using partial least squares regression, which provided important medicinal efficacy information for quality control. The present study offered an efficient means for monitoring Weibizhi tablet quality consistency.

The performance limits were explored for an X-ray Diffraction based explosives detection system for baggage scanning. This XDi system offers 4D imaging that comprises three spatial dimensions with voxel sizes in the order of ~(0.5cm)3, and one spectral dimension for material discrimination. Because only a very small number of photons are observed for an individual voxel, material discrimination cannot work reliably at the voxel level. Therefore, an initial 3D reconstruction is performed, which allows the identification of objects of interest. Combining all the measured photons that scattered within an object, more reliable spectra are determined on the object-level. As a case study we looked at two liquid materials, one threat and one innocuous, with very similar spectral characteristics, but with 15% difference in electron density. Simulations showed that Poisson statistics alone reduce the material discrimination performance to undesirable levels when the photon counts drop to 250. When additional, uncontrolled variation sources are considered, the photon count plays a less dominant role in detection performance, but limits the performance also for photon counts of 500 and higher. Experimental data confirmed the presence of such non-Poisson variation sources also in the XDi prototype system, which suggests that the present system can still be improved without necessarily increasing the photon flux, but by better controlling and accounting for these variation sources. When the classification algorithm was allowed to use spectral differences in the experimental data, the discrimination between the two materials improved significantly, proving the potential of X-ray diffraction also for liquid materials.

Hybridization has become a central element in theories of animal evolution during the last decade. New methods in population genomics and statistical model testing now allow the disentangling of the complexity that hybridization brings into key evolutionary processes such as local adaptation, colonization of new environments, species diversification and extinction. We evaluated the consequences of hybridization in a complex of three alpine butterflies in the genus Coenonympha, by combining morphological, genetic and ecological analyses. A series of approximate Bayesian computation procedures based on a large SNP data set strongly suggest that the Darwin's Heath (Coenonympha darwiniana) originated through hybridization between the Pearly Heath (Coenonympha arcania) and the Alpine Heath (Coenonympha gardetta) with different parental contributions. As a result of hybridization, the Darwin's Heath presents an intermediate morphology between the parental species, while its climatic niche seems more similar to the Alpine Heath. Our results also reveal a substantial genetic and morphologic differentiation between the two geographically disjoint Darwin's Heath lineages leading us to propose the splitting of this taxon into two different species.

In the field of tissue engineering, adult stem cells are increasingly recognized as an important tool for in vitro reconstructed tissue-engineered grafts. In the world of cell therapies, undoubtedly, mesenchymal stem cells from bone marrow or adipose tissue are the most promising progenitors for tissue engineering applications. In this setting, adipose-derived stem cells (ASCs) are generally similar to those derived from bone marrow and are most conveniently extracted from tissue removed by elective cosmetic liposuction procedures; they also show a great potential for endothelization. The aim of the present work was to investigate how the cocommitment into a vascular and bone phenotype of ASCs could be a useful tool for improving the in vitro and in vivo reconstruction of a vascularized bone graft. Human ASCs obtained from abdominoplasty procedures were loaded in a hydroxyapatite clinical-grade scaffold, codifferentiated, and tested for proliferation, cell distribution, and osteogenic and vasculogenic gene expression. The chromosomal stability of the cultures was investigated using the comparative genomic hybridization array for 3D cultures. ASC adhesion, distribution, proliferation, and gene expression not only demonstrated a full osteogenic and vasculogenic commitment in vitro and in vivo, but also showed that endothelization strongly improves their osteogenic commitment. In the end, genetic analyses confirmed that no genomical alteration in long-term in vitro culture of ASCs in 3D scaffolds occurs.

This paper describes three statistical sampling approaches for regional soil monitoring, a design-based, a model-based and a hybrid approach. In the model-based approach a space-time model is exploited to predict global statistical parameters of interest such as the space-time mean. In the hybrid

This paper describes three statistical sampling approaches for regional soil monitoring, a design-based, a model-based and a hybrid approach. In the model-based approach a space-time model is exploited to predict global statistical parameters of interest such as the space-time mean. In the hybrid ap

Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statisticalanalyses and learn a wide range of techniques without getting in over your head, this is your book.

Technology was identified which will enable application of hybrid propulsion to manned and unmanned space launch vehicles. Two design concepts are proposed. The first is a hybrid propulsion system using the classical method of regression (classical hybrid) resulting from the flow of oxidizer across a fuel grain surface. The second system uses a self-sustaining gas generator (gas generator hybrid) to produce a fuel rich exhaust that was mixed with oxidizer in a separate combustor. Both systems offer cost and reliability improvement over the existing solid rocket booster and proposed liquid boosters. The designs were evaluated using life cycle cost and reliability. The program consisted of: (1) identification and evaluation of candidate oxidizers and fuels; (2) preliminary evaluation of booster design concepts; (3) preparation of a detailed point design including life cycle costs and reliability analyses; (4) identification of those hybrid specific technologies needing improvement; and (5) preperation of a technology acquisition plan and large scale demonstration plan.

Algebraic statistics brings together ideas from algebraic geometry, commutative algebra, and combinatorics to address problems in statistics and its applications. Computer algebra provides powerful tools for the study of algorithms and software. However, these tools are rarely prepared to address statistical challenges and therefore new algebraic results need often be developed. This way of interplay between algebra and statistics fertilizes both disciplines. Algebraic statistics is a relativ...

Hybrid magnet theory as applied to the error analyses used in the design of Advanced Light Source (ALS) insertion devices is reviewed. Sources of field errors in hybrid insertion devices are discussed.

This short commentary discusses Biomarkers' requirements for the reporting of statisticalanalyses in submitted papers. It is expected that submitters will follow the general instructions of the journal, the more detailed guidance given by the International Committee of Medical Journal Editors, the specific guidelines developed by the EQUATOR network, and those of various specialist groups. Biomarkers expects that the study design and subsequent statisticalanalyses are clearly reported and that the data reported can be made available for independent assessment. The journal recognizes that there is continuing debate about different approaches to statistical science. Biomarkers appreciates that the field continues to develop rapidly and encourages the use of new methodologies.

We review the status of hybrid baryons. The only known way to study hybrids rigorously is via excited adiabatic potentials. Hybrids can be modelled by both the bag and flux-tube models. The low-lying hybrid baryon is N 1/2^+ with a mass of 1.5-1.8 GeV. Hybrid baryons can be produced in the glue-rich processes of diffractive gamma N and pi N production, Psi decays and p pbar annihilation.

This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.

Full Text Available An important factor for the evaluation of an agricultural system’s sustainability is the monitoring of soil quality via its physical attributes. The physical attributes of soil, such as soil penetration resistance, can be used to monitor and evaluate the soil’s quality. Artificial Neural Networks (ANN have been employed to solve many problems in agriculture, and the use of this technique can be considered an alternative approach for predicting the penetration resistance produced by the soil’s basic properties, such as bulk density and water content. The aim of this work is to perform an analysis of the soil penetration resistance behavior measured from the cone index under different levels of bulk density and water content using statisticalanalyses, specifically regression analysis and ANN modeling. Both techniques show that soil penetration resistance is associated with soil bulk density and water content. The regression analysis presented a determination coefficient of 0.92 and an RMSE of 0.951, and the ANN modeling presented a determination coefficient of 0.98 and an RMSE of 0.084. The results show that the ANN modeling presented better results than the mathematical model obtained from regression analysis.Um importante fator para a avaliação da sustentabilidade de sistemas agrícolas é o monitoramento da qualidade do solo por meio de seus atritutos físicos. Logo, atributos físicos do solo, como resistência à penetração, podem ser empregados no monitoramento e na avaliação da qualidade do solo. Redes Neurais Artificiais (RNA tem sido empregadas na solução de vários problemas na agricultura, neste contexto, o uso desta técnica pode ser considerada uma abordagem alternativa para se predizer a resistência à penetração do solo a partir de suas propriedades básicas como densidade e teor de água. Portanto, o objetivo desse trabalho foi desenvolver um estudo do comportamento da resistência à penetração do solo, medida

This study assessed the feasibility of Concentrated Solar Power plants (CSP) in Northeast, Brazil. It focused on parabolic trough solar power plants, which is the most mature CSP technology; and evaluated plants rated at 100 MWe, dry cooling systems (due to the low water availability in Northeast), and with and without hybridization based on natural gas (degree of hybridization varying from 25 to 75%). Hence, the capacity factor of the simulated plants hovered between 23 and 98%, according to the degree of hybridization and the choice of the thermodynamic cycle of the natural gas fueled thermal system: Rankine or combined cycle. The CSP plants were simulated at Bom Jesus da Lapa, in the semi-arid region of Bahia. Given the prospects for natural gas resources in the Sao Francisco Basin, different scenarios for the gas prices were tested. Moreover, two scenarios were tested for the cost of the CSP plants, one based on the current financial environment and the other based on incentive policies, such as fiscal incentives and loans. Findings show that while simple plants levelized costs (LCOE) hovered around 520 R$/MWh, for hybrid plants LCOE may reach 140 to 190 R$/MWh. Therefore, this study proposed incentive policies to promote the increasing investment in hybrid CSP plants. (author)

This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

... Foodborne, Waterborne, and Environmental Diseases Mycotic Diseases Branch Histoplasmosis Statistics Recommend on Facebook Tweet Share Compartir How common is histoplasmosis? In the United States, an estimated 60% to ...

A new edition of the trusted guide on commonly used statistical distributions Fully updated to reflect the latest developments on the topic, Statistical Distributions, Fourth Edition continues to serve as an authoritative guide on the application of statistical methods to research across various disciplines. The book provides a concise presentation of popular statistical distributions along with the necessary knowledge for their successful use in data modeling and analysis. Following a basic introduction, forty popular distributions are outlined in individual chapters that are complete with re

The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

It is well established that related species hybridize and that this can have varied but significant effects on speciation and environmental adaptation. It should therefore come as no surprise that hybridization is not limited to species that are alive today. In the last several decades, advances in technologies for recovering and sequencing DNA from fossil remains have enabled the assembly of high-coverage genome sequences for a growing diversity of organisms, including many that are extinct. Thanks to the development of new statistical approaches for detecting and quantifying admixture from genomic data, genomes from extinct populations have proven useful both in revealing previously unknown hybridization events and informing the study of hybridization between living organisms. Here, we review some of the key recent statistical innovations for detecting ancient hybridization using genomewide sequence data and discuss how these innovations have revised our understanding of human evolutionary history.

Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

The purpose of this study was to analyze the available statistics concerning teachers in schools of general education in the Federal Republic of Germany. An analysis of the demographic structure of the pool of full-time teachers showed that in 1971 30 percent of the teachers were under age 30, and 50 percent were under age 35. It was expected that…

The reasons for adopting hybrid vehicles result mainly from the lack of adequate range from electric vehicles at an acceptable cost. Hybrids can offer significant improvements in emissions and fuel economy. Series and parallel hybrids are compared. A combination of series and parallel operation would be the ideal. This can be obtained using a planetary gearbox as a power split device allowing a small generator to transfer power to the propulsion motor giving the effect of a CVT. It allows the engine to run at semi-constant speed giving better fuel economy and reduced emissions. Hybrid car developments are described that show the wide range of possible hybrid systems. (author)

Improper uses of elementary statistics that were often observed in beginners' manuscripts and papers were collected and better ways were suggested. This paper consists of three parts: About descriptive statistics, multivariate analyses, and statistical tests.

This paper formulates a new approach to complex fluid dynamics, which accounts for microscopic statistical effects in the micromotion. While the ordinary fluid variables (mass density and momentum) undergo usual dynamics, the order parameter field is replaced by a statistical distribution on the order parameter space. This distribution depends also on the point in physical space and its dynamics retains the usual fluid transport features while containing the statistical information on the order parameter space. This approach is based on a hybrid moment closure for Yang-Mills Vlasov plasmas, which replaces the usual cold-plasma assumption. After presenting the basic properties of the hybrid closure, such as momentum map features, singular solutions and Casimir invariants, the effect of Yang-Mills fields is considered and a direct application to ferromagnetic fluids is presented. Hybrid models are also formulated for complex fluids with symmetry breaking. For the special case of liquid crystals, a hybrid formul...

Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions. It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,

Back by popular demand, the JSC Biostatistics Laboratory and LSAH statisticians are offering an opportunity to discuss your statistical challenges and needs. Take the opportunity to meet the individuals offering expert statistical support to the JSC community. Join us for an informal conversation about any questions you may have encountered with issues of experimental design, analysis, or data visualization. Get answers to common questions about sample size, repeated measures, statistical assumptions, missing data, multiple testing, time-to-event data, and when to trust the results of your analyses.

Full Text Available Distinguishing between hybrid introgression and incomplete lineage sorting causing incongruence among gene trees in that they exhibit topological differences requires application of statistical approaches that are based on biologically relevant models. Such study is especially challenging in hybrid systems, where usual vectors mediating interspecific gene transfers--hybrids with Mendelian heredity--are absent or unknown. Here we study a complex of hybridizing species, which are known to produce clonal hybrids, to discover how one of the species, Cobitis tanaitica, has achieved a pattern of mito-nuclear mosaic genome over the whole geographic range. We appplied three distinct methods, including the method using solely the information on gene tree topologies, and found that the contrasting mito-nuclear signal might not have resulted from the retention of ancestral polymorphism. Instead, we found two signs of hybridization events related to C. tanaitica; one concerning nuclear gene flow and the other suggested mitochondrial capture. Interestingly, clonal inheritance (gynogenesis of contemporary hybrids prevents genomic introgressions and non-clonal hybrids are either absent or too rare to be detected among European Cobitis. Our analyses therefore suggest that introgressive hybridizations are rather old episodes, mediated by previously existing hybrids whose inheritance was not entirely clonal. Cobitis complex thus supports the view that the type of resulting hybrids depends on a level of genomic divergence between sexual species.

In this revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this goal through a coherent mix of mathematical analysis, intuitive discussions and examples.* Ross's clear writin

In this 3rd edition revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. Concepts are motivated, illustrated and explained in a way that attempts to increase one's intuition. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this

Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.

The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.

This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...

Hybrid automata have been proposed as a language for modelling and analysing the interaction of digital and analogue dynamics in embedded computer systems. In the paper, hybrid automata are studied from a dynamical systems perspective. Extending earlier work on conditions for existence and uniqueness of executions of hybrid automata, we characterise a class of hybrid automata whose executions depend continuously on the initial state. The continuity conditions are subsequently used to derive a...

This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing…

This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing…

Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization.

We derive and implement a general method to characterize the nonclassicality in compound discrete- and continuous-variable systems. For this purpose, we introduce the operational notion of conditional hybrid nonclassicality which relates to the ability to produce a nonclassical continuous-variable state by projecting onto a general superposition of discrete-variable subsystem. We discuss the importance of this form of quantumness in connection with interfaces for quantum communication. To verify the conditional hybrid nonclassicality, a matrix version of a nonclassicality quasiprobability is derived and its sampling approach is formulated. We experimentally generate an entangled, hybrid Schrödinger cat state, using a coherent photon-addition process acting on two temporal modes, and we directly sample its nonclassicality quasiprobability matrix. The introduced conditional quantum effects are certified with high statistical significance.

Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

In this paper, we analyse asymptotically a new class of LDPC codes called Non-binary Hybrid LDPC codes, which has been recently introduced. We use density evolution techniques to derive a stability condition for hybrid LDPC codes, and prove their threshold behavior. We study this stability condition to conclude on asymptotic advantages of hybrid LDPC codes compared to their non-hybrid counterparts.

All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

Statistical Methods, 3e provides students with a working introduction to statistical methods offering a wide range of applications that emphasize the quantitative skills useful across many academic disciplines. This text takes a classic approach emphasizing concepts and techniques for working out problems and intepreting results. The book includes research projects, real-world case studies, numerous examples and data exercises organized by level of difficulty. This text requires that a student be familiar with algebra. New to this edition: NEW expansion of exercises a

Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

Full Text Available The combined effect of cytoplasmic male sterility and xenia on maize hybrid traits is referred to as the plus-hybrid effect. Two studied ZP hybrids differently responded to this effect for grain yield. All plus-hybrid combinations of the firstly observed hybrid had a higher yield than their fertile counterparts, but not significantly, while only one combination of the second hybrid positively responded, also without statistical significance. It seems that the observed effect mostly depended on the genotype of the female component.

For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1997, Statistics Finland, Helsinki 1998, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-September 1998, Energy exports by recipient country in January-September 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products

For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1996, Statistics Finland, Helsinki 1997, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-June 1998, Energy exports by recipient country in January-June 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products

Distant hybridization of a new kind of cytoplasmic male sterility of Brassica napus (2n =4× =38) and Purple-Caitai was carried out. Purple-Caitai was used as the recurrent parent to continue backcross. Agronomic and quality traits of the hybrids and backcross progenies of Brassica napus × Purple-Caitai were analyzed. The results showed that the donor parent and the recipient parent e×hibited nucleus replacement,the values of agronomic traits and quality traits of hybridization and backcross progenies had a certain fluctuation and the higher the generation of material, the more similar to the recurrent parent, while the variation range of lower generation was wide.%本研究采用一种新型甘蓝型油菜细胞质雄性不育系(Eru CMS)(Brassica napus 2n=4x=38)与红菜薹(B.campestris L.ssp.chinensis L.var.utilis Tsen et Lee.2n=2x=20)进行远缘杂交,并以红菜薹为轮回亲本连续回交,对甘蓝型油菜细胞质雄性不育系与红菜薹的杂种一代及其回交后代进行植物学性状及品质性状调查测定.结果表明,供体亲本和受体亲本在回交转育过程中发生明显的核置换,其杂交及回交后代的农艺性状及品质性状出现波动,并且随着回交次数的增多,世代越高的材料越相似于轮回亲本红菜薹,而世代越低材料遗传变异范围更大.

This paper focuses on an analysis of pedestrian and motorists' actions at sites with pedestrian hybrid beacons and assesses their effectiveness in improving the safety of pedestrians. Descriptive and statisticalanalyses (one-tail two-sample T-test and two-proportion Z-test) were conducted using field data collected during morning and evening peak hours at three study sites in the city of Charlotte, NC, before and after the installation of pedestrian hybrid beacons. Further, an analysis was conducted to assess the change in pedestrian and motorists' actions over time (before the installation; 1 month, 3 months, 6 months, and 12 months after the installation). Results showed an increase in average traffic speed at one of the pedestrian hybrid beacon sites while no specific trends were observed at the other two pedestrian hybrid beacon sites. A decrease in the number of motorists not yielding to pedestrians, pedestrians trapped in the middle of the street, and pedestrian-vehicle conflicts were observed at all the three pedestrian hybrid beacon sites. The installation of pedestrian hybrid beacons did not have a negative effect on pedestrian actions at two out of the three sites. Improvements seem to be relatively more consistent 3 months after the installation of the pedestrian hybrid beacon.

Full Text Available Hybridization is an important evolutionary force, because interspecific gene transfer can introduce more new genetic material than is directly generated by mutations. Pinus engelmannii Carr. is one of the nine most common pine species in the pine-oak forest ecoregion in the state of Durango, Mexico. This species is widely harvested for lumber and is also used in reforestation programmes. Interspecific hybrids between P.engelmannii and Pinus arizonica Engelm. have been detected by morphological analysis. The presence of hybrids in P. engelmannii seed stands may affect seed quality and reforestation success. Therefore, the goals of this research were to identify introgressive hybridization between P. engelmannii and other pine species in eight seed stands of this species in Durango, Mexico, and to examine how hybrid proportion is related to mean genetic dissimilarity between trees in these stands, using Amplified Fragment Length Polymorphism (AFLP markers and morphological traits. Differences in the average current annual increment of putative hybrids and pure trees were also tested for statistical significance. Morphological and genetic analyses of 280 adult trees were carried out. Putative hybrids were found in all the seed stands studied. The hybrids did not differ from the pure trees in vigour or robustness. All stands with putative P. engelmannii hybrids detected by both AFLPs and morphological traits showed the highest average values of the Tanimoto distance, which indicates: i more heterogeneous genetic material, ii higher genetic variation and therefore iii the higher evolutionary potential of these stands, and iv that the morphological differentiation (hybrid/not hybrid is strongly associated with the Tanimoto distance per stand. We conclude that natural pairwise hybrids are very common in the studied stands. Both morphological and molecular approaches are necessary to confirm the genetic identity of forest reproductive material.

This book, designed for students taking a basic introductory course in statistical analysis, is far more than just a book of tables. Each table is accompanied by a careful but concise explanation and useful worked examples. Requiring little mathematical background, Elementary Statistics Tables is thus not just a reference book but a positive and user-friendly teaching and learning aid. The new edition contains a new and comprehensive "teach-yourself" section on a simple but powerful approach, now well-known in parts of industry but less so in academia, to analysing and interpreting process dat

Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

The existence of an arc statistics problem was at the center of a strong debate in the last fifteen years. With the aim to clarify if the optical depth for giant gravitational arcs by galaxy clusters in the so called concordance model is compatible with observations, several studies were carried out which helped to significantly improve our knowledge of strong lensing clusters, unveiling their extremely complex internal structure. In particular, the abundance and the frequency of strong lensing events like gravitational arcs turned out to be a potentially very powerful tool to trace the structure formation. However, given the limited size of observational and theoretical data-sets, the power of arc statistics as a cosmological tool has been only minimally exploited so far. On the other hand, the last years were characterized by significant advancements in the field, and several cluster surveys that are ongoing or planned for the near future seem to have the potential to make arc statistics a competitive cosmo...

This book offers a comprehensive approach to multivariate statisticalanalyses. It provides theoretical knowledge of the concepts underlying the most important multivariate techniques and an overview of actual applications.

The main goal of this book is to provide a state of the art of hybrid metaheuristics. The book provides a complete background that enables readers to design and implement hybrid metaheuristics to solve complex optimization problems (continuous/discrete, mono-objective/multi-objective, optimization under uncertainty) in a diverse range of application domains. Readers learn to solve large scale problems quickly and efficiently combining metaheuristics with complementary metaheuristics, mathematical programming, constraint programming and machine learning. Numerous real-world examples of problems and solutions demonstrate how hybrid metaheuristics are applied in such fields as networks, logistics and transportation, bio-medical, engineering design, scheduling.

Full Text Available Bu makalenin amacı zaman serileri için resmi istatistik ajansları tarafından geliştirilen ve çok yaygın olarak uygulanan mevsim düzeltme programlarını tanıtmaktır. Bu programlar iki ana grupta sınıflanmaktadır. Bunlardan biri, ilk defa olarak NBER tarafından geliştirilen ve hareketli ortalamalar filtreleri kullanan CENSUS II X-11 ailesidir. Bu aile X-11 ARIMA ve X-12 ARIMA tekniklerini içerir. Diğeri ise İspanya Merkez Bankası tarafından geliştirilen ve model bazlı bir yaklaşım olan TRAMO/SEATS programıdır. Bu makalede sözü edilen tekniklerin mevsimsel ayrıştırma süreçleri, bu tekniklerin içerdiği ticari gün, takvim etkisi gibi bazı özel etkiler, avantaj ve dezavantajları ve ayrıca öngörü performansları tartışılacaktır.-This paper’s aim is to introduce most commonly applied seasonal adjustment programs improved by official statistical agencies for the time series. These programs are classified in two main groups. One of them is the family of CENSUS II X-11 which was using moving average filters and was first developed by NBER. This family involves X-11 ARIMA and X-12 ARIMA techniques. The other one is TRAMO/SEATS program which was a model based approach and has been developed by Spain Central Bank. The seasonal decomposition procedures of these techniques which are mentioned before and consisting of some special effects such as trading day, calendar effects and their advantages-disadvantages and also forecasting performances of them will be discussed in this paper.

The finite element analyses on hysteretic behavior of steel coupling beam-column connections with steel boundary elements welded in hybrid coupled shear wall system were performed by using the FEM software ABAQUS for numerical simulation,so as to obtain the seismic performance of hybrid coupled shear wall system under low cyclic loading.The results of the finite element analyses agree quite well with experimental results.Meanwhile,the finite element analysis results and experimental results all indicate that the hysteretic curve of connections in hybrid coupled wall is full,and the ductility factor and ultimate bearing capacity are high,which shows that this kind of connection has high seismic performance,and is suitable for using in the high-rise buildings in high-intensity earthquake areas.%通过利用ABAQUS有限元软件对型钢边缘构件-钢连梁焊接型混合连肢墙（HCW）节点滞回性能进行有限元分析,研究混合连肢墙在低周循环荷载作用下的抗震性能,并将有限元计算结果与试验结果进行了对比,吻合情况较好。同时,试验研究与有限元分析结果均表明：节点滞回曲线饱满,且延性系数及极限承载力较高,表明节点具有良好的抗震性能。

In 1975 John Tukey proposed a multivariate median which is the 'deepest' point in a given data cloud in R^d. Later, in measuring the depth of an arbitrary point z with respect to the data, David Donoho and Miriam Gasko considered hyperplanes through z and determined its 'depth' by the smallest portion of data that are separated by such a hyperplane. Since then, these ideas has proved extremely fruitful. A rich statistical methodology has developed that is based on data depth and, more general...

In recent years, statistical mechanics has been increasingly recognized as a central domain of mathematics. Major developments include the Schramm-Loewner evolution, which describes two-dimensional phase transitions, random matrix theory, renormalization group theory and the fluctuations of random surfaces described by dimers. The lectures contained in this volume present an introduction to recent mathematical progress in these fields. They are designed for graduate students in mathematics with a strong background in analysis and probability. This book will be of particular interest to graduate students and researchers interested in modern aspects of probability, conformal field theory, percolation, random matrices and stochastic differential equations.

Full Text Available Different types of fibers have been added to acrylic resin materials to improve their mechanical properties. The purpose of this study was to know the transverse strength of the hybrid acrylic resins after glass fiber reinforcement with difference method. This study used rectangular specimens of 65 mm in length, 10 mm in width and 2.5 mm in thickness. There were 3 groups consisting of 6 specimens each, hybrid acrylic resin without glass fiber (control, glass fibers dipped in methyl methacrylate monomer for 15 minutes before being reinforced into hybrid acrylic resin (first method, glass fibers reinforced into a mixture of polymer powder and monomer liquid after the hybrid acrylic resin was mixed directly (second method. All of the specimens were cured for 20 minutes at 100° C. Transverse strength was measured using Autograph. The statisticalanalyses using one way ANOVA and LSD test showed that there were significant differences in transverse strength (p < 0.05 among the groups. The means of transverse strength were 94,94; 118,27; and 116,34 MPa. It meant that glass fibers reinforcement into hybrid acrylic resin enhanced their transverse strength compared with control. Glass fiber reinforcement into hybrid acrylic resin with differenciate method didn’t enhance their transverse strength.

I introduce the concept of hybrid intermediaries: financial conglomerates that control a multiplicity of entity types active in the "assembly line" process of modern financial intermediation, a system that has become known as shadow banking. The complex bank holding companies of today are the best example of hybrid intermediaries, but I argue that financial firms from the "nonbank" space can just as easily evolve into conglomerates with similar organizational structure, thus acquiring the cap...

Full Text Available effect was observed for the elongation at break of the hybrid composites. The impact strength of the hybrid composites increased with the addition of glass fibres. The tensile and impact properties of thermoplastic natural rubber reinforced short... panels made from conventional structural materials. Figure 3 illustrates the performance of cellular biocomposite panels against conventional systems used for building and residential construction, namely a pre- cast pre-stressed hollow core concrete...

The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better

Context: Several hundred candidate hybrid pulsators of type A-F have been identified from space-based observations. Their large number allows both statisticalanalyses and detailed investigations of individual stars. This offers the opportunity to study the full interior of the genuine hybrids, in which both low-radial-order p- and high-order g-modes are self-excited at the same time. However, a few other physical processes can also be responsible for the observed hybrid nature, related to binarity or to surface inhomogeneities. The finding that most delta Scuti stars also show long-period light variations represents a real challenge for theory. Methods: Fourier analysis of all the available Kepler light curves. Investigation of the frequency and period spacings. Determination of the stellar physical parameters from spectroscopic observations. Modelling of the transit events. Results: The Fourier analysis of the Kepler light curves revealed 55 significant frequencies clustered into two groups, which are separ...

Parametric analyses, using a hybrid vehicle synthesis and economics program (HYVELD) are described investigating the sensitivity of hybrid vehicle cost, fuel usage, utility, and marketability to changes in travel statistics, energy costs, vehicle lifetime and maintenance, owner use patterns, internal combustion engine (ICE) reference vehicle fuel economy, and drive-line component costs and type. The lowest initial cost of the hybrid vehicle would be $1200 to $1500 higher than that of the conventional vehicle. For nominal energy costs ($1.00/gal for gasoline and 4.2 cents/kWh for electricity), the ownership cost of the hybrid vehicle is projected to be 0.5 to 1.0 cents/mi less than the conventional ICE vehicle. To attain this ownership cost differential, the lifetime of the hybrid vehicle must be extended to 12 years and its maintenance cost reduced by 25 percent compared with the conventional vehicle. The ownership cost advantage of the hybrid vehicle increases rapidly as the price of fuel increases from $1 to $2/gal.

Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....

We present a hybrid model for content extraction from HTML documents. The model operates on Document Object Model (DOM) tree of the corresponding HTML document. It evaluates each tree node and associated statistical features like link density and text distribution across the node to predict signi...

During the last two decades the mid-western states of the United States of America has been largely afﬂicted by heavy ﬂood producing rainfall. Several of these storms seem to have similar hydrometeorological properties in terms of pattern, track, evolution, life cycle, clustering, etc. which rais...

A proposed method is considered to classify the regions in the close neighborhood of selected measurements according to the ratio of two radionuclides measured from either a radioactive plume or a deposited radionuclide mixture. The subsequent associated locations are then considered in the area of interest with a representative ratio class. This method allows for a more comprehensive and meaningful understanding of the data sampled following a radiological incident.

Injuries to the neck, or cervical region, are very importantsince there is a potential risk of damage to the spinal cord.Any neck injury can have devastating if not life threateningconsequences. High-speed transportation as well as leisure-timeadventures have increased the number of serious neck injuriesand made us increasingly aware of its consequences.Surveillance systems and epidemiological studies are importantprerequisites in defining the scope of the problem. Thedevelopment of mechanica...

This will be a 4-day series of 2-hour sessions as part of CERN's Academic Training Course. Each session will consist of a 1-hour lecture followed by one hour of practical computing, which will have exercises based on that day's lecture. While it is possible to follow just the lectures or just the computing exercises, we highly recommend that, because of the way this course is designed, participants come to both parts. In order to follow the hands-on exercises sessions, students need to bring their own laptops. The exercises will be run on a dedicated CERN Web notebook service, SWAN (swan.cern.ch), which is open to everybody holding a CERN computing account. The requirement to use the SWAN service is to have a CERN account and to have also access to Cernbox, the shared storage service at CERN. New users of cernbox are invited to activate beforehand cernbox by simply connecting to https://cernbox.cern.ch. A basic prior knowledge of ROOT and C++ is also recommended for participation in the practical session....

Full Text Available hybrid similarity measure is established to measure the hybrid similarity. In cluster tree, the hybrid similarity measure can be calculated for the random data even it may not be the co-occurred and generate different views. Different views of tree can be combined and choose the one which is significant in cost. A method is proposed to combine the multiple views. Multiple views are represented by different distance measures into a single cluster. Comparing the cluster tree based hybrid similarity with the traditional statistical methods it gives the better feasibility for intelligent based search. It helps in improving the dimensionality reduction and semantic analysis.

the vast amounts of raw data. This task is tackled by computational tools implementing algorithms that match the experimental data to databases, providing the user with lists for downstream analysis. Several platforms for such automated interpretation of mass spectrometric data have been developed, each...... having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....

Bayesian methods for statistical analysis is a book on statistical methods for analysing a wide variety of data. The book consists of 12 chapters, starting with basic concepts and covering numerous topics, including Bayesian estimation, decision theory, prediction, hypothesis testing, hierarchical models, Markov chain Monte Carlo methods, finite population inference, biased sampling and nonignorable nonresponse. The book contains many exercises, all with worked solutions, including complete c...

This paper summarizes a presentation for a panel discussion on "The Future of Astrostatistics" held at the Statistical Challenges in Modern Astronomy V conference at Pennsylvania State University in June 2011. I argue that the emerging needs of astrostatistics may both motivate and benefit from fundamental developments in statistics. I highlight some recent work within statistics on fundamental topics relevant to astrostatistical practice, including the Bayesian/frequentist debate (and ideas for a synthesis), multilevel models, and multiple testing. As an important direction for future work in statistics, I emphasize that astronomers need a statistical framework that explicitly supports unfolding chains of discovery, with acquisition, cataloging, and modeling of data not seen as isolated tasks, but rather as parts of an ongoing, integrated sequence of analyses, with information and uncertainty propagating forward and backward through the chain. A prototypical example is surveying of astronomical populations, ...

Various areas of hybrid microelectronic technology are discussed. The topics addressed include: basic thick film processing, thick film pastes and substrates, add-on components and attachment methods, thin film processing, and design of thick film hybrid circuits. Also considered are: packaging hybrid circuits, automating the production of hybrid circuits, application of hybrid techniques, customer's view of hybrid technology, and quality control and assurance in hybrid circuit production.

A Practical Guide to Implementing Nonparametric and Rank-Based ProceduresNonparametric Statistical Methods Using R covers traditional nonparametric methods and rank-based analyses, including estimation and inference for models ranging from simple location models to general linear and nonlinear models for uncorrelated and correlated responses. The authors emphasize applications and statistical computation. They illustrate the methods with many real and simulated data examples using R, including the packages Rfit and npsm.The book first gives an overview of the R language and basic statistical c

Written in a friendly, conversational style, this book offers a hands-on approach to statistical mediation and moderation for both beginning researchers and those familiar with modeling. Starting with a gentle review of regression-based analysis, Paul Jose covers basic mediation and moderation techniques before moving on to advanced topics in multilevel modeling, structural equation modeling, and hybrid combinations, such as moderated mediation. User-friendly features include numerous graphs and carefully worked-through examples; ""Helpful Suggestions"" about procedures and pitfalls; ""Knowled

A hybrid gear consisting of metallic outer rim with gear teeth and metallic hub in combination with a composite lay up between the shaft interface (hub) and gear tooth rim is described. The composite lay-up lightens the gear member while having similar torque carrying capability and it attenuates the impact loading driven noise/vibration that is typical in gear systems. The gear has the same operational capability with respect to shaft speed, torque, and temperature as an all-metallic gear as used in aerospace gear design.

has turned out as a major focus of European education and training policies and certainly is a crucial principle underlying the European Qualifications Framework (EQF). In this context, «hybrid qualifications» (HQ) may be seen as an interesting approach to tackle these challenges as they serve «two...... masters», i.e. by producing skills for the labour market and enabling individuals to progress more or less directly to higher education. The specific focus of this book is placed on conditions, structures and processes which help to combine VET with qualifications leading into higher education...

Applying Statistics in Behavioural Research is written for undergraduate students in the behavioural sciences, such as Psychology, Pedagogy, Sociology and Ethology. The topics range from basic techniques, like correlation and t-tests, to moderately advanced analyses, like multiple regression and MAN

Full Text Available Brain-imaging research has predominantly generated insight by means of classical statistics, including regression-type analyses and null-hypothesis testing using t-test and ANOVA. Throughout recent years, statistical learning methods enjoy increasing popularity especially for applications in rich and complex data, including cross-validated out-of-sample prediction using pattern classification and sparsity-inducing regression. This concept paper discusses the implications of inferential justifications and algorithmic methodologies in common data analysis scenarios in neuroimaging. It is retraced how classical statistics and statistical learning originated from different historical contexts, build on different theoretical foundations, make different assumptions, and evaluate different outcome metrics to permit differently nuanced conclusions. The present considerations should help reduce current confusion between model-driven classical hypothesis testing and data-driven learning algorithms for investigating the brain with imaging techniques.

We have carried out a statistic survey on the pulsating variable stars with multiple identities. These stars were identified to exhibit two types of pulsation or multiple light variability types in the literature, and are usually called hybrid pulsators. We extracted the hybrid information based on the Simbad database. Actually, all the variables with multiple identities are retrieved. The survey covers various pulsating stars across the Hertzsprung-Russell diagram. We aim at giving a clue in selecting interesting targets for further observation. Hybrid pulsators are excellent targets for asteroseismology. An important implication of such stars is their potential in advancing the theories of both stellar evolution and pulsation. By presenting the statistics, we address the open questions and prospects regarding current status of hybrid pulsation studies.

Intuitionistic hybrid logic is hybrid modal logic over an intuitionistic logic basis instead of a classical logical basis. In this short paper we introduce intuitionistic hybrid logic and we give a survey of work in the area.......Intuitionistic hybrid logic is hybrid modal logic over an intuitionistic logic basis instead of a classical logical basis. In this short paper we introduce intuitionistic hybrid logic and we give a survey of work in the area....

We investigate the connections between the process algebra for hybrid systems of Bergstra and Middelburg and the formalism of hybrid automata of Henzinger et al. We give interpretations of hybrid automata in the process algebra for hybrid systems and compare them with the standard interpretation of hybrid automata as timed transition systems. We also relate the synchronized product operator on hybrid automata to the parallel composition operator of the process algebra. It turns out that the f...

Epidemic spreading phenomena are ubiquitous in nature and society. Examples include the spreading of diseases, information, and computer viruses. Epidemics can spread by \\textit{local spreading}, where infected nodes can only infect a limited set of direct target nodes and \\textit{global spreading}, where an infected node can infect every other node. In reality, many epidemics spread using a hybrid mixture of both types of spreading. In this study we develop a theoretical framework for studying hybrid epidemics, and examine the optimum balance between spreading mechanisms in terms of achieving the maximum outbreak size. In a metapopulation, made up of many weakly connected subpopulations, we show that one can calculate an optimal tradeoff between local and global spreading which will maximise the extent of the epidemic. As an example we analyse the 2008 outbreak of the Internet worm Conficker, which uses hybrid spreading to propagate through the internet. Our results suggests that the worm would have been eve...

Percolation has been one of the most applied statistical models. Percolation transition is one of the most robust continuous transitions known thus far. However, recent extensive researches reveal that it exhibits diverse types of phase transitions such as discontinuous and hybrid phase transitions. Here hybrid phase transition means the phase transition exhibiting natures of both continuous and discontinuous phase transitions simultaneously. Examples include k-core percolation, cascading failures in interdependent networks, synchronization, etc. Thus far, it is not manifest if the critical behavior of hybrid percolation transitions conforms to the conventional scaling laws of second-order phase transition. Here, we investigate the critical behaviors of hybrid percolation transitions in the cascading failure model in inter-dependent networks and the restricted Erdos-Renyi model. We find that the critical behaviors of the hybrid percolation transitions contain some features that cannot be described by the conventional theory of second-order percolation transitions.

Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…

Full Text Available A rigorous full-wave analysis is employed to analyze discontinuity in shielded Microstrip (open end, uniform bend. An accurate and efficient method of moments solution combined with the source method(SM formulation is proposed in order to achieve a full-wave characterization of the analyzed structures. A wavelet matrix transform(WMT, operated by wavelet-like transform (WLT allows a significant reduction of the central processing unit time and the memory storage.

Statisticalanalyses are a key part of biomedical research. Traditionally surgical research has relied upon a few statistical methods for evaluation and interpretation of data to improve clinical practice. As research methods have increased in both rigor and complexity, statisticalanalyses and interpretation have fallen behind. Some evidence suggests that surgical research studies are being designed and analyzed improperly given the specific study question. The goal of this article is to discuss the complexities of surgical research analyses and interpretation, and provide some resources to aid in these processes.

The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.

The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151

Full Text Available We propose a new interpretation of the neutral and charged X,Z exotic hadron resonances. Hybridized-tetraquarks are neither purely compact tetraquark states nor bound or loosely bound molecules but rather a manifestation of the interplay between the two. While meson molecules need a negative or zero binding energy, its counterpart for h-tetraquarks is required to be positive. The formation mechanism of this new class of hadrons is inspired by that of Feshbach metastable states in atomic physics. The recent claim of an exotic resonance in the Bs0π± channel by the D0 Collaboration and the negative result presented subsequently by the LHCb Collaboration are understood in this scheme, together with a considerable portion of available data on X,Z particles. Considerations on a state with the same quantum numbers as the X(5568 are also made.

We propose a new interpretation of the neutral and charged X, Z exotic hadron resonances. Hybridized-tetraquarks are neither purely compact tetraquark states nor bound or loosely bound molecules. The latter would require a negative or zero binding energy whose counterpart in h-tetraquarks is a positive quantity. The formation mechanism of this new class of hadrons is inspired by that of Feshbach metastable states in atomic physics. The recent claim of an exotic resonance in the Bs pi+- channel by the D0 collaboration and the negative result presented subsequently by the LHCb collaboration are understood in this scheme, together with a considerable portion of available data on X, Z particles. Considerations on a state with the same quantum numbers as the X(5568) are also made.

Full Text Available We describe a simple way to construct new statistical models for spatial point pattern data. Taking two or more existing models (finite Gibbs spatial point processes we multiply the probability densities together and renormalise to obtain a new probability density. We call the resulting model a hybrid. We discuss stochastic properties of hybrids, their statistical implications, statistical inference, computational strategies and software implementation in the R package spatstat. Hybrids are particularly useful for constructing models which exhibit interaction at different spatial scales. The methods are demonstrated on a real data set on human social interaction. Software and data are provided.

Hybrid vigour, or heterosis, refers to the increased yield and biomass of hybrid offspring relative to the parents. Although this has been exploited in plants for agriculture and horticulture, the molecular and cellular mechanisms underlying hybrid vigour are largely unknown. Genetic analyses show that there are a large number of quantitative trait loci (QTLs) that contribute to the heterotic phenotype, indicating that it is a complex phenomenon. Gene expression in hybrids is regulated by the interactions of the two parental epigenetic systems and the underlying genomes. Increasing understanding of the interplay of small RNA (sRNA) molecules, DNA methylation, and histone marks provides new opportunities to define the basis of hybrid vigour and to understand why F1 heterosis is not passed on to subsequent generations. We discuss recent findings that suggest the existence of several pathways that alter DNA methylation patterns, which may lead to transcriptional changes resulting in the heterotic phenotype.

Hybrid sterility is the most common form of postzygotic reproductive isolation in plants. The best-known example is perhaps the hybrid sterility between indica and japonica subspecies of Asian cultivated rice (Oryza sativa L.). Major progress has been reported recently in rice in identifying and cloning hybrid sterility genes at two loci regulating female and male fertility, respectively. Genetic analyses and molecular characterization of these genes, together with the results from other model organisms especially Drosophila, have advanced the understanding of the processes underlying reproductive isolation and speciation. These findings also have significant implications for crop genetic improvement, by providing the feasibility and strategies for overcoming intersubspecific hybrid sterility thus allowing the development of intersubspecific hybrids.

The aim of this work is the determination of regional-scale rainfall thresholds for the triggering of landslides in the Tuscany Region (Italy).The critical rainfall events related to the occurrence of 593 past landslides were characterized in terms of duration (D) and intensity (Ⅰ).I and D values were plotted in a log-log diagram and a lower boundary was clearly noticeable:it was interpreted as a threshold representing the rainfall conditions associated to landsliding.That was also confirmed by a comparison with many literature thresholds,but at the same time it was clear that a similar threshold would be affected by a too large approximation to be effectively used for a regional warning system.Therefore,further analyses were performed differentiating the events on the basis of seasonality,magnitude,location,land use and lithology.None of these criteria led to discriminate among all the events different groups to be characterized by a specific and more effective threshold.This outcome could be interpreted as the demonstration that at regional scale the best results are obtained by the simplest approach,in our case an empirical black box model which accounts only for two rainfall parameters (I and D).So a set of thresholds could be conveniently defined using a statistical approach:four thresholds corresponding to four severity levels were defined by means of the prediction interval technique and we developed a prototype warning system based on rainfall recordings or weather forecasts.%@@

Compared to organic coatings, organic-inorganic hybrid coatings can potentially improve the anticorrosion performance. The organic phase provides the excellent mechaincal and barrier properties while the inorganic phase acts as an adhesion promoter and corrosion inhibitor. Despite that many studies on alkoxylsilane-based hybrid coatings have been developed and studied, their weatherability and anticorrosion performance has been rarely evaluated. On the other hand, organic-inorganic hybrid coatings based on mixed sol-gel precursors have received much less attention compared to alkoxylsilane-based hybrid coatings. In the first part, polyurethane hybrid coatings with a unique hybrid crosslinked structure as an improved unicoat were successfully prepared. The effect of polyesters on physical properties of the hybrid coatings was studied. Polyurethane coatings derived from cycloaliphatic polyester show comparable properties than those derived from the commercially viable aromatic polyester. Introducing the polysiloxane part into the polyurethane coatings enhanced the crosslinking density, Tg, mechanical properties, and general coating properties. The increased adhesion between the hybrid coating and the substrate make the hybrid coating a good candidate for anticorrosion application, which is shown by electrochemical impedance spectroscopy (EIS). The degradation mechanism of the polyurethane/polysiloxane hybrid coatings under various weathering conditions was shown to be the scission of the urethane and ester groups in the organic phase along with reorganizing and rearranging of the inorganic phase. The anticorrosion performance of the cycloaliphatic hybrid was much better than that of aromatic based hybrid under outdoor weathering based on visual observation and EIS analysis. Acid undercutting is an issue for TEOS based hybrid coating. In the second part, design of experiments (DOEs) was used to statistically investigate on the effect of sol-gel precursors. The

We investigate the connections between the process algebra for hybrid systems of Bergstra and Middelburg and the formalism of hybrid automata of Henzinger et al. We give interpretations of hybrid automata in the process algebra for hybrid systems and compare them with the standard interpretation of

We investigate the connections between the process algebra for hybrid systems of Bergstra and Middelburg and the formalism of hybrid automata of Henzinger et al. We give interpretations of hybrid automata in the process algebra for hybrid systems and compare them with the standard interpretation of

Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

The joint analysis of spatial and genetic data is rapidly becoming the norm in population genetics. More and more studies explicitly describe and quantify the spatial organization of genetic variation and try to relate it to underlying ecological processes. As it has become increasingly difficult...... to keep abreast with the latest methodological developments, we review the statistical toolbox available to analyse population genetic data in a spatially explicit framework. We mostly focus on statistical concepts but also discuss practical aspects of the analytical methods, highlighting not only...

Written by pioneers in this exciting new field, Algebraic Statistics introduces the application of polynomial algebra to experimental design, discrete probability, and statistics. It begins with an introduction to Gröbner bases and a thorough description of their applications to experimental design. A special chapter covers the binary case with new application to coherent systems in reliability and two level factorial designs. The work paves the way, in the last two chapters, for the application of computer algebra to discrete probability and statistical modelling through the important concept of an algebraic statistical model.As the first book on the subject, Algebraic Statistics presents many opportunities for spin-off research and applications and should become a landmark work welcomed by both the statistical community and its relatives in mathematics and computer science.

Hybrid computational phantoms combine voxel-based and simplified equation-based modelling approaches to provide unique advantages and more realism for the construction of anthropomorphic models. In this work, a methodology and C++ code are developed to generate hybrid computational phantoms covering statistical distributions of body morphometry in the paediatric population. The paediatric phantoms of the Virtual Population Series (IT’IS Foundation, Switzerland) were modified to match target anthropometric parameters, including body mass, body length, standing height and sitting height/stature ratio, determined from reference databases of the National Centre for Health Statistics and the National Health and Nutrition Examination Survey. The phantoms were selected as representative anchor phantoms for the newborn, 1, 2, 5, 10 and 15 years-old children, and were subsequently remodelled to create 1100 female and male phantoms with 10th, 25th, 50th, 75th and 90th body morphometries. Evaluation was performed qualitatively using 3D visualization and quantitatively by analysing internal organ masses. Overall, the newly generated phantoms appear very reasonable and representative of the main characteristics of the paediatric population at various ages and for different genders, body sizes and sitting stature ratios. The mass of internal organs increases with height and body mass. The comparison of organ masses of the heart, kidney, liver, lung and spleen with published autopsy and ICRP reference data for children demonstrated that they follow the same trend when correlated with age. The constructed hybrid computational phantom library opens up the prospect of comprehensive radiation dosimetry calculations and risk assessment for the paediatric population of different age groups and diverse anthropometric parameters.

We present an overview of a series of results obtained from the analysis of human behavior in a virtual environment. We focus on the massive multiplayer online game (MMOG) Pardus which has a worldwide participant base of more than 400,000 registered players. We provide evidence for striking statistical similarities between social structures and human-action dynamics in the real and virtual worlds. In this sense MMOGs provide an extraordinary way for accurate and falsifiable studies of social phenomena. We further discuss possibilities to apply methods and concepts developed in the course of these studies to analyse oral and written narratives.

All students and researchers in environmental and biological sciences require statistical methods at some stage of their work. Many have a preconception that statistics are difficult and unpleasant and find that the textbooks available are difficult to understand. Practical Statistics for Environmental and Biological Scientists provides a concise, user-friendly, non-technical introduction to statistics. The book covers planning and designing an experiment, how to analyse and present data, and the limitations and assumptions of each statistical method. The text does not refer to a specific comp

The large amount of data on galaxies, up to higher and higher redshifts, asks for sophisticated statistical approaches to build adequate classifications. Multivariate cluster analyses, that compare objects for their global similarities, are still confidential in astrophysics, probably because their results are somewhat difficult to interpret. We believe that the missing key is the unavoidable characteristics in our Universe: evolution. Our approach, known as Astrocladistics, is based on the evolutionary nature of both galaxies and their properties. It gathers objects according to their "histories" and establishes an evolutionary scenario among groups of objects. In this presentation, I show two recent results on globular clusters and earlytype galaxies to illustrate how the evolutionary concepts of Astrocladistics can also be useful for multivariate analyses such as K-means Cluster Analysis.

Stable and stationary states with hollow current density profiles have been achieved with Lower Hybrid Current Drive (LHCD) during Lower Hybrid (LH) wave accessibility experiments. By analysing the bounded propagation domain in phase space which naturally limits the central penetration and absorption of the waves, off-axis LH power deposition has been realized in a reproducible manner. The resulting current density profile modifications have led to a global confinement enhancement attributed to the formation of an internal `transport barrier` in the central reversed shear region where the electron thermal diffusivity is reduced to its neoclassical collisional level. The multiple-pass LH wave propagation in the weak Landau damping and reversed magnetic shear regime is also investigated in the framework of a statistical theory and the experimental validation of this theory is discussed. (author). 37 refs.

Full Text Available This research work aims in developing Tamil to English Cross - language text retrieval system using hybrid machine translation approach. The hybrid machine translation system is a combination of rule based and statistical based approaches. In an existing word by word translation system there are lot of issues and some of them are ambiguity, Out-of-Vocabulary words, word inflections, and improper sentence structure. To handle these issues, proposed architecture is designed in such a way that, it contains Improved Part-of-Speech tagger, machine learning based morphological analyser, collocation based word sense disambiguation procedure, semantic dictionary, and tense markers with gerund ending rules, and two pass transliteration algorithm. From the experimental results it is clear that the proposed Tamil Query based translation system achieves significantly better translation quality over existing system, and reaches 95.88% of monolingual performance.

Full Text Available A fundamental concern of a theory of statistical inference is how one should measure statistical evidence. Certainly the words “statistical evidence,” or perhaps just “evidence,” are much used in statistical contexts. It is fair to say, however, that the precise characterization of this concept is somewhat elusive. Our goal here is to provide a definition of how to measure statistical evidence for any particular statistical problem. Since evidence is what causes beliefs to change, it is proposed to measure evidence by the amount beliefs change from a priori to a posteriori. As such, our definition involves prior beliefs and this raises issues of subjectivity versus objectivity in statisticalanalyses. This is dealt with through a principle requiring the falsifiability of any ingredients to a statistical analysis. These concerns lead to checking for prior-data conflict and measuring the a priori bias in a prior.

S-Plus is a powerful environment for statistical and graphical analysis of data. It provides the tools to implement many statistical ideas which have been made possible by the widespread availability of workstations having good graphics and computational capabilities. This book is a guide to using S-Plus to perform statisticalanalyses and provides both an introduction to the use of S-Plus and a course in modern statistical methods. The aim of the book is to show how to use S-Plus as a powerful and graphical system. Readers are assumed to have a basic grounding in statistics, and so the book is intended for would-be users of S-Plus, and both students and researchers using statistics. Throughout, the emphasis is on presenting practical problems and full analyses of real data sets.

This article describes a fun activity that can be used to help students make links between statisticalanalyses and their real-world implications. Although an illustrative example is provided using analysis of variance, the activity may be adapted for use with other statistical techniques.

Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.

Full Text Available To evaluate new winter rapeseed hybrids and cultivars, investigations were conducted at the experimental field of the Faculty of Agriculture, University of Zagreb, in the period 2009/10 - 2011/12. The trial involved 11 hybrids and 5 cultivars rapeseed of 5 seed producers selling seed in Croatia. The studied rapeseed hybrids and cultivars differed significantly in seed and oil yields, oil content and yield components (seed number per silique and 1000 seed weight. However, a number of hybrids rendered identical results, since the differences in the investigated properties were within statistically allowable deviation. Hybrids Traviata and CWH 119 can be singled out based on the achieved seed and oil yields, and the cultivar Ricco and hybrids CWH 119 and PR46W15 for their high oil content in seed. Hybrids with a larger silique number per plant also achieved a higher seed yield.

STATISTICS USING R will be useful at different levels, from an undergraduate course in statistics, through graduate courses in biological sciences, engineering, management and so on. The book introduces statistical terminology and defines it for the benefit of a novice. For a practicing statistician, it will serve as a guide to R language for statistical analysis. For a researcher, it is a dual guide, simultaneously explaining appropriate statistical methods for the problems at hand and indicating how these methods can be implemented using the R language. For a software developer, it is a guide in a variety of statistical methods for development of a suite of statistical procedures.

Statistics comes in two main flavors: frequentist and Bayesian. For historical and technical reasons, frequentist statistics has dominated data analysis in the past; but Bayesian statistics is making a comeback at the forefront of science. In this paper, we give a practical overview of Bayesian statistics and illustrate its main advantages over frequentist statistics for the kinds of analyses that are common in empirical software engineering, where frequentist statistics still is standard. We...

If the waterfall field of hybrid inflation couples to a U(1) gauge field, the waterfall can generate a statistically anisotropic contribution to the curvature perturbation. We investigate this possibility, generalising in several directions the seminal work of Yokoyama and Soda. The statistical anisotropy of the bispectrum could be detectable by PLANCK even if the statistical anisotropy of the spectrum is too small to detect.

According to Howard Gardner, Professor of Cognition and Education at Harvard University, intelligence of humans cannot be measured with a single factor such as the IQ level. Instead, he and others have suggested that humans have different types of intelligence. This paper examines whether students registered in online or mostly online courses have…

that 3D methodologies can accurately detect the Region Of Interest (ROI. Automatic segmentation has been achieved using HMMs where the ROI is detected accurately but suffers a long computation time for its calculations.

Full Text Available With the aim to clarify the effect of seasonal variation on reproductive performance of hybrid rabbits, a six-years investigation was carried out. Traits analysed were pregnancy rate of does and numerical productivity at weaning. The data set included: 33588 matings and subsequent pregnancy diagnosis; 245743 young rabbits at weaning. From the statistical analysis, pregnancy rate and numerical productivity at weaning appeared to be significantly (P<0.001 affected by seasonal variation. Furthermore a statistically significant (P<0.001 month influence was also found. Nevertheless a correlation between the two parameters needs to be performed to supplement our analysis.

The Adaptive Resolution Scheme (AdResS) is a hybrid scheme that allows to treat a molecular system with different levels of resolution depending on the location of the molecules. The construction of a Hamiltonian based on the this idea (H-AdResS) allows one to formulate the usual tools of ensembles and statistical mechanics. We present a number of exact and approximate results that provide a statistical mechanics foundation for this simulation method. We also present simulation results that illustrate the theory.

Volatile compounds were extracted by a pentane/ether (1:1) mixture from the leaves of seven citrus somatic tetraploid hybrids sharing mandarin as their common parent and having lime, Eurêka lemon, lac lemon, sweet orange, grapefruit, kumquat, or poncirus as the other parent. Extracts were examined by GC-MS and compared with those of their respective parents. All hybrids were like their mandarin parent, and unlike their nonmandarin parents, in being unable to synthesize monoterpene aldehydes and alcohols. The hybrids did retain the ability, although strongly reduced, of their nonmandarin parents to synthesize sesquiterpene hydrocarbons, alcohols, and aldehydes. These results suggest that complex forms of dominance in the mandarin genome determine the biosynthesis pathways of volatile compounds in tetraploid hybrids. A down-regulation of the biosynthesis of methyl N-methylanthranilate, a mandarin-specific compound, originates from the genomes of the nonmandarin parents. Statisticalanalyses showed that all of the hybrids were similar to their common mandarin parent in the relative composition of their volatile compounds.

A guide to the essential statistical skills needed for success in assignments, projects or dissertations. It explains why it is impossible to avoid using statistics in analysing data. It also describes the language of statistics to make it easier to understand the various terms used for statistical techniques.

Full Text Available This paper reports on spatial-statisticalanalyses for simulated random packings of spheres with random diameters. The simulation methods are the force-biased algorithm and the Jodrey-Tory sedimentation algorithm. The sphere diameters are taken as constant or following a bimodal or lognormal distribution. Standard characteristics of spatial statistics are used to describe these packings statistically, namely volume fraction, pair correlation function of the system of sphere centres and spherical contact distribution function of the set-theoretic union of all spheres. Furthermore, the coordination numbers are analysed.

Hybrid breakdown, or outbreeding depression, is the loss of fitness observed in crosses between genetically divergent populations. The role of maternally inherited mitochondrial genomes in hybrid breakdown has not been widely examined. Using laboratory crosses of the marine copepod Tigriopus californicus, we report that the low fitness of F(3) hybrids is completely restored in the offspring of maternal backcrosses, where parental mitochondrial and nuclear genomic combinations are reassembled. Paternal backcrosses, which result in mismatched mitochondrial and nuclear genomes, fail to restore hybrid fitness. These results suggest that fitness loss in T. californicus hybrids is completely attributable to nuclear-mitochondrial genomic interactions. Analyses of ATP synthetic capacity in isolated mitochondria from hybrid and backcross animals found that reduced ATP synthesis in hybrids was also largely restored in backcrosses, again with maternal backcrosses outperforming paternal backcrosses. The strong fitness consequences of nuclear-mitochondrial interactions have important, and often overlooked, implications for evolutionary and conservation biology.

Full Text Available Production values of 43 experimental and recognized sugar beet hybrids were conducted on the Zagreb location in the period 2003-2005. The trials included hybrids from six breeding institutions that sell sugar beet seed in the Republic of Croatia. Research results have revealed significant differences in yields and root quality among inve- stigated sugar beet hybrids. However, the results of a large number of hybrids were equal in value; namely, the dif- ference between them was within the statistically allowable deviation. The hybrids KW 0148 HR and Buda in 2003, Sofarizo and Takt were distinguished by high sugar yields in 2004, whereas Merak, Impact and Europa in 2005. The highest root yields were recorded for hybrids Dioneta, Buda and KW 0148 HR in 2003, Sofarizo, Takt, HI 0191 and Dorotea in 2004, Impact and SES 2371 in 2005. The highest root sugar contents were determined in hybrids Zita and Evelina in 2003, Cyntia, Diamant and Belinda in 2004, and Merak, Belinda and Cyntia in 2005.

Statisticalanalyses are performed for material strength parameters from approximately 6700 specimens of structural timber. Non-parametric statisticalanalyses and fits to the following distributions types have been investigated: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull....... The statistical fits have generally been made using all data (100%) and the lower tail (30%) of the data. The Maximum Likelihood Method and the Least Square Technique have been used to estimate the statistical parameters in the selected distributions. 8 different databases are analysed. The results show that 2......-parameter Weibull (and Normal) distributions give the best fits to the data available, especially if tail fits are used whereas the LogNormal distribution generally gives poor fit and larger coefficients of variation, especially if tail fits are used....

Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores

The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

The introgression of modern humans (Homo sapiens) with Neanderthals 40,000 YBP after a half-million years of separation, may have led to the best example of a hybrid swarm on earth. Modern trade and transportation in support of the human hybrids has continued to introduce additional species, genotyp...

this article shows that there are two different museum mindsets where the second mindset leans towards participatory practices. It is shown how a museum can support a hybrid economy of meaning that builds on both a user generated economy of meaning and an institutional economy of meaning and adds value to both....... Such a museum is referred to as a hybrid museum....

EPA and the United Parcel Service (UPS) have developed a hydraulic hybrid delivery vehicle to explore and demonstrate the environmental benefits of the hydraulic hybrid for urban pick-up and delivery fleets.

This paper describes the system for making reproducible statisticalanalyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......This paper describes the system for making reproducible statisticalanalyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using Open......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....

@@ 1Introduction: What are resin catalyst hybrids? There are typically two types of resin catalyst. One is acidic resin which representative is polystyrene sulfonic acid. The other is basic resin which is availed as metal complex support. The objective items of this study on resin catalyst are consisting of pellet hybrid, equilibrium hybrid and function hybrid of acid and base,as shown in Fig. 1[1-5].

A mesoscale calibration artifact, also called a hybrid artifact, suitable for hybrid dimensional measurement and the method for make the artifact. The hybrid artifact has structural characteristics that make it suitable for dimensional measurement in both vision-based systems and touch-probe-based systems. The hybrid artifact employs the intersection of bulk-micromachined planes to fabricate edges that are sharp to the nanometer level and intersecting planes with crystal-lattice-defined angles.

U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...

Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...

The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.

Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...

... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 11/ ... the United States are diagnosed with Merkel cell skin cancer each year. Almost all people diagnosed with the ...

Presents an undergraduate laboratory exercise in elementary statistics in which students verify empirically the various aspects of the Gaussian distribution. Sampling techniques and other commonly used statistical procedures are introduced. (CP)

... the full list of resources ​​. Overweight and Obesity Statistics Page Content About Overweight and Obesity Prevalence of ... adults age 20 and older [ Top ] Physical Activity Statistics Adults Research Findings Research suggests that staying active ...

Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

The history of the economic analyses is summarized for short rotation intensively cultured hybrid poplar at the North Central Forest Experiment Station. Early break-even analyses with limited data indicated that at a price of $25-30 per dry ton for fiber and low to medium production costs, several systems looked profitable. Later cash flow analyses indicated that two...

We report on preliminary results of a hybrid non-LTE analysis of high-resolution, high-S/N spectra of the helium-rich subdwarf B star Feige49 and the helium-poor sdB HD205805. Non-LTE effects are found to have a notable impact on the stellar parameter and abundance determination. In particular the HeI lines show significant deviations from detailed balance, with the computed equivalent widths strengthened by up to ~35%. Non-LTE abundance corrections for the metals (C, N, O, Mg, S) are of the order ~0.05-0.25 dex on the mean, while corrections of up to ~0.7 dex are derived for individual transitions. The non-LTE approach reduces systematic trends and the statistical uncertainties in the abundance determination. Consequently, non-LTE analyses of a larger sample of objects have the potential to put much tighter constraints on the formation history of the different sdB populations than currently discussed.

Multivariate statistics refer to an assortment of statistical methods that have been developed to handle situations in which multiple variables or measures are involved. Any analysis of more than two variables or measures can loosely be considered a multivariate statistical analysis. An introductory text for students learning multivariate statistical methods for the first time, this book keeps mathematical details to a minimum while conveying the basic principles. One of the principal strategies used throughout the book--in addition to the presentation of actual data analyses--is poin

Full Text Available We give an overview of the papers published in this special issue on spatial statistics, of the Journal of Statistical Software. 21 papers address issues covering visualization (micromaps, links to Google Maps or Google Earth, point pattern analysis, geostatistics, analysis of areal aggregated or lattice data, spatio-temporal statistics, Bayesian spatial statistics, and Laplace approximations. We also point to earlier publications in this journal on the same topic.

We give an overview of the papers published in this special issue on spatial statistics, of the Journal of Statistical Software. 21 papers address issues covering visualization (micromaps, links to Google Maps or Google Earth), point pattern analysis, geostatistics, analysis of areal aggregated or lattice data, spatio-temporal statistics, Bayesian spatial statistics, and Laplace approximations. We also point to earlier publications in this journal on the same topic.

Outlines five projects currently funded by the United Kingdom's Electronic Libraries Program (eLib): HyLiFe (Hybrid Library of the Future), MALIBU (MAnaging the hybrid Library for the Benefit of Users), HeadLine (Hybrid Electronic Access and Delivery in the Library Networked Environment), ATHENS (authentication scheme), and BUILDER (Birmingham…

Homoploid hybrid speciation occurs when a stable, fertile, and reproductively isolated lineage results from hybridization between two distinct species without a change in ploidy level. Reproductive isolation between a homoploid hybrid species and its parents is generally attained via chromosomal re...

A projectile for a railgun that uses a hybrid armature and provides a seed block around part of the outer surface of the projectile to seed the hybrid plasma brush. In addition, the hybrid armature is continuously vaporized to replenish plasma in a plasma armature to provide a tandem armature and provides a unique ridge and groove to reduce plasama blowby.

Several theoretical approaches combined in program. Intraply hybrid composites investigated theoretically and experimentally at Lewis Research Center. Theories developed during investigations and corroborated by attendant experiments used to develop computer program identified as INHYD (Intraply Hybrid Composite Design). INHYD includes several composites micromechanics theories, intraply hybrid composite theories, and integrated hygrothermomechanical theory. Equations from theories used by program as appropriate for user's specific applications.

Furusawa, Akira [Department of Applied Physics, School of Engineering, The University of Tokyo (Japan)

2014-12-04

I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

Full Text Available Genomic in situ hybridization (GISH is a powerful tool to characterize parental chromosomes in interspecific hybrids, including the behaviour of autosynapsis and chromosome pairing. It was used to distinguish the chromosomes of Oryza sativa from wild species in a spontaneous interspecific hybrid and to investigate the chromosome pairing at metaphase I in meiosis of the hybrid in this study. The hybrid was a triploid with 36 chromosomes according to the chromosome number investigated in mitosis of root tips. During metaphase I of meiosis in the hybrid, less chromosome pairing was observed and most of the chromosomes existed as univalent. Based on GISH and FISH (Fluorescent in situ hybridizationanalyses, the chromosomes of the hybrid were composed of genomes A, B and C. Thus, it was believed that the hybrid was the result of natural hybridization between cultivated rice and wild species O. minuta which was planted in experimental fields.

There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

We present a hybrid model for content extraction from HTML documents. The model operates on Document Object Model (DOM) tree of the corresponding HTML document. It evaluates each tree node and associated statistical features like link density and text distribution across the node to predict...... model outperformed other existing content extraction models. We present a browser based implementation of the proposed model as proof of concept and compare the implementation strategy with various state of art implementations. We also discuss various applications of the proposed model with special...

From Cosmos to Chaos- Peter Coles, 2006, Oxford University Press, 224pp. To confirm or refute a scientific theory you have to make a measurement. Unfortunately, however, measurements are never perfect: the rest is statistics. Indeed, statistics is at the very heart of scientific progress, but it is often poorly taught and badly received; for many, the very word conjures up half-remembered nightmares of 'null hypotheses' and 'student's t-tests'. From Cosmos to Chaos by Peter Coles, a cosmologist at Nottingham University, is an approachable antidote that places statistics in a range of catchy contexts. Using this book you will be able to calculate the probabilities in a game of bridge or in a legal trial based on DNA fingerprinting, impress friends by talking confidently about entropy, and stretch your mind thinking about quantum mechanics. (U.K.)

tested. Damage scores for all the tested hybrids were significantly different from the susceptible check (Atwalira). ... diseases, soil infertility among others and ... biological and host plant resistance. ..... Data analyses were carried out using SAS.

This article reports the findings of a 3-year study of a hybrid professional development program designed to prepare science and mathematics teachers to implement GIS in their classrooms. The study was conducted as part of the CoastLines Innovative Technology Experiences for Students and Teachers project funded by the National Science Foundation. Three cohorts of teachers participated in the program, with each participant receiving 40 h of synchronous online instruction and 80 h of in-person instruction and support over an 8-month period. Data from surveys of participants both before and after the program were analyzed using correlation, ordinary least squares, and ordered logit regression analyses. The analyses revealed increases in the self-reported frequency of GIS use and enhanced feelings of preparation, competence, community, and comfort with respect to using GIS for instruction. A composite index of all impact variables was positively influenced as well. The statisticalanalyses found a strong relationship between self-reported feelings of preparation and use of GIS. Some support was found for the idea that feelings of competence, community, and comfort were related to the teachers' sense of preparation. The findings suggest that a robust hybrid model of teacher professional development can prepare teachers to use GIS in their classrooms. More research is needed to understand how hybrid models influence the sociopsychological and other dimensions that support teachers' feelings of preparation to implement GIS.

This article reports the findings of a 3-year study of a hybrid professional development program designed to prepare science and mathematics teachers to implement GIS in their classrooms. The study was conducted as part of the CoastLines Innovative Technology Experiences for Students and Teachers project funded by the National Science Foundation. Three cohorts of teachers participated in the program, with each participant receiving 40 h of synchronous online instruction and 80 h of in-person instruction and support over an 8-month period. Data from surveys of participants both before and after the program were analyzed using correlation, ordinary least squares, and ordered logit regression analyses. The analyses revealed increases in the self-reported frequency of GIS use and enhanced feelings of preparation, competence, community, and comfort with respect to using GIS for instruction. A composite index of all impact variables was positively influenced as well. The statisticalanalyses found a strong relationship between self-reported feelings of preparation and use of GIS. Some support was found for the idea that feelings of competence, community, and comfort were related to the teachers' sense of preparation. The findings suggest that a robust hybrid model of teacher professional development can prepare teachers to use GIS in their classrooms. More research is needed to understand how hybrid models influence the sociopsychological and other dimensions that support teachers' feelings of preparation to implement GIS.

The application of statistical testing in psychological research over the period of 1940-1960 is examined in order to address psychologists' reconciliation of the extant controversy between the Fisher and Neyman-Pearson approaches. Textbooks of psychological statistics and the psychological journal literature are reviewed to examine the presence of what Gigerenzer (1993) called a hybrid model of statistical testing. Such a model is present in the textbooks, although the mathematically incomplete character of this model precludes the appearance of a similarly hybridized approach to statistical testing in the research literature. The implications of this hybrid model for psychological research and the statistical testing controversy are discussed.

This article builds on earlier studies that have examined "the economic cost of homosexuality," by using data from the 2000 U.S. Census and by employing multilevel analyses. Our findings indicate that partnered gay men experience a 12.5 percent earnings penalty compared to married heterosexual men, and a statistically insignificant earnings…

Statistics Essentials For Dummies not only provides students enrolled in Statistics I with an excellent high-level overview of key concepts, but it also serves as a reference or refresher for students in upper-level statistics courses. Free of review and ramp-up material, Statistics Essentials For Dummies sticks to the point, with content focused on key course topics only. It provides discrete explanations of essential concepts taught in a typical first semester college-level statistics course, from odds and error margins to confidence intervals and conclusions. This guide is also a perfect re

Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra

Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics

Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w

Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

The energy efficiency of various piston engine options for series hybrid automobiles are compared with conventional, battery powered electric, and proton exchange membrane (PEM) fuel cell hybrid automobiles. Gasoline, compressed natural gas (CNG), and hydrogen are considered for these hybrids. The engine and fuel comparisons are done on a basis of equal vehicle weight, drag, and rolling resistance. The relative emissions of these various fueled vehicle options are also presented. It is concluded that a highly optimized, hydrogen fueled, piston engine, series electric hybrid automobile will have efficiency comparable to a similar fuel cell hybrid automobile and will have fewer total emissions than the battery powered vehicle, even without a catalyst.

How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.

Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...

This book focuses on the meaning of statistical inference and estimation. Statistical inference is concerned with the problems of estimation of population parameters and testing hypotheses. Primarily aimed at undergraduate and postgraduate students of statistics, the book is also useful to professionals and researchers in statistical, medical, social and other disciplines. It discusses current methodological techniques used in statistics and related interdisciplinary areas. Every concept is supported with relevant research examples to help readers to find the most suitable application. Statistical tools have been presented by using real-life examples, removing the “fear factor” usually associated with this complex subject. The book will help readers to discover diverse perspectives of statistical theory followed by relevant worked-out examples. Keeping in mind the needs of readers, as well as constantly changing scenarios, the material is presented in an easy-to-understand form.

In this article, the American Cancer Society provides an overview of female breast cancer statistics in the United States, including trends in incidence, mortality, survival, and screening. Approximately 230,480 new cases of invasive breast cancer and 39,520 breast cancer deaths are expected to occur among US women in 2011. Breast cancer incidence rates were stable among all racial/ethnic groups from 2004 to 2008. Breast cancer death rates have been declining since the early 1990s for all women except American Indians/Alaska Natives, among whom rates have remained stable. Disparities in breast cancer death rates are evident by state, socioeconomic status, and race/ethnicity. While significant declines in mortality rates were observed for 36 states and the District of Columbia over the past 10 years, rates for 14 states remained level. Analyses by county-level poverty rates showed that the decrease in mortality rates began later and was slower among women residing in poor areas. As a result, the highest breast cancer death rates shifted from the affluent areas to the poor areas in the early 1990s. Screening rates continue to be lower in poor women compared with non-poor women, despite much progress in increasing mammography utilization. In 2008, 51.4% of poor women had undergone a screening mammogram in the past 2 years compared with 72.8% of non-poor women. Encouraging patients aged 40 years and older to have annual mammography and a clinical breast examination is the single most important step that clinicians can take to reduce suffering and death from breast cancer. Clinicians should also ensure that patients at high risk of breast cancer are identified and offered appropriate screening and follow-up. Continued progress in the control of breast cancer will require sustained and increased efforts to provide high-quality screening, diagnosis, and treatment to all segments of the population.

The PV/TEG hybrid system, consisting of the photovoltaic cells and thermoelectric element, is presented in the paper. The dependence of the PV/TEG hybrid system parameters on the illumination levels and the temperature is analysed. The maxim power values of the photovoltaic cell, of the thermoelectric element and of the PV/TEG system are calculated and a comparison between them is presented and analysed. An economic analysis is also presented.

The practice of statistics involves analyzing data and planning data collection schemes to answer scientific questions. Issues often arise with the data that must be dealt with and can lead to new procedures. In analyzing data, these issues can sometimes be addressed through the statistical models that are developed. Simulation can also be helpful in evaluating a new procedure. Moreover, simulation coupled with optimization can be used to plan a data collection scheme. The practice of statistics as just described is much more than just using a statistical package. In analyzing the data, it involves understanding the scientific problem and incorporating the scientist's knowledge. In modeling the data, it involves understanding how the data were collected and accounting for limitations of the data where possible. Moreover, the modeling is likely to be iterative by considering a series of models and evaluating the fit of these models. Designing a data collection scheme involves understanding the scientist's goal and staying within hislher budget in terms of time and the available resources. Consequently, a practicing statistician is faced with such tasks and requires skills and tools to do them quickly. We have written this article for students to provide a glimpse of the practice of statistics. To illustrate the practice of statistics, we consider a problem motivated by some precipitation data that our relative, Masaru Hamada, collected some years ago. We describe his rain gauge observational study in Section 2. We describe modeling and an initial analysis of the precipitation data in Section 3. In Section 4, we consider alternative analyses that address potential issues with the precipitation data. In Section 5, we consider the impact of incorporating additional infonnation. We design a data collection scheme to illustrate the use of simulation and optimization in Section 6. We conclude this article in Section 7 with a discussion.

Hybridization may drive rare taxa to extinction through genetic swamping, where the rare form is replaced by hybrids, or by demographic swamping, where population growth rates are reduced due to the wasteful production of maladaptive hybrids. Conversely, hybridization may rescue the viability of small, inbred populations. Understanding the factors that contribute to destructive versus constructive outcomes of hybridization is key to managing conservation concerns. Here, we survey the literature for studies of hybridization and extinction to identify the ecological, evolutionary, and genetic factors that critically affect extinction risk through hybridization. We find that while extinction risk is highly situation dependent, genetic swamping is much more frequent than demographic swamping. In addition, human involvement is associated with increased risk and high reproductive isolation with reduced risk. Although climate change is predicted to increase the risk of hybridization-induced extinction, we find little empirical support for this prediction. Similarly, theoretical and experimental studies imply that genetic rescue through hybridization may be equally or more probable than demographic swamping, but our literature survey failed to support this claim. We conclude that halting the introduction of hybridization-prone exotics and restoring mature and diverse habitats that are resistant to hybrid establishment should be management priorities.

We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National Statistica

Plasmon hybridization between closely spaced nanoparticles yields new hybrid modes not found in individual constituents, allowing for the engineering of resonance properties and field enhancement capabilities of metallic nanostructure. Experimental verifications of plasmon hybridization have been thus far mostly limited to optical frequencies, as metals cannot support surface plasmons at longer wavelengths. Here, we introduce the concept of 'spoof plasmon hybridization' in highly conductive metal structures and investigate experimentally the interaction of localized surface plasmon resonances (LSPR) in adjacent metal disks corrugated with subwavelength spiral patterns. We show that the hybridization results in the splitting of spoof plasmon modes into bonding and antibonding resonances analogous to molecular orbital rule and plasmonic hybridization in optical spectrum. These hybrid modes can be manipulated to produce enormous field enhancements (larger than 5000) by tuning the separation between disks or alte...

The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

Ontologies are used in the annotation and analysis of biological data. As knowledge accumulates, ontologies and annotation undergo constant modifications to reflect this new knowledge. These modifications may influence the results of statistical applications such as functional enrichment analyses that describe experimental data in terms of ontological groupings. Here, we investigate to what degree modifications of the Gene Ontology (GO) impact these statisticalanalyses for both experimental and simulated data. The analysis is based on new measures for the stability of result sets and considers different ontology and annotation changes. Our results show that past changes in the GO are non-uniformly distributed over different branches of the ontology. Considering the semantic relatedness of significant categories in analysis results allows a more realistic stability assessment for functional enrichment studies. We observe that the results of term-enrichment analyses tend to be surprisingly stable despite changes in ontology and annotation.

Natural hybridization is reproduction (without artificial influence) between two or more species/populations which are distinguishable from each other by heritable characters. Natural hybridizations among marine fishes were highly underappreciated due to limited research effort; it seems that this phenomenon occurs more often than is commonly recognized. As hybridization plays an important role in biodiversity processes in the marine environment, detecting hybridization events and investigating hybridization is important to understand and protect biodiversity. The first chapter sets the framework for this disseration study. The Cohesion Species Concept was selected as the working definition of a species for this study as it can handle marine fish hybridization events. The concept does not require restrictive species boundaries. A general history and background of natural hybridization in marine fishes is reviewed during in chapter as well. Four marine fish hybridization cases were examed and documented in Chapters 2 to 5. In each case study, at least one diagnostic nuclear marker, screened from among ~14 candidate markers, was found to discriminate the putative hybridizing parent species. To further investigate genetic evidence to support the hybrid status for each hybrid offspring in each case, haploweb analysis on diagnostic markers (nuclear and/or mitochondrial) and the DAPC/PCA analysis on microsatellite data were used. By combining the genetic evidences, morphological traits, and ecological observations together, the potential reasons that triggered each hybridization events and the potential genetic/ecology effects could be discussed. In the last chapter, sequences from 82 pairs of hybridizing parents species (for which COI barcoding sequences were available either on GenBank or in our lab) were collected. By comparing the COI fragment p-distance between each hybridizing parent species, some general questions about marine fish hybridization were discussed: Is

Full Text Available Glass fiber reinforcement of the hybrid acrylic resin with difference method can enhance residual monomer content of the material; it can cause cytotoxic effect on fibroblast cells. The purpose of this study was to know the cytotoxicity of hybrid acrylic resins after glass fiber reinforcement with difference method on the cultured fibroblasts. The squared specimens of 10 mm in length, 10 mm in width and 1.5 mm in thickness were cured for 20 minutes at 100° C. The fibroblast cells were grown in Eagle's Minimum Essential Medium to be 2 × 105 cells/ml, then the cells were added to the samples in the plates and incubated at 37° C. After 48 hours, the cytotoxic effect was determined by direct cell number count using microscope and a hemocytometer. The statisticalanalyses using one way ANOVA and LSD test showed that there were significant difference in cell viability (p < 0.05 among the groups. The means percentage of cell viability were 90.00%, 99.,11%, 98.66%, it could be concluded that glass fiber reinforcement into hybrid acrylic resin with either first method or second method was not toxic.

Agronomic and Environmental research experiments result in data that are analyzed using statistical methods. These data are unavoidably accompanied by uncertainty. Decisions about hypotheses, based on statisticalanalyses of these data are therefore subject to error. This error is of three types,...

The Global Modeling and Assimilation Office is preparing to upgrade its three-dimensional variational system to a hybrid approach in which the ensemble is generated using a square-root ensemble Kalman filter (EnKF) and the variational problem is solved using the Grid-point Statistical Interpolation system. As in most EnKF applications, we found it necessary to employ a combination of multiplicative and additive inflations, to compensate for sampling and modeling errors, respectively and, to maintain the small-member ensemble solution close to the variational solution; we also found it necessary to re-center the members of the ensemble about the variational analysis. During tuning of the filter we have found re-centering and additive inflation to play a considerably larger role than expected, particularly in a dual-resolution context when the variational analysis is ran at larger resolution than the ensemble. This led us to consider a hybrid strategy in which the members of the ensemble are generated by simply converting the variational analysis to the resolution of the ensemble and applying additive inflation, thus bypassing the EnKF. Comparisons of this, so-called, filter-free hybrid procedure with an EnKF-based hybrid procedure and a control non-hybrid, traditional, scheme show both hybrid strategies to provide equally significant improvement over the control; more interestingly, the filter-free procedure was found to give qualitatively similar results to the EnKF-based procedure.

Full Text Available We analyze statistical properties of the largest cryptocurrencies (determined by market capitalization, of which Bitcoin is the most prominent example. We characterize their exchange rates versus the U.S. Dollar by fitting parametric distributions to them. It is shown that returns are clearly non-normal, however, no single distribution fits well jointly to all the cryptocurrencies analysed. We find that for the most popular currencies, such as Bitcoin and Litecoin, the generalized hyperbolic distribution gives the best fit, while for the smaller cryptocurrencies the normal inverse Gaussian distribution, generalized t distribution, and Laplace distribution give good fits. The results are important for investment and risk management purposes.

In this chapter results of a research synthesis and quantitative meta-analyses of three facets of time effects in education are presented, namely time at school during regular lesson hours, homework, and extended learning time. The number of studies for these three facets of time that could be used

. Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...

It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

On behalf of a client of Animal Sciences Group, different varieties of veal were analyzed by both instrumental and sensory analyses. The sensory evaluation was performed with a sensory analytical panel in the period of 13th of May and 31st of May, 2005. The three varieties of veal were: young bull,

This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...

Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent

The new edition of this international bestseller continues to throw light on the world of statistics for health care professionals and medical students. Revised throughout, the 11th edition features new material in the areas of relative risk, absolute risk and numbers needed to treat diagnostic tests, sensitivity, specificity, ROC curves free statistical software The popular self-testing exercises at the end of every chapter are strengthened by the addition of new sections on reading and reporting statistics and formula appreciation.

This thesis is about statistics' contributions to industry. It is an article compendium comprising four articles divided in two blocks: (i) two contributions for a water supply company, and (ii) significance of the effects in Design of Experiments. In the first block, great emphasis is placed on how the research design and statistics can be applied to various real problems that a water company raises and it aims to convince water management companies that statistics can be very useful to impr...

Statistics help guide us to optimal decisions under uncertainty. A large variety of statistical problems are essentially solutions to optimization problems. The mathematical techniques of optimization are fundamentalto statistical theory and practice. In this book, Jagdish Rustagi provides full-spectrum coverage of these methods, ranging from classical optimization and Lagrange multipliers, to numerical techniques using gradients or direct search, to linear, nonlinear, and dynamic programming using the Kuhn-Tucker conditions or the Pontryagin maximal principle. Variational methods and optimiza

Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.

The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t

In their bestselling MATHEMATICAL STATISTICS WITH APPLICATIONS, premiere authors Dennis Wackerly, William Mendenhall, and Richard L. Scheaffer present a solid foundation in statistical theory while conveying the relevance and importance of the theory in solving practical problems in the real world. The authors' use of practical applications and excellent exercises helps you discover the nature of statistics and understand its essential role in scientific research.

Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt

The paper analyses the relations among GDP and the foreign direct investment in manufacturing industry （FM）, service industry （FS）, real estate （FRE） and education （FE） from related statistical data from 1990 to 2010 based on VAR model, conclusions are as follows ： in the long run, the increases of FM and FS lead to the rise of GDP, and the FRE and FE change in the opposite direction with GDP. In the short term, accounting from the influence to national economic development from high to low ranking, it is FS, FM, FE and FRE. We suggest that the decision - maker should make decision based on long - term goal guided by the Catalogue of Industries for Guiding Foreign Investment, when utilizing and optimizing the structure of foreign direct investment. So the role of FDI to drive economic growth should be given full attention to, and at the same time, reducing the negative influence.%根据1990—2010年全国国内生产总值（GDP）、FDI在制造业的投资额（FM）、FDI在服务业的投资额（FS）、FDI在房地产业的投资额（FRE）及在教育领域的投资额（FE）的统计数据进行基于VAR模型的实证分析，认为长期看，外商在制造业的投资额（FM）和在服务业的投资额（FS）的增加导致全国GDP的增加，外商在房地产业的投资额（FRE）以及在教育领域的投资额（FE）与GDP反向变动；短期看，对全国经济发展的短期影响程度由高到低，依次是FDI在服务业的投资额（FS）、在制造业的投资额（FM）和在教育领域的投资额（FE）以及在房地产业的投资额（FRE）。建议以国务院颁布的《外商投资产业指导目录》为导向，积极利用外资及优化外商投资结构，在发挥外资对我国经济的积极推动作用的同时尽量减少外资的负面影响。

Following our earlier work, we construct statistical discrete geometry by applying statistical mechanics to discrete (Regge) gravity. We propose a coarse-graining method for discrete geometry under the assumptions of atomism and background independence. To maintain these assumptions, restrictions are given to the theory by introducing cut-offs, both in ultraviolet and infrared regime. Having a well-defined statistical picture of discrete Regge geometry, we take the infinite degrees of freedom (large n) limit. We argue that the correct limit consistent with the restrictions and the background independence concept is not the continuum limit of statistical mechanics, but the thermodynamical limit.

Ericsson is a global provider of telecommunications systems equipment and related services for mobile and fixed network operators. 3Gsim is a tool used by Ericsson in tests of the 3G RNC node. In order to validate the tests, statistics are constantly gathered within 3Gsim and users can use telnet to access the statistics using some system specific 3Gsim commands. The statistics can be retrieved but is unstructured for the human eye and needs parsing and arranging to be readable. The statist...

This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.