Sample records for optical para metric

We firstly revisit the importance, naturalness and limitations of the so-called opticalmetrics for describing the propagation of light rays in the limit of geometric optics. We then exemplify their flexibility and nontriviality in some nonlinear material media and in the context of nonlinear theories of the electromagnetism, both underlain by curved backgrounds, where opticalmetrics could be flat and impermeable membranes only to photons could be conceived, respectively. Finally, we underline and discuss the relevance and potential applications of our analyses in a broad sense, ranging from material media to compact astrophysical systems.

The light trajectory in an inhomogeneous medium is studied by the variation of Lagrangians L and which correspond to Fermat’s principle in the geometrical optics and the null geodesic in the metricoptics,respectively.The relation between the metric coefficients of the three-dimensional space and of the four-dimensional space-time is established.The physical meaning of the equivalence and difference of both the descriptions is revealed.It is shown that Fermat’s principle is a direct result of the null geodesic.

In non-magnetic anisotropic media, the behavior of electromagnetic waves depends on the polarization and direction of the incident light. Therefore, to tame the unwanted wave responses such as polarization dependent reflections, the artificial impedance-matched media are suggested to be used in optical devices like invisibility cloak or super lenses. Nevertheless, developing the impedance-matched media is far from trivial in practice. In this paper, we are comparing the samples of both impedance-matched and non-impedance-matched (non-magnetic) media regarding their electromagnetic response in constructing a well-defined opticalmetric. In the case of similar anisotropic patterns, we show that the opticalmetric in an impedance-matched medium for unpolarized light is the same as the opticalmetric of an electrical birefringent medium when the extraordinary mode is concerned. By comparing the eikonal equation in an empty curved space-time and its counterparts in the medium, we have shown that a non-impedance-matched medium can resemble an opticalmetric for a particular polarization. As an example of non-impedance-matched materials, we are studying a medium with varying optical axis profile. We show that such a medium can be an alternative to impedance-matched materials in various optical devices.

Nano artifact metrics exploit unique physical attributes of nanostructured matter for authentication and clone resistance, which is vitally important in the age of Internet-of-Things where securing identities is critical. However, high-cost and huge experimental apparatuses, such as scanning electron microscopy, have been required in the former studies. Herein, we demonstrate an optical approach to characterise the nanoscale-precision signatures of silicon random structures towards realising low-cost and high-value information security technology. Unique and versatile silicon nanostructures are generated via resist collapse phenomena, which contains dimensions that are well below the diffraction limit of light. We exploit the nanoscale precision ability of confocal laser microscopy in the height dimension, and our experimental results demonstrate that the vertical precision of measurement is essential in satisfying the performances required for artifact metrics. Furthermore, by using state-of-the-art nanostru...

Nano-artifact metrics exploit unique physical attributes of nanostructured matter for authentication and clone resistance, which is vitally important in the age of Internet-of-Things where securing identities is critical. However, expensive and huge experimental apparatuses, such as scanning electron microscopy, have been required in the former studies. Herein, we demonstrate an optical approach to characterise the nanoscale-precision signatures of silicon random structures towards realising low-cost and high-value information security technology. Unique and versatile silicon nanostructures are generated via resist collapse phenomena, which contains dimensions that are well below the diffraction limit of light. We exploit the nanoscale precision ability of confocal laser microscopy in the height dimension; our experimental results demonstrate that the vertical precision of measurement is essential in satisfying the performances required for artifact metrics. Furthermore, by using state-of-the-art nanostructuring technology, we experimentally fabricate clones from the genuine devices. We demonstrate that the statistical properties of the genuine and clone devices are successfully exploited, showing that the liveness-detection-type approach, which is widely deployed in biometrics, is valid in artificially-constructed solid-state nanostructures. These findings pave the way for reasonable and yet sufficiently secure novel principles for information security based on silicon random nanostructures and optical technologies.

Nano-artifact metrics exploit unique physical attributes of nanostructured matter for authentication and clone resistance, which is vitally important in the age of Internet-of-Things where securing identities is critical. However, expensive and huge experimental apparatuses, such as scanning electron microscopy, have been required in the former studies. Herein, we demonstrate an optical approach to characterise the nanoscale-precision signatures of silicon random structures towards realising low-cost and high-value information security technology. Unique and versatile silicon nanostructures are generated via resist collapse phenomena, which contains dimensions that are well below the diffraction limit of light. We exploit the nanoscale precision ability of confocal laser microscopy in the height dimension; our experimental results demonstrate that the vertical precision of measurement is essential in satisfying the performances required for artifact metrics. Furthermore, by using state-of-the-art nanostructuring technology, we experimentally fabricate clones from the genuine devices. We demonstrate that the statistical properties of the genuine and clone devices are successfully exploited, showing that the liveness-detection-type approach, which is widely deployed in biometrics, is valid in artificially-constructed solid-state nanostructures. These findings pave the way for reasonable and yet sufficiently secure novel principles for information security based on silicon random nanostructures and optical technologies. PMID:27578146

It is has been long known that the curved space in the presence of gravitation can be described as a non-homogeneous anisotropic medium in flat geometry with different constitutive equations. In this article, we show that the eigenpolarizations of such medium can be exactly solved, leading to a pseudo-isotropic description of curved vacuum with two refractive index eigenvalues having opposite signs, which correspond to forward and backward travel in time. We conclude that for a rotating universe, time-reversal symmetry is broken. We also demonstrate the applicability of this method to Schwarzschild metric and derive exact forms of refractive index. We derive the subtle optical anisotropy of space around a spherically symmetric, non-rotating and uncharged blackhole in the form of an elegant closed-form expression, and show that the refractive index in such a pseudo-isotropic system would be a function of coordinates as well as the direction of propagation. Corrections arising from such anisotropy in the bending of light are shown and a simplified system of equations for ray-tracing in the equivalent medium of Schwarzschild metric is found.

Grain shape is commonly understood as a morphological characteristic of snow that is independent of the optical diameter (or specific surface area) influencing its physical properties. In this study we use tomography images to investigate two objectively defined metrics of grain shape that naturally extend the characterization of snow in terms of the optical diameter. One is the curvature length λ2, related to the third-order term in the expansion of the two-point correlation function, and the other is the second moment μ2 of the chord length distributions. We show that the exponential correlation length, widely used for microwave modeling, can be related to the optical diameter and λ2. Likewise, we show that the absorption enhancement parameter B and the asymmetry factor gG, required for optical modeling, can be related to the optical diameter and μ2. We establish various statistical relations between all size metrics obtained from the two-point correlation function and the chord length distribution. Overall our results suggest that the characterization of grain shape via λ2 or μ2 is virtually equivalent since both capture similar aspects of size dispersity. Our results provide a common ground for the different grain metrics required for optical and microwave modeling of snow.

Application of weight coefficients of the bilateral filter used to determine weighted similarity metrics of image ranges in optical flow computation algorithm that employs 3-dimension recursive search (3DRS) was investigated. By testing the algorithm applying images taken from the public test database Middlebury benchmark, the effectiveness of this weighted similarity metrics for solving the image processing problem was demonstrated. The necessity of matching the equation parameter values when calculating the weight coefficients aimed at taking into account image texture features was proved for reaching the higher noise resistance under the vector field construction. The adaptation technique which allows excluding manual determination of parameter values was proposed and its efficiency was demonstrated.

In this paper we present recent developments in optical microscope image analysis using both, best focus optical image as well as those images conventionally considered out of focus for metrology applications. Depending on the type of analysis, considerable information can be deduced with the additional use of the out of focus optical images. One method for analyzing the complete set of images is to calculate the total "edge slope" from an image, as the target is moved through-focus. A plot of the sum of the mean square slope is defined as the through-focus focus metric. We present a unique method for evaluating the angular illumination homogeneity in an optical microscope (with Koehler illumination configuration), based on the through-focus focus metric approach. Both theoretical simulations and experimental results are presented to demonstrate this approach. We present a second application based on the through-focus focus metric method for evaluating critical dimensions (CD) with demonstrated nanometer sensitivity for both experimental and optical simulations. An additional approach to analyzing the complete set of images is to assemble or align the through focus image intensity profiles such that the x-axis represents the position on the target, the y-axis represents the focus (or defocus) position of the target with respect to the lens and the z-axis represents the image intensity. This two-dimensional image is referred to as the through focus image map. Using recent simulation results we apply the through focus image map to CD and overlay analysis and demonstrate nanometer sensitivity in the theoretical results.

Adaptive optics (AO) in conjunction with subsequent postprocessing techniques have obviously improved the resolution of turbulence-degraded images in ground-based astronomical observations or artificial space objects detection and identification. However, important tasks involved in AO image postprocessing, such as frame selection, stopping iterative deconvolution, and algorithm comparison, commonly need manual intervention and cannot be performed automatically due to a lack of widely agreed on image quality metrics. In this work, based on the Laplacian of Gaussian (LoG) local contrast feature detection operator, we propose a LoG domain matching operation to perceive effective and universal image quality statistics. Further, we extract two no-reference quality assessment indices in the matched LoG domain that can be used for a variety of postprocessing tasks. Three typical space object images with distinct structural features are tested to verify the consistency of the proposed metric with perceptual image quality through subjective evaluation.

We establish the relationships between the metric of charge transfer excitation (Δr) for the bright ππ* state and the two-photon absorption probability as well as the first hyperpolarizability for two families of push-pull π-conjugated systems. As previously demonstrated by Guido et al. (J. Chem. Theory Comput. 2013, 9, 3118-3126), Δr is a measure for the average hole-electron distance upon excitation and can be used to discriminate between short- and long-range electronic excitations. We indicate two new benefits from using this metric for the analyses of nonlinear optical properties of push-pull systems. First, the two-photon absorption probability and the first hyperpolarizability are found to be interrelated through Δr; if β ∼ (Δr)(k), then roughly, δ(TPA) ∼ (Δr)(k+1). Second, a simple power relation between Δr and the molecular hyperpolarizabilities of push-pull systems offers the possibility of estimating properties for longer molecular chains without performing calculations of high-order response functions explicitly. We further demonstrate how to link the hyperpolarizabilities with the chain length of the push-pull π-conjugated systems through the metric of charge transfer.

A method for assigning a confidence metric for automated determination of optic disc location that includes analyzing a retinal image and determining at least two sets of coordinates locating an optic disc in the retinal image. The sets of coordinates can be determined using first and second image analysis techniques that are different from one another. An accuracy parameter can be calculated and compared to a primary risk cut-off value. A high confidence level can be assigned to the retinal image if the accuracy parameter is less than the primary risk cut-off value and a low confidence level can be assigned to the retinal image if the accuracy parameter is greater than the primary risk cut-off value. The primary risk cut-off value being selected to represent an acceptable risk of misdiagnosis of a disease having retinal manifestations by the automated technique.

Maximization of a projected laser beam's power density at a remotely located extended object (speckle target) can be achieved by using an adaptive optics (AO) technique based on sensing and optimization of the target-return speckle field's statistical characteristics, referred to here as speckle metrics (SM). SM AO was demonstrated in a target-in-the-loop coherent beam combining experiment using a bistatic laser beam projection system composed of a coherent fiber-array transmitter and a power-in-the-bucket receiver. SM sensing utilized a 50 MHz rate dithering of the projected beam that provided a stair-mode approximation of the outgoing combined beam's wavefront tip and tilt with subaperture piston phases. Fiber-integrated phase shifters were used for both the dithering and SM optimization with stochastic parallel gradient descent control.

We present a detailed discussion of how to obtain precise stellar photometry in crowded fields using images obtained with multi-conjugate adaptive optics (MCAO), with the intent of informing the scientific development of this key technology for the Extremely Large Telescopes. We use deep J and K$_\\mathrm{s}$ exposures of NGC 1851 obtained using the Gemini Multi-Conjugate Adaptive Optics System (GeMS) on Gemini South to quantify the performance of the system and to develop an optimal strategy for extracting precise stellar photometry from the images using well-known PSF-fitting techniques. We judge the success of the various techniques we employ by using science-based metrics, particularly the width of the main sequence turn-off region. We also compare the GeMS photometry with the exquisite HST data of the same target in the visible. We show that the PSF produced by GeMS possesses significant spatial and temporal variability that must be accounted for during the photometric analysis by allowing the PSF model a...

As with other imaging modalities, motion induces artifacts that can significantly corrupt optical neuroimaging data. While multiple methods have been developed for motion detection in individual NIRS measurement channels, the large measurement numbers present in multichannel fNIRS or high-density diffuse optical tomography (HD-DOT) systems, create an opportunity for detection methods that integrate over the entire field of view. Here, we leverage the inherent covariance among multiple NIRS measurements after pre-processing, to quantify motion artifacts by calculating the global variance in the temporal derivative (GV-TD) across all measurements (e.g. from the temporal derivative of each time-course, the method calculates root mean square across all measurements for each time point). This calculation is fast, automated, and identifies motion by incorporating global aspects of data instead of individual channels. To test the performance, we designed an experimental paradigm that intermixed controlled epochs of motion artifact with relatively motion-free epochs during a block design hearing-words language paradigm using a previously described HD-DOT system. We categorized 348 blocks by sorting the blocks based on the maximum of their GV-TD time-courses. Our results show that with a modest thresholding of the data, wherein we keep data with 0.66 of the full data set average GV-TD, we obtain a 50% increase in the signal-to-noise. With noisier data, we expect the performance gains to increase. Further, the impact on resting state functional connectivity may also be more significant. In summary, a censoring threshold based on the GV-TD metric provides a fast and direct way for identifying motion artifacts.

The use of images from the nuclear medicine has become very quickly on a standard in the stages of planning of radiotherapy. These metabolic and molecular images lack sharpness which hinders the process of contouring to be difficult to define the limits of the tissue. Methods of segmentation by threshold used parameters such as the size of the tumor and the relationship signal noise to set using iterative methods activity relative to the maximum level to apply. Currently there is no consensus for the establishment of the appropriate threshold. This problem is exacerbated when we consider volumes in movement or also those whose edge is not a step function, as in the case of real tissues, where density variable clonogenic at the edge of the fabric is a variable profile in the catchment values. The hypothesis of this study is that all these effects, object size, relationship signal background, edge of variable uptake and movement, are added to the same level when it comes to producing a blurring. For this reason, we intend to establish a figure of merit that serve as a metric of the blurring and to determine unambiguously the value threshold to choose. (Author)

Purpose Studies in the field of cataract and refractive surgery often report only summary wave-front analysis data data that are too condensed to allow for a retrospective calculation of metrics relevant to visual perception. The aim of this study was to develop a tool that can be used to estimate t

Created for a Metric Day activity, Metric Madness is a board game for two to four players. Students review and practice metric vocabulary, measurement, and calculations by playing the game. Playing time is approximately twenty to thirty minutes.

Understanding the organization and mechanical function of the extracellular matrix (ECM) is critical for the development of therapeutic strategies that regulate wound healing following disease or injury. However, these relationships are challenging to elucidate during remodeling following myocardial infarction (MI) due to rapid changes in cellularity and an inability to characterize both ECM microstructure and function non-destructively. In this study, we overcome those challenges through whole organ decellularization and non-linear optical microscopy to directly relate the microstructure and mechanical properties of myocardial ECM. We non-destructively quantify collagen organization, content, and cross-linking within decellularized healthy and infarcted myocardium using second harmonic generation (SHG) and two photon excited autofluorescence. Tensile mechanical testing and compositional analysis reveal that the cumulative SHG intensity within each image volume and the average collagen autofluorescence are significantly correlated with collagen content and elastic modulus of the ECM, respectively. Compared to healthy ECM, infarcted tissues demonstrate a significant increase in collagen content and fiber alignment, and a decrease in cross-linking and elastic modulus. These findings indicate that cross-linking plays a key role in stiffness at the collagen fiber level following infarction, and highlight how this non-destructive approach to assessing remodeling can be used to understand ECM structure-function relationships.

Canonical quantization may be approached from several different starting points. The usual approaches involve promotion of c-numbers to q-numbers, or path integral constructs, each of which generally succeeds only in Cartesian coordinates. All quantization schemes that lead to Hilbert space vectors and Weyl operators---even those that eschew Cartesian coordinates---implicitly contain a metric on a flat phase space. This feature is demonstrated by studying the classical and quantum ``aggregations'', namely, the set of all facts and properties resident in all classical and quantum theories, respectively. Metrical quantization is an approach that elevates the flat phase space metric inherent in any canonical quantization to the level of a postulate. Far from being an unwanted structure, the flat phase space metric carries essential physical information. It is shown how the metric, when employed within a continuous-time regularization scheme, gives rise to an unambiguous quantization procedure that automatically ...

As the old 'publish or perish' adage is brought into question, additional research-impact indices, known as altmetrics, are offering new evaluation alternatives. But such metrics may need to adjust to the evolution of science publishing.

Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learnin

We review the problem of describing the gravitational field of compact stars in general relativity. We focus on the deviations from spherical symmetry which are expected to be due to rotation and to the natural deformations of mass distributions. We assume that the relativistic quadrupole moment takes into account these deviations, and consider the class of axisymmetric static and stationary quadrupolar metrics which satisfy Einstein's equations in empty space and in the presence of matter represented by a perfect fluid. We formulate the physical conditions that must be satisfied for a particular spacetime metric to describe the gravitational field of compact stars. We present a brief review of the main static and axisymmetric exact solutions of Einstein's vacuum equations, satisfying all the physical conditions. We discuss how to derive particular stationary and axisymmetric solutions with quadrupolar properties by using the solution generating techniques which correspond either to Lie symmetries and B\\"acku...

Diffractive 3D phase gratings of spherical scatterers dense in hexagonal packing geometry represent adaptively tunable 4D-spatiotemporal filters with trichromatic resonance in visible spectrum. They are described in the (lambda) - chromatic and the reciprocal (nu) -aspects by reciprocal geometric translations of the lightlike Pythagoras theorem, and by the direction cosine for double cones. The most elementary resonance condition in the lightlike Pythagoras theorem is given by the transformation of the grating constants gx, gy, gz of the hexagonal 3D grating to (lambda) h1h2h3 equals (lambda) 111 with cos (alpha) equals 0.5. Through normalization of the chromaticity in the von Laue-interferences to (lambda) 111, the (nu) (lambda) equals (lambda) h1h2h3/(lambda) 111-factor of phase velocity becomes the crucial resonance factor, the 'regulating device' of the spatiotemporal interaction between 3D grating and light, space and time. In the reciprocal space equal/unequal weights and times in spectral metrics result at positions of interference maxima defined by hyperbolas and circles. A database becomes built up by optical interference for trichromatic image preprocessing, motion detection in vector space, multiple range data analysis, patchwide multiple correlations in the spatial frequency spectrum, etc.

Lee et al. (Percept Psychophys 70:1032-1046, 2008a) investigated whether visual perception of metric shape could be calibrated when used to guide feedforward reaches-to-grasp. It could not. Seated participants viewed target objects (elliptical cylinders) in normal lighting using stereo vision and free head movements that allowed small (approximately 10 degrees) perspective changes. The authors concluded that poor perception of metric shape was the reason reaches-to-grasp should be visually guided online. However, Bingham and Lind (Percept Psychophys 70:524-540, 2008) showed that large perspective changes (> or =45 degrees) yield good perception of metric shape. So, now we repeated the Lee et al.'s study with the addition of information from large perspective changes. The results were accurate feedforward reaches-to-grasp reflecting accurate perception of both metric shape and metric size. Large perspective changes occur when one locomotes into a workspace in which reaches-to-grasp are subsequently performed. Does the resulting perception of metric shape persist after the large perspective changes have ceased? Experiments 2 and 3 tested reaches-to-grasp with delays (Exp. 2, 5-s delay; Exp. 3, approximately 16-s delay) and multiple objects to be grasped after a single viewing. Perception of metric shape and metric size persisted yielding accurate reaches-to-grasp. We advocate the study of nested actions using a dynamic approach to perception/action.

Full Text Available En el presente trabajo se proponen métricas para disminuir el impacto negativo de los riesgos en los proyectos productivos de la Universidad de las Ciencias Informáticas, basado en el modelo CMMI. Las métricas pueden medir el avance del proyecto, así como contribuir a la prevención de los riesgos y en caso de ocurrir un riesgo, permite a los líderes tomar decisiones correctas ante los problemas presentados. Múltiples factores influyen en el desarrollo de software con calidad: los recursos humanos y materiales, uso adecuado de técnicas y metodologías, la selección correcta de la tecnología, la planificación del tiempo y los costos. Mediante este análisis se obtienen un conjunto de métricas que permiten determinar el impacto de los riesgos en los proyecto productivos de la Universidad.The present work proposes metrics to decrease the negative impact of risks in productive projects from the University of Informatics Sciences, based on the CMMI model. Metrics can measure the development of the project, as well as contribute to prevent risks, and in case it happens, they allow leaders to make the right decisions for the problems presented. Multiple factors influence on the development of the software with quality: human and material resources, correct use of techniques and methodologies, appropriate selection of technology, time planning and costs. However, it is important to make a detailed study of risks and apply rules and quality standards proposed worldwide, which consider, as an important area, risk management. Through this analysis it is obtained a set of metrics that allow determining the impact of risks in the productive projects of the University.

Full Text Available The present work proposes metrics to decrease the negative impact of risks in productive projects from the University of Informatics Sciences, based on the CMMI model. Metrics can measure the development of the project, as well as contribute to prevent risks, and in case it happens, they allow leaders to make the right decisions for the problems presented. Multiple factors influence on the development of the software with quality: human and material resources, correct use of techniques and methodologies, appropriate selection of technology, time planning and costs. However, it is important to make a detailed study of risks and apply rules and quality standards proposed worldwide, which consider, as an important area, risk management. Through this analysis it is obtained a set of metrics that allow determining the impact of risks in the productive projects of the University. En el presente trabajo se proponen métricas para disminuir el impacto negativo de los riesgos en los proyectos productivos de la Universidad de las Ciencias Informáticas, basado en el modelo CMMI. Las métricas pueden medir el avance del proyecto, así como contribuir a la prevención de los riesgos y en caso de ocurrir un riesgo, permite a los líderes tomar decisiones correctas ante los problemas presentados. Múltiples factores influyen en el desarrollo de software con calidad: los recursos humanos y materiales, uso adecuado de técnicas y metodologías, la selección correcta de la tecnología, la planificación del tiempo y los costos. Mediante este análisis se obtienen un conjunto de métricas que permiten determinar el impacto de los riesgos en los proyecto productivos de la Universidad.

The purpose of this article is to propose a new method to define and calculate path integrals over metrics on a Kaehler manifold. The main idea is to use finite dimensional spaces of Bergman metrics, as an approximation to the full space of Kaehler metrics. We use the theory of large deviations to decide when a sequence of probability measures on the spaces of Bergman metrics tends to a limit measure on the space of all Kaehler metrics. Several examples are considered.

NASA science publications have used the metric system of measurement since 1970. Although NASA has maintained a metric use policy since 1979, practical constraints have restricted actual use of metric units. In 1988, an amendment to the Metric Conversion Act of 1975 required the Federal Government to adopt the metric system except where impractical. In response to Public Law 100-418 and Executive Order 12770, NASA revised its metric use policy and developed this Metric Transition Plan. NASA's goal is to use the metric system for program development and functional support activities to the greatest practical extent by the end of 1995. The introduction of the metric system into new flight programs will determine the pace of the metric transition. Transition of institutional capabilities and support functions will be phased to enable use of the metric system in flight program development and operations. Externally oriented elements of this plan will introduce and actively support use of the metric system in education, public information, and small business programs. The plan also establishes a procedure for evaluating and approving waivers and exceptions to the required use of the metric system for new programs. Coordination with other Federal agencies and departments (through the Interagency Council on Metric Policy) and industry (directly and through professional societies and interest groups) will identify sources of external support and minimize duplication of effort.

Full Text Available Weight and height measurements are important data for the evaluation of nutritional status but some situations prevent the execution of these measurements in the standard manner, using special equipment or an estimate by predictive equations. Predictive equations of height and weight requiring only a metric tape as an instrument have been recently developed. Objective: To validate three predictive equations for weight and two for height by Rabito and evaluating their agreement with the equations proposed by Chumlea. Methods: The following data were collected: sex, age and anthropometric measurements, ie, weight (kg, height (m, subscapular skinfold (mm, calf (cm, arm (cm and abdominal (cm circumferences, arm length (cm, and half span (cm. Data were analyzed statistically using the Lin coefficient to test the agreement between the equations and the St. Laurent coefficient to compare the estimated weight and height values with real values. Results: 100 adults (age 48 ± 18 years admitted to the University Hospital (HCFMRP/USP were evaluated. Equations I: W(kg = 0.5030 (AC + 0.5634 (AbC + 1.3180 (CC +0.0339 (SSSF - 43.1560 and II: W (kg = 0.4808 (AC + 0.5646 (AbC +1.3160 (CC - 42.2450 showed the highest coefficients of agreement for weight and equations IV and V showed the highest coefficients of agreement for height. The St. Laurent coefficient indicated that equations III and V were valid for weight and height, respectively. Conclusion: Among the validated equations, the number III W (kg = 0.5759 (AC + 0.5263 (AbC +1.2452 (CC - 4.8689 (S - 32.9241 and VH (m = 63,525 -3,237(S - 0,06904 (A + 1,293 (HS are recommended for height or weight because of their easy use for hospitalized patients and the equations be validated in other situations.Las medidas de peso y talla son datos importantes para la evaluación del estado nutricional y también para el planeamiento de terapias nutricionales y medicamentosas. Entretanto, algunas situaciones imposibilitan

To develop a proposal for metrics for patents to be applied in assessing the postgraduate programs of Medicine III - Capes. From the reading and analysis of the 2013 area documents of all the 48 areas of Capes, a proposal for metrics for patents was developed to be applied in Medicine III programs. Except for the areas Biotechnology, Food Science, Biological Sciences III, Physical Education, Engineering I, III and IV and Interdisciplinary, most areas do not adopt a scoring system for patents. The proposal developed was based on the criteria of Biotechnology, with adaptations. In general, it will be valued, in ascending order, the deposit, the granting and licensing/production. It will also be assigned higher scores to patents registered abroad and whenever there is a participation of students. This proposal can be applied to the item Intellectual Production of the evaluation form, in subsection Technical Production/Patents. The percentage of 10% for academic programs and 40% for Masters Professionals should be maintained. The program will be scored as Very Good when it reaches 400 points or over; Good, between 200 and 399 points; Regular, between 71 and 199 points; Weak up to 70 points; Insufficient, no punctuation. Desenvolver uma proposta de métricas para patentes a serem aplicadas na avaliação dos Programas de Pós-Graduação da Área Medicina III - Capes. A partir da leitura e análise dos documentos de área de 2013 de todas as 48 Áreas da Capes, desenvolveu-se uma proposta de métricas para patentes, a ser aplicada na avaliação dos programas da área. Constatou-se que, com exceção das áreas Biotecnologia, Ciência de Alimentos, Ciências Biológicas III, Educação Física, Engenharias I, III e IV e Interdisciplinar, a maioria não adota sistema de pontuação para patentes. A proposta desenvolvida baseou-se nos critérios da Biotecnologia, com adaptações. De uma forma geral, foi valorizado, em ordem crescente, o depósito, a concessão e o

We introduce and develop the theory of metric sheaves. A metric sheaf $\\A$ is defined on a topological space $X$ such that each fiber is a metric model. We describe the construction of the generic model as the quotient space of the sheaf through an appropriate filter. Semantics in this model is completely controlled and understood by the forcing rules in the sheaf.

Bis(cyclohexylammonium) terephthalate (BCT) and cyclohexylammonium 4-methoxy benzoate (C4MB) single crystals were successfully grown by the slow evaporation solution growth technique. The harvested crystals were subjected to single-crystal X-ray diffraction, spectral, optical, thermal and mechanical studies in order to evaluate physiochemical properties. The Kurtz and Perry technique for second harmonic generation (SHG) study revealed that the powdered materials of BCT and C4MB exhibit SHG efficiency 0.2 times less and 1.3 times greater than that of standard reference material potassium dihydrogen phosphate. C4MB crystal exhibits high efficiency than BCT, because of methoxy group substituted in the para position of phenyl ring. With high SHG efficiency and thermal stability para substituted C4MB crystal will be a potential candidate for optical device fabrication.

The propagation of light in area metric spacetimes, which naturally emerge as refined backgrounds in quantum electrodynamics and quantum gravity, is studied from first principles. In the geometric-optical limit, light rays are found to follow geodesics in a Finslerian geometry, with the Finsler norm being determined by the area metric tensor. Based on this result, and an understanding of the nonlinear relation between ray vectors and wave covectors in such refined backgrounds, we study light deflection in spherically symmetric situations and obtain experimental bounds on the non-metricity of spacetime in the solar system.

Aimed toward researchers and graduate students familiar with elements of functional analysis, linear algebra, and general topology; this book contains a general study of modulars, modular spaces, and metric modular spaces. Modulars may be thought of as generalized velocity fields and serve two important purposes: generate metric spaces in a unified manner and provide a weaker convergence, the modular convergence, whose topology is non-metrizable in general. Metric modular spaces are extensions of metric spaces, metric linear spaces, and classical modular linear spaces. The topics covered include the classification of modulars, metrizability of modular spaces, modular transforms and duality between modular spaces, metric and modular topologies. Applications illustrated in this book include: the description of superposition operators acting in modular spaces, the existence of regular selections of set-valued mappings, new interpretations of spaces of Lipschitzian and absolutely continuous mappings, the existe...

Up-to-date research in metric diffusion along compact foliations is presented in this book. Beginning with fundamentals from the optimal transportation theory and the theory of foliations; this book moves on to cover Wasserstein distance, Kantorovich Duality Theorem, and the metrization of the weak topology by the Wasserstein distance. Metric diffusion is defined, the topology of the metric space is studied and the limits of diffused metrics along compact foliations are discussed. Essentials on foliations, holonomy, heat diffusion, and compact foliations are detailed and vital technical lemmas are proved to aide understanding. Graduate students and researchers in geometry, topology and dynamics of foliations and laminations will find this supplement useful as it presents facts about the metric diffusion along non-compact foliation and provides a full description of the limit for metrics diffused along foliation with at least one compact leaf on the two dimensions.

The Air Force sustainment enterprise does not have metrics that . . . adequately measure key sustainment parameters, according to the 2011 National...standardized and do not contribute to the overall assessment of the sustainment enterprise . This paper explores the development of a single metric...is not feasible. To answer the question does the sustainment enterprise provide cost-effective readiness for a weapon system, a suite of metrics is

Full Text Available We introduce the notion of -metric as a generalization of a metric by replacing the triangle inequality with a more generalized inequality. We investigate the topology of the spaces induced by a -metric and present some essential properties of it. Further, we give characterization of well-known fixed point theorems, such as the Banach and Caristi types in the context of such spaces.

National Aeronautics and Space Administration — This chapter presents several performance metrics for offline evaluation of prognostics algorithms. A brief overview of different methods employed for performance...

This thesis develops effective approximations of certain metrics that occur frequently in pure and applied mathematics. We show that distances that often arise in applications, such as the Earth Mover's Distance between two probability measures, can be approximated by easily computed formulas for a wide variety of ground distances. We develop simple and easily computed characterizations both of norms measuring a function's regularity -- such as the Lipschitz norm -- and of their duals. We are particularly concerned with the tensor product of metric spaces, where the natural notion of regularity is not the Lipschitz condition but the mixed Lipschitz condition. A theme that runs throughout this thesis is that snowflake metrics (metrics raised to a power less than 1) are often better-behaved than ordinary metrics. For example, we show that snowflake metrics on finite spaces can be approximated by the average of tree metrics with a distortion bounded by intrinsic geometric characteristics of the space and not the number of points. Many of the metrics for which we characterize the Lipschitz space and its dual are snowflake metrics. We also present applications of the characterization of certain regularity norms to the problem of recovering a matrix that has been corrupted by noise. We are able to achieve an optimal rate of recovery for certain families of matrices by exploiting the relationship between mixed-variable regularity conditions and the decay of a function's coefficients in a certain orthonormal basis.

Full Text Available In this article, we mainly formalize in Mizar [2] the equivalence among a few compactness definitions of metric spaces, norm spaces, and the real line. In the first section, we formalized general topological properties of metric spaces. We discussed openness and closedness of subsets in metric spaces in terms of convergence of element sequences. In the second section, we firstly formalize the definition of sequentially compact, and then discuss the equivalence of compactness, countable compactness, sequential compactness, and totally boundedness with completeness in metric spaces.

Full Text Available Performance measurement predominantly consisted of near-term outputs measured through bibliometrics, but the recent focus is on accountability for investment based on long-term outcomes. Our objective is to build a logic model and associated metrics through which to measure the contribution of environmental health research programs to improvements in human health, the environment, and the economy. We developed a logic model that defines the components and linkages between extramural environmental health research grant programs and the outputs and outcomes related to health and social welfare, environmental quality and sustainability, economics, and quality of life, focusing on the environmental health research portfolio of the National Institute of Environmental Health Sciences (NIEHS Division of Extramural Research and Training and delineates pathways for contributions by five types of institutional partners in the research process. The model is being applied to specific NIEHS research applications and the broader research community. We briefly discuss two examples and discuss the strengths and limits of outcome- based evaluation of research programs.A avaliação de desempenho compreendia predominantemente resultados de curto prazo avaliados através de bibliometria, mas recentemente a ênfase voltou-se à prestação de contas dos investimentos com base em resultados a longo prazo. Nosso objetivo é criar um modelo lógico e métricas associadas através dos quais possamos avaliar a contribuição de programas de pesquisa em saúde ambiental para melhorar a saúde humana, o meio ambiente e a economia. Desenvolvemos um modelo lógico que define os componentes e elos entre os programas de pesquisa em saúde ambiental extramuros subsidiados e os resultados relacionados à saúde e ao bem-estar social, qualidade ambiental e sustentabilidade, economia e qualidade de vida, com ênfase no portfólio de pesquisa em saúde ambiental do National

In September of 2009, a Tri-Lab team was formed to develop a set of metrics relating to the NNSA nuclear weapon surveillance program. The purpose of the metrics was to develop a more quantitative and/or qualitative metric(s) describing the results of realized or non-realized surveillance activities on our confidence in reporting reliability and assessing the stockpile. As a part of this effort, a statistical sub-team investigated various techniques and developed a complementary set of statistical metrics that could serve as a foundation for characterizing aspects of meeting the surveillance program objectives. The metrics are a combination of tolerance limit calculations and power calculations, intending to answer level-of-confidence type questions with respect to the ability to detect certain undesirable behaviors (catastrophic defects, margin insufficiency defects, and deviations from a model). Note that the metrics are not intended to gauge product performance but instead the adequacy of surveillance. This report gives a short description of four metrics types that were explored and the results of a sensitivity study conducted to investigate their behavior for various inputs. The results of the sensitivity study can be used to set the risk parameters that specify the level of stockpile problem that the surveillance program should be addressing.

Designed to meet the job-related metric measurement needs of students interested in transportation, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational terminology,…

Designed to meet the job-related metric measurement needs of students interested in food distribution, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…

This document was developed out of a need for a complete, carefully designed set of evaluation instruments and procedures that might be applied in metric inservice programs across the nation. Components of this package were prepared in such a way as to permit local adaptation to the evaluation of a broad spectrum of metric education activities.…

A new computational visual distinctness metric based on principles of the early human visual system is presented. The metric is applied to quantify (1) the visual distinctness of targets in complex natural scenes and (2) the perceptual differences between compressed and uncompressed images. The new

We have constructed the positive definite metric matrixes for the bounded domains of Rn and proved an inequality which is about the Jacobi matrix of a harmonic mapping on a bounded domain of Rn and the metric matrix of the same bounded domain.

textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for ex

Curvature properties are studied for the Sasaki metric on the (1,1) tensor bundle of a Riemannian manifold. As an application, examples of almost para-Nordenian and para-K(a)hler-Nordenian B-metrics are constructed on the (1,1) tensor bundle by looking at the Sasaki metric. Also, with respect to the para-Nordenian B-structure, paraholomorphic conditions for the complete lifts of vector fields are analyzed.

The holographic principle (HP) conjectures, that the maximum number of degrees of freedom of any realistic physical system is proportional to the system's boundary area. The HP has its roots in the study of black holes. It has recently been applied to cosmological solutions. In this article we apply the HP to spherically symmetric static space-times. We find that any regular spherically symmetric object saturating the HP is subject to tight constraints on the (interior) metric, energy-density, temperature and entropy-density. Whenever gravity can be described by a metric theory, gravity is macroscopically scale invariant and the laws of thermodynamics hold locally and globally, the (interior) metric of a regular holographic object is uniquely determined up to a constant factor and the interior matter-state must follow well defined scaling relations. When the metric theory of gravity is general relativity, the interior matter has an overall string equation of state (EOS) and a unique total energy-density. Thus the holographic metric derived in this article can serve as simple interior 4D realization of Mathur's string fuzzball proposal. Some properties of the holographic metric and its possible experimental verification are discussed. The geodesics of the holographic metric describe an isotropically expanding (or contracting) universe with a nearly homogeneous matter-distribution within the local Hubble volume. Due to the overall string EOS the active gravitational mass-density is zero, resulting in a coasting expansion with Ht = 1, which is compatible with the recent GRB-data.

The present paper is a sequel to our paper "Metric characterization of isometries and of unital operator spaces and systems". We characterize certain common objects in the theory of operator spaces (unitaries, unital operator spaces, operator systems, operator algebras, and so on), in terms which are purely linear-metric, by which we mean that they only use the vector space structure of the space and its matrix norms. In the last part we give some characterizations of operator algebras (which are not linear-metric in our strict sense described in the paper).

Full Text Available We established fixed point theorems in multiplicative metric spaces. The obtained results generalize Banach contraction principle in multiplicative metric spaces and also characterize completeness of the underlying multiplicative metric space.

This paper describes the project , building and utilization of a tomograph of micro metric resolution in soil science. It describes the problems involved in soil`s science study and it describes the system and methodology 3 figs.

Describes the program to convert to the metric system all of General Motors Corporation products. Steps include establishing policy regarding employee-owned tools, setting up training plans, and making arrangements with suppliers. (MF)

Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

The federal agencies are working with industry to ease adoption of the metric system. The goal is to help U.S. industry compete more successfully in the global marketplace, increase exports, and create new jobs. The strategy is to use federal procurement, financial assistance, and other business-related activities to encourage voluntary conversion. Based upon the positive experiences of firms and industries that have converted, federal agencies have concluded that metric use will yield long-term benefits that are beyond any one-time costs or inconveniences. It may be time for additional steps to move the Nation out of its dual-system comfort zone and continue to progress toward metrication. This report includes 'Metric Highlights in U.S. History'.

A recent survey has indicated that 17 % of companies have ceased mass customizing less than 1 year after initiating the effort. This paper presents measurement for a company’s mass customization performance, utilizing metrics within the three fundamental capabilities: robust process design, choice...... navigation, and solution space development. A mass customizer when assessing performance with these metrics can identify within which areas improvement would increase competitiveness the most and enable more efficient transition to mass customization....

With growing amount of data produced by users on social media the need of extraction of relevant data for marketing, research and other uses grows as well. The bachelor thesis named "Social media metrics" presents the issues of monitoring, measurement and metrics of social media. In the research part it also maps and captures the present Czech practice in measurement and monitoring of social media. I also rate the use of social media monitoring tools and usual methods of social media measurem...

We consider compact complex surfaces with Hermitian metrics which are Einstein but not Kaehler. It is shown that the manifold must be CP2 blown up at 1,2, or 3 points, and the isometry group of the metric must contain a 2-torus. Thus the Page metric on CP2#(-CP2) is almost the only metric of this type.

The uncertainties in current estimates of anthropogenic radiative forcing are dominated by the effects of aerosols, both in relation to the direct absorption and scattering of radiation by aerosols and also with respect to aerosol-related changes in cloud formation, longevity, and microphysics (See Figure 1; Intergovernmental Panel on Climate Change, Assessment Report 4, 2008). Moreover, the Arctic region in particular is especially sensitive to changes in climate with the magnitude of temperature changes (both observed and predicted) being several times larger than global averages (Kaufman et al. 2009). Recent studies confirm that aerosol-cloud interactions in the arctic generate climatologically significant radiative effects equivalent in magnitude to that of green house gases (Lubin and Vogelmann 2006, 2007). The aerosol optical depth is the most immediate representation of the aerosol direct effect and is also important for consideration of aerosol-cloud interactions, and thus this quantity is essential for studies of aerosol radiative forcing.

Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

We construct isospectral non isometric metrics on real and complex projective space. We recall the construction using isometric torus actions by Carolyn Gordon in chapter 2. In chapter 3 we will recall some facts about complex projective space. In chapter 4 we build the isospectral metrics. Chapter 5 is devoted to the non isometry proof of the metrics built in chapter 4. In chapter 6 isospectral metrics on real projective space are derived from metrics on the sphere.

We note that the Bogomolny equation for abelian vortices is precisely the condition for invariance of the Hermitian-Einstein equation under a degenerate conformal transformation. This leads to a natural interpretation of vortices as degenerate hermitian metrics that satisfy a certain curvature equation. Using this viewpoint, we rephrase standard results about vortices and make some new observations. We note the existence of a conceptually simple, non-linear rule for superposing vortex solutions, and we describe the natural behaviour of the L^2-metric on the moduli space upon certain restrictions.

In this paper the theory of uniformly convex metric spaces is developed. These spaces exhibit a generalized convexity of the metric from a fixed point. Using a (nearly) uniform convexity property a simple proof of reflexivity is presented and a weak topology of such spaces is analyzed. This topology called co-convex topology agrees with the usualy weak topology in Banach spaces. An example of a $CAT(0)$-spaces with weak topology which is not Hausdorff is given. This answers questions raised b...

The role of Finsler-like metrics in situations where Lorentz symmetry breaking and also CPT violation are discussed. Various physical instances of such metrics both in quantum gravity and analogue systems are discussed. Both differences and similarities between the cases will be emphasised. In particular the medium of D-particles that arise in string theory will be examined. In this case the breaking of Lorentz invariance, at the level of quantum fluctuations, together with concomitant CPT in certain situations will be analysed. In particular it will be shown correlations for neutral meson pairs will be modified and a new contribution to baryogenesis will appear.

Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English language with the intention that it may be used in second language instruction. Stress is defined by its physical and acoustical correlates, and the principles of…

regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

Metric of quantum states plays an important role in quantum information theory. In this letter, we find the deep connection between quantum logic theory and quantum information theory. Using the method of quantum logic, we can get a famous inequality in quantum information theory, and we answer a question raised by S. Gudder.

Implementation of a Total Quality Management (TQM) approach to engineering work required the development of a system of metrics which would serve as a meaningful management tool for evaluating effectiveness in accomplishing project objectives and in achieving improved customer satisfaction. A team effort was chartered with the goal of developing a system of engineering performance metrics which would measure customer satisfaction, quality, cost effectiveness, and timeliness. The approach to developing this system involved normal systems design phases including, conceptual design, detailed design, implementation, and integration. The lessons teamed from this effort will be explored in this paper. These lessons learned may provide a starting point for other large engineering organizations seeking to institute a performance measurement system accomplishing project objectives and in achieving improved customer satisfaction. To facilitate this effort, a team was chartered to assist in the development of the metrics system. This team, consisting of customers and Engineering staff members, was utilized to ensure that the needs and views of the customers were considered in the development of performance measurements. The development of a system of metrics is no different than the development of any type of system. It includes the steps of defining performance measurement requirements, measurement process conceptual design, performance measurement and reporting system detailed design, and system implementation and integration.

National Park Service, Department of the Interior — NPScape housing metrics are calculated using outputs from the Spatially Explicit Regional Growth Model. Metric GIS datasets are produced seamlessly for the United...

This review describes the events leading up to the discovery of the Kerr metric in 1963 and the enormous impact the discovery has had in the subsequent 50 years. The review discusses the Penrose process, the four laws of black hole mechanics, uniqueness of the solution, and the no-hair theorems. It also includes Kerr perturbation theory and its application to black hole stability and quasi-normal modes. The Kerr metric's importance in the astrophysics of quasars and accreting stellar-mass black hole systems is detailed. A theme of the review is the "miraculous" nature of the solution, both in describing in a simple analytic formula the most general rotating black hole, and in having unexpected mathematical properties that make many calculations tractable. Also included is a pedagogical derivation of the solution suitable for a first course in general relativity.

establish a connection between the geometrical formulation of quantum statistics as proposed by Chentsov and Morozova and measures of quantum information as introduced by Wigner and Yanase and extended in this article. We show that the set of normalized Morozova-Chentsov functions describing the possible...... quantum statistics is a Bauer simplex and determine its extreme points. We determine a particularly simple skew information, the "¿-skew information," parametrized by a ¿ ¿ (0, 1], and show that the convex cone this family generates coincides with the set of all metric adjusted skew informations.......We extend the concept of Wigner-Yanase-Dyson skew information to something we call "metric adjusted skew information" (of a state with respect to a conserved observable). This "skew information" is intended to be a non-negative quantity bounded by the variance (of an observable in a state...

Full Text Available Continuing purchase of AHSS resources is threatened more by library budget squeezes than that of STM resources. Librarians must justify all expenditure, but quantitative metrical analysis to assess the value to the institution of journals and specialized research databases for AHSS subjects can be inconclusive; often the number of recorded transactions is lower than for STM, as the resource may be relevant to a smaller number of users. This paper draws on a literature review and extensive primary research, including a survey of 570 librarians and academics across the Anglophone countries, findings from focus group meetings and the analysis of user behaviour at a UK university before and after the installation of the Summon discovery system. It concludes that providing a new approach to metrics can help to develop resources strategies that meet changing user needs; and that usage statistics can be complemented with supplementary ROI measures to make them more meaningful.

Research and publications about luminescent polymers have been developed in the last years for the academic innovation; however the industrial application has been very limited in this area. Processed Optical markers are few explored due the difficult to process luminescent polymeric materials with stable luminescence. The materials used to process luminescent polypropylene (PP) were polyamide 6 (PA6) doped with europium complex [Eu(tta){sub 3}(H{sub 2}O){sub 2}] obtained through the dilution and casting process. The polyolefins because they are inert, do not fit the common procedure of doping, in consequence, in this work luminescent polypropylene was indirectly prepared by polyamide 6 doped with europium complex through extrusion process. Product characterization was done using Thermal gravimetry analysis (TG), Differential Scanning Calorimetric (DSC), X-Ray Diffraction (XRD), Infrared spectroscopy (FTIR) and spectro fluorescence of emission and excitation. The blend PP/PA6:Eu(tta){sub 3} presented luminescent properties, after semi-industrial process, as observed in the narrow bands of intra configuration transitions- 4f{sup 6} relatives to energy levels {sup 7}F{sub 0} {yields} {sup 5}L{sub 6} (394nm), {sup 7}F{sub 0} {yields} {sup 5}D{sub 3} (415nm), {sup 7}F{sub 0} {yields} {sup 5}D{sub 2} (464nm), {sup 7}F{sub 0} {yields} {sup 5}D{sub 1} (525nm) e {sup 7}F{sub 0} {yields} {sup 5}D{sub 0} (578nm) of emission spectrum. Red light of the pellets or film is emitted when excited in UV lamp (365nm). TG results showed under O{sub 2} atmosphere that PP doped with PA6:Eu(tta){sub 3} was more stable than pure PP. In this work was processed luminescent PP/PA6:Eu(tta){sub 3} with properties of thermal and photo stability which can be used as optical marker in polymer processing. (author)

Recurrent neural networks (RNNs) in combination with a pooling operator and the neighbourhood components analysis (NCA) objective function are able to detect the characterizing dynamics of sequences and embed them into a fixed-length vector space of arbitrary dimensionality. Subsequently, the resulting features are meaningful and can be used for visualization or nearest neighbour classification in linear time. This kind of metric learning for sequential data enables the use of algorithms tailored towards fixed length vector spaces such as R^n.

Full Text Available An important and timely plenary session at the 2015 UKSG Conference and Exhibition focused on the role of metrics in research assessment. The two excellent speakers had slightly divergent views.Todd Carpenter from NISO (National Information Standards Organization argued that altmetrics aren’t alt anymore and that downloads and other forms of digital interaction, including social media reference, reference tracking, personal library saving, and secondary linking activity now provide mainstream approaches to the assessment of scholarly impact. James Wilsdon is professor of science and democracy in the Science Policy Research Unit at the University of Sussex and is chair of the Independent Review of the Role of Metrics in Research Assessment commissioned by the Higher Education Funding Council in England (HEFCE. The outcome of this review will inform the work of HEFCE and the other UK higher education funding bodies as they prepare for the future of the Research Excellence Framework. He is more circumspect arguing that metrics cannot and should not be used as a substitute for informed judgement. This article provides a summary of both presentations.

A marked metric measure space (mmm-space) is a triple (X,r,mu), where (X,r) is a complete and separable metric space and mu is a probability measure on XxI for some Polish space I of possible marks. We study the space of all (equivalence classes of) marked metric measure spaces for some fixed I. It arises as state space in the construction of Markov processes which take values in random graphs, e.g. tree-valued dynamics describing randomly evolving genealogical structures in population models. We derive here the topological properties of the space of mmm-spaces needed to study convergence in distribution of random mmm-spaces. Extending the notion of the Gromov-weak topology introduced in (Greven, Pfaffelhuber and Winter, 2009), we define the marked Gromov-weak topology, which turns the set of mmm-spaces into a Polish space. We give a characterization of tightness for families of distributions of random mmm- spaces and identify a convergence determining algebra of functions, called polynomials.

We construct the differential geometry of smooth manifolds equipped with an algebraic curvature map acting as an area measure. Area metric geometry provides a spacetime structure suitable for the discussion of gauge theories and strings, and is considerably more general than Lorentzian geometry. Our construction of geometrically relevant objects, such as an area metric compatible connection and derived tensors, makes essential use of a decomposition theorem due to Gilkey, whereby we generate the area metric from a finite collection of metrics. Employing curvature invariants for multi-metric backgrounds we devise a class of gravity theories with inherently stringy character, and discuss gauge matter actions.

This resource work lists metric information published by the U.S. Government and the American National Standards Institute. Also organizations marketing metric materials for education are given. A short table of conversions is included as is a listing of basic metric facts for everyday living. (LS)

In this paper we introduce in study the projectively related complex Finsler metrics. We prove the complex versions of the Rapcs\\'{a}k's theorem and characterize the weakly K\\"{a}hler and generalized Berwald projectively related complex Finsler metrics. The complex version of Hilbert's Fourth Problem is also pointed out. As an application, the projectiveness of a complex Randers metric is described.

Full Text Available Using the concepts of G-metric, partial metric, and b-metric spaces, we define a new concept of generalized partial b-metric space. Topological and structural properties of the new space are investigated and certain fixed point theorems for contractive mappings in such spaces are obtained. Some examples are provided here to illustrate the usability of the obtained results.

In this paper we study pseudo-Riemannian spaces with a degenerate curvature structure i.e. there exists a continuous family of metrics having identical polynomial curvature invariants. We approach this problem by utilising an idea coming from invariant theory. This involves the existence of a boost, the existence of this boost is assumed to extend to a neighbourhood. This approach proves to be very fruitful: It produces a class of metrics containing all known examples of degenerate metrics. To date, only Kundt and Walker metrics have been given, however, our study gives a plethora of examples showing that degenerate metrics extend beyond the Kundt and Walker examples. The approach also gives a useful criterion for a metric to be degenerate. Specifically, we use this to study the subclass of VSI and CSI metrics (i.e., spaces where polynomial curvature invariants are all vanishing or constants, respectively).

A Multiagent System (MAS) is a software paradigm for building large scale intelligent distributed systems. Increasingly these systems are being deployed on handheld computing devices that rely on non-traditional communications mediums such as mobile ad hoc networks and satellite links. These systems present new challenges for computer scientists in describing system performance and analyzing competing systems. This chapter surveys existing metrics that can be used to describe MASs and related components. A framework for analyzing MASs is provided and an example of how this framework might be employed is given for the domain of distributed constraint reasoning.

Green chemistry has developed mathematical parameters to describe the sustainability of chemical reactions and processes, in order to quantify their environmental impact. These parameters are related to mass and energy magnitudes, and enable analyses and numerical diagnoses of chemical reactions. The environmental impact factor (E factor), atom economy, and reaction mass efficiency have been the most influential metrics, and they are interconnected by mathematical equations. The ecodesign concept must also be considered for complex industrial syntheses, as a part of the sustainability of manufacturing processes. The aim of this Concept article is to identify the main parameters for evaluating undesirable environmental consequences.

Optics: Ninth Edition Optics: Ninth Edition covers the work necessary for the specialization in such subjects as ophthalmic optics, optical instruments and lens design. The text includes topics such as the propagation and behavior of light; reflection and refraction - their laws and how different media affect them; lenses - thick and thin, cylindrical and subcylindrical; photometry; dispersion and color; interference; and polarization. Also included are topics such as diffraction and holography; the limitation of beams in optical systems and its effects; and lens systems. The book is recommen

We give a truly elementary proof of the convexity of metric-adjusted skew information following an idea of Effros. We extend earlier results of weak forms of superadditivity to general metric-adjusted skew information. Recently, Luo and Zhang introduced the notion of semi-quantum states on a bipa......We give a truly elementary proof of the convexity of metric-adjusted skew information following an idea of Effros. We extend earlier results of weak forms of superadditivity to general metric-adjusted skew information. Recently, Luo and Zhang introduced the notion of semi-quantum states...

@@ Complex manifolds are topological spaces that are covered by coordinate charts where the Coordinate changes are given by holomorphic transformations. For example, Riemann surfaces are one dimensional complex manifolds. In order to understand complex manifolds, it is useful to introduce metrics that are compatible with the complex structure. In general, we should have a pair (M, ds2M) where ds2M is the metric. The metric is said to be canonical if any biholomorphisms of the complex manifolds are automatically isometries. Such metrics can naturally be used to describe invariants of the complex structures of the manifold.

Complex manifolds are topological spaces that are covered by coordinate charts where the coordinate changes are given by holomorphic transformations.For example,Riemann surfaces are one dimensional complex manifolds.In order to understand complex manifolds,it is useful to introduce metrics that are compatible with the complex structure.In general,we should have a pair(M,ds~2_M)where ds~2_M is the metric.The metric is said to be canonical if any biholomorphisms of the complex manifolds are automatically isometries.Such metrics can naturally be used to describe invariants of the complex structures of the manifold.

On 13 Jul. 1992, Deputy Director Duane Sewell restated the Laboratory's policy on conversion to the metric system which was established in 1974. Sewell's memo announced the Laboratory's intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory's conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on 25 Jul. 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation's conversion to the metric system. The second part of this report is on applying the metric system.

Full Text Available The class of metric spaces (X,d known as small-determined spaces, introduced by Garrido and Jaramillo, are properly defined by means of some type of real-valued Lipschitz functions on X. On the other hand, B-simple metric spaces introduced by Hejcman are defined in terms of some kind of bornologies of bounded subsets of X. In this note we present a common framework where both classes of metric spaces can be studied which allows us to see not only the relationships between them but also to obtain new internal characterizations of these metric properties.

Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

Optics: Eighth Edition covers the work necessary for the specialization in such subjects as ophthalmic optics, optical instruments and lens design. The text includes topics such as the propagation and behavior of light; reflection and refraction - their laws and how different media affect them; lenses - thick and thin, cylindrical and subcylindrical; photometry; dispersion and color; interference; and polarization. Also included are topics such as diffraction and holography; the limitation of beams in optical systems and its effects; and lens systems. The book is recommended for engineering st

Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English and German languages. The objective is to promote use of metrical phonology as a tool for enhancing instruction in stress patterns in words and sentences, particularly in…

Designed to meet the job-related metric measurement needs of students interested in hard goods merchandising, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…

Designed to meet the job-related metric measurement needs of students interested in soft goods merchandising, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…

The authors discuss background information about the metric system and explore the effect of metrication of agriculture in areas such as equipment calibration, chemical measurement, and marketing of agricultural products. Suggestions are given for possible leadership roles and approaches that agricultural education might take in converting to the…

We develop numerical methods for approximating Ricci flat metrics on Calabi-Yau hypersurfaces in projective spaces. Our approach is based on finding balanced metrics, and builds on recent theoretical work by Donaldson. We illustrate our methods in detail for a one parameter family of quintics. We also suggest several ways to extend our results.

This manual is intended for use in training persons whose vocations involve technical drawing to use the metric system of measurement. It could be used in a short course designed for that purpose or for individual study. The manual begins with a brief discussion of the rationale for conversion to the metric system. It then provides a…

Using isometric embedding of metric trees into Banach spaces, this paper will investigate barycenters, type and cotype, and various measures of compactness of metric trees. A metric tree ($T$, $d$) is a metric space such that between any two of its points there is an unique arc that is isometric to an interval in $\\mathbb{R}$. We begin our investigation by examining isometric embeddings of metric trees into Banach spaces. We then investigate the possible images $x_0=\\pi ((x_1+\\ldots+x_n)/n)$, where $\\pi$ is a contractive retraction from the ambient Banach space $X$ onto $T$ (such a $\\pi$ always exists) in order to understand the "metric" barycenter of a family of points $ x_1, \\ldots,x_n$ in a tree $T$. Further, we consider the metric properties of trees such as their type and cotype. We identify various measures of compactness of metric trees (their covering numbers, $\\epsilon$-entropy and Kolmogorov widths) and the connections between them. Additionally, we prove that the limit of the sequence of Kolmogorov...

The idea of mutual classification of spaces and mappings is one of the main research directions of point set topology. In a systematical way, this book discusses the basic theory of generalized metric spaces by using the mapping method, and summarizes the most important research achievements, particularly those from Chinese scholars, in the theory of spaces and mappings since the 1960s. This book has three chapters, two appendices and a list of more than 400 references. The chapters are "The origin of generalized metric spaces", "Mappings on metric spaces" and "Classes of generalized metric spaces". Graduates or senior undergraduates in mathematics major can use this book as their text to study the theory of generalized metric spaces. Researchers in this field can also use this book as a valuable reference.

As Global Positioning Satellite (GPS) applications become more prevalent for land- and air-based vehicles, GPS applications for space vehicles will also increase. The Applied Technology Directorate of Kennedy Space Center (KSC) has developed a lightweight, low-cost GPS Metric Tracking Unit (GMTU), the first of two steps in developing a lightweight, low-cost Space-Based Tracking and Command Subsystem (STACS) designed to meet Range Safety's link margin and latency requirements for vehicle command and telemetry data. The goals of STACS are to improve Range Safety operations and expand tracking capabilities for space vehicles. STACS will track the vehicle, receive commands, and send telemetry data through the space-based asset, which will dramatically reduce dependence on ground-based assets. The other step was the Low-Cost Tracking and Data Relay Satellite System (TDRSS) Transceiver (LCT2), developed by the Wallops Flight Facility (WFF), which allows the vehicle to communicate with a geosynchronous relay satellite. Although the GMTU and LCT2 were independently implemented and tested, the design collaboration of KSC and WFF engineers allowed GMTU and LCT2 to be integrated into one enclosure, leading to the final STACS. In operation, GMTU needs only a radio frequency (RF) input from a GPS antenna and outputs position and velocity data to the vehicle through a serial or pulse code modulation (PCM) interface. GMTU includes one commercial GPS receiver board and a custom board, the Command and Telemetry Processor (CTP) developed by KSC. The CTP design is based on a field-programmable gate array (FPGA) with embedded processors to support GPS functions.

Using the tractor calculus to study conformally warped manifolds, we adapt results of Gover and Nurowski to give sharp metric obstructions to the existence of quasi-Einstein metrics on suitably generic manifolds. We do this by introducing an analogue of the curvature tractor, itself the tractor analogue of the curvature of the Fefferman-Graham ambient metric. We then use these obstructions to produce a tensorial invariant which is polynomial in the Riemann curvature and its divergence, and which gives the desired obstruction. In particular, this leads to a generalization to arbitrary dimensions of an algorithm due to Bartnik and Tod for finding static metrics. We also explore the consequences of this work for gradient Ricci solitons, finding an obstruction to their existence on suitably generic manifolds, and observing an interesting similarity between the nonnegativity of the curvature tractor and Hamilton's matrix Harnack inequality.

In this work we study different classes of effective composite metrics proposed in the context of one-loop quantum corrections in bimetric gravity. For this purpose we consider contributions of the matter loops in the form of cosmological constants and potential terms yielding two types of effective composite metrics. This guarantees a nice behavior at the quantum level. However, the theoretical consistency at the classical level needs to be ensured additionally. It turns out that among all these possible couplings, only one unique effective metric survives these criteria at the classical level.

In this work we study different classes of effective composite metrics proposed in the context of one-loop quantum corrections in bimetric gravity. For this purpose we consider contributions of the matter loops in form of cosmological constants and potential terms yielding two types of effective composite metrics. This guarantees a nice behaviour at the quantum level. However, the theoretical consistency at the classical level needs to be ensured additionally. It turns out that among all these possible couplings only one unique effective metric survives this criteria at the classical level.

An obstruction to the implementation of spatially flat Painleve-Gullstrand (PG) slicings is demonstrated, and explicitly discussed for Reissner-Nordstroem and Schwarzschild-anti-deSitter spacetimes. Generalizations of PG slicings which are not spatially flat but which remain regular at the horizons are introduced. These metrics can be obtained from standard spherically symmetric metrics by physical Lorentz boosts. With these generalized PG metrics, problematic contributions to the imaginary part of the action in the Parikh-Wilczek derivation of Hawking radiation due to the obstruction can be avoided.

The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

The classical Patterson-Walker construction of a split-signature (pseudo-)Riemannian structure from a given torsion-free affine connection is generalized to a construction of a split-signature conformal structure from a given projective class of connections. A characterization of the induced structures is obtained. We achieve a complete description of Einstein metrics in the conformal class formed by the Patterson-Walker metric. Finally, we describe all symmetries of the conformal Patterson-Walker metric. In both cases we obtain descriptions in terms of geometric data on the original structure.

Describes a classroom activity which involved sixth grade students in a learning situation including making ice cream, safety procedures in a science laboratory, calibrating a thermometer, using metric units of volume and mass. (EB)

Full Text Available A new economic approach to process capability assessment is presented, which differs from the commonly used engineering metrics. The proposed metric consists of two economic capability measures – the expected profit and the variation in profit of the process. This dual economic metric offers a number of significant advantages over other engineering or economic metrics used in process capability analysis. First, it is easy to understand and communicate. Second, it is based on a measure of total system performance. Third, it unifies the fraction nonconforming approach and the expected loss approach. Fourth, it reflects the underlying interest of management in knowing the expected financial performance of a process and its potential variation.

Describes a classroom activity which involved sixth grade students in a learning situation including making ice cream, safety procedures in a science laboratory, calibrating a thermometer, using metric units of volume and mass. (EB)

Full Text Available We study metric solutions of Einstein–anti-Maxwell theory admitting Killing spinors. The analogue of the IWP metric which admits a space-like Killing vector is found and is expressed in terms of a complex function satisfying the wave equation in flat (2+1-dimensional space–time. As examples, electric and magnetic Kasner spaces are constructed by allowing the solution to depend only on the time coordinate. Euclidean solutions are also presented.

When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

Heterotic vacua of string theory are realised, at large radius, by a compact threefold with vanishing first Chern class together with a choice of stable holomorphic vector bundle. These form a wide class of potentially realistic four-dimensional vacua of string theory. Despite all their phenomenological promise, there is little understanding of the metric on the moduli space of these. What is sought is the analogue of special geometry for these vacua. The metric on the moduli space is important in phenomenology as it normalises D-terms and Yukawa couplings. It is also of interest in mathematics, since it generalises the metric, first found by Kobayashi, on the space of gauge field connections, to a more general context. Here we construct this metric, correct to first order in alpha', in two ways: first by postulating a metric that is invariant under background gauge transformations of the gauge field, and also by dimensionally reducing heterotic supergravity. These methods agree and the resulting metric is Ka...

Digital holographic microscopy is an optic-electronic technique that enables the numerical reconstruction of the complex wave-field reflected from, or transmitted through, a target. Together with phase unwrapping, this method permits a height profile, a thickness profile, and/or a refractive index profile, to be extracted, in addition to the reconstruction of the image intensity. Digital holographic microscopy is unlike classical imaging systems in that one can obtain the focused image without situating the camera in the focal plane; indeed, it is possible to recover the complex wave-field at any distance from the camera plane. In order to reconstruct the image, the captured interference pattern is first processed to remove the virtual image and DC component, and then back-propagated using a numerical implementation of the Fresnel transform. A necessary input parameter to this algorithm is the distance from the camera to the image plane, which may be measured independently, estimated by eye following reconstruction at multiple distances, or estimated automatically using a focus metric. Autofocus algorithms are commonly used in microscopy in order to estimate the depth at which the image comes into focus by manually adjusting the microscope stage; in digital holographic microscopy the hologram can be reconstructed at multiple depths, and the autofocus metric can be evaluated for each reconstructed image intensity. In this paper, fifteen sparsity metrics are investigated as potential focus metrics for digital holographic microscopy, whereby the metrics are applied to a series of reconstructed intensities. These metrics are tested on the hologram of a biological cell. The results demonstrate that many of the metrics produce similar profiles, and groupings of the metrics are proposed.

A metric projective structure is a manifold equipped with the unparametrised geodesics of some pseudo-Riemannian metric. We make acomprehensive treatment of such structures in the case that there is a projective Weyl curvature nullity condition. The analysis is simplified by a fundamental and canonical 2-tensor invariant that we discover. It leads to a new canonical tractor connection for these geometries which is defined on a rank $(n+1)$-bundle. We show this connection is linked to the metrisability equations that govern the existence of metrics compatible with the structure. The fundamental 2-tensor also leads to a new class of invariant linear differential operators that are canonically associated to these geometries; included is a third equation studied by Gallot et al. We apply the results to study the metrisability equation, in the nullity setting described. We obtain strong local and global results on the nature of solutions and also on the nature of the geometries admitting such solutions, obtaining ...

The assessment of phylogenetic network reconstruction methods requires the ability to compare phylogenetic networks. This is the second in a series of papers devoted to the analysis and comparison of metrics for tree-child time consistent phylogenetic networks on the same set of taxa. In this paper, we generalize to phylogenetic networks two metrics that have already been introduced in the literature for phylogenetic trees: the nodal distance and the triplets distance. We prove that they are metrics on any class of tree-child time consistent phylogenetic networks on the same set of taxa, as well as some basic properties for them. To prove these results, we introduce a reduction/expansion procedure that can be used not only to establish properties of tree-child time consistent phylogenetic networks by induction, but also to generate all tree-child time consistent phylogenetic networks with a given number of leaves.

We consider inflation within the context of what is arguably the simplest non-metric extension of Einstein gravity. There non-metricity is described by a single graviscalar field with a non-minimal kinetic coupling to the inflaton field Ψ, parameterized by a single parameter γ. There is a simple equivalent description in terms of a massless field and an inflaton with a modified potential. We discuss the implications of non-metricity for chaotic inflation and find that it significantly alters the inflaton dynamics for field values Ψ∼>M{sub P}/γ, dramatically changing the qualitative behaviour in this regime. In the equivalent single-field description this is described as a cuspy potential that forms of barrier beyond which the inflation becomes a ghost field. This imposes an upper bound on the possible number of e-folds. For the simplest chaotic inflation models, the spectral index and the tensor-to-scalar ratio receive small corrections dependent on the non-metricity parameter. We also argue that significant post-inflationary non-metricity may be generated.

Full Text Available We study Lagrange spaces with (γ,β-metric, where γ is a cubic metric and β is a 1-form. We obtain fundamental metric tensor, its inverse, Euler-Lagrange equations, semispray coefficients, and canonical nonlinear connection for a Lagrange space endowed with a (γ,β-metric. Several other properties of such space are also discussed.

Magnetic Eternally Collapsing Objects (MECO) have been proposed as the central engines of galactic black hole candidates (GBHC) and supermassive active galactic nuclei (AGN). Previous work has shown that their luminosities and spectral and timing characteristics are in good agreement with observations. These features and the formation of jets are generated primarily by the interactions of accretion disks with an intrinsically magnetic central MECO. The interaction of accretion disks with the anchored magnetic fields of the central objects permits a unified description of properties for GBHC, AGN, neutron stars in low mass x-ray binaries and dwarf novae systems. The previously published MECO models have been based on a quasistatic Schwarzschild metric of General Relativity; however, the only essential feature of this metric is its ability to produce extreme gravitational redshifts. For reasons discussed in this article, an alternative development based on a quasistatic exponential metric is considered here.

Several complexity metrics are described which are related to logic structure, data structure and size of spreadsheet models. They primarily concentrate on the dispersion of cell references and cell paths. Most metrics are newly defined, while some are adapted from traditional software engineering. Their purpose is the identification of cells which are liable to errors. In addition, they can be used to estimate the values of dependent process metrics, such as the development duration and effort, and especially to adjust the cell error rate in accordance with the contents of each individual cell, in order to accurately asses the reliability of a model. Finally, two conceptual constructs - the reference branching condition cell and the condition block - are discussed, aiming at improving the reliability, modifiability, auditability and comprehensibility of logical tests.

, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

This book studies certain spaces of Riemannian metrics on both compact and non-compact manifolds. These spaces are defined by various sign-based curvature conditions, with special attention paid to positive scalar curvature and non-negative sectional curvature, though we also consider positive Ricci and non-positive sectional curvature. If we form the quotient of such a space of metrics under the action of the diffeomorphism group (or possibly a subgroup) we obtain a moduli space. Understanding the topology of both the original space of metrics and the corresponding moduli space form the central theme of this book. For example, what can be said about the connectedness or the various homotopy groups of such spaces? We explore the major results in the area, but provide sufficient background so that a non-expert with a grounding in Riemannian geometry can access this growing area of research.

In this letter, we describe a general mechanism for emergence of a rainbow metric from a quantum cosmological model. This idea is based on QFT on a quantum space-time. Under general assumptions, we discover that the quantum space-time on which the field propagates can be replaced by a classical space-time, whose metric depends explicitly on the energy of the field: as shown by an analysis of dispersion relations, quanta of different energy propagate on different metrics, similar to photons in a refractive material (hence the name "rainbow" used in the literature). In deriving this result, we do not consider any specific theory of quantum gravity: the qualitative behavior of high-energy particles on quantum space-time relies only on the assumption that the quantum space-time is described by a wave-function $\\Psi_o$ in a Hilbert space $\\mathcal{H}_G$.

Full Text Available In 2005, Mustafa and Sims (2006 introduced and studied a new class of generalized metric spaces, which are called G-metric spaces, as a generalization of metric spaces. We establish some useful propositions to show that many fixed point theorems on (nonsymmetric G-metric spaces given recently by many authors follow directly from well-known theorems on metric spaces. Our technique can be easily extended to other results as shown in application.

Isotropic Berwald metrics are as a generalization of Berwald metrics. Shen proved that every Berwald metric is of vanishing S-curvature. In this paper, we generalize this fact and prove that every isotropic Berwald metric is of isotropic S-curvature. Let F = α + β be a Randers metric of isotropic Berwald curvature. Then it corresponds to a conformal vector field through navigation representation.

A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

We study the junction condition relating the pressure to heat flux at the boundary of an accelerating and expanding spherically symmetric radiating star. We transform the junction condition to an ordinary differential equation by making a separability assumption on the metric functions in the space–time variables. The condition of separability on the metric functions yields several new exact solutions. A class of shear-free models is found which contains a linear equation of state and generalizes a previously obtained model. Four new shearing models are obtained; all the gravitational potentials can be written explicitly. A brief physical analysis indicates that the matter variables are well behaved.

It is well known that pseudo-Riemannian metrics in the projective class of a given torsion free affine connection can be obtained from (and are equivalent to) the solutions of a certain overdetermined projectively invariant differential equation. This equation is a special case of a so-called first BGG equation. The general theory of such equations singles out a subclass of so-called normal solutions. We prove that non-degerate normal solutions are equivalent to pseudo-Riemannian Einstein metrics in the projective class and observe that this connects to natural projective extensions of the Einstein condition.

Process modeling languages such as EPCs, BPMN, flow charts, UML activity diagrams, Petri nets, etc.\\ are used to model business processes and to configure process-aware information systems. It is known that users have problems understanding these diagrams. In fact, even process engineers and system......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...

Similarity search is an important problem in information retrieval. This similarity is based on a distance. Symbolic representation of time series has attracted many researchers recently, since it reduces the dimensionality of these high dimensional data objects. We propose a new distance metric that is applied to symbolic data objects and we test it on time series data bases in a classification task. We compare it to other distances that are well known in the literature for symbolic data objects. We also prove, mathematically, that our distance is metric.

Haptics technology is being used more and more in different applications, such as in computer games for increased immersion, in surgical simulators to create a realistic environment for training of surgeons, in surgical robotics due to safety issues and in mobile phones to provide feedback from user action. The existence of these applications highlights a clear need to understand performance metrics for haptic interfaces and their implications on device design, use and application. Performance Metrics for Haptic Interfaces aims at meeting this need by establishing standard practices for the ev

Electronic excitations in dilute solutions of poly para phenylene ethynylene (poly-PPE) are studied using a QM/MM approach combining many-body Green's functions theory within the $GW$ approximation and the Bethe-Salpeter equation with polarizable force field models. Oligomers up to a length of 7.5\\,nm (10 repeat units) functionalized with nonyl side chains are solvated in toluene and water, respectively. After equilibration using atomistic molecular dynamics (MD), the system is partitioned into a quantum region (backbone) embedded into a classical (side chains and solvent) environment. Optical absorption properties are calculated solving the coupled QM/MM system self-consistently and special attention is paid to the effects of solvents. The model allows to differentiate the influence of oligomer conformation induced by the solvation from electronic effects related to local electric fields and polarization. It is found that the electronic environment contributions are negligible compared to the conformational ...

We consider the internal structure and the physical properties of specific classes of neutron, quark and Bose-Einstein Condensate stars in the hybrid metric-Palatini gravity theory, which is a combination of the metric and Palatini $f(R)$ formalisms. The theory is very successful in accounting for the observed phenomenology, since it unifies local constraints at the Solar System level and the late-time cosmic acceleration, even if the scalar field is very light. We derive the equilibrium equations for a spherically symmetric configuration (mass continuity and Tolman-Oppenheimer-Volkoff) in the framework of hybrid metric-Palatini theory, and we investigate their solutions numerically for different equations of state of neutron and quark matter, by adopting for the scalar field potential a Higgs-type form. Stellar models, described by the stiff fluid, radiation-like, the bag model and the Bose-Einstein Condensate equations of state are explicitly constructed in both General Relativity and hybrid metric-Palatini...

Blogs represent an important new arena for knowledge discovery in open source intelligence gathering. Bloggers are a vast network of human (and sometimes non-human) information sources monitoring important local and global events, and other blogs, for items of interest upon which they comment. Increasingly, issues erupt from the blog world and into the real world. In order to monitor blogging about important events, we must develop models and metrics that represent blogs correctly. The structure of blogs requires new techniques for evaluating such metrics as the relevance, specificity, credibility and timeliness of blog entries. Techniques that have been developed for standard information retrieval purposes (e.g. Google's PageRank) are suboptimal when applied to blogs because of their high degree of exophoricity, quotation, brevity, and rapidity of update. In this paper, we offer new metrics related for blog entry relevance, specificity, timeliness and credibility that we are implementing in a blog search and analysis tool for international blogs. This tools utilizes new blog-specific metrics and techniques for extracting the necessary information from blog entries automatically, using some shallow natural language processing techniques supported by background knowledge captured in domain-specific ontologies.

Report from Dagstuhl seminar 14491. This report documents the program and the outcomes of Dagstuhl Seminar 14491 “Socio-Technical Security Metrics”. In the domain of safety, metrics inform many decisions, from the height of new dikes to the design of nuclear plants. We can state, for example, that t

This chapter will address the importance of intercampus involvement in reporting of gainful employment student-level data that will be used in the calculation of gainful employment metrics by the U.S. Department of Education. The authors will discuss why building relationships within the institution is critical for effective gainful employment…

Full Text Available The strong metric dimension has been a subject of considerable amount of research in recent years. This survey describes the related development by bringing together theoretical results and computational approaches, and places the recent results within their historical and scientific framework. [Projekat Ministarstva nauke Republike Srbije, br. 174010 i br. 174033

In this short Note we would like to bring into the attention of people working in General Relativity a Schwarzschild like metric found by Professor Cleopatra Mociu\\c{t}chi in sixties. It was obtained by the A. Sommerfeld reasoning from his treatise "Elektrodynamik" but using instead of the energy conserving law from the classical Physics, the relativistic energy conserving law.

Area metric manifolds emerge as effective classical backgrounds in quantum string theory and quantum gauge theory, and present a true generalization of metric geometry. Here, we consider area metric manifolds in their own right, and develop in detail the foundations of area metric differential geometry. Based on the construction of an area metric curvature scalar, which reduces in the metric-induced case to the Ricci scalar, we re-interpret the Einstein-Hilbert action as dynamics for an area metric spacetime. In contrast to modifications of general relativity based on metric geometry, no continuous deformation scale needs to be introduced; the extension to area geometry is purely structural and thus rigid. We present an intriguing prediction of area metric gravity: without dark energy or fine-tuning, the late universe exhibits a small acceleration.

In this paper we consider flat metrics (semi-translation structures) on surfaces of finite type. There are two main results. The first is a complete description of when a set of simple closed curves is spectrally rigid, that is, when the length vector determines a metric among the class of flat metrics. Secondly, we give an embedding into the space of geodesic currents and use this to get a boundary for the space of flat metrics. The geometric interpretation is that flat metrics degenerate to "mixed structures" on the surface: part flat metric and part measured foliation.

The escape velocity value in the terms of general relativity by means Schwarzschild metric is provided to make of the motion equation with Friedman cosmological model behavior build in the terms of Robertson-Worker metric. (author)

Risk is the best known and perhaps the best studied example within a much broader class of cyber security metrics. However, risk is not the only possible cyber security metric. Other metrics such as resilience can exist and could be potentially very valuable to defenders of ICS systems. Often, metrics are defined as measurable properties of a system that quantify the degree to which objectives of the system are achieved. Metrics can provide cyber defenders of an ICS with critical insights regarding the system. Metrics are generally acquired by analyzing relevant attributes of that system. In terms of cyber security metrics, ICSs tend to have unique features: in many cases, these systems are older technologies that were designed for functionality rather than security. They are also extremely diverse systems that have different requirements and objectives. Therefore, metrics for ICSs must be tailored to a diverse group of systems with many features and perform many different functions. In this chapter, we first...

On domains $\\Omega\\subset\\R^n$, we consider metrics induced by continuous densities $\\rho\\colon\\Omega\\rightarrow(0,\\infty)$ and study the Hausdorff and packing dimensions of the boundary of $\\Omega$ with respect to these metrics.

Recently, the phenomenology of f(R) gravity has been scrutinized motivated by the possibility to account for the self-accelerated cosmic expansion without invoking dark energy sources. Besides, this kind of modified gravity is capable of addressing the dynamics of several self-gravitating systems alternatively to the presence of dark matter. It has been established that both metric and Palatini versions of these theories have interesting features but also manifest severe and different downsides. A hybrid combination of theories, containing elements from both these two formalisms, turns out to be also very successful accounting for the observed phenomenology and is able to avoid some drawbacks of the original approaches. This article reviews the formulation of this hybrid metric-Palatini approach and its main achievements in passing the local tests and in applications to astrophysical and cosmological scenarios, where it provides a unified approach to the problems of dark energy and dark matter.

We show that if (M,\\omega) is a closed symplectic manifold which admits a nontrivial Hamiltonian vector field all of whose contractible closed orbits are constant, then Hofer's metric on the group of Hamiltonian diffeomorphisms of (M,\\omega) has infinite diameter, and indeed admits infinite-dimensional quasi-isometrically embedded normed vector spaces. A similar conclusion applies to Hofer's metric on various spaces of Lagrangian submanifolds, including those Hamiltonian-isotopic to the diagonal in M x M when M satisfies the above dynamical condition. To prove this, we use the properties of a Floer-theoretic quantity called the boundary depth, which measures the nontriviality of the boundary operator on the Floer complex in a way that encodes robust symplectic-topological information.

For complete affine manifolds we introduce a definition of compactification based on the projective differential geometry (i.e.\\ geodesic path data) of the given connection. The definition of projective compactness involves a real parameter $\\alpha$ called the order of projective compactness. For volume preserving connections, this order is captured by a notion of volume asymptotics that we define. These ideas apply to complete pseudo-Riemannian spaces, via the Levi-Civita connection, and thus provide a notion of compactification alternative to conformal compactification. For each order $\\alpha$, we provide an asymptotic form of a metric which is sufficient for projective compactness of the given order, thus also providing many local examples. Distinguished classes of projectively compactified geometries of orders one and two are associated with Ricci-flat connections and non--Ricci--flat Einstein metrics, respectively. Conversely, these geometric conditions are shown to force the indicated order of projectiv...

Quality of care in the context of inpatient neurology is the standard of performance by neurologists and the hospital system as measured against ideal models of care. There are growing regulatory pressures to define health care value through concrete quantifiable metrics linked to reimbursement. Theoretical models of quality acknowledge its multimodal character with quantitative and qualitative dimensions. For example, the Donabedian model distils quality as a phenomenon of three interconnected domains, structure-process-outcome, with each domain mutually influential. The actual measurement of quality may be implicit, as in peer review in morbidity and mortality rounds, or explicit, in which criteria are prespecified and systemized before assessment. As a practical contribution, in this article a set of candidate quality indicators for inpatient neurology based on an updated review of treatment guidelines is proposed. These quality indicators may serve as an initial blueprint for explicit quality metrics long overdue for inpatient neurology.

There's a saying by John Wanamaker who pontificated, "Half the money I spend on advertising is wasted; the trouble is, I don't know which half". Today you have opportunities to determine which parts of your marketing efforts are effective and what is wasted. However, you have to measure your marketing results. This article will discuss marketing metrics and how to use them to get the best bang for your marketing buck.

We present a stationary generalization of the static $q-$metric, the simplest generalization of the Schwarzschild solution that contains a quadrupole parameter. It possesses three independent parameters that are related to the mass, quadrupole moment and angular momentum. We investigate the geometric and physical properties of this exact solution of Einstein's vacuum equations, and show that it can be used to describe the exterior gravitational field of rotating, axially symmetric, compact objects.

An n-dimensional strictly pseudoconvex Hartogs domain $D_F$ can be equipped with a natural Kaehler metric g_F. In this paper we prove that if m_0g_F is balanced for a given positive integer m_0 then m_0>n and (D_F, g_F) is holomorphically isometric to an open subset of the n-dimensional complex hyperbolic space.

We generalize the notion of the Futaki invariant and extremal vector field to the general almost-Kahler case and we prove the periodicity of the extremal vector field when the symplectic form represents an integral cohomology class modulo torsion. We give also an explicit formula of the hermitian scalar curvature which allows us to obtain examples of non-integrable extremal almost-Kahler metrics saturating LeBrun's estimates.

Today digital sources supply a historically unprecedented component of human sensorimotor data, the consumption of which is correlated with poorly understood maladies such as Internet addiction disorder and Internet gaming disorder. Because both natural and digital sensorimotor data share common mathematical descriptions, one can quantify our informational sensorimotor needs using the signal processing metrics of entropy, noise, dimensionality, continuity, latency, and bandwidth. Such metrics describe in neutral terms the informational diet human brains require to self-calibrate, allowing individuals to maintain trusting relationships. With these metrics, we define the trust humans experience using the mathematical language of computational models, that is, as a primitive statistical algorithm processing finely grained sensorimotor data from neuromechanical interaction. This definition of neuromechanical trust implies that artificial sensorimotor inputs and interactions that attract low-level attention through frequent discontinuities and enhanced coherence will decalibrate a brain's representation of its world over the long term by violating the implicit statistical contract for which self-calibration evolved. Our hypersimplified mathematical understanding of human sensorimotor processing as multiscale, continuous-time vibratory interaction allows equally broad-brush descriptions of failure modes and solutions. For example, we model addiction in general as the result of homeostatic regulation gone awry in novel environments (sign reversal) and digital dependency as a sub-case in which the decalibration caused by digital sensorimotor data spurs yet more consumption of them. We predict that institutions can use these sensorimotor metrics to quantify media richness to improve employee well-being; that dyads and family-size groups will bond and heal best through low-latency, high-resolution multisensory interaction such as shared meals and reciprocated touch; and

We present a stationary generalization of the static $q-$metric, the simplest generalization of the Schwarzschild solution that contains a quadrupole parameter. It possesses three independent parameters that are related to the mass, quadrupole moment and angular momentum. We investigate the geometric and physical properties of this exact solution of Einstein's vacuum equations, and show that it can be used to describe the exterior gravitational field of rotating, axially symmetric, compact objects.

Full Text Available It is shown that the metric of clusters of galaxies should be universal, depending only on the fundamental constants and compatible with the metric of the universe. There are examples of universal metrics obtained in Einstein's theory of gravitation. On the basis of axisymmetric solutions of Einstein’s equation proposed universal metric describing the properties of galaxies, groups and clusters of galaxies

We present metrics for measuring the similarity of states in a finite Markov decision process (MDP). The formulation of our metrics is based on the notion of bisimulation for MDPs, with an aim towards solving discounted infinite horizon reinforcement learning tasks. Such metrics can be used to aggregate states, as well as to better structure other value function approximators (e.g., memory-based or nearest-neighbor approximators). We provide bounds that relate our metric distances to the opti...

Full Text Available An almost contact metric 3-submersion is a Riemannian submersion, π from an almost contact metric manifold (M4m+3,(φi,ξi,ηii=13,g onto an almost quaternionic manifold (N4n,(Jii=13,h which commutes with the structure tensors of type (1,1;i.e., π*φi=Jiπ*, for i=1,2,3. For various restrictions on ∇φi, (e.g., M is 3-Sasakian, we show corresponding limitations on the second fundamental form of the fibres and on the complete integrability of the horizontal distribution. Concommitantly, relations are derived between the Betti numbers of a compact total space and the base space. For instance, if M is 3-quasi-Saskian (dΦ=0, then b1(N≤b1(M. The respective φi-holomorphic sectional and bisectional curvature tensors are studied and several unexpected results are obtained. As an example, if X and Y are orthogonal horizontal vector fields on the 3-contact (a relatively weak structure total space of such a submersion, then the respective holomorphic bisectional curvatures satisfy: Bφi(X,Y=B′J′i(X*,Y*−2. Applications to the real differential geometry of Yarg-Milis field equations are indicated based on the fact that a principal SU(2-bundle over a compactified realized space-time can be given the structure of an almost contact metric 3-submersion.

Full Text Available The evaluation of Graphical User Interface has significant role to improve its quality. Very few metrics exists for the evaluation of Graphical User Interface. The purpose of metrics is to obtain better measurements in terms of risk management, reliability forecast, project scheduling, and cost repression. In this paper structural complexity metrics is proposed for the evaluation of Graphical User Interface. Structural complexity of Graphical User Interface is considered as an indicator of complexity. The goal of identifying structural complexity is to measure the GUI testability. In this testability evaluation the process of measuring the complexity of the user interface from testing perspective is proposed. For the GUI evaluation and calculating structural complexity an assessment process is designed which is based on types of events. A fuzzy model is developed to evaluate the structural complexity of GUI. This model takes five types of events as input and return structural complexity of GUI as output. Further a relationship is established between structural complexity and testability of event driven software. Proposed model is evaluated with four different applications. It is evident from the results that higher the complexities lower the testability of application.

We consider two-player games played over finite state spaces for an infinite number of rounds. At each state, the players simultaneously choose moves; the moves determine a successor state. It is often advantageous for players to choose probability distributions over moves, rather than single moves. Given a goal, for example, reach a target state, the question of winning is thus a probabilistic one: what is the maximal probability of winning from a given state? On these game structures, two fundamental notions are those of equivalences and metrics. Given a set of winning conditions, two states are equivalent if the players can win the same games with the same probability from both states. Metrics provide a bound on the difference in the probabilities of winning across states, capturing a quantitative notion of state similarity. We introduce equivalences and metrics for two-player game structures, and we show that they characterize the difference in probability of winning games whose goals are expressed in the...

This report reviews the findings of two projects funded by the National Institute of Education (NIE) ano conducted by the American Institutes for Research (AIR). The project reports, "Going Metric" and "Metric Inservice Teacher Training," document the impact of metric conversion on the educational systems of Great Britain, New Zeland, Australia,…

Researchers use many different metrics for evaluation of performance of student models. The aim of this paper is to provide an overview of commonly used metrics, to discuss properties, advantages, and disadvantages of different metrics, to summarize current practice in educational data mining, and to provide guidance for evaluation of student…

This classroom guide for metric education included a brief rationale and history of metrics, a preliminary metric quiz, a symbol summary, and a list of recommended instructional materials. The guide is comprised primarily of four sections covering the topics of: weight, length, volume, and temperature. Each of these sections contains goals and…

We consider the internal structure and the physical properties of specific classes of neutron, quark and Bose-Einstein condensate stars in the recently proposed hybrid metric-Palatini gravity theory, which is a combination of the metric and Palatini f (R ) formalisms. It turns out that the theory is very successful in accounting for the observed phenomenology, since it unifies local constraints at the Solar System level and the late-time cosmic acceleration, even if the scalar field is very light. In this paper, we derive the equilibrium equations for a spherically symmetric configuration (mass continuity and Tolman-Oppenheimer-Volkoff) in the framework of the scalar-tensor representation of the hybrid metric-Palatini theory, and we investigate their solutions numerically for different equations of state of neutron and quark matter, by adopting for the scalar field potential a Higgs-type form. It turns out that the scalar-tensor definition of the potential can be represented as an Clairaut differential equation, and provides an explicit form for f (R ) given by f (R )˜R +Λeff, where Λeff is an effective cosmological constant. Furthermore, stellar models, described by the stiff fluid, radiation-like, bag model and the Bose-Einstein condensate equations of state are explicitly constructed in both general relativity and hybrid metric-Palatini gravity, thus allowing an in-depth comparison between the predictions of these two gravitational theories. As a general result it turns out that for all the considered equations of state, hybrid gravity stars are more massive than their general relativistic counterparts. Furthermore, two classes of stellar models corresponding to two particular choices of the functional form of the scalar field (constant value, and logarithmic form, respectively) are also investigated. Interestingly enough, in the case of a constant scalar field the equation of state of the matter takes the form of the bag model equation of state describing

Full Text Available In the National Library of Finland (NLF there are millions of digitized newspaper and journal pages, which are openly available via the public website http://digi.kansalliskirjasto.fi. To serve users better, last year the front end was completely overhauled with its main aim in crowdsourcing features, e.g., by giving end-users the opportunity to create digital clippings and a personal scrapbook from the digital collections. But how can you know whether crowdsourcing has had an impact? How much crowdsourcing functionalities have been used so far? Did crowdsourcing work? In this paper the statistics and metrics of a recent crowdsourcing effort are analysed across the different digitized material types (newspapers, journals, ephemera. The subjects, categories and keywords given by the users are analysed to see which topics are the most appealing. Some notable public uses of the crowdsourced article clippings are highlighted. These metrics give us indications on how the end-users, based on their own interests, are investigating and using the digital collections. Therefore, the suggested metrics illustrate the versatility of the information needs of the users, varying from citizen science to research purposes. By analysing the user patterns, we can respond to the new needs of the users by making minor changes to accommodate the most active participants, while still making the service more approachable for those who are trying out the functionalities for the first time. Participation in the clippings and annotations can enrich the materials in unexpected ways and can possibly pave the way for opportunities of using crowdsourcing more also in research contexts. This creates more opportunities for the goals of open science since source data becomes ­available, making it possible for researchers to reach out to the general public for help. In the long term, utilizing, for example, text mining methods can allow these different end-user segments to

The spectral metric, defined by Schwarz and Oh using Floer-theoretical method, is a bi-invariant metric on the Hamiltonian diffeomorphism group. We show in this note that for certain symplectic manifolds, this metric can not be extended to a bi-invariant metric on the full group of symplectomorphisms. We also study the bounded isometry conjecture of Lalonde and Polterovich in the context of the spectral metric. In particular, we show that the conjecture holds for the torus with all linear symplectic forms.

Full Text Available The goal of the paper is to study the angle between two curves in the framework of metric (and metric measure spaces. More precisely, we give a new notion of angle between two curves in a metric space. Such a notion has a natural interplay with optimal transportation and is particularly well suited for metric measure spaces satisfying the curvature-dimension condition. Indeed one of the main results is the validity of the cosine formula on RCD*(K, N metric measure spaces. As a consequence, the new introduced notions are compatible with the corresponding classical ones for Riemannian manifolds, Ricci limit spaces and Alexandrov spaces.

The authors extend the notion of statistical structure from Riemannian geometry to the general framework of path spaces endowed with a nonlinear connection and a generalized metric.Two particular cases of statistical data are defined.The existence and uniqueness of a nonlinear connection corresponding to these classes is proved.Two Koszul tensors are introduced in accordance with the Riemannian approach.As applications,the authors treat the Finslerian (α,β)-metrics and the Beil metrics used in relativity and field theories while the support Riemannian metric is the Fisher-Rao metric of a statistical model.

This is a practical guide to using web metrics to measure impact and demonstrate value. The web provides an opportunity to collect a host of different metrics, from those associated with social media accounts and websites to more traditional research outputs. This book is a clear guide for library and information professionals as to what web metrics are available and how to assess and use them to make informed decisions and demonstrate value. As individuals and organizations increasingly use the web in addition to traditional publishing avenues and formats, this book provides the tools to unlock web metrics and evaluate the impact of this content. The key topics covered include: bibliometrics, webometrics and web metrics; data collection tools; evaluating impact on the web; evaluating social media impact; investigating relationships between actors; exploring traditional publications in a new environment; web metrics and the web of data; the future of web metrics and the library and information professional.Th...

In four dimensions, the most general metric admitting two Killing vectors and a rank-two Killing tensor can be parameterized by ten arbitrary functions of a single variable. We show that picking a special vierbien, reducing the system to eight functions, implies the existence of two geodesic and share-free, null congruences, generated by two principal null directions of the Weyl tensor. Thus, if the spacetime is an Einstein manifold, the Goldberg-Sachs theorem implies it is Petrov type D, and by explicit construction, is in the Carter class. Hence, our analysis provide an straightforward connection between the most general integrable structure and the Carter family of spacetimes.

As the penetration of variable generation (wind and solar) increases around the world, there is an accompanying growing interest and importance in accurately assessing the contribution that these resources can make toward planning reserve. This contribution, also known as the capacity credit or capacity value of the resource, is best quantified by using a probabilistic measure of overall resource adequacy. In recognizing the variable nature of these renewable resources, there has been interest in exploring the use of reliability metrics other than loss of load expectation. In this paper, we undertake some comparisons using data from the Western Electricity Coordinating Council in the western United States.

Electronic excitations in dilute solutions of poly para phenylene ethynylene (poly-PPE) are studied using a QM/MM approach combining many-body Green's functions theory within the GW approximation and the Bethe-Salpeter equation with polarizable force field models. Oligomers up to a length of 7.5 nm (10 repeat units) functionalized with nonyl side chains are solvated in toluene and water, respectively. After equilibration using atomistic molecular dynamics (MD), the system is partitioned into a quantum region (backbone) embedded into a classical (side chains and solvent) environment. Optical absorption properties are calculated solving the coupled QM/MM system self-consistently and special attention is paid to the effects of solvents. The model allows to differentiate the influence of oligomer conformation induced by the solvation from electronic effects related to local electric fields and polarization. It is found that the electronic environment contributions are negligible compared to the conformational dynamics of the conjugated PPE. An analysis of the electron-hole wave function reveals a sensitivity of energy and localization characteristics of the excited states to bends in the global conformation of the oligomer rather than to the relative of phenyl rings along the backbone.

Full Text Available The scale quality of indirect and direct scalings of the intensity of emotional experiences was investigated from the perspective of representational measurement theory. Study 1 focused on sensory pleasantness and disgust, Study 2 on surprise and amusement, and Study 3 on relief and disappointment. In each study, the emotion intensities elicited by a set of stimuli were estimated using Ordinal Difference Scaling, an indirect probabilistic scaling method based on graded pair comparisons. The obtained scale values were used to select test cases for the quadruple axiom, a central axiom of difference measurement. A parametric bootstrap test was used to decide whether the participants’ difference judgments systematically violated the axiom. Most participants passed this test. The indirect scalings of these participants were then linearly correlated with their direct emotion intensity ratings to determine whether they agreed with them up to measurement error, and hence might be metric as well. The majority of the participants did not pass this test. The findings suggest that Ordinal Difference Scaling allows to measure emotion intensity on a metric scale level for most participants. As a consequence, quantitative emotion theories become amenable to empirical test on the individual level using indirect measurements of emotional experience.

This is a corrected and essentially extended version of the unpublished manuscript by Y Nutku and M Sheftel which contains new results. It is proposed to be published in honour of Y Nutku’s memory. All corrections and new results in sections 1, 2 and 4 are due to M Sheftel. We present new anti-self-dual exact solutions of the Einstein field equations with Euclidean and neutral (ultra-hyperbolic) signatures that admit only one rotational Killing vector. Such solutions of the Einstein field equations are determined by non-invariant solutions of Boyer-Finley (BF) equation. For the case of Euclidean signature such a solution of the BF equation was first constructed by Calderbank and Tod. Two years later, Martina, Sheftel and Winternitz applied the method of group foliation to the BF equation and reproduced the Calderbank-Tod solution together with new solutions for the neutral signature. In the case of Euclidean signature we obtain new metrics which asymptotically locally look like a flat space and have a non-removable singular point at the origin. In the case of ultra-hyperbolic signature there exist three inequivalent forms of metric. Only one of these can be obtained by analytic continuation from the Calderbank-Tod solution whereas the other two are new.

This report documents part of the work performed in phase I of a Laboratory Directors Research and Development (LDRD) funded project entitled Building Performance Assurances (BPA). The focus of the BPA effort is to transform the way buildings are built and operated in order to improve building performance by facilitating or providing tools, infrastructure, and information. The efforts described herein focus on the development of metrics with which to evaluate building performance and for which information and optimization tools need to be developed. The classes of building performance metrics reviewed are (1) Building Services (2) First Costs, (3) Operating Costs, (4) Maintenance Costs, and (5) Energy and Environmental Factors. The first category defines the direct benefits associated with buildings; the next three are different kinds of costs associated with providing those benefits; the last category includes concerns that are broader than direct costs and benefits to the building owner and building occupants. The level of detail of the various issues reflect the current state of knowledge in those scientific areas and the ability of the to determine that state of knowledge, rather than directly reflecting the importance of these issues; it intentionally does not specifically focus on energy issues. The report describes work in progress and is intended as a resource and can be used to indicate the areas needing more investigation. Other reports on BPA activities are also available.

Matching the visual appearances of the target over consecutive image frames is the most critical issue in video-based object tracking. Choosing an appropriate distance metric for matching determines its accuracy and robustness, and thus significantly influences the tracking performance. Most existing tracking methods employ fixed pre-specified distance metrics. However, this simple treatment is problematic and limited in practice, because a pre-specified metric does not likely to guarantee the closest match to be the true target of interest. This paper presents a new tracking approach that incorporates adaptive metric learning into the framework of visual object tracking. Collecting a set of supervised training samples on-the-fly in the observed video, this new approach automatically learns the optimal distance metric for more accurate matching. The design of the learned metric ensures that the closest match is very likely to be the true target of interest based on the supervised training. Such a learned metric is discriminative and adaptive. This paper substantializes this new approach in a solid case study of adaptive-metric differential tracking, and obtains a closed-form analytical solution to motion estimation and visual tracking. Moreover, this paper extends the basic linear distance metric learning method to a more powerful nonlinear kernel metric learning method. Extensive experiments validate the effectiveness of the proposed approach, and demonstrate the improved performance of the proposed new tracking method.

Full Text Available Several object-oriented metrics have been developed and used in conjunction with the quality models to predict the overall quality of software. However, it may not be enough to propose metrics. The fundamental question may be of their validity, utility and reliability. It may be much significant to be sure that these metrics are really useful and for that their construct validity must be assured. Thereby, good quality metrics must be developed using a foolproof and sound framework / model. A critical review of literature on the attempts in this regard reveals that there is no standard framework or model available for such an important activity. This study presents a framework for the quality metric development called Metric Development Framework (qMDF, which is prescriptive in nature. qMDF is a general framework but it has been established specially with ideas of object-oriented metrics. qMDF has been implemented to develop a good quality design metric, as a validation of proposed framework. Finally, it is defended that adaptation of qMDF by metric developers would yield good quality metrics, while ensuring their construct validity, utility, reliability and reduced developmental effort.

We present a study of a new metric for characterizing spectral line bisectors from integrated Sun-as-a-star measurements of photospheric spectral lines. This metric, which we call bisector area, differs from previous analysis methods in that it characterizes the entire bisector (up to a maximum intensity limit) with a single scalar quantity. In this preliminary study, we analyzed data from the Synoptic Optical Long-term Investigations of the Sun Integrated Sunlight Spectrometer (SOLIS/ISS) during the decline and rise of solar cycles 23 and 24, respectively, to diagnose any potential correlations between bisector area and other known changes in spectral line properties as a function of time. This work was carried out through the National Solar Observatory Research Experiences for Undergraduates (REU) Program, which is funded by the National Science Foundation (NSF). The National Solar Observatory is operated by the Association of Universities for Research in Astronomy, Inc. (AURA) under cooperative agreement with the NSF.

A special class of metrics, called universal metrics, solve all gravity theories defined by covariant field equations purely based on the metric tensor. Since we currently lack the knowledge of what the full of quantum corrected field equations of gravity are at a given microscopic length scale, these metrics are particularly important in understanding quantum fields in curved backgrounds in a consistent way. But, finding explicit universal metrics has been a hard problem as there does not seem to be a procedure for it. In this work, we overcome this difficulty and give a construction of universal metrics of d dimensional spacetime from curves constrained to live in a d-1 dimensional Minkowski spacetime or a Euclidean space.

The tree metric theorem provides a combinatorial four point condition that characterizes dissimilarity maps derived from pairwise compatible split systems. A similar (but weaker) four point condition characterizes dissimilarity maps derived from circular split systems (Kalmanson metrics). The tree metric theorem was first discovered in the context of phylogenetics and forms the basis of many tree reconstruction algorithms, whereas Kalmanson metrics were first considered by computer scientists, and are notable in that they are a non-trivial class of metrics for which the traveling salesman problem is tractable. We present a unifying framework for these theorems based on combinatorial structures that are used for graph planarity testing. These are (projective) PC-trees, and their affine analogs, PQ-trees. In the projective case, we generalize a number of concepts from clustering theory, including hierarchies, pyramids, ultrametrics and Robinsonian matrices, and the theorems that relate them. As with tree metric...

A special class of metrics, called universal metrics, solves all gravity theories defined by covariant field equations purely based on the metric tensor. Since we currently lack the knowledge of what the full quantum-corrected field equations of gravity are at a given microscopic length scale, these metrics are particularly important in understanding quantum fields in curved backgrounds in a consistent way. However, finding explicit universal metrics has been a difficult problem as there does not seem to be a procedure for it. In this work, we overcome this difficulty and give a construction of universal metrics of d -dimensional spacetime from curves constrained to live in a (d -1 )-dimensional Minkowski spacetime or a Euclidean space.

An optimal transport path may be viewed as a geodesic in the space of probability measures under a suitable family of metrics. This geodesic may exhibit a tree-shaped branching structure in many applications such as trees, blood vessels, draining and irrigation systems. Here, we extend the study of ramified optimal transportation between probability measures from Euclidean spaces to a geodesic metric space. We investigate the existence as well as the behavior of optimal transport paths under various properties of the metric such as completeness, doubling, or curvature upper boundedness. We also introduce the transport dimension of a probability measure on a complete geodesic metric space, and show that the transport dimension of a probability measure is bounded above by the Minkowski dimension and below by the Hausdorff dimension of the measure. Moreover, we introduce a metric, called "the dimensional distance", on the space of probability measures. This metric gives a geometric meaning to the transport dimen...

The Kerr-Newman metric describes a very special rotating, charged mass and is the most general of the asymptotically flat stationary 'black hole' solutions to the Einstein-Maxwell equations of general relativity. We review the derivation of this metric from the Reissner-Nordstrom solution by means of a complex transformation algorithm and provide a brief overview of its basic geometric properties. We also include some discussion of interpretive issues, related metrics, and higher-dimensional analogues.

This paper describes an effort to identify common metrics for task-oriented human-robot interaction (HRI). We begin by discussing the need for a toolkit of HRI metrics. We then describe the framework of our work and identify important biasing factors that must be taken into consideration. Finally, we present suggested common metrics for standardization and a case study. Preparation of a larger, more detailed toolkit is in progress.

In this short note, we prove that the space of all admissible piecewise linear metrics parameterized by length square on a triangulated manifolds is a convex cone. We further study Regge's Einstein-Hilbert action and give a much more reasonable definition of discrete Einstein metric than our former version in \\cite{G}. Finally, we introduce a discrete Ricci flow for three dimensional triangulated manifolds, which is closely related to the existence of discrete Einstein metrics.

Used just as they are, the metrics in this book will bring many benefits to both the IT department and the business as a whole. Details of the attributes of each metric are given, enabling you to make the right choices for your business. You may prefer and are encouraged to design and create your own metrics to bring even more value to your business - this book will show you how to do this, too.

Full Text Available This paper deals with metric observer application for induction motors. Firstly, assuming that stator currents and speed are measured, a metric observer is designed to estimate the rotor fluxes. Secondly, assuming that only stator currents are measured, another metric observer is derived to estimate rotor fluxes and speed. The proposed observer validity is checked throughout simulations on a 4 kW induction motor drive.

We prove that Nakhleh's metric for reduced phylogenetic networks is also a metric on the classes of tree-child phylogenetic networks, semibinary tree-sibling time consistent phylogenetic networks, and multilabeled phylogenetic trees. We also prove that it separates distinguishable phylogenetic networks. In this way, it becomes the strongest dissimilarity measure for phylogenetic networks available so far. Furthermore, we propose a generalization of that metric that separates arbitrary phylogenetic networks.

A wide variety of full-size monoclonal antibodies (mAbs) and therapeutics derived from alternative antibody formats can be produced through genetic and biological engineering techniques. These molecules are now filling the preclinical and clinical pipelines of every major pharmaceutical company and many biotechnology firms. Metrics for the development of antibody therapeutics, including averages for the number of candidates entering clinical study and development phase lengths for mAbs approved in the United States, were derived from analysis of a dataset of over 600 therapeutic mAbs that entered clinical study sponsored, at least in part, by commercial firms. The results presented provide an overview of the field and context for the evaluation of on-going and prospective mAb development programs. The expansion of therapeutic antibody use through supplemental marketing approvals and the increase in the study of therapeutics derived from alternative antibody formats are discussed.

The modeling of concepts from a cognitive perspective is important for designing spatial information systems that interoperate with human users. Concept representations that are built using geometric and topological conceptual space structures are well suited for semantic similarity and concept combination operations. In addition, concepts that are more closely grounded in the physical world, such as many spatial concepts, have a natural fit with the geometric structure of conceptual spaces. Despite these apparent advantages, conceptual spaces are underutilized because existing formalizations of conceptual space theory have focused on individual aspects of the theory rather than the creation of a comprehensive algebra. In this paper we present a metric conceptual space algebra that is designed to facilitate the creation of conceptual space knowledge bases and inferencing systems. Conceptual regions are represented as convex polytopes and context is built in as a fundamental element. We demonstrate the applicability of the algebra to spatial information systems with a proof-of-concept application.

Full Text Available Information system is a special kind of products which is depend upon great number variables related to nature, conditions during implementation and organizational clime and culture. Because that quality metrics of information system (QMIS has to reflect all previous aspects of information systems. In this paper are presented basic elements of QMIS, characteristics of implementation and operation metrics for IS, team - management quality metrics for IS and organizational aspects of quality metrics. In second part of this paper are presented results of study of QMIS in area of MIS (Management IS.

In joint work with Chen and Weber, the author has elsewhere shown that CP2#2(-CP2) admits an Einstein metric. The present paper presents a new and rather different proof of the existence of such an Einstein metric, using a variational approach which simultaneously casts new light on the related uniqueness problem. Our results include new existence theorems for extremal Kahler metrics, and these allow one to prove the above existence statement by deforming the Kahler-Einstein metric on CP2#3(-CP2) until bubbling-off occurs.

Node similarity is a significant property driving the growth of real networks. In this paper, based on the observed spreading results we apply the node similarity metrics to reconstruct propagation networks. We find that the reconstruction accuracy of the similarity metrics is strongly influenced by the infection rate of the spreading process. Moreover, there is a range of infection rate in which the reconstruction accuracy of some similarity metrics drops to nearly zero. In order to improve the similarity-based reconstruction method, we finally propose a temporal similarity metric to take into account the time information of the spreading. The reconstruction results are remarkably improved with the new method.

We provide further crucial support for a refined, area metric structure of spacetime. Based on the solution of conceptual issues, such as the consistent coupling of fermions and the covariant identification of radiation fields on area metric backgrounds, we show that the radiation-dominated epoch of area metric cosmology is equivalent to that epoch in standard Einstein cosmology. This ensures, in particular, successful nucleosynthesis. This surprising result complements the previously derived prediction of a small late-time acceleration of an area metric universe.

Propose metric to qualify the production conveyed through articles published in Professional Master's Programs and, from there, to establish guidance for the evaluation of postgraduate programs of Medicine III. Analysis of the documents of 2013 area graduate programs strict sense concerning the application and measurement of score the articles published, and creation of proposal for metric of the theme with the quadrennial review of Medicine III. Were evaluated the medicines area documents I, II and III; Biological Sciences (I) and Interdisciplinary, as well as the 2013 reports of CAPES. All programs establish metrics for "Classification of Published Articles" within its bibliographic production although with different percentages respecting its specificities. With these data collected and correlating their relevance with the surgical areas, was drafted proposal for quantification of the quality of the published articles to "Professional Postgraduate Programs" at surgical area that have specific characteristics according to their guidelines, directing their scientific production to technique journals preferably. The metric suggested for published articles, that should be included in intellectual production of the Area Document, should be considered for the extract A1 = 100 points; A2 = 85 points; B1 = 80 points; B2 = 70 points; B3 = 60 points; B4 = 40 points and B5 = 20 points. Propor métrica para qualificar a produção veiculada através de artigos publicados em programas de mestrado profissional e, a partir daí, estabelecer orientação para a avaliação dos programas de pós-graduação da Medicina III. Análise dos documentos de área de 2013 dos programas de pós-graduação senso estrito no que concerne à aplicação e mensuração de pontuação a artigos publicados, e criação de proposta para métrica do tema com vistas à avaliação quadrienal da Medicina III. Foram avaliados os documentos de área das Medicinas I, II e III; Ciências Biológicas I

Sixth-grade students and teachers were tested to determine students' metric achievement and their teachers' attitudes toward metric instruction after seven years of regular classroom instruction. Results were somewhat disappointing. (MNS)

"Bibliometrics", "scientometrics", "informetrics", and "webometrics" can all be considered as manifestations of a single research area with similar objectives and methods, which we call "information metrics" or iMetrics. This study explores the cognitive and social distinctness of iMetrics with resp

A metric representing a slow rotating object with quadrupole moment is obtained using the Newman-Janis formalism to include rotation into the weak limit of the Erez-Rosen metric. This metric is intended to tackle relativistic astrometry and gravitational lensing problems in which a quadrupole moment has to be taken into account.

The notation of fuzzy set field is introduced. A fuzzy metric is redefined on fuzzy set field and on arbitrary fuzzy set in a field. The metric redefined is between fuzzy points and constitutes both fuzziness and crisp property of vector. In addition, a fuzzy magnitude of a fuzzy point in a field is defined.

Designed to meet the job-related metric measurement needs of students in automotive merchandising and petroleum marketing classes, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know…

In this paper, we construct an invariant metric in the space of homogeneous polynomials of a given degree (≥ 3). The homogeneous polynomials specify a nonlinear symplectic map which in turn represents a Hamiltonian system. By minimizing the norm constructed out of this metric as a function of system parameters, we demonstrate that the performance of a nonlinear Hamiltonian system is enhanced.

In this paper we will prove that the only compact 4-manifold M with an Einstein metric of positive sectional curvature which is also hermitian with respect to some complex structure on M, is the complex projective plane CP^2, with its Fubini-Study metric.

Introducing adaptive metric has been shown to improve the results of distance-based classification algorithms. Existing methods are often computationally intensive, either in the training or in the classification phase. We present a novel algorithm that we call Cluster-Based Adaptive Metric (CLAM) c

We prove that, if a finite metric space is of strictly negative type, then its transfinite diameter is uniquely realized by the infinite extender (load vector). Finite metric spaces that have this property include all spaces on two, three, or four points, all trees, and all finite subspaces of Eu...

A new rotation version of the Curzon-Chazy metric is found. This new metric was obtained by means of a perturbation method, in order to include slow rotation. The solution is then proved to fulfill the Einstein field equations using a REDUCE program. Furthermore, the applications of this new solution are discussed.

Designed to meet the job-related metric measurement needs of offset printing press operation students, this instructional package is one of six for the communication media occupations cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…

Written by a team of leading experts in the field, this volume presents a self-contained account of the theory, techniques and results in metric type spaces (in particular in G-metric spaces); that is, the text approaches this important area of fixed point analysis beginning from the basic ideas of metric space topology. The text is structured so that it leads the reader from preliminaries and historical notes on metric spaces (in particular G-metric spaces) and on mappings, to Banach type contraction theorems in metric type spaces, fixed point theory in partially ordered G-metric spaces, fixed point theory for expansive mappings in metric type spaces, generalizations, present results and techniques in a very general abstract setting and framework. Fixed point theory is one of the major research areas in nonlinear analysis. This is partly due to the fact that in many real world problems fixed point theory is the basic mathematical tool used to establish the existence of solutions to problems which arise natur...

Introducing adaptive metric has been shown to improve the results of distance-based classification algorithms. Existing methods are often computationally intensive, either in the training or in the classification phase. We present a novel algorithm that we call Cluster-Based Adaptive Metric (CLAM)

htmlabstractWe define and study a notion of discrete homology theory for metric spaces. Instead of working with simplicial homology, our chain complexes are given by Lipschitz maps from an n n -dimensional cube to a fixed metric space. We prove that the resulting homology theory satisfies a

This report describes a program by which the Veterans Benefit Administration (VBA) can implement metrics to measure the performance of automated data systems and demonstrate that they are improving over time. It provides a definition of quality, particularly with regard to software. Requirements for management and staff to achieve a successful metrics program are discussed. It lists the attributes of high-quality software, then describes the metrics or calculations that can be used to measure these attributes in a particular system. Case studies of some successful metrics programs used by business are presented. The report ends with suggestions on which metrics the VBA should use and the order in which they should be implemented.

We examine three-dimensional metric deformations based on a tetrad transformation through the action the matrices of scalar field. We describe by this approach to deformation the results obtained by Coll et al. (Gen. Relativ. Gravit. 34:269, 2002), where it is stated that any three-dimensional metric was locally obtained as a deformation of a constant curvature metric parameterized by a 2-form. To this aim, we construct the corresponding deforming matrices and provide their classification according to the properties of the scalar and of the vector used in Coll et al. (Gen Relativ Gravit 34:269, 2002) to deform the initial metric. The resulting causal structure of the deformed geometries is examined, too. Finally we apply our results to a spherically symmetric three geometry and to a space sector of Kerr metric.

Full Text Available Software companies have to face serious problems about how to measure the progress of test activities and quality of software products in order to estimate test completion criteria, and if the shipment milestone will be reached on time. Measurement is a key activity in testing life cycle and requires established, managed and well documented test process, defined software quality attributes, quantitative measures, and using of test management and bug tracking tools. Test metrics are a subset of software metrics (product metrics, process metrics and enable the measurement and quality improvement of test process and/or software product. The goal of this paper is to briefly present Fabasoft best practices and lessons learned during functional and system testing of big complex software products, and to describe a simple test metrics model applied to the software test process with the purpose to better control software projects, measure and increase software quality.

A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.

The necessity of a theory of General Topology and, most of all, of Algebraic Topology on locally finite metric spaces comes from many areas of research in both Applied and Pure Mathematics: Molecular Biology, Mathematical Chemistry, Computer Science, Topological Graph Theory and Metric Geometry. In this paper we propose the basic notions of such a theory and some applications: we replace the classical notions of continuous function, homeomorphism and homotopic equivalence with the notions of NPP-function, NPP-local-isomorphism and NPP-homotopy (NPP stands for Nearest Point Preserving); we also introduce the notion of NPP-isomorphism. We construct three invariants under NPP-isomorphisms and, in particular, we define the fundamental group of a locally finite metric space. As first applications, we propose the following: motivated by the longstanding question whether there is a purely metric condition which extends the notion of amenability of a group to any metric space, we propose the property SN (Small Neighb...

Full Text Available This paper presents and discusses various metrics proposed for evaluation of polyphonic sound event detection systems used in realistic situations where there are typically multiple sound sources active simultaneously. The system output in this case contains overlapping events, marked as multiple sounds detected as being active at the same time. The polyphonic system output requires a suitable procedure for evaluation against a reference. Metrics from neighboring fields such as speech recognition and speaker diarization can be used, but they need to be partially redefined to deal with the overlapping events. We present a review of the most common metrics in the field and the way they are adapted and interpreted in the polyphonic case. We discuss segment-based and event-based definitions of each metric and explain the consequences of instance-based and class-based averaging using a case study. In parallel, we provide a toolbox containing implementations of presented metrics.

Dr. Geisler's far-reaching, unique book provides an encyclopedic compilation of the key metrics to measure and evaluate the impact of science and technology on academia, industry, and government. Focusing on such items as economic measures, patents, peer review, and other criteria, and supported by an extensive review of the literature, Dr. Geisler gives a thorough analysis of the strengths and weaknesses inherent in metric design, and in the use of the specific metrics he cites. His book has already received prepublication attention, and will prove especially valuable for academics in technology management, engineering, and science policy; industrial R&D executives and policymakers; government science and technology policymakers; and scientists and managers in government research and technology institutions. Geisler maintains that the application of metrics to evaluate science and technology at all levels illustrates the variety of tools we currently possess. Each metric has its own unique strengths and...

Simulation and bisimulation metrics for stochastic systems provide a quantitative generalization of the classical simulation and bisimulation relations. These metrics capture the similarity of states with respect to quantitative specifications written in the quantitative mu-calculus and related probabilistic logics. We show that game metrics, besides being logically characterized by the quantitative mu-calculus, also provide a bound for discounted and long-run average values of games. We then present algorithms for computing the metrics on Markov decision processes (MDPs), turn-based stochastic games, and concurrent games. For turn-based games and MDPs, we provide a polynomial-time algorithm for the computation of the one-step metric distance between states. The algorithm is based on linear programming. For concurrent games, we show that computing the exact distance between states is at least as hard as computing the value of concurrent reachability games and the square-root-sum problem in computational geome...

We construct the differential geometry of smooth manifolds equipped with an algebraic curvature map acting as an area measure. Area metric geometry provides a spacetime structure suitable for the discussion of gauge theories and strings, and is considerably more general than Lorentzian geometry. Our construction of geometrically relevant objects, such as an area metric compatible connection and derived tensors, makes essential use of a decomposition theorem due to Gilkey, showing that a general area metric is generated by a finite collection of metrics rather than by a single one. Employing curvature invariants for area metric manifolds we devise an entirely new class of gravity theories with inherently stringy character, and discuss gauge matter actions.

A basic assumption in traditional pattern matching is that the order of the elements in the given input strings is correct, while the description of the content, i.e. the description of the elements, may be erroneous. Motivated by questions that arise in Text Editing, Computational Biology, Bit Torrent and Video on Demand, and Computer Architecture, a new pattern matching paradigm was recently proposed by [2]. In this model, the pattern content remains intact, but the relative positions may change. Several papers followed the initial definition of the new paradigm. Each paper revealed new aspects in the world of string rearrangement metrics. This new unified view has already proven itself by enabling the solution of an open problem of the mathematician Cayley from 1849. It also gave better insight to problems that were already studied in different and limited situations, such as the behavior of different cost functions, and enabled deriving results for cost functions that were not yet sufficiently analyzed by previous research. At this stage, a general understanding of this new model is beginning to coalesce. The aim of this survey is to present an overview of this recent new direction of research, the problems, the methodologies, and the state-of-the-art.

There are as many unique and disparate manifestations of border systems as there are borders to protect. Border Security is a highly complex system analysis problem with global, regional, national, sector, and border element dimensions for land, water, and air domains. The complexity increases with the multiple, and sometimes conflicting, missions for regulating the flow of people and goods across borders, while securing them for national security. These systems include frontier border surveillance, immigration management and customs functions that must operate in a variety of weather, terrain, operational conditions, cultural constraints, and geopolitical contexts. As part of a Laboratory Directed Research and Development Project 08-684 (Year 1), the team developed a reference framework to decompose this complex system into international/regional, national, and border elements levels covering customs, immigration, and border policing functions. This generalized architecture is relevant to both domestic and international borders. As part of year two of this project (09-1204), the team determined relevant relative measures to better understand border management performance. This paper describes those relative metrics and how they can be used to improve border management systems.

The category of metric spaces is a subcategory of quasi-metric spaces. In this paper the notion of entropy for the continuous maps of a quasi-metric space is extended via spanning and separated sets. Moreover, two metric spaces that are associated to a given quasi-metric space are introduced and the

Hierarchical clustering is a popular method of performing unsupervised learning. Some metric must be used to determine the similarity between pairs of clusters in hierarchical clustering. Traditional similarity metrics either can deal with simple shapes (i.e. spherical shapes) only or are very sensitive to outliers (the chaining effect). The main contribution of this paper is to propose some potential-based similarity metrics (APES and AMAPES) between clusters in hierarchical clustering, inspired by the concepts of the electric potential and the gravitational potential in electromagnetics and astronomy. The main features of these metrics are: the first, they have strong antijamming capability; the second, they are capable of finding clusters of different shapes such as spherical, spiral, chain, circle, sigmoid, U shape or other complex irregular shapes; the third, existing algorithms and research fruits for classical metrics can be adopted to deal with these new potential-based metrics with no or little modification. Experiments showed that the new metrics are more superior to traditional ones. Different potential functions are compared, and the sensitivity to parameters is also analyzed in this paper.

Metric learning has attracted increasing attention due to its critical role in image analysis and classification. Conventional metric learning always assumes that the training and test data are sampled from the same or similar distribution. However, to build an effective distance metric, we need abundant supervised knowledge (i.e., side/label information), which is generally inaccessible in practice, because of the expensive labeling cost. In this paper, we develop a robust transfer metric learning (RTML) framework to effectively assist the unlabeled target learning by transferring the knowledge from the well-labeled source domain. Specifically, RTML exploits knowledge transfer to mitigate the domain shift in two directions, i.e., sample space and feature space. In the sample space, domain-wise and class-wise adaption schemes are adopted to bridge the gap of marginal and conditional distribution disparities across two domains. In the feature space, our metric is built in a marginalized denoising fashion and low-rank constraint, which make it more robust to tackle noisy data in reality. Furthermore, we design an explicit rank constraint regularizer to replace the rank minimization NP-hard problem to guide the low-rank metric learning. Experimental results on several standard benchmarks demonstrate the effectiveness of our proposed RTML by comparing it with the state-of-the-art transfer learning and metric learning algorithms.

Emerging metrics based on article-level does not exclude traditional metrics based on citations to the journal, but complements them. Both can be employed in conjunction to offer a richer picture of an article use from immediate to long terms. Article-level metrics (ALM) is the result of the aggregation of different data sources and the collection of content from multiple social network services. Sources used for the aggregation can be broken down into five categories: usage, captures, mentions, social media and citations. Data sources depend on the tool, but they include classic metrics indicators based on citations, academic social networks (Mendeley, CiteULike, Delicious) and social media (Facebook, Twitter, blogs, or Youtube, among others). Altmetrics is not synonymous with alternative metrics. Altmetrics are normally early available and allow to assess the social impact of scholarly outputs, almost at the real time. This paper overviews briefly the meaning of altmetrics and describes some of the existing tools used to apply this new metrics: Public Library of Science--Article-Level Metrics, Altmetric, Impactstory and Plum.

Emerging metrics based on article-level does not exclude traditional metrics based on citations to the journal, but complements them. Both can be employed in conjunction to offer a richer picture of an article use from immediate to long terms. Article-level metrics (ALM) is the result of the aggregation of different data sources and the collection of content from multiple social network services. Sources used for the aggregation can be broken down into five categories: usage, captures, mentions, social media and citations. Data sources depend on the tool, but they include classic metrics indicators based on citations, academic social networks (Mendeley, CiteULike, Delicious) and social media (Facebook, Twitter, blogs, or Youtube, among others). Altmetrics is not synonymous with alternative metrics. Altmetrics are normally early available and allow to assess the social impact of scholarly outputs, almost at the real time. This paper overviews briefly the meaning of altmetrics and describes some of the existing tools used to apply this new metrics: Public Library of Science - Article-Level Metrics, Altmetric, Impactstory and Plum. PMID:26110028

Full Text Available We introduce the notion of metric entropy for a nonautonomous dynamical system given by a sequence (Xn; μn of probability spaces and a sequence of measurable maps fn : Xn → Xn+1 with fnμn = μn+1. This notion generalizes the classical concept of metric entropy established by Kolmogorov and Sinai, and is related via a variational inequality to the topological entropy of nonautonomous systems as defined by Kolyada, Misiurewicz, and Snoha. Moreover, it shares several properties with the classical notion of metric entropy. In particular, invariance with respect to appropriately defined isomorphisms, a power rule, and a Rokhlin-type inequality are proved

If a finite metric space is of strictly negative type then its transfinite diameter is uniquely realized by an infinite extent (“load vector''). Finite metric spaces that have this property include all trees, and all finite subspaces of Euclidean and Hyperbolic spaces. We prove that if the distan...... matrix of a finite metric space is both hypermetric and regular, then it is of strictly negative type. We show that the strictly negative type finite subspaces of spheres are precisely those which do not contain two pairs of antipodal points....

A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

In this note we show how the Quantum Information Metric can be computed holographically using a perturbative approach. In particular when the deformation of the conformal field theory state is induced by a scalar operator the corresponding bulk configuration reduces to a scalar field perturbatively probing the unperturbed background. We study two concrete examples: a CFT ground state deformed by a primary operator and thermofield double state in $d=2$ deformed by a marginal operator. Finally, we generalize the bulk construction to the case of a multi dimensional parameter space and show that the Quantum Information Metric coincides with the metric of the non-linear sigma model for the corresponding scalar fields.

An obstruction to the implementation of spatially flat Painleve-Gullstrand(PG) slicings is demonstrated, and explicitly discussed for Reissner-Nordstrom and Schwarzschild-anti-deSitter spacetimes. Generalizations of PG slicings which are not spatially flat but which remain regular at the horizons are introduced. These metrics can be obtained from standard spherically symmetric metrics by physical Lorentz boosts. With these generalized PG metrics, problematic contributions to the imaginary part of the action in the Parikh-Wilczek derivation of Hawking radiation due to the obstruction can be avoided.

Sigma metrics can be used to predict assay quality, allowing easy comparison of instrument quality and predicting which tests will require minimal quality control (QC) rules to monitor the performance of the method. A Six Sigma QC program can result in fewer controls and fewer QC failures for methods with a sigma metric of 5 or better. The higher the number of methods with a sigma metric of 5 or better, the lower the costs for reagents, supplies, and control material required to monitor the performance of the methods.

According to the properties of Firey combination,we first introduce the p-Hausdorff metric,which coincides with the well-known Hausdorff metric in the case p = 1.Then we give two important results on the p-Hausdorff metric.%根据Firey组合的属性,引入p-Hausdorff度量,特别地,当p=1时,p-Hausdorff度量就是著名的Hausdorff度量.进一步运用凸几何分析理论证明关于p-Hausdorff度量的2个重要结论.

Classical optics has focused grounded in the design of imaging systems in which it has reached a high level of development. However, classical optics solutions to the problems of light energy transfer are only appropriate when the light rays are paraxial. The paraxial condition is not met in most applications for the concentration and illumination. Nonimaging optics eliminates the constraints of image formation and solves problems of efficient light transfer. Moreover, in general, these new o...

Software defects rediscovered by a large number of customers affect various stakeholders and may: 1) hint at gaps in a software manufacturer's Quality Assurance (QA) processes, 2) lead to an over-load of a software manufacturer's support and maintenance teams, and 3) consume customers' resources, leading to a loss of reputation and a decrease in sales. Quantifying risk associated with the rediscovery of defects can help all of these stake-holders. In this chapter we present a set of metrics needed to quantify the risks. The metrics are designed to help: 1) the QA team to assess their processes; 2) the support and maintenance teams to allocate their resources; and 3) the customers to assess the risk associated with using the software product. The paper includes a validation case study which applies the risk metrics to industrial data. To calculate the metrics we use mathematical instruments like the heavy-tailed Kappa distribution and the G/M/k queuing model.

A promising way to introduce general relativity in the classroom is to study the physical predictions that follow from certain given metrics, such as the Schwarzschild one. This involves lower mathematical expenditure than an approach focusing on differential geometry in its full glory and permits to emphasize physical aspects before attacking the field equations. Even so, in terms of motivation, lacking justification of the metric employed may pose an obstacle. The paper discusses how to establish the weak-field limit of the Schwarzschild metric with a minimum of relatively simple physical assumptions. Since this does not appear sufficient to arrive at a form of the metric useful for more than the most basic predictions (gravitational redshift), the determination of a single additional parameter from experiment is admitted. An attractive experimental candidate is the measurement of the perihelion precession of Mercury, because the result was already known before the completion of general relativity. It is sh...

National Aeronautics and Space Administration — eSky will develop specific crew state metrics based on the timeliness, tempo and accuracy of pilot inputs required by the H-mode Flight Control System (HFCS)....

Full Text Available This study investigates how Brazilian advertisers are adapting to new media and its attention metrics. In-depth interviews were conducted with advertisers in 2009 and 2011. In 2009, new media and its metrics were celebrated as innovations that would increase advertising campaigns overall efficiency. In 2011, this perception has changed: New media’s profusion of metrics, once seen as an advantage, started to compromise its ease of use and adoption. Among its findings, this study argues that there is an opportunity for media groups willing to shift from a product-focused strategy towards a customer-centric one, through the creation of new, simple and integrative metrics.

Software engineering activities in the Industry has come a long way with various improve- ments brought in various stages of the software development life cycle. The complexity of modern software, the commercial constraints and the expectation for high quality products demand the accurate fault prediction based on OO design metrics in the class level in the early stages of software development. The object oriented class metrics are used as quality predictors in the entire OO software development life cycle even when a highly iterative, incremental model or agile software process is employed. Recent research has shown some of the OO design metrics are useful for predicting fault-proneness of classes. In this paper the empirical validation of a set of metrics proposed by Chidamber and Kemerer is performed to assess their ability in predicting the software quality in terms of fault proneness and degradation. We have also proposed the design complexity of object-oriented software with Weighted Methods per Class m...

U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services has compiled aggregate national benchmark cost and workload metrics using data submitted to CMS by the AB MACs and the...

National Park Service, Department of the Interior — NPScape conservation status metrics are calculated using data from the USGS Gap Analysis Program (PAD-US), World Protected Areas Database (WDPA), and National Marine...

Traffic engineering must be concerned with a broad definition of service that includes network availability, reliability and stability, as well as traditional traffic data on loss, throughput, delay and jitter. MPLS and Virtual Private Networks (VPNs) significantly contribute to security and Quality of Service (QoS) within communication networks, but there remains a need for metric measurement and evaluation. The purpose of this paper is to propose a methodology which gives a measure for LSP ( Lfew abel Switching Paths) metrics in VPN MPLS networks. We propose here a statistical method for the evaluation of those metrics. Statistical methodology is very important in this type of study since there is a large amount of data to consider. We use the notions of sample surveys, self-similar processes, linear regression, additive models and bootstrapping. The results obtained allows us to estimate the different metrics for such SLAs.

Full Text Available We describe how several metrics are possible in thermodynamic state space but that only one, Weinhold’s, has achieved widespread use. Lengths calculated based on this metric have been used to bound dissipation in finite-time (irreversible processes be they continuous or discrete, and described in the energy picture or the entropy picture. Examples are provided from thermodynamics of heat conversion processes as well as chemical reactions. Even losses in economics can be bounded using a thermodynamic type metric. An essential foundation for the metric is a complete equation of state including all extensive variables of the system; examples are given. Finally, the second law of thermodynamics imposes convexity on any equation of state, be it analytical or empirical.

We prove that there are asymptotically anti-de Sitter Einstein metrics with prescribed conformal infinity. More precisely we show that, given any suitably small perturbation $\\hat g$ of the conformal metric of the $(n+1)$-dimensional anti-de Sitter space at timelike infinity, which is given by the canonical Lorentzian metric on the $n$-dimensional cylinder, there is a Lorentzian Einstein metric on $(-T,T)\\times \\mathbb{B}^n$ whose conformal geometry is given by $\\hat g$. This is a Lorentzian counterpart of the Graham-Lee theorem in Riemannian geometry and is motivated by the holographic prescription problem in the context of the AdS/CFT correspondence in string theory.

Bearing the thermodynamic arguments together with the two definitions of mass in mind, we try to find metrics with spherical symmetry. We consider the adiabatic condition along with the Gong-Wang mass, and evaluate the $g_{rr}$ element which points to a null hypersurface. In addition, we generalize the thermodynamics laws to this hypersurface to find its temperature and thus the corresponding surface gravity which enables us to get a relation for the $g_{tt}$ element. Finally, we investigate the mathematical and physical properties of the discovered metric in the Einstein relativity framework which shows that the primary mentioned null hypersurface is an event horizon. We also show that if one considers the Misner-Sharp mass in the calculations, the Schwarzschild metric will be got. The relationship between the two mass definitions in each metric is studied. The results of considering the geometrical surface gravity are also addressed.

Development of successful business models has become a necessity in turbulent business environments, but compared to research on business modeling tools, attention to the role of metrics in designing business models in literature is limited. Building on existing approaches to business models and performance measurement literature, we develop a generic open repository of metrics related to core business model concepts. We validate and assess the practical value of the repository based on four ...

Above Planck energies, the spacetime might become non--Riemannian, as it is known fron string theory and inflation. Then geometries arise in which nonmetricity and torsion appear as field strengths, side by side with curvature. By gauging the affine group, a metric affine gauge theory emerges as dynamical framework. Here, by using the harmonic map ansatz, a new class of multipole like solutions in the metric affine gravity theory (MAG) is obtained.

We develop the metric theory of Diophantine approximation on homogeneous varieties of semisimple algebraic groups and prove results analogous to the classical Khinchin and Jarnik theorems. In full generality our results establish simultaneous Diophantine approximation with respect to several completions, and Diophantine approximation over general number fields using S-algebraic integers. In several important examples, the metric results we obtain are optimal. The proof uses quantitative equidistribution properties of suitable averaging operators, which are derived from spectral bounds in automorphic representations.

In this paper we address two problems concerning a family of domains $M_{\\Omega}(\\mu) \\subset \\C^n$, called Cartan-Hartogs domains, endowed with a natural Kaehler metric $g(\\mu)$. The first one is determining when the metric $g(\\mu)$ is extremal (in the sense of Calabi), while the second one studies when the coefficient $a_2$ in the Engli\\v{s} expansion of Rawnsley $\\epsilon$-function associated to $g(\\mu)$ is constant.

Continual improvement of business processes requires, apart from other efforts, to develop effective metrics, by which managers and/or process engineers will be able to manage the organization's growth. Obviously, there are plenty measures that can be taken to optimize processes. Once effective metrics are identified, the assessment team should do what works best for them. In this paper, an organizational “centralization” or “decentralization” is a matter of interest. The dichotomous term “ce...

Any traditional engineering field has metrics to rigorously assess the quality of their products. Engineers know that the output must satisfy the requirements, must comply with the production and market rules, and must be competitive. Professionals in the new field of software engineering started a few years ago to define metrics to appraise their product: individual programs and software systems. This concern motivates the need to assess not only the outcome but also the process and tools em...

The report presents software metrics and porting metrics for the GGT Waveform. The porting was from a ground-based COTS SDR, the SDR-3000, to the CoNNeCT JPL SDR. The report does not address any of the Operating Environment (OE) software development, nor the original TDRSS waveform development at GSFC for the COTS SDR. With regard to STRS, the report presents compliance data and lessons learned.

The choice of metric critically affects the performance of classification and clustering algorithms. Metric learning algorithms attempt to improve performance, by learning a more appropriate metric. Unfortunately, most of the current algorithms learn a distance function which is not invariant to rigid transformations of images. Therefore, the distances between two images and their rigidly transformed pair may differ, leading to inconsistent classification or clustering results. We propose to constrain the learned metric to be invariant to the geometry preserving transformations of images that induce permutations in the feature space. The constraint that these transformations are isometries of the metric ensures consistent results and improves accuracy. Our second contribution is a dimension reduction technique that is consistent with the isometry constraints. Our third contribution is the formulation of the isometry constrained logistic discriminant metric learning (IC-LDML) algorithm, by incorporating the isometry constraints within the objective function of the LDML algorithm. The proposed algorithm is compared with the existing techniques on the publicly available labeled faces in the wild, viewpoint-invariant pedestrian recognition, and Toy Cars data sets. The IC-LDML algorithm has outperformed existing techniques for the tasks of face recognition, person identification, and object classification by a significant margin.

Let (A,H,D) be a spectral triple, namely: A is a C*-algebra, H is a Hilbert space on which A acts and D is a selfadjoint operator with compact resolvent such that the set of elements of A having a bounded commutator with D is dense. A spectral metric space, the noncommutative analog of a complete metric space, is a spectral triple (A,H,D) with additional properties which guaranty that the Connes metric induces the weak*-topology on the state space of A. A *-automorphism respecting the metric defined a dynamical system. This article gives various answers to the question: is there a canonical spectral triple based upon the crossed product algebra AxZ, characterizing the metric properties of the dynamical system ? If $\\alpha$ is the noncommutative analog of an isometry the answer is yes. Otherwise, the metric bundle construction of Connes and Moscovici is used to replace (A,$\\alpha$) by an equivalent dynamical system acting isometrically. The difficulties relating to the non compactness of this new system are di...

The learning of appropriate distance metrics is a critical problem in image classification and retrieval. In this work, we propose a boosting-based technique, termed \\BoostMetric, for learning a Mahalanobis distance metric. One of the primary difficulties in learning such a metric is to ensure that the Mahalanobis matrix remains positive semidefinite. Semidefinite programming is sometimes used to enforce this constraint, but does not scale well. \\BoostMetric is instead based on a key observation that any positive semidefinite matrix can be decomposed into a linear positive combination of trace-one rank-one matrices. \\BoostMetric thus uses rank-one positive semidefinite matrices as weak learners within an efficient and scalable boosting-based learning process. The resulting method is easy to implement, does not require tuning, and can accommodate various types of constraints. Experiments on various datasets show that the proposed algorithm compares favorably to those state-of-the-art methods in terms of classi...

Given a locally compact Polish space X, a necessary and sufficient condition for a group G of homeomorphisms of X to be the full isometry group of (X,d) for some proper metric d on X is given. It is shown that every locally compact Polish group G acts freely on GxY as the full isometry group of GxY with respect to a certain proper metric on GxY, where Y is an arbitrary locally compact Polish space with (card(G),card(Y)) different from (1,2). Locally compact Polish groups which act effectively and almost transitively on complete metric spaces as full isometry groups are characterized. Locally compact Polish non-Abelian groups on which every left invariant metric is automatically right invariant are characterized and fully classified. It is demonstrated that for every locally compact Polish space X having more than two points the set of proper metrics d such that Iso(X,d) = {id} is dense in the space of all proper metrics on X.

Finsler spacetimes have become increasingly popular within the theoretical physics community over the last two decades. However, because physicists need to use pseudo-Finsler structures to describe propagation of signals, there will be nonzero null vectors in both the tangent and cotangent spaces — this causes significant problems in that many of the mathematical results normally obtained for "usual" (Euclidean signature) Finsler structures either do not apply, or require significant modifications to their formulation and/or proof. We shall first provide a few basic definitions, explicitly demonstrating the interpretation of bi-metric theories in terms of pseudo-Finsler norms. We shall then discuss the tricky issues that arise when trying to construct an appropriate pseudo-Finsler metric appropriate to bi-metric spacetimes. Whereas in Euclidian signature the construction of the Finsler metric typically fails only at the zero vector, in Lorentzian signature the Finsler metric is typically ill-defined on the entire null cone. Consequently it is not a good idea to try to encode bi-metricity into pseudo-Finsler geometry. One has to be very careful when applying the concept of pseudo-Finsler geometry in physics.

Although many studies have demonstrated the utility of airborne lidar for forest inventory, the acquisition and processing of the data can be cost prohibitive for small areas. In such cases, it may be possible to emulate lidar metrics using more affordable optical data. This study explored processing methods for predicting lidar metrics using SPOT-5 textural data. Multiple-linear regression (MLR) was compared with non-linear machine learning techniques including multi-layer perceptron (MLP) artificial neural networks (ANN), rational basis function (RBF) ANN and regression tree (RT). For this purpose, 11 grey level co-occurrence matrix (GLCM) indices were calculated for bands, band ratios and principal components (PCs) of SPOT-5 multispectral image. SPOT-5 metrics were correlated with 25 lidar metrics collected over a Pinus radiata plantation. After dimensionality reduction, random forest feature selection was applied to select the most relevant SPOT-5 textural attributes for inferring each lidar metric. The results showed that the non-linear methods including MLP and RBF methods are more promising for modelling lidar metrics using SPOT-5 data than MLR and RT.

This study examines the impacts of emissions from aviation in six source regions on global and regional temperatures. We consider the NOx-induced impacts on ozone and methane, aerosols and contrail-cirrus formation and calculate the global and regional emission metrics global warming potential (GWP), global temperature change potential (GTP) and absolute regional temperature change potential (ARTP). The GWPs and GTPs vary by a factor of 2-4 between source regions. We find the highest aviation aerosol metric values for South Asian emissions, while contrail-cirrus metrics are higher for Europe and North America, where contrail formation is prevalent, and South America plus Africa, where the optical depth is large once contrails form. The ARTP illustrate important differences in the latitudinal patterns of radiative forcing (RF) and temperature response: the temperature response in a given latitude band can be considerably stronger than suggested by the RF in that band, also emphasizing the importance of large-scale circulation impacts. To place our metrics in context, we quantify temperature change in four broad latitude bands following 1 year of emissions from present-day aviation, including CO2. Aviation over North America and Europe causes the largest net warming impact in all latitude bands, reflecting the higher air traffic activity in these regions. Contrail cirrus gives the largest warming contribution in the short term, but remain important at about 15 % of the CO2 impact in several regions even after 100 years. Our results also illustrate both the short- and long-term impacts of CO2: while CO2 becomes dominant on longer timescales, it also gives a notable warming contribution already 20 years after the emission. Our emission metrics can be further used to estimate regional temperature change under alternative aviation emission scenarios. A first evaluation of the ARTP in the context of aviation suggests that further work to account for vertical sensitivities

An analytical expression for the nonlinear refractive index of graphene has been derived and used to obtain the performance metrics of third order nonlinear devices using graphene as a nonlinear medium. None of the metrics is found to be superior to the existing nonlinear optical materials.

This article suggests the utilization of thermometers constructed from optical fibers technology in overhead power transmission lines. This will help the studies in progress about the electric power systems ampacity. It also presents the advantages of using optical fibers instead of conventional thermometers. As an example, the fact that the optical fibers are dielectric and make possible the remote monitoring. It is also shown a possibility of introducing such measuring system in electric junctions where OPGW cables are used 4 refs., 4 figs.; e-mail: liliana at fis.puc-rio.br

Recently, the implementation of optical systems has been possible through the utilization of the existent transmission lines structure by the sharing between electric power and telecommunications enterprises, using Op-GW cables instead of conventional lightning-arresters cables. In order to make such optical sharing feasible, the enterprise ALCOA Aluminio S.A is developing the energized lightning-arrester system with optical fiber in it. This work presents such system and show its great advantages specially when implemented in low population density regions in order to supply electric power demand at lower costs 6 refs., 5 figs.

Full text. We show how to simulate the equatorial section of the Schwarzschild metric through a flowing liquid crystal in its nematic phase. Inside a liquid crystal in the nematic phase, a traveling light ray feels an effective metric, whose properties are linked to perpendicular and parallel refractive indexes, no e ne respectively, of the rod-like molecule of the liquid crystal. As these indexes depend on the scalar order parameter of the liquid crystal, the Beris-Edwards hydrodynamic theory is used to connect the order parameter with the velocity of a liquid crystal flow at each point. This way we calculate a radial velocity profile that simulates the equatorial section of the Schwarzschild metric in the nematic phase of the liquid crystal. This work will be presented in the following way. First, we show the effective metric that describes the light propagation around a (k = 1; c = 0) disclination defect of the nematic phase of a liquid crystalline sample and how this light propagation can be described by the order parameter q of the liquid crystalline material. Afterwards, we consider the liquid crystal flowing radially and we use the Beris-Edwards theory to analyze the dependence of the order parameter of the material with the flowing velocity module. In these two cases we consider the more general situation of three space dimensions. Finally, we employ the result from the second part in the first and we compare with the Schwarzschild metric written in isotropic coordinates. (author)

Between 2013 and 2014, the Centers for Medicare and Medicaid Services and the National Cardiovascular Data Registry publically reported risk-adjusted 30-day readmission rates after percutaneous coronary intervention (PCI) as a pilot project. A key strength of this public reporting effort included risk adjustment with clinical rather than administrative data. Furthermore, because readmission after PCI is common, expensive, and preventable, this metric has substantial potential to improve quality and value in American cardiology care. Despite this, concerns about the metric exist. For example, few PCI readmissions are caused by procedural complications, limiting the extent to which improved procedural technique can reduce readmissions. Also, similar to other readmission measures, PCI readmission is associated with socioeconomic status and race. Accordingly, the metric may unfairly penalize hospitals that care for underserved patients. Perhaps in the context of these limitations, Centers for Medicare and Medicaid Services has not yet included PCI readmission among metrics that determine Medicare financial penalties. Nevertheless, provider organizations may still wish to focus on this metric to improve value for cardiology patients. PCI readmission is associated with low-risk chest discomfort and patient anxiety. Therefore, patient education, improved triage mechanisms, and improved care coordination offer opportunities to minimize PCI readmissions. Because PCI readmission is common and costly, reducing PCI readmission offers provider organizations a compelling target to improve the quality of care, and also performance in contracts involve shared financial risk.

As data centers proliferate in both size and number, their energy efficiency is becoming increasingly important. We discuss the properties of a number of the proposed metrics of energy efficiency and productivity. In particular, we focus on the Data Center Energy Productivity (DCeP) metric, which is the ratio of useful work produced by the data center to the energy consumed performing that work. We describe our approach for using DCeP as the principal outcome of a designed experiment using a highly instrumented, high performance computing data center. We found that DCeP was successful in clearly distinguishing between different operational states in the data center, thereby validating its utility as a metric for identifying configurations of hardware and software that would improve (or even maximize) energy productivity. We also discuss some of the challenges and benefits associated with implementing the DCeP metric, and we examine the efficacy of the metric in making comparisons within a data center and among data centers.

Cleanrooms are among the most energy-intensive types of facilities. This is primarily due to the cleanliness requirements that result in high airflow rates and system static pressures, as well as process requirements that result in high cooling loads. Various studies have shown that there is a wide range of cleanroom energy efficiencies and that facility managers may not be aware of how energy efficient their cleanroom facility can be relative to other cleanroom facilities with the same cleanliness requirements. Metrics and benchmarks are an effective way to compare one facility to another and to track the performance of a given facility over time. This article presents the key metrics and benchmarks that facility managers can use to assess, track, and manage their cleanroom energy efficiency or to set energy efficiency targets for new construction. These include system-level metrics such as air change rates, air handling W/cfm, and filter pressure drops. Operational data are presented from over 20 different cleanrooms that were benchmarked with these metrics and that are part of the cleanroom benchmark dataset maintained by Lawrence Berkeley National Laboratory (LBNL). Overall production efficiency metrics for cleanrooms in 28 semiconductor manufacturing facilities in the United States and recorded in the Fabs21 database are also presented.

Full Text Available In the software measurement validations, assessing the validation of software metrics in software engineering is a very difficult task due to lack of theoretical methodology and empirical methodology [41, 44, 45]. During recent years, there have been a number of researchers addressing the issue of validating software metrics. At present, software metrics are validated theoretically using properties of measures. Further, software measurement plays an important role in understanding and controlling software development practices and products. The major requirement in software measurement is that the measures must represent accurately those attributes they purport to quantify and validation is critical to the success of software measurement. Normally, validation is a collection of analysis and testing activities across the full life cycle and complements the efforts of other quality engineering functions and validation is a critical task in any engineering project. Further, validation objective is to discover defects in a system and assess whether or not the system is useful and usable in operational situation. In the case of software engineering, validation is one of the software engineering disciplines that help build quality into software. The major objective of software validation process is to determine that the software performs its intended functions correctly and provides information about its quality and reliability. This paper discusses the validation methodology, techniques and different properties of measures that are used for software metrics validation. In most cases, theoretical and empirical validations are conducted for software metrics validations in software engineering [1-50].

Full Text Available An enhanced quantum-based image fidelity metric, the QIFM metric, is proposed as a tool to assess the “congruity” between two or more quantum images. The often confounding contrariety that distinguishes between classical and quantum information processing makes the widely accepted peak-signal-to-noise-ratio (PSNR ill-suited for use in the quantum computing framework, whereas the prohibitive cost of the probability-based similarity score makes it imprudent for use as an effective image quality metric. Unlike the aforementioned image quality measures, the proposed QIFM metric is calibrated as a pixel difference-based image quality measure that is sensitive to the intricacies inherent to quantum image processing (QIP. As proposed, the QIFM is configured with in-built non-destructive measurement units that preserve the coherence necessary for quantum computation. This design moderates the cost of executing the QIFM in order to estimate congruity between two or more quantum images. A statistical analysis also shows that our proposed QIFM metric has a better correlation with digital expectation of likeness between images than other available quantum image quality measures. Therefore, the QIFM offers a competent substitute for the PSNR as an image quality measure in the quantum computing framework thereby providing a tool to effectively assess fidelity between images in quantum watermarking, quantum movie aggregation and other applications in QIP.

A mainstream procedure to analyze the wealth of genomic data available nowadays is the detection of homologous regions shared across genomes, followed by the extraction of biological information from the patterns of conservation and variation observed in such regions. Although of pivotal importance, comparative genomic procedures that rely on homology inference are obviously not applicable if no homologous regions are detectable. This fact excludes a considerable portion of "genomic dark matter" with no significant similarity - and, consequently, no inferred homology to any other known sequence - from several downstream comparative genomic methods. In this review we compile several sequence metrics that do not rely on homology inference and can be used to compare nucleotide sequences and extract biologically meaningful information from them. These metrics comprise several compositional parameters calculated from sequence data alone, such as GC content, dinucleotide odds ratio, and several codon bias metrics. They also share other interesting properties, such as pervasiveness (patterns persist on smaller scales) and phylogenetic signal. We also cite examples where these homology-independent metrics have been successfully applied to support several bioinformatics challenges, such as taxonomic classification of biological sequences without homology inference. They where also used to detect higher-order patterns of interactions in biological systems, ranging from detecting coevolutionary trends between the genomes of viruses and their hosts to characterization of gene pools of entire microbial communities. We argue that, if correctly understood and applied, homology-independent metrics can add important layers of biological information in comparative genomic studies without prior homology inference.

We introduce a "W*"-metric space, which is a particular approach to non-commutative metric spaces where a "quantum metric" is defined on a von Neumann algebra. We generalize the notion of a quantum code and quantum error correction to the setting of finite dimensional "W*"-metric spaces, which includes codes and error correction for classical…

Full Text Available We prove some common fixed-point theorems for the ordered g-weak contractions in cone rectangular metric spaces without assuming the normality of cone. Our results generalize some recent results from cone metric and cone rectangular metric spaces into ordered cone rectangular metric spaces. Examples are provided which illustrate the results.

We calculate some Wilson loop functionals in a static sphere-symmetrical diagonal metric field and a gravitational metric field established by a cosmic string. Using the direction change of vector when it is parallel transported in the metric field of cosmic string, the cone symmetry of the metric field is shown.

... 34 Education 1 2010-07-01 2010-07-01 false Metric system of measurement. 74.15 Section 74.15... Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205) declares that the metric system is the preferred measurement system for...

This article performs a comparison among the various alternatives for implantation of an optical system in existing transmission lines. The work takes into consideration the substitution of lightning conductors by OPGW cables, the installation of self-sustained cables, the using of spined optical cables, the installation of a new transmission line and a case study for the 500 kV section between the Tucurui and Presidente Dutra substations.

The majority of measures proposed to date for direct curve comparison in bioequivalence studies were investigated. These measures have often been called metrics, but in most cases this was incorrect in the mathematical sense. It was demonstrated, with a set of counter-examples, that the axioms of a metric are fulfilled only for the integral p-metric and some of its transforms. The Rescigno index and two other measures devised by Polli and McLean are the semi-metrics, lacking the triangle inequality, while others also lack symmetry. The use of the p-metric is therefore recommended, and statistical analysis is suggested as a point at which the scaling of differences might be carried out.

Real-vacuum single Kerr-Schild (ISKS) metrics are discussed and new results proved. It is shown that if they Weyl tensor of such a metric has a twist-free expanding principal null direction, then it belongs to the Schwarzchild family of metrics-there are no Petrov type-II Robinson-Trautman metrics of Kerr-Schild type. If such a metric has twist then it belongs either to the Kerr family or else its Weyl tensor is of Petrov type II. The main part of the paper is concerned with complexified versions of Kerr-Schild metrics. The general real ISKS metric is written in double Kerr-Schild (IDKS) form. The H and l potentials which generate IDKS metrics are determined for the general vacuum ISKS metric and given explicitly for the Schwarzchild and Kerr families of metrics.

Static analysis tools that are used for worst-case execution time (WCET) analysis of real-time software just provide partial information on an analyzed program. Only the longest-executing path, which currently determines the WCET bound is indicated to the programmer. This limited view can prevent...... a programmer (or compiler) from targeting optimizations the right way. A possible resort is to use a metric that targets WCET and which can be efficiently computed for all code parts of a program. Similar to dynamic profiling techniques, which execute code with input that is typically expected...... to estimate the Criticality metric, by relaxing the precision of WCET analysis. Through this, we can reduce analysis time by orders of magnitude, while only introducing minor error. To evaluate our estimation approach and share our garnered experience using the metric, we evaluate real-time programs, which...

Full Text Available We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general) to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services) to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

We introduce in this paper a metric learning approach for automatic sleep stage classification based on single-channel EEG data. We show that learning a global metric from training data instead of using the default Euclidean metric, the k-nearest neighbor classification rule outperforms state-of-the-art methods on Sleep-EDF dataset with various classification settings. The overall accuracy for Awake/Sleep and 4-class classification setting are 98.32% and 94.49% respectively. Furthermore, the superior accuracy is achieved by performing classification on a low-dimensional feature space derived from time and frequency domains and without the need for artifact removal as a preprocessing step.

Most existing distance metric learning methods assume perfect side information that is usually given in pairwise or triplet constraints. Instead, in many real-world applications, the constraints are derived from side information, such as users' implicit feedbacks and citations among articles. As a result, these constraints are usually noisy and contain many mistakes. In this work, we aim to learn a distance metric from noisy constraints by robust optimization in a worst-case scenario, to which we refer as robust metric learning. We formulate the learning task initially as a combinatorial optimization problem, and show that it can be elegantly transformed to a convex programming problem. We present an efficient learning algorithm based on smooth optimization [7]. It has a worst-case convergence rate of O(1/{\\surd}{\\varepsilon}) for smooth optimization problems, where {\\varepsilon} is the desired error of the approximate solution. Finally, our empirical study with UCI data sets demonstrate the effectiveness of ...

The energy of a particle moving on a spacetime, in principle, can affect the background metric. The modifications to it depend on the ratio of energy of the particle and the Planck energy, known as rainbow gravity. Here we find the explicit expressions for the coordinate transformations from rainbow Minkowski spacetime to accelerated frame. The corresponding metric is also obtained which we call as rainbow-Rindler metric. So far we are aware of, no body has done it in a concrete manner. Here this is found from the first principle and hence all the parameters are properly identified. The advantage of this is that the calculated Unruh temperature is compatible with the Hawking temperature of the rainbow black hole horizon, obtained earlier. Since the accelerated frame has several importance in revealing various properties of gravity, we believe that the present result will not only fill that gap, but also help to explore different aspects of rainbow gravity paradigm.

The Department of Energy (DOE) Fuel Cycle Research and Development (FCRD) Advanced Fuels Campaign (AFC) is conducting research and development on enhanced Accident Tolerant Fuels (ATF) for light water reactors (LWRs). This mission emphasizes the development of novel fuel and cladding concepts to replace the current zirconium alloy-uranium dioxide (UO2) fuel system. The overall mission of the ATF research is to develop advanced fuels/cladding with improved performance, reliability and safety characteristics during normal operations and accident conditions, while minimizing waste generation. The initial effort will focus on implementation in operating reactors or reactors with design certifications. To initiate the development of quantitative metrics for ATR, a LWR Enhanced Accident Tolerant Fuels Metrics Development Workshop was held in October 2012 in Germantown, MD. This paper summarizes the outcome of that workshop and the current status of metrics development for LWR ATF.

The use of robotics or unmanned systems offers significant benefits to the military user by enhancing mobility, logistics, material handling, command and control, reconnaissance, and protection. The evaluation and selection process for the procurement of an unmanned robotic system involves comparison of performance and physical characteristics such as operating environment, application, payloads and performance criteria. Testing an unmanned system for operation in an unstructured environment using emerging technologies, which have not yet been fully tested, presents unique challenges for the testing community. Standard metrics, test procedures, terminologies, and methodologies simplify comparison of different systems. A procedure was developed to standardize the test and evaluation process for UGVs. This procedure breaks the UGV into three components: the platform, the payload, and the command and control link. Standardized metrics were developed for these components which permit unbiased comparison of different systems. The development of these metrics and their application will be presented.

We introduce Riemannian First-Passage Percolation (Riemannian FPP) as a new model of random differential geometry, by considering a random, smooth Riemannian metric on $\\mathbb R^d$. We are motivated in our study by the random geometry of first-passage percolation (FPP), a lattice model which was developed to model fluid flow through porous media. By adapting techniques from standard FPP, we prove a shape theorem for our model, which says that large balls under this metric converge to a deterministic shape under rescaling. As a consequence, we show that smooth random Riemannian metrics are geodesically complete with probability one. In differential geometry, geodesics are curves which locally minimize length. They need not do so globally: consider great circles on a sphere. For lattice models of FPP, there are many open questions related to minimizing geodesics; similarly, it is interesting from a geometric perspective when geodesics are globally minimizing. In the present study, we show that for any fixed st...

Metric perturbations the stability of solution of Einstein-Cartan cosmology (ECC) are given. The first addresses the stability of solutions of Einstein-Cartan (EC) cosmological model against Einstein static universe background. In this solution we show that the metric is stable against first-order perturbations and correspond to acoustic oscillations. The second example deals with the stability of de Sitter metric also against first-order perturbations. Torsion and shear are also computed in these cases. The resultant perturbed anisotropic spacetime with torsion is only de Sitter along one direction or is unperturbed along one direction and perturbed against the other two. Cartan torsion contributes to the frequency of oscillations in the model. Therefore gravitational waves could be triggered by the spin-torsion scalar density .

We consider the problem of constructing Steiner minimum trees for a metric defined by a polygonal unit circle (corresponding to s = 2 weighted legal orientations in the plane). A linear-time algorithm to enumerate all angle configurations for degree three Steiner points is given. We provide...... a simple proof that the angle configuration for a Steiner point extends to all Steiner points in a full Steiner minimum tree, such that at most six orientations suffice for edges in a full Steiner minimum tree. We show that the concept of canonical forms originally introduced for the uniform orientation...... metric generalises to the fixed orientation metric. Finally, we give an O(s n) time algorithm to compute a Steiner minimum tree for a given full Steiner topology with n terminal leaves....

To develop a proposal for metrics for protocols and other technical products to be applied in assessing the Postgraduate Programs of Medicine III - Capes. The 2013 area documents of all the 48 Capes areas were read. From the analysis of the criteria used by the areas at the 2013's Triennal Assessment, a proposal for metrics for protocols and other technical products was developed to be applied in assessing the Postgraduate Programs of Medicine III. This proposal was based on the criteria of Biological Sciences I and Interdisciplinary areas. Only seven areas have described a scoring system for technical products. The products considered and the scoring varied widely. Due to the wide range of different technical products which could be considered relevant, and that would not be punctuated if they were not previously specified, it was developed, for the Medicine III, a proposal for metrics in which five specific criteria to be analyzed: Demand, Relevance/Impact, Scope, Complexity and Adherence to the Program. Based on these criteria, each product can receive 10 to 100 points. This proposal can be applied to the item Intellectual Production of the evaluation form, in subsection "Technical production, patents and other relevant production". The program will be scored as Very Good when it reaches mean ≥150 points/permanent professor/quadrennium; Good, mean between 100 and 149 points; Regular, mean between 60 and 99 points; Weak mean between 30 and 59 points; Insufficient, up to 29 points/permanent professor/quadrennium. Desenvolver proposta de métricas para protocolos e outras produções técnicas a serem aplicadas na avaliação dos Programas de Pós-Graduação da Área Medicina III da Capes. Foram lidos os documentos de área de 2013 de todas as 48 Áreas da Capes. A partir da análise dos critérios utilizados por elas na avaliação trienal 2013, foi desenvolvida uma proposta de métricas para protocolos e outras produções técnicas. Esta proposta foi baseada

We show how to write any Kaehler metric of complex dimension 2 admitting a holomorphic isometry as a simple 1-real-function deformation of a Gibbons-Hawking metric. Hyper-Kaehler metrics with a tri-holomorphic isometry (Gibbons-Hawking metrics) or with a mono-holomorphic isometry are recovered for particular values of the additional function. The new general metric can be used as an Ansatz in several interesting physical problems.

To recommend metrics to qualify software production and to propose guidelines for the CAPES quadrennial evaluation of the Post-Graduation Programs of Medicine III about this issue. Identification of the development process quality features, of the product attributes and of the software use, determined by Brazilian Association of Technical Standards (ABNT), International Organization Standardization (ISO) and International Electrotechnical (IEC), important in the perspective of the CAPES Medicine III Area correlate users, basing the creation proposal of metrics aiming to be used on four-year evaluation of Medicine III. The in use software quality perception by the user results from the provided effectiveness, productivity, security and satisfaction that originate from its characteristics of functionality, reliability, usability, efficiency, maintainability and portability (in use metrics quality). This perception depends on the specific use scenario. The software metrics should be included in the intellectual production of the program, considering the system behavior measurements results obtained by users' performance evaluation through out the favorable responses punctuation sum for the six in use metrics quality (27 sub-items, 0 to 2 points each) and for quality perception proof (four items, 0 to 10 points each). It will be considered as very good (VG) 85 to 94 points; good (G) 75 to 84 points; regular (R) 65 to 74 points; weak (W) 55 to 64 points; poor (P) Normas Técnicas (ABNT), International Organization Standardization (ISO) e International Electrotechnical (IEC), importantes na perspectiva dos usuários correlatos da Área Medicina III da CAPES, embasando a criação de proposta para métrica do tema, com vistas à avaliação quadrienal dos cursos de pós-graduação. A percepção de qualidade em uso do software pelo usuário resulta da efetividade, produtividade, segurança e satisfação proporcionada, que têm origem nas suas características de

The concepts of Boolean metric space and convex combination are used to characterize polynomial maps in a class of commutative Von Neumann regular rings including Boolean rings and p-rings, that we have called CFG-rings. In those rings, the study of the category of algebraic varieties (i.e. sets of solutions to a finite number of polynomial equations with polynomial maps as morphisms) is equivalent to the study of a class of Boolean metric spaces, that we call here CFG-spaces.

We investigate the proposed duality between a quantum information metric in a CFTd +1 and the volume of a maximum time slice in the dual AdSd +2 for topological geons. Examining the specific cases of Banados-Teitelboim-Zannelli (BTZ) black holes and planar Schwarzschild-anti-de Sitter black holes, along with their geon counterparts, we find that the proposed duality relation for geons is the same apart from a factor of 4. The information metric therefore provides a probe of the topology of the bulk spacetime.

They consider a metric-affine theory of gravity in which the dynamical fields are the Lorentzian metrics and the non-symmetric linear connections on the worked manifold X. Working with a Lagrangian density which is invariant under general covariant transformations and using standard tools of the calculus of variations, they study the corresponding currents. They find that the superpotential takes a nice form involving the torsion of the linear connection in a simple way and generalizing the well-known Komar superpotential. A feature of our approach is the use of the Poincare`-Cartan form in relation to the first variational formula of the calculus of variations.

Full Text Available Recently, domain specific ontology development has been driven by research on the
Semantic Web. Ontologies have been suggested for use in many application areas targeted by the
Semantic Web, such as dynamic web service composition and general web service matching.
Fundamental characteristics of these ontologies must be determined in order to effectively make use of
them: for example, Sirin, Hendler and Parsia have suggested that determining fundamental
characteristics of ontologies is important for dynamic web service composition. Our research examines
cohesion metrics for ontologies. The cohesion metrics examine the fundamental quality of cohesion as
it relates to ontologies.

A 100-ton (metric) holographic table has ben built as a testing facility for the mechanical and civil engineering industry. The table is made of a rectangular concrete slab of 5 by 20 m and it is presented in both directions in order to achieve maximum stiffness and stability. Due to the large size and the loading capacity of this table, full scale examination of large specimens or heavy concrete elements can be performed by means of optical methods including holographic interferometry, speckle interferometry or high sensitivity moire. Rheological behavior, i.e. creep and shrinkage, induces redistribution of internal stresses. Scale effects blurry the actual phenomena if observed on a reduced size model. In order to manage a desirable 'Design by testing' such a facility allowing full scale testing certainly offers a very efficient tool.

The National Optical Astronomy Observatory (NOAO) is the U.S. national research and development center for ground-based nighttime astronomy. The NOAO librarian manages the organization’s publications tracking and metrics program, which consists of three components: identifying publications, organizing citation data, and disseminating publications information. We are developing methods to streamline these tasks, better organize our data, provide greater accessibility to publications data, and add value to our services.Our publications tracking process is complex, as we track refereed publications citing data from several sources: NOAO telescopes at two observatory sites, telescopes of consortia in which NOAO participates, the NOAO Science Archive, and NOAO-granted community-access time on non-NOAO telescopes. We also identify and document our scientific staff publications. In addition, several individuals contribute publications data.In the past year, we made several changes in our publications tracking and metrics program. To better organize our data and streamline the creation of reports and metrics, we created a MySQL publications database. When designing this relational database, we considered ease of use, the ability to incorporate data from various sources, efficiency in data inputting and sorting, and potential for growth. We also considered the types of metrics we wished to generate from our publications data based on our target audiences and the messages we wanted to convey. To increase accessibility and dissemination of publications information, we developed a publications section on the library’s website, with citation lists, acknowledgements guidelines, and metrics. We are now developing a searchable online database for our website using PHP.The publications tracking and metrics program has provided many opportunities for the library to market its services and contribute to the organization’s mission. As we make decisions on collecting, organizing

Development of successful business models has become a necessity in turbulent business environments, but compared to research on business modeling tools, attention to the role of metrics in designing business models in literature is limited. Building on existing approaches to business models and

This document outlines a set of metrics for evaluating the diagnostic and prognostic schemes developed for the Vehicle Integrated Prognostic Reasoner (VIPR), a system-level reasoner that encompasses the multiple levels of large, complex systems such as those for aircraft and spacecraft. VIPR health managers are organized hierarchically and operate together to derive diagnostic and prognostic inferences from symptoms and conditions reported by a set of diagnostic and prognostic monitors. For layered reasoners such as VIPR, the overall performance cannot be evaluated by metrics solely directed toward timely detection and accuracy of estimation of the faults in individual components. Among other factors, overall vehicle reasoner performance is governed by the effectiveness of the communication schemes between monitors and reasoners in the architecture, and the ability to propagate and fuse relevant information to make accurate, consistent, and timely predictions at different levels of the reasoner hierarchy. We outline an extended set of diagnostic and prognostics metrics that can be broadly categorized as evaluation measures for diagnostic coverage, prognostic coverage, accuracy of inferences, latency in making inferences, computational cost, and sensitivity to different fault and degradation conditions. We report metrics from Monte Carlo experiments using two variations of an aircraft reference model that supported both flat and hierarchical reasoning.

This proceeding is based on a talk prepared for the XIII Marcell Grossmann meeting. We summarise some results of work in progress in collaboration with Giovanni Amelino-Camelia about momentum dependent (Rainbow) metrics in a Relative Locality framework and we show that this formalism is equivalent to the Hamiltonian formalization of Relative Locality obtained in arXiv:1102.4637.

We generalize the formulation of the colliding gravitational waves to metric-affine theories and present an example of such kind of exact solutions. The plane waves are equipped with five symmetries and the resulting geometry after the collision possesses two spacelike Killing vectors.

I study a thermodynamical approach to scalar metric perturbations during the inflationary stage. In the power-law expanding universe here studied, I find a negative heat capacity as a manifestation of superexponential growing for the number of states in super Hubble scales. The power spectrum depends on the Gibbons-Hawking and Hagedorn temperatures.

Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnostics to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. The utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak [J. L. Luxon, Nucl. Fusion 42, 614 (2002)], as part of a multi-year transport model validation activity.

Development of successful business models has become a necessity in turbulent business environments, but compared to research on business modeling tools, attention to the role of metrics in designing business models in literature is limited. Building on existing approaches to business models and per

Cost of metric conversion in school shops is examined, and the author categories all the shops in the school and gives useful information on which shops are the easiest to convert, which are most complicated, where resistance is most likely to be met, and where conversion is most urgent. The math department is seen as catalyst. (Editor/HD)

This paper considers a cloud computing setting in which similarity querying of metric data is outsourced to a service provider. The data is to be revealed only to trusted users, not to the service provider or anyone else. Users query the server for the most similar data objects to a query example...

Full Text Available Yau proved an existence theorem for Ricci-flat Kähler metrics in the 1970s, but we still have no closed form expressions for them. Nevertheless there are several ways to get approximate expressions, both numerical and analytical. We survey some of this work and explain how it can be used to obtain physical predictions from superstring theory.

We prove a metrical result on a family of conjectures related to the Littlewood conjecture, namely the original Littlewood conjecture, the mixed Littlewood conjecture of de Mathan and Teulié and a hybrid between a conjecture of Cassels and the Littlewood conjecture. It is shown that the set of nu...

Intrinsic perceptual biases for simple duration ratios are thought to constrain the organization of rhythmic patterns in music. We tested that hypothesis by exposing listeners to folk melodies differing in metrical structure (simple or complex duration ratios), then testing them on alterations that preserved or violated the original metrical structure. Simple meters predominate in North American music, but complex meters are common in many other musical cultures. In Experiment 1, North American adults rated structure-violating alterations as less similar to the original version than structure-preserving alterations for simple-meter patterns but not for complex-meter patterns. In Experiment 2, adults of Bulgarian or Macedonian origin provided differential ratings to structure-violating and structure-preserving alterations in complex- as well as simple-meter contexts. In Experiment 3, 6-month-old infants responded differentially to structure-violating and structure-preserving alterations in both metrical contexts. These findings imply that the metrical biases of North American adults reflect enculturation processes rather than processing predispositions for simple meters.

We study a quantum information metric (or fidelity susceptibility) in conformal field theories with respect to a small perturbation by a primary operator. We argue that its gravity dual is approximately given by a volume of maximal time slice in an AdS spacetime when the perturbation is exactly marginal. We confirm our claim in several examples.

In the present paper we introduce the concepts of strongly ideal convergent sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong ideal limit points and the strong ideal cluster points of a sequence in this space and investigate some properties of these concepts.

NREL began work for DOE on this project to standardize the measurement and characterization of building energy performance. NREL's primary research objectives were to determine which performance metrics have greatest value for determining energy performance and to develop standard definitions and methods of measuring and reporting that performance.

Full Text Available We prove that for a given normalized compact metric space it can induce a σ-max-superdecomposable measure, by constructing a Hausdorff pseudometric on its power set. We also prove that the restriction of this set function to the algebra of all measurable sets is a σ-max-decomposable measure. Finally we conclude this paper with two open problems.

Development of successful business models has become a necessity in turbulent business environments, but compared to research on business modeling tools, attention to the role of metrics in designing business models in literature is limited. Building on existing approaches to business models and per

We describe the Metrics Analysis Framework (MAF), an open-source python framework developed to provide a user-friendly, customizable, easily-extensible set of tools for analyzing data sets. MAF is part of the Large Synoptic Survey Telescope (LSST) Simulations effort. Its initial goal is to provide a tool to evaluate LSST Operations Simulation (OpSim) simulated surveys to help understand the effects of telescope scheduling on survey performance, however MAF can be applied to a much wider range of datasets. The building blocks of the framework are Metrics (algorithms to analyze a given quantity of data), Slicers (subdividing the overall data set into smaller data slices as relevant for each Metric), and Database classes (to access the dataset and read data into memory). We describe how these building blocks work together, and provide an example of using MAF to evaluate different dithering strategies. We also outline how users can write their own custom Metrics and use these within the framework.

We extend a recently introduced universal grayscale image quality index to a newly developed perceptually decorrelated colour space. The resulting colour image fidelity metric quantifies the distortion of a processed colour image relative to its original version. We evaluated the new colour image fi

Full Text Available Cohesion is one of the most important factors for software quality as well as maintainability, reliability and reusability. Module cohesion is defined as a quality attribute that seeks for measuring the singleness of the purpose of a module. The module of poor quality can be a serious obstacle to the system quality. In order to design a good software quality, software managers and engineers need to introduce cohesion metrics to measure and produce desirable software. A highly cohesion software is thought to be a desirable constructing. In this paper, we propose a function-oriented cohesion metrics based on the analysis of live variables, live span and the visualization of processing element dependency graph. We give six typical cohesion examples to be measured as our experiments and justification. Therefore, a well-defined, well-normalized, well-visualized and well-experimented cohesion metrics is proposed to indicate and thus enhance software cohesion strength. Furthermore, this cohesion metrics can be easily incorporated with software CASE tool to help software engineers to improve software quality.

Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnostics to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. The utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak [J. L. Luxon, Nucl. Fusion 42, 614 (2002)], as part of a multi-year transport model validation activity.

In this short Note we would like to bring into the attention of people working in General Relativity a Schwarzschild like metric found by Professor Cleopatra Mociuţchi in sixties. It was obtained by the A. Sommerfeld reasoning from his treatise "Elektrodynamik" but using instead of the energy conserving law from the classical Physics, the relativistic energy conserving law.

Abstract – Business Marketing is one of the prospective which has been tremendously affected by digital world in last few years. Digital marketing refers to doing advertising through digital channels. This paper provides detailed study of metrics to measure success of digital marketing platform and glimpse of future of technologies by 2020.

This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2011. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2010. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

We show that one can define (p,∞)-summable spectral triples using degenerate metrics on smooth manifolds. Furthermore, these triples satisfy Connes-Moscovici's discrete and finite dimension spectrum hypothesis, allowing one to use the Local Index Theorem [1] to compute the pairing with K-theory. We demonstrate this with a concrete example.

Yau proved an existence theorem for Ricci-flat K\\"ahler metrics in the 1970's, but we still have no closed form expressions for them. Nevertheless there are several ways to get approximate expressions, both numerical and analytical. We survey some of this work and explain how it can be used to obtain physical predictions from superstring theory.

Yau proved an existence theorem for Ricci-flat Kähler metrics in the 1970s, but we still have no closed form expressions for them. Nevertheless there are several ways to get approximate expressions, both numerical and analytical. We survey some of this work and explain how it can be used to obtain physical predictions from superstring theory.

Full Text Available Problem statement: Typically, the accuracy metric is often applied for optimizing the
heuristic or stochastic classification models. However, the use of accuracy metric might lead the
searching process to the sub-optimal solutions due to its less discriminating values and it is also not
robust to the changes of class distribution. Approach: To solve these detrimental effects, we propose a
novel performance metric which combines the beneficial properties of accuracy metric with the
extended recall and precision metrics. We call this new performance metric as Optimized Accuracy
with Recall-Precision (OARP. Results: In this study, we demonstrate that the OARP metric is
theoretically better than the accuracy metric using four generated examples. We also demonstrate
empirically that a naïve stochastic classification algorithm, which is Monte Carlo Sampling (MCS
algorithm trained with the OARP metric, is able to obtain better predictive results than the one trained
with the conventional accuracy metric. Additionally, the t-test analysis also shows a clear advantage of
the MCS model trained with the OARP metric over the accuracy metric alone for all binary data sets.
Conclusion: The experiments have proved that the OARP metric leads stochastic classifiers such as
the MCS towards a better training model, which in turn will improve the predictive results of any
heuristic or stochastic classification models.

In this work is presented a versatile system for X-ray excited optical luminescence (XEOL) measurements. The apparatus was assembled from a sample holder connected to an optical fiber responsibly for the acquisition of the scintillation signal. The spectrum is registered with a CCD coupled in a spectrograph provided with diffraction gratings. The system performance was analyzed by exciting GdAlO3:Eu{sup 3+} 3.0 at.% with X-rays from a diffractometer and measuring the emission spectra. The system can be used to obtain precise and reliable spectroscopic properties of samples with various conformations without the loss of the required safety when dealing with ionizing radiations. (author)

This paper presents a range of products for the ceramic tiles decoration, characterized for a microcracks structure after its application on a ceramic substrate and subsequent firing. This structure origins a multicoloured iridescent effect that confers to the ceramic tile differential aesthetics characteristics, with a final aspect similar to the rainbow quartz or iris quartz. An analysis of the state of the art is made as well as a deepening in the study and characterization of the physical optical phenomena responsible of that kind of effects in these minerals, with the aim of determining and modelling the multicoloured effect or iridescent effect, the cause of this effect, the physical-optical phenomenon that produces it, and the analysis and knowledge of the phenomenological mechanisms that origins it, so that we can try the extrapolation of that effects wit our ceramic materials. (Author)

This work describes preliminary experiments that will bring subsidies to analyze the capability to implement a system able to capture radiological images with new sensor system, comprised by FOs scanning process and I-CCD camera. These experiments have the main objective to analyze the optical response from FOs bundle, with several typos of scintillators associated with them, when it is submitted to medical x-rays exposition. (author)

The typical procedure for evaluating the performance of different objective quality metrics and indices involves comparisons between subjective quality ratings and the quality indices obtained using the objective metrics in question on the known video sequences. Several correlation indicators can...

In this work we construct a family of spherically symmetric, static, charged regular black hole metrics in the context of Einstein-nonlinear electrodynamics theory. The construction of the charged regular black hole metrics is based on three requirements: (a) the weak energy condition should be satisfied, (b) the energy–momentum tensor should have the symmetry T{sub 0}{sup 0}=T{sub 1}{sup 1}, and (c) these metrics have to asymptotically behave as the Reissner–Nordström black hole metric. In addition, these charged regular black hole metrics depend on two parameters which for specific values yield regular black hole metrics that already exist in the literature. Furthermore, by relaxing the third requirement, we construct more general regular black hole metrics which do not behave asymptotically as a Reissner–Nordström black hole metric.

The metric traveling salesman problem is one of the most prominent APX-complete optimization problems. An important particularity of this problem is that there is a large gap between the known upper bound and lower bound on the approximability (assuming P ≠ NP). In fact, despite more than 30 years of research, no one could find a better approximation algorithm than the 1.5-approximation provided by Christofides. The situation is similar for a related problem, the metric Hamiltonian path problem, where the start and the end of the path are prespecified: the best approximation ratio up to date is 5/3 by an algorithm presented by Hoogeveen almost 20 years ago.

Full Text Available Secure orchestration is an important concern in the internet of service. Next to providing the required functionality the composite services must also provide a reasonable level of security in order to protect sensitive data. Thus, the orchestrator has a need to check whether the complex service is able to satisfy certain properties. Some properties are expressed with metrics for precise definition of requirements. Thus, the problem is to analyse the values of metrics for a complex business process. In this paper we extend our previous work on analysis of secure orchestration with quantifiable properties. We show how to define, verify and enforce quantitative security requirements in one framework with other security properties. The proposed approach should help to select the most suitable service architecture and guarantee fulfilment of the declared security requirements.

The increasing volume of physics data is posing a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from our ongoing automation efforts. First, we describe our framework for distributed data management and network metrics, automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for network-aware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our models.

Full Text Available Software quality depends on several factors such as on time delivery; within budget and fulfilling user's needs. Complexity is one of the most important factors that may affect the quality. Therefore, measuring and controlling the complexity result in improving the quality. So far, most of the researches have tried to identify and measure the complexity in design and code phase. However, whenwe have the code or design for software, it is too late to control complexity. In this article, with emphasis on Requirement Engineering process, we analyze the causes of software complexity, particularly in the first phase of software development, and propose a requirement based metric. This metric enables a software engineer to measure the complexity before actual design and implementation and choosestrategies that are appropriate to the software complexity degree, thus saving on cost and human resource wastage and, more importantly, leading to lower maintenance costs.

Full Text Available The past three years has seen exponential growth in the number of organizations who have elected to entrust core information technology functions to application service providers. Of particular interest is the outsourcing of critical systems such as corporate databases. Major banks and financial service firms are contracting with third party organizations, sometimes overseas, for their database needs. These sophisticated contracts require careful supervision by both parties. Due to the complexities of web- based applications and the complicated nature of databases, an entire class of software suites has been developed to measure the quality of service the database is providing. This article investigates the performance metrics which have evolved to satisfy this need and describes a taxonomy of performance metrics for hosted databases.

After almost 20 years of intensive research on magnetocaloric effects near room temperature, magnetic refrigeration with first-order magnetocaloric materials has come close to real-life applications. Many materials have been discussed as potential candidates to be used in multicaloric devices. However, phase transitions in ferroic materials are often hysteretic and a metric is needed to estimate the detrimental effects of this hysteresis. We propose the coefficient of refrigerant performance, which compares the net work in a reversible cycle with the positive work on the refrigerant, as a universal metric for ferroic materials. Here, we concentrate on examples from magnetocaloric materials and only consider one barocaloric experiment. This is mainly due to lack of data on electrocaloric materials. It appears that adjusting the field-induced transitions and the hysteresis effects can minimize the losses in first-order materials.This article is part of the themed issue 'Taking the temperature of phase transitions in cool materials'.

For a metric space $X$, we study the space $D^{\\infty}(X)$ of bounded functions on $X$ whose infinitesimal Lipschitz constant is uniformly bounded. $D^{\\infty}(X)$ is compared with the space $\\LIP^{\\infty}(X)$ of bounded Lipschitz functions on $X$, in terms of different properties regarding the geometry of $X$. We also obtain a Banach-Stone theorem in this context. In the case of a metric measure space, we also compare $D^{\\infty}(X)$ with the Newtonian-Sobolev space $N^{1, \\infty}(X)$. In particular, if $X$ supports a doubling measure and satisfies a local Poincar{\\'e} inequality, we obtain that $D^{\\infty}(X)=N^{1, \\infty}(X)$.

The relative growth of field and metric perturbations during preheating is sensitive to initial conditions set in the preceding inflationary phase. Recent work suggests this may protect super-Hubble metric perturbations from resonant amplification during preheating. We show that this possibility is fragile and extremely sensitive to the specific form of the interactions between the inflaton and other fields. The suppression is naturally absent in two classes of preheating in which either (1) the critical points (hence the vacua) of the effective potential during inflation are deformed away from the origin, or (2) the effective masses of fields during inflation are small but during preheating are large. Unlike the simple toy model of a g^2 \\phi^2 \\chi^2 coupling, most realistic particle physics models contain these other features. Moreover, they generically lead to both adiabatic and isocurvature modes and non-Gaussian scars on super-Hubble scales. Large-scale coherent magnetic fields may also appear naturally...

Integral flow surfaces constitute a widely used flow visualization tool due to their capability to convey important flow information such as fluid transport, mixing, and domain segmentation. Current flow surface rendering techniques limit their expressiveness, however, by focusing virtually exclusively on displacement visualization, visually neglecting the more complex notion of deformation such as shearing and stretching that is central to the field of continuum mechanics. To incorporate this information into the flow surface visualization and analysis process, we derive a metric tensor field that encodes local surface deformations as induced by the velocity gradient of the underlying flow field. We demonstrate how properties of the resulting metric tensor field are capable of enhancing present surface visualization and generation methods and develop novel surface querying, sampling, and visualization techniques. The provided results show how this step towards unifying classic flow visualization and more advanced concepts from continuum mechanics enables more detailed and improved flow analysis.

Metrics structures stemming from the Connes distance promote Moyal planes to the status of quantum metric spaces. We discuss this aspect in the light of recent developments, emphasizing the role of Moyal planes as representative examples of a recently introduced notion of quantum (noncommutative) locally compact space. We move then to the framework of Lorentzian noncommutative geometry and we examine the possibility of defining a notion of causality on Moyal plane, which is somewhat controversial in the area of mathematical physics. We show the actual existence of causal relations between the elements of a particular class of pure (coherent) states on Moyal plane with related causal structure similar to the one of the usual Minkowski space, up to the notion of locality.

Full Text Available The Planck vacuum (PV is assumed to be the source of the visible universe. So under conditions of sufficient stress, there must exist a pathway through which energy from the PV can travel into this universe. Conversely, the passage of energy from the visible universe to the PV must also exist under the same stressful conditions. The following examines two versions of the Schwarzschild metric equation for compatability with this open-pathway idea.

Poetry is a special combination of musical and linguistic qualities-of sounds both regarded as pure sound and as mean-ingful speech. Part of the pleasure of poetry lies in its relationship with music. Metrics, including rhythm and meter, is an impor-tant method for poetry to express poetic sentiment. Through the introduction of poetic language and typical examples, the writer of this paper tries to discuss the relationship between sound and meaning.

quantified by computing the entropy of the robot’s a posteriori pose estimate. The robot’s pose history along its trajectory is captured by the mapping...man-portable robot system. The robot was equipped with additional computing hardware to increase the capabilities of the platform. Similarly, the...Laboratory Autonomous Exploration Using an Information Gain Metric by Nicholas C Fung, Jason M Gregory, and John G Rogers Computational and

The aim of this article is to analyze the asymptotic properties of the C-metric, using a general method specified in work of Tafel and coworkers, [1], [2], [3]. By finding an appropriate conformal factor $\\Omega$, it allows the investigation of the asymptotic properties of a given asymptotically flat spacetime. The news function and Bondi mass aspect are computed, their general properties are analyzed, as well as the small mass, small acceleration, small and large Bondi time limits.

Ocean Model Assessment With Lagrangian Metrics” Pearn P. Niiler Scripps Institution of Oceanography 9500 Gilman Drive MC 0213 La Jolla, CA...project are to aid in the development of accurate modeling of upper ocean circulation by using data on circulation observations to test models . These tests...or metrics, will be statistical measures of model and data comparisons. It is believed that having accurate models of upper ocean currents will

The Moller energy(due to matter and fields including gravity) distribution of the gamma metric is studied in tele-parallel gravity. The result is the same as those obtained in general relativity by Virbhadra in the Weinberg complex and Yang-Radincshi in the Moller definition. Our result is also independent of the three teleparallel dimensionless coupling constants, which means that it is valid not only in the teleparallel equivalent of general relativity, but also in any teleparallel model.

We show that for any metric space $X$ the condition \\[ \\int_X\\int_X\\int_X c(z_1,z_2,z_3)^2\\, d\\Hm z_1\\, d\\Hm z_2\\, d\\Hm z_3 < \\infty, \\] where $c(z_1,z_2,z_3)$ is the Menger curvature of the triple $(z_1,z_2,z_3)$, guarantees that $X$ is rectifiable.

Full Text Available This article presents different types of collaborative systems, their structure and classification. This paper defines the concept of virtual campus as a collaborative system. It builds architecture for virtual campus oriented on collaborative training processes. It analyses the quality characteristics of collaborative systems and propose techniques for metrics construction and validation in order to evaluate them. The article analyzes different ways to increase the efficiency and the performance level in collaborative banking systems.

We propose a new method for anisotropic polygonal surface remeshing. Our algorithm takes as input a surface triangle mesh. An anisotropic rectangular metric, defined at each triangle facet of the input mesh, is derived from both a user-specified normal-based tolerance error and the requirement to favor rectangle-shaped polygons. Our algorithm uses a greedy optimization procedure that adds, deletes and relocates generators so as to match two criteria related to partitioning and conformity.

To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. It measures 21 metrics to provide insight into the grid’s capacity to embody these characteristics. This report looks across a spectrum of smart grid concerns to measure the status of smart grid deployment and impacts.

improve the bottom line of an organization. The first step of this process is to solicit the key performance indicators ( KPIs ) that best reflect the...organization’s mission. The second step is to use and/or develop metrics based on those KPIs to measure the organization’s mission performance today...The third step is to capture the trends of those KPIs over time to see if the organization is getting better or worse. The final step is to

Article-level metrics (ALMs) provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, usage statistics, discussions in online comments and social media, social bookmarking, and recommendations. In this essay, we describe why article-level metrics are an important extension of traditional citation-based journal metrics and provide a number of example from ALM data collected for PLOS Biology.

Distance metric learning (DML) is a critical factor for image analysis and pattern recognition. To learn a robust distance metric for a target task, we need abundant side information (i.e., the similarity/dissimilarity pairwise constraints over the labeled data), which is usually unavailable in practice due to the high labeling cost. This paper considers the transfer learning setting by exploiting the large quantity of side information from certain related, but different source tasks to help with target metric learning (with only a little side information). The state-of-the-art metric learning algorithms usually fail in this setting because the data distributions of the source task and target task are often quite different. We address this problem by assuming that the target distance metric lies in the space spanned by the eigenvectors of the source metrics (or other randomly generated bases). The target metric is represented as a combination of the base metrics, which are computed using the decomposed components of the source metrics (or simply a set of random bases); we call the proposed method, decomposition-based transfer DML (DTDML). In particular, DTDML learns a sparse combination of the base metrics to construct the target metric by forcing the target metric to be close to an integration of the source metrics. The main advantage of the proposed method compared with existing transfer metric learning approaches is that we directly learn the base metric coefficients instead of the target metric. To this end, far fewer variables need to be learned. We therefore obtain more reliable solutions given the limited side information and the optimization tends to be faster. Experiments on the popular handwritten image (digit, letter) classification and challenge natural image annotation tasks demonstrate the effectiveness of the proposed method.

An auxiliary metric (reference metric) is inevitable in massive gravity theory. In the scenario of the gauge/gravity duality, massive gravity with a singular reference metric is used to study momentum dissipation, which describes the electric and heat conductivity for normal conductors. We demonstrate in detail that the de Rham-Gabadadze-Tolley (dRGT) massive gravity with a singular reference metric is ghost free.

motivate continuous improvement and likewise quality. Attributen of MNaninafui Metrica Section Overview. The importance of metrics cannot be overstated...some of the attributes of meaningful measures discussed earlier in this chapter. The Metrica Handbook. This guide is utilized by a variety of Air...Metric Assessment Tool. 3-8 Metrica Belaction The metric assessment tool was designed to apply to any type of metric. Two criteria were established for

Full Text Available By using the world?s linguistic diversity, the study of meaning can be transformed from an introspective inquiry into a subject of empirical investigation. For this to be possible, the notion of meaning has to be operationalized by defining the meaning of an expression as the collection of all contexts in which the expression can be used. Under this definition, meaning can be empirically investigated by sampling contexts. A semantic map is a technique to show the relations between such sampled contextual occurrences. Or, formulated more technically, a semantic map is a visualization of a metric on contexts sampled to represent a domain of meaning. Or, put more succinctly, a semantic map is a metric on meaning. To establish such a metric, a notion of (dissimilarity is needed. The similarity between two meanings can be empirically investigated by looking at their encoding in many different languages. The more similar these encodings, in language after language, the more similar the contexts. So, to investigate the similarity between two contextualized meanings, only judgments about the similarity between expressions within the structure of individual languages are needed. As an example of this approach, data on cross-linguistic variation in inchoative/causative alternations from Haspelmath (1993 is reanalyzed.

Full Text Available Organizations often need to release microdata without revealing sensitive information. To this scope, data are anonymized and, to assess the quality of the process, various privacy metrics have been proposed, such as k-anonymity, l-diversity, and t-closeness. These metrics are able to capture different aspects of the disclosure risk, imposing minimal requirements on the association of an individual with the sensitive attributes. If we want to combine them in a optimization problem, we need a common framework able to express all these privacy conditions. Previous studies proposed the notion of mutual information to measure the different kinds of disclosure risks and the utility, but, since mutual information is an average quantity, it is not able to completely express these conditions on single records. We introduce here the notion of one-symbol information (i.e., the contribution to mutual information by a single record that allows to express and compare the disclosure risk metrics. In addition, we obtain a relation between the risk values t and l, which can be used for parameter setting. We also show, by numerical experiments, how l-diversity and t-closeness can be represented in terms of two different, but equally acceptable, conditions on the information gain..

Due to the large quantity of low-cost, high-speed computational processing available today, computational imaging (CI) systems are expected to have a major role for next generation multifunctional cameras. The purpose of this work is to quantify the performance of theses CI systems in a standardized manner. Due to the diversity of CI system designs that are available today or proposed in the near future, significant challenges in modeling and calculating a standardized detection signal-to-noise ratio (SNR) to measure the performance of these systems. In this paper, we developed a path forward for a standardized detectivity metric for CI systems. The detectivity metric is designed to evaluate the performance of a CI system searching for a specific known target or signal of interest, and is defined as the optimal linear matched filter SNR, similar to the Hotelling SNR, calculated in computational space with special considerations for standardization. Therefore, the detectivity metric is designed to be flexible, in order to handle various types of CI systems and specific targets, while keeping the complexity and assumptions of the systems to a minimum.

Data from recent experiments at North Carolina State University and other locations provide a unique opportunity to study the effect of ambient ozone on the growth of clover. The data consist of hourly ozone measurements over a 140 day growing season at eight sites in the US, coupled with clover growth response data measured every 28 days. The objective is to model an indicator of clover growth as a function of ozone exposure. A common strategy for dealing with the numerous hourly ozone measurements is to reduce these to a single summary measurement, a so-called exposure metric, for the growth period of interest. However, the mean ozone value is not necessarily the best summarization, as it is widely believed that low levels of ozone have a negligible effect on growth, whereas peak ozone values are deleterious to plant growth. There are also suspected interactions with available sunlight, temperature and humidity. A number of exposure metrics have been proposed that reflect these beliefs by assigning different weights to ozone values according to magnitude, time of day, temperature and humidity. These weighting schemes generally depend on parameters that have, to date, been subjectively determined. We propose a statistical approach based on profile likelihoods to estimate the parameters in these exposure metrics.

Let $(\\mathcal {X},\\Omega)$ be a closed polarized complex manifold, $g$ be an extremal metric on $\\mathcal X$ that represents the K\\"ahler class $\\Omega$, and $G$ be a compact connected subgroup of the isometry group $Isom(\\mathcal{X},g)$. Assume that the Futaki invariant relative to $G$ is nondegenerate at $g$. Consider a smooth family $(\\mathcal{M}\\to B)$ of polarized complex deformations of $(\\mathcal{X},\\Omega)\\simeq (\\mathcal{M}_0,\\Theta_0)$ provided with a holomorphic action of $G$. Then for every $t\\in B$ sufficiently small, there exists an $h^{1,1}(\\cX)$-dimensional family of extremal K\\"ahler metrics on $\\mathcal{M}_t$ whose K\\"ahler classes are arbitrarily close to $\\Theta_t$. We apply this deformation theory to analyze the Mukai-Umemura 3-fold and its complex deformations. In particular, we prove that there are certain complex deformation of the Mukai-Umemura 3-folds which have extremal metric of non constant scalar curvature with Kaehler class $c_1$.

Earlier Bassett et al [Phys Rev D 63 (2001) 023506] investigated the amplification of large scale magnetic fields during preheating and inflation in several different models. They argued that in the presence of conductivity resonance effect is weakened. From a dynamo equation in spacetimes endowed with torsion recently derived by Garcia de Andrade [Phys Lett B 711: 143 (2012)] it is shown that a in a universe with pure torsion in Minkowski spacetime the cosmological magnetic field is enhanced by ohmic or non-conductivity effect, which shows that the metric-torsion effects is worth while of being studied. In this paper we investigated the metric-torsion preheating perturbation, which leads to the seed cosmological magnetic field in the universe with torsion is of the order of $B_{seed}\\sim{10^{-37}Gauss}$ which is several orders of magnitude weaker than the decoupling value obtained from pure metric preheating of $10^{-15}Gauss$. Despite of the weakness of the magnetic field this seed field may seed the galact...

In a multi-armed bandit problem, an online algorithm chooses from a set of strategies in a sequence of trials so as to maximize the total payoff of the chosen strategies. While the performance of bandit algorithms with a small finite strategy set is quite well understood, bandit problems with large strategy sets are still a topic of very active investigation, motivated by practical applications such as online auctions and web advertisement. The goal of such research is to identify broad and natural classes of strategy sets and payoff functions which enable the design of efficient solutions. In this work we study a very general setting for the multi-armed bandit problem in which the strategies form a metric space, and the payoff function satisfies a Lipschitz condition with respect to the metric. We refer to this problem as the "Lipschitz MAB problem". We present a complete solution for the multi-armed problem in this setting. That is, for every metric space (L,X) we define an isometry invariant which bounds f...

Computer security is one of the most complicated and challenging fields in technology today. A security metrics program provides a major benefit: looking at the metrics on a regular basis offers early clues to changes in attack patterns or environmental factors that may require changes in security strategy. The term "security metrics" loosely…

In this paper we obtain fixed point and common fixed point theorems for self-mappings defined on a metric-type space, an ordered metric-type space or a normal cone metric space. Moreover, some examples and an application to integral equations are given to illustrate the usability of the obtained results.

In this paper, we prove the existence and uniqueness of the weak cone geodesics in the space of K\\"ahler cone metrics by solving the singular, homogeneous complex Monge-Amp\\`{e}re equation. As an application, we prove the metric space structure of the appropriate subspace of the space of K\\"ahler cone metrics.

In this paper we present PRIMA: a new model tailored to symbolic music that detects the meter and the first downbeat position of a piece. Given onset data, the metrical structure of a piece is interpreted using the Inner Metric Analysis (IMA) model. IMA identifies the strong and weak metrical

We give a construction of two-sided invariant metrics on free products (possibly with amalgamation) of groups with two-sided invariant metrics and, under certain conditions, on HNN extensions of such groups. Our approach is similar to the Graev's construction of metrics on free groups over pointed...

Australian universities are increasingly resorting to the use of journal metrics such as impact factors and ranking lists in appraisal and promotion processes, and are starting to set quantitative "performance expectations" which make use of such journal-based metrics. The widespread use and misuse of research metrics is leading to…

... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Metric system of measurement. 226.15 Section....S. NON-GOVERNMENTAL ORGANIZATIONS Pre-award Requirements § 226.15 Metric system of measurement. (a...) declares that the metric system is the preferred measurement system for U.S. trade and commerce....

... 45 Public Welfare 1 2010-10-01 2010-10-01 false Metric system of measurement. 74.15 Section 74.15... ORGANIZATIONS, AND COMMERCIAL ORGANIZATIONS Pre-Award Requirements § 74.15 Metric system of measurement. The... that the metric system is the preferred measurement system for U.S. trade and commerce. The...

... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Metric system of measurement. 145.15 Section... system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205) declares that the metric system is the preferred measurement system for U.S. trade...

... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Metric system of measurement. 30.15... measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205), declares that the metric system is the preferred measurement system for U.S. trade and commerce. The...

... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Metric system of measurement. 14.15... COMMERCIAL ORGANIZATIONS Pre-Award Requirements § 14.15 Metric system of measurement. The Metric Conversion... system is the preferred measurement system for U.S. trade and commerce. The Act requires each...

... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Metric system of measurement... system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205) declares that the metric system is the preferred measurement system for U.S. trade...

Full Text Available A number of routing metrics exist in wireless networks. These routing metrics were originally designed for mobile ad hoc networks (MANETs). When Wireless Mesh Networks (WMNs) came into being, an idea of introducing and using these routing metrics...

Full Text Available In 1994, Mathews [7] introduced the notion of partial metric spaces as a part of his study of denotational semantics of data.ow networks and obtained a generalization of the Banach contraction principle in partial metric spaces. In this paper, we prove stability results in partial metric spaces.

Introduction: We define a collection of metrics for describing and comparing sets of terms in controlled and uncontrolled indexing languages and then show how these metrics can be used to characterize a set of languages spanning folksonomies, ontologies and thesauri. Method: Metrics for term set characterization and comparison were identified and…

Optical space differs from physical space. The structure of optical space has generally been assumed to be metrical. In contradistinction, we do not assume any metric, but only incidence relations (i.e., we assume that optical points and lines exist and that two points define a unique line, and two lines a unique point). (The incidence relations have generally been assumed implicitly by earlier authors.) The condition that makes such an incidence structure into a projective space is the Pappus condition. The Pappus condition describes a projective relation between three collinear triples of points, whose validity can--in principle--be verified empirically. The Pappus condition is a necessary condition for optical space to be a homogeneous space (Lobatchevski hyperbolic or Riemann elliptic space) as assumed by, for example, the well-known Luneburg theory. We test the Pappus condition in a full-cue situation (open field, broad daylight, distances of up to 20 m, visual fields of up to 160 degrees diameter). We found that although optical space is definitely not veridical, even under full-cue conditions, violations of the Pappus condition are the exception. Apparently optical space is not totally different from a homogeneous space, although it is in no way close to Euclidean.

The conventional techniques employed to monitor the gasoline quality are expensive, time-consuming and demands on specialized workers to its execution. A study about the applicability of a long period grating, a fiber optic device, as an auxiliary tool for the analysis of Brazilian gasoline conformity is presented in this work. The long period grating spectral response was measured with the device immersed in samples of gasoline A with different proportions of hydrated ethyl alcohol fuel. A resolution of 0.23 % was obtained for the concentrations range of commercial interest, between 20 % and 40 %. The device performance was also tested with a set of conform and non-conform gasoline C samples. The device spectral response for these samples, as well as the samples densities and the conformity status were employed to train and to validate an artificial neural network with radial base function. The obtained results show that fiber optic sensors supervised by artificial neural networks can constitute systems for smart measurement with high applicability in the analyses of gasoline conformity, reducing costs and time related to conventional tests. (author)

We prove an analogue of Farb-Masur's theorem that the length-spectra metric on moduli space is "almost isometric" to a simple model $\\mathcal {V}(S)$ which is induced by the cone metric over the complex of curves. As an application, we know that the Teichm\\"{u}ller metric and the length-spectra metric are "almost isometric" on moduli space, while they are not even quasi-isometric on Teichm\\"{u}ller space.

Other books on information security metrics discuss number theory and statistics in academic terms. Light on mathematics and heavy on utility, PRAGMATIC Security Metrics: Applying Metametrics to Information Security breaks the mold. This is the ultimate how-to-do-it guide for security metrics.Packed with time-saving tips, the book offers easy-to-follow guidance for those struggling with security metrics. Step by step, it clearly explains how to specify, develop, use, and maintain an information security measurement system (a comprehensive suite of metrics) to

Full Text Available The purpose of this paper is to critically review the ‘levelised cost of energy’ metric used in electricity project development. This metric is widely used, because it is a simple metric to calculate the cost per unit of electricity for a given technology connected to the electricity network. However, it neglects certain key terms such as inflation, integration costs, and system costs. The implications of incorporating these additional costs would provide a more comprehensive metric for evaluating electricity generation projects, and for the system as a whole. It is therefore recommended to refine the metric for the South African context.

Spaces equipped with two complementary (distinct) congruences of self-dual null strings and at least one congruence of anti-self-dual null strings are considered. It is shown that if such spaces are Einsteinian then the vacuum Einstein field equations can be reduced to a single nonlinear partial differential equation of the second order. Different forms of these equations are analyzed. Finally, several new explicit metrics of the para-Hermite and para-Kähler Einstein spaces with Λ ≠ 0 are presented. Some relation of that metrics to a modern approach to mechanical issues is discussed.

In [20]-[22] there was developed a method for constructing a class of Calabi-Yau metrics in D=6 with a hamiltonian isometry, which require a 4-dimensional hyperkahler structure as initial input. Particular solutions of the resulting non linear equation corresponding to complete Calabi-Yau metrics were found in [22], but surprisingly the equation gets harder to solve for general hyperkahler structures due to the non trivial curvature of the Ricci flat 4-metric. In the present letter we suggest that the complications due to the choice of the hyperkahler structure may be avoided. We carefully analyze the assumptions made in those references and we work out a construction which do not require such initial input. This is also generalized to higher dimensions. It should be emphasized that there is nothing wrong with the use of hyperkahler structures as a solution generating technique, what is pointed out here is that this method is just optional.

The assessment of phylogenetic network reconstruction methods requires the ability to compare phylogenetic networks. This is the first in a series of papers devoted to the analysis and comparison of metrics for tree-child time consistent phylogenetic networks on the same set of taxa. In this paper, we study three metrics that have already been introduced in the literature: the Robinson-Foulds distance, the tripartitions distance and the mu-distance. They generalize to networks the classical Robinson-Foulds or partition distance for phylogenetic trees. We analyze the behavior of these metrics by studying their least and largest values and when they achieve them. As a by-product of this study, we obtain tight bounds on the size of a tree-child time consistent phylogenetic network.

In this work a measuring system of partial discharges based on the traditional electrical detection technique is presented, but that unlike the existing systems it uses an optical link of communication. On one hand, the optical fiber avoids that the electromagnetic interference of the area affects the measurement and on the other hand, allows to make measurements online. With a personal computer and an application program the system is controlled, besides storing and displaying graphically the acquired data. As a result of this thesis the developed system obtains the magnitude and the phase of every partial discharge occurred during a time interval fixed or selected by the user, as well as the total number of discharges. In a graphical form it presents the magnitude and the phase of each discharge, in addition to the corresponding histogram of frequencies. [Spanish] En este trabajo se presenta un sistema de medicion de descargas parciales basado en la tecnica de deteccion electrica tradicional, pero que a diferencia de los sistemas existentes utiliza un enlace de comunicacion optico. Por un lado, la fibra optica evita que la interferencia electromagnetica del medio afecte la medicion y por otro, permite hacer mediciones en linea. Con una computadora personal y un programa de aplicacion se controla el sistema, ademas de almacenar y presentar graficamente la informacion adquirida. Como resultado de esta tesis el sistema desarrollado obtiene la magnitud y la fase de cada descarga parcial ocurrida durante un intervalo de tiempo fijo o seleccionado por el usuario, asi como el numero total de descargas. En forma grafica presenta la magnitud y la fase de cada descarga, ademas del histograma de frecuencias correspondiente.

The performance of the k-nearest neighborhoods (k-NN) classifier is highly dependent on the distance metric used to identify the k nearest neighbors of the query points. The standard Euclidean distance is commonly used in practice. This paper investigates the performance of k-NN classifier with respect to different adaptive metrics in the context of medical imaging. We propose using adaptive metrics such that the structure of the data is better described, introducing some unsupervised learning knowledge in k-NN. We investigated four different metrics are estimated: a theoretical metric based on the assumption that images are drawn from Brownian Image Model (BIM), the normalized metric based on variance of the data, the empirical metric is based on the empirical covariance matrix of the unlabeled data, and an optimized metric obtained by minimizing the classification error. The spectral structure of the empirical covariance also leads to Principal Component Analysis (PCA) performed on it which results the subspace metrics. The metrics are evaluated on two data sets: lateral X-rays of the lumbar aortic/spine region, where we use k-NN for performing abdominal aorta calcification detection; and mammograms, where we use k-NN for breast cancer risk assessment. The results show that appropriate choice of metric can improve classification.

The distance between a pair of spike trains, quantifying the differences between them, can be measured using various metrics. Here we introduce a new class of spike train metrics, inspired by the Pompeiu-Hausdorff distance, and compare them with existing metrics. Some of our new metrics (the modulus-metric and the max-metric) have characteristics that are qualitatively different from those of classical metrics like the van Rossum distance or the Victor and Purpura distance. The modulus-metric and the max-metric are particularly suitable for measuring distances between spike trains where information is encoded in bursts, but the number and the timing of spikes inside a burst do not carry information. The modulus-metric does not depend on any parameters and can be computed using a fast algorithm whose time depends linearly on the number of spikes in the two spike trains. We also introduce localized versions of the new metrics, which could have the biologically relevant interpretation of measuring the differences between spike trains as they are perceived at a particular moment in time by a neuron receiving these spike trains.

Defining product metrics requires a rigorous and disciplined approach, because useful metrics depend, to a very large extent, on one's goals and assumptions about the studied software process. Unlike in more mature scientific fields, it appears difficult to devise a "universal" set of metrics in software engineering, that can be used across application environments. We propose an approach for the definition of product metrics which is driven by the experimental goals of measurement, expressed via the Goal/Question/Metric (GQM) paradigm, and is based on the mathematical properties of the metrics. This approach integrates several research contributions from the literature into a consistent, practical and rigorous approach. The approach we outline should not be considered as a complete and definitive solution, but as a starting point for discussion about a product metric definition approach widely accepted in the software engineering community. At this point, we intend to provide an intellectual process that we think is necessary to define sound software product metrics. A precise and complete documentation of such an approach will provide the information needed to make the assessment and reuse of a new metric possible. Thus, product metrics are supported by a solid theory which facilitates their review and refinement. Moreover, their definition is made less exploratory and, as a consequence, one is less likely to identify spurious correlations between process and product metrics.

Full Text Available Reuse and reusability are two major aspects in object oriented software which can be measured from inheritance hierarchy. Reusability is the prerequisite of reuse but both may or may not bemeasured using same metric. This paper characterizes metrics of reuse and reusability in Object Oriented Software Development (OOSD. Reuse metrics compute the extent to which classes have been reused and reusability metrics computes the extent to which classes can be reused. In this paper five new metrics namely- Breadth of Inheritance Tree (BIT, Method Reuse Per Inheritance Relation (MRPIR,Attribute Reuse Per Inheritance Relation (ARPIR, Generality of Class (GC and Reuse Probability (RP have been proposed. These metrics help to evaluate reuse and reusability of object oriented software.Four extensively validated existing object oriented metrics, namely- Depth of Inheritance Tree (DIT, Number of Children (NOC, Method Inheritance Factor (MIF and Attribute Inheritance Factor (AIFhave been selected and investigated for comparison with proposed metrics. All metrics can be computed from inheritance hierarchies and classified according to their characteristics. Further, metrics areevaluated against a case study. These metrics are helpful in comparing alternative inheritance hierarchies at design time to select best alternative, so that the development time and cost can be reduced.

Full Text Available Emerging metrics based on article-level does not exclude traditional metrics based on citations to the journal, but complements them. Article-level metrics (ALMs provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, statistics of usage, discussions in online comments and social media, social bookmarking, and recommendations. In this editorial, the role of article level metrics in publishing scientific papers has been described. Article-Level Metrics (ALMs are rapidly emerging as important tools to quantify how individual articles are being discussed, shared, and used. Data sources depend on the tool, but they include classic metrics indicators depending on citations, academic social networks (Mendeley, CiteULike, Delicious and social media (Facebook, Twitter, blogs, and Youtube. The most popular tools used to apply this new metrics are: Public Library of Science - Article-Level Metrics, Altmetric, Impactstory and Plum Analytics. Journal Impact Factor (JIF does not consider impact or influence beyond citations count as this count reflected only through Thomson Reuters’ Web of Science® database. JIF provides indicator related to the journal, but not related to a published paper. Thus, altmetrics now becomes an alternative metrics for performance assessment of individual scientists and their contributed scholarly publications. Macedonian scholarly publishers have to work on implementing of article level metrics in their e-journals. It is the way to increase their visibility and impact in the world of science.

As the international standard of measurement, the metric system is one key to success in the global marketplace. International standards have become an important factor in international economic competition. Non-metric products are becoming increasingly unacceptable in world markets that favor metric products. Procurement is the primary federal tool for encouraging and helping U.S. industry to convert voluntarily to the metric system. Besides the perceived unwillingness of the customer, certain regulatory language, and certain legal definitions in some states, there are no major impediments to conversion of the remaining non-metric industries to metric usage. Instead, there are good reasons for changing, including an opportunity to rethink many industry standards and to take advantage of size standardization. Also, when the remaining industries adopt the metric system, they will come into conformance with federal agencies engaged in similar activities.

Numerical analysis of data from international trade and ecological networks has shown that the non-linear fitness-complexity metric is the best candidate to rank nodes by importance in bipartite networks that exhibit a nested structure. Despite its relevance for real networks, the mathematical properties of the metric and its variants remain largely unexplored. Here, we perform an analytic and numeric study of the fitness-complexity metric and a new variant, called minimal extremal metric. We rigorously derive exact expressions for node scores for perfectly nested networks and show that these expressions explain the non-trivial convergence properties of the metrics. A comparison between the fitness-complexity metric and the minimal extremal metric on real data reveals that the latter can produce improved rankings if the input data are reliable.

3 4 5 6 7 8 Story Points Delivered by the Team Sprint Number Bar Chart of Velocity CMU/SEI-2013-TN-029 | 13 In this fictional case, the ‘back...counter to tenets of Agile methods. Velocity is a local measure, used by an individual development team to gauge the realism of commitments they make. The...is a metric primarily intended to guide the development team itself to understand the realism in delivery commitments. The feed- back provided at the

Since it was first discovered in 1963 the Kerr metric has been used by relativists as a test-bed for conjectures on worm-holes, time travel, closed time-like loops, and the existence or otherwise of global Cauchy surfaces. More importantly, it has also used by astrophysicists to investigate the effects of collapsed objects on their local environments. These two groups of applications should not be confused. Astrophysical Black Holes are not the same as the Kruskal solution and its generalisations.

A relativistic wave equation for spin 1/2 particles in the Melvin space-time, a space-time where the metric is determined by a magnetic field, is obtained. The energy levels for these particles are obtained as functions of the magnetic field and compared with the ones calculated with the Dirac equation in the flat Minkowski space-time. The numeric values for some magnetic fields of interest are shown. With these results, the effects of very intense magnetic fields on the energy levels, as intense as the ones expected to be produced in magnetars or in ultra-relativistic heavy-ion collisions, are investigated. (orig.)

As in the field of ""Invariant Distances and Metrics in Complex Analysis"" there was and is a continuous progress this is the second extended edition of the corresponding monograph. This comprehensive book is about the study of invariant pseudodistances (non-negative functions on pairs of points) and pseudometrics (non-negative functions on the tangent bundle) in several complex variables. It is an overview over a highly active research area at the borderline between complex analysis, functional analysis and differential geometry. New chapters are covering the Wu, Bergman and several other met

A physics-first derivation of the Schwarzschild metric is given. Gravitation is described in terms of the effects of tidal forces (or of spacetime curvature) on the volume of a small ball of test particles (a dust ball), freely falling after all particles were at rest with respect to each other initially. Because this formulation avoids the use of tensors, neither advanced tensor calculus nor sophisticated differential geometry are needed in the calculation. The derivation is not lengthy and it has visual appeal, so it may be useful in teaching.

Measuring true progress towards energy conservation goals requires the accurate reporting and accounting of energy consumption. An accurate energy metrics framework is also a critical element for verifiable Greenhouse Gas Inventories. Energy conservation in government can reduce expenditures on energy costs leaving more funds available for public services. In addition to monetary savings, conserving energy can help to promote energy security, air quality, and a reduction of carbon footprint. With energy consumption/GHG inventories recently produced at the Federal level, state and local governments are beginning to also produce their own energy metrics systems. In recent years, many states have passed laws and executive orders which require their agencies to reduce energy consumption. In June 2008, SC state government established a law to achieve a 20% energy usage reduction in state buildings by 2020. This study examines case studies from other states who have established similar goals to uncover the methods used to establish an energy metrics system. Direct energy consumption in state government primarily comes from buildings and mobile sources. This study will focus exclusively on measuring energy consumption in state buildings. The case studies reveal that many states including SC are having issues gathering the data needed to accurately measure energy consumption across all state buildings. Common problems found include a lack of enforcement and incentives that encourage state agencies to participate in any reporting system. The case studies are aimed at finding the leverage used to gather the needed data. The various approaches at coercing participation will hopefully reveal methods that SC can use to establish the accurate metrics system needed to measure progress towards its 20% by 2020 energy reduction goal. Among the strongest incentives found in the case studies is the potential for monetary savings through energy efficiency. Framing energy conservation

We show that, in the framework of Deformed Special Relativity (DSR), namely a (four-dimensional) generalization of the (local) space-time struc- ture based on an energy-dependent "deformation" of the usual Minkowski geometry, two kinds of gauge symmetries arise, whose spaces either coin- cide with the deformed Minkowski space or are just internal spaces to it. This is why we named them "metric gauge theories". In the case of the internal gauge ?elds, they are a consequence of the deformed Minkowski space (DMS) possessing the structure of a generalized Lagrange space. Such a geometrical structure allows one to de?ne curvature and torsion in the DMS.

Any surface is completely characterized by a metric and a symmetric tensor satisfying the Gauss-Codazzi-Mainardi equations (GCM), which identifies the latter as its curvature. We demonstrate that physical questions relating to a surface described by any Hamiltonian involving only surface degrees of freedom can be phrased completely in terms of these tensors without explicit reference to the ambient space: the surface is an emergent entity. Lagrange multipliers are introduced to impose GCM as constraints on these variables and equations describing stationary surface states derived. The behavior of these multipliers is explored for minimal surfaces, showing how their singularities correlate with surface instabilities.

Any surface is completely characterized by a metric and a symmetric tensor satisfying the Gauss–Codazzi–Mainardi equations (GCM), which identifies the latter as its curvature. We demonstrate that physical questions relating to a surface described by any Hamiltonian involving only surface degrees of freedom can be phrased completely in terms of these tensors without explicit reference to the ambient space: the surface is an emergent entity. Lagrange multipliers are introduced to impose GCM as constraints on these variables and equations describing stationary surface states derived. The behavior of these multipliers is explored for minimal surfaces, showing how their singularities correlate with surface instabilities.

Nine small volume classrooms in schools located in the Chicago suburbs were tested to quantify speech intelligibility at various seat locations. Several popular intelligibility metrics were investigated, including Speech Transmission Index (STI), %Alcons, Signal to Noise Ratios (SNR), and 80 ms Useful/Detrimental Ratios (U80). Incorrect STI values were experienced in high noise environments, while the U80s and the SNRs were found to be the most accurate methodologies. Test results are evaluated against the guidelines of ANSI S12.60-2002, and match the data from previous research.

Metric Propositional Neighborhood Logic (MPNL) over natural numbers. MPNL features two modalities referring, respectively, to an interval that is “met by” the current one and to an interval that “meets” the current one, plus an infinite set of length constraints, regarded as atomic propositions...... is decidable in double exponential time and expressively complete with respect to a well-defined sub-fragment of the two-variable fragment FO2[N,=,numbers. Moreover, we show that MPNL can be extended in a natural way...

We prove that the metric of a general holographic spacetime can be reconstructed (up to an overall conformal factor) from distinguished spatial slices - "light-cone cuts" - of the conformal boundary. Our prescription is covariant and applies to bulk points in causal contact with the boundary. Furthermore, we describe a procedure for determining the light-cone cuts corresponding to bulk points in the causal wedge of the boundary in terms of the divergences of correlators in the dual field theory. Possible extensions for determining the conformal factor and including the cuts of points outside of the causal wedge are discussed. We also comment on implications for subregion/subregion duality.

The generation of three-dimensional (3-D) digital models produced by optical technologies in some cases involves metric errors. This happens when small high-resolution 3-D images are assembled together in order to model a large object. In some applications, as for example 3-D modeling of Cultural Heritage, the problem of metric accuracy is a major issue and no methods are currently available for enhancing it. The authors present a procedure by which the metric reliability of the 3-D model, obtained through iterative alignments of many range maps, can be guaranteed to a known acceptable level. The goal is the integration of the 3-D range camera system with a close range digital photogrammetry technique. The basic idea is to generate a global coordinate system determined by the digital photogrammetric procedure, measuring the spatial coordinates of optical targets placed around the object to be modeled. Such coordinates, set as reference points, allow the proper rigid motion of few key range maps, including a portion of the targets, in the global reference system defined by photogrammetry. The other 3-D images are normally aligned around these locked images with usual iterative algorithms. Experimental results on an anthropomorphic test object, comparing the conventional and the proposed alignment method, are finally reported.

Let Mn be an n-dimensional submanifold without umbilical points in the (n+1)-dimen-sional unit sphere Sn+1. Four basic invariants of Mn under the Moebius transformation group of Sn+1 are a1-form Φ called moebius form, a symmetric (0, 2) tensor A called Blaschke tensor, a symmetric (0, 2) tensor B called Moebius second fundamental form and a positive definite (0, 2) tensor g called Moebius metric. A symmetric (0, 2) tensor D = A+μB called para-Blaschke tensor, where μ is constant, is also an Moebius invariant. We call the para-Blaschke tensor is isotropic if there exists a function λ such that D = λg. One of the basic questions in Moebius geometry is to classify the hypersurfaces with isotropic para-Blaschke tensor. When λ is not constant, all hypersurfaces with isotropic para-Blaschke tensor are explicitly expressed in this paper.

The use of steel catenary risers (SCR) has been considered a strong candidate for the future development of petroleum exploration in ultra deep water. These pipes are utilized to raise the petroleum production from a marine well on the ocean seabed up to the offshore oil platforms. While manufacturing these pipes, the welding process of the various riser sections is considered critical. Imperfections in the weld or in the alignment of the sections can become stress accumulators and, consequently, reduce the useful life of the SCR. A failure in these pipes can produce a serious environmental disaster. It has therefore been identified, the necessity to develop a system capable of evaluating the geometric quality of the riser's welded joints. This work has developed an optical system capable of characterizing the 3D geometric form of the welded ring in the riser's interior and measure the transversal and angular misalignment of the adjacent welded sections. A geometric evaluation of the weld is made from the measurement. The developed system achieves the internal geometric measurement of the riser's welded joints utilizing images acquired of the pipe's interior. A cloud of thousands of points is generated characterizing the measured surface. The prototype was bench tested, showing itself to be perfectly capable of achieving the measurement within the metrological necessities of the application and producing excellent results. (author)

Full Text Available In this research we aim to propose an advanced metric to evaluate the effectiveness of learning objects in order to be reused in new contexts. By the way learning objects reusability is achieving economic benefits from educational technology as it saving time and improving quality, but in case of choosing unsuitable learning object it may be less benefit than creating the learning object from scratch. Actually learning objects reusability can facilitate systems development and adaptation. By surveying the current evaluation metrics, we found that while they cover essential aspects, they enables all reviewers of learning objects to evaluate all criteria without paying attention to their roles in creating the learning object which affect their capability to evaluate specific criteria. Our proposed Approach (LOREM is evaluating learning objects based on a group of Aspects which measure their level of effectiveness in order to be reused in other contexts. LOREM classifies reviewers into 3 categories; 1. Academic Group: (Subject Expert Matter “SME” and Instructor. 2. Technical Group: (Instructional Designer “ID”, LO Developer and LO Designer. 3. Students group. The authorization of reviewers in these several categories are differentiated according to reviewer's type, e.g., (Instructor, LO Developer and their area of expert (their expertise subjects for academic and students reviewers.

Full Text Available Context: intuitively, the concept the set has been established as a collection of different elements, that is, a set is determined via the relationship of membership of an element of a universe as a whole. The situation, of course, is whether or does not belong; in a diffuse to each element subset of the universe it is associated with a degree of membership, which is a number between 0 and 1. The fuzzy subsets are established as a correspondence between each element of the universe and a degree of membership. Method: the study was based on previous work as articles or books, where authors present ideas about the importance of fuzzy subsets and the need to create with them new theories and spaces. Results: by combining two theories, a new study environment that allows state that corresponds Hausdorff distance, extends and adjusts the notion of distance between nonempty compact subsets in the environment of metrics spaces, more accurately generated in (Rn; du. Conclusions: the construction carried out allows a metric space with several qualities, where we can say that are the object consequence initial study.

Towards addressing some of the fundamental mysteries in physics at the micro- and macro-cosm level, that form the Key Science Projects (KSPs) for the Square Kilometer Array (SKA; such as Probing the Dark Ages and the Epoch of Reionization in the course of an Evolving Universe; Galaxy Evolution, Cosmology, and Dark Energy; and the Origin and evolution of Cosmic Magnetism) a suitable interfacing of these goals has to be achieved with its optimally designed array configuration, by means of a critical evaluation of the radio imagingcapabilities and metrics. Of the two forerunner sites, viz. Australia and South Africa, where pioneering advancements to state-of-the-art in synthesis array radio astronomy instrumentation are being attempted in the form of pathfinders to the SKA, for its eventual deployment, a diversity of site-dependent topology and design metrics exists. Here, the particular discussion involves those KSPs that relate to galactic morphology and evolution, and explores their suitability as a scientific research goal from the prespective of the location-driven instrument design specification. Relative merits and adaptability with regard to either site shall be presented from invoking well-founded and established array-design and optimization principles designed into a customized software tool.

Full Text Available Community detection in a complex network is an important problem of much interest in recent years. In general, a community detection algorithm chooses an objective function and captures the communities of the network by optimizing the objective function, and then, one uses various heuristics to solve the optimization problem to extract the interesting communities for the user. In this article, we demonstrate the procedure to transform a graph into points of a metric space and develop the methods of community detection with the help of a metric defined for a pair of points. We have also studied and analyzed the community structure of the network therein. The results obtained with our approach are very competitive with most of the well-known algorithms in the literature, and this is justified over the large collection of datasets. On the other hand, it can be observed that time taken by our algorithm is quite less compared to other methods and justifies the theoretical findings.

Social Media is now determined as an excellent communicative tool to connect directly with consumers. One of the most significant ways to connect with the consumers through these Social Networking Sites (SNS) is to create a facebook fanpage with brand contents and to place different posts periodically on these fanpages. In measuring social networking sites' effectiveness, corporate houses are now analyzing metrics in terms of calculating engagement rate, number of comments/share and likings in fanpages. So now, it is very important for the marketers to know the effectiveness of different contents or posts of fanpages in order to increase the fan responsiveness and engagement rate in the fan pages. In the study the authors have analyzed total 1834 brand posts from 17 international brands of Electronics companies. Data of 9 months (From December 2014 to August 2015) have been collected for analyses, which were available online in the Brand' fan pages. An econometrics analysis is conducted using Eviews 9, to determine the impact of different contents on fanpage engagement. The study picked the four most frequently posted content to determine their impact on PTA (people Talking About) metrics and Fanpage engagement activities.

Full Text Available Thinking marketing strategies from a resource-based perspective (Barney, 1991, proposing assets as either tangible, organizational and human, and from Constantin and Luch’s vision (1994, where strategic resources can be tanbigle or intangible, internal or external to the firm, raises a research approach on Marketing and Finance. According to Srivastava, Shervani and Fahey (1998 there are 3 market assets types, which generate firm value. Firm value can be measured by discounted cashflow, compromising marketing activities with value generation forcasts (Anderson, 1982; Day, Fahey, 1988; Doyle, 2000; Rust et al., 2004a. The economic value of marketing strategies and marketing metrics are calling strategy researchers’ and marketing managers’ attention, making clear the need for building a bridge able to articulate marketing and finance form a strategic perspective. This article proposes an analytical framework based on different scientific approaches envolving risk and return promoted by marketing strategies and points out advances concerning both methodological approaches and marketing strategies and its impact on firm metrics and value, usgin Srinivasan and Hanssens (2009 as a start point.

The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70percent capacity factor with 7percent T&D losses. Displacing such a plant for one year would save 3 billion kW h per year at the meter and reduce emissions by 3 million metric tons of CO2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question--Dr. Arthur H. Rosenfeld.

The satellite observatory LISA will be capable of detecting gravitational waves from extreme mass ratio inspirals (EMRIs), such as a small black hole orbiting a supermassive black hole. The gravitational effects of the much smaller mass can be treated as the perturbation of a known background metric, here the Schwarzschild metric. The perturbed Einstein field equations form a system of ten coupled partial differential equations. We solve the equations in the harmonic gauge, also called the Lorentz gauge or Lorenz gauge. Using separation of variables and Fourier transforms, we write the frequency domain solutions in terms of six radial functions which satisfy decoupled ordinary differential equations. The six functions are the Zerilli and five generalized Regge-Wheeler functions of spin 2,1,0. We use the solutions to calculate the gravitational self-force for circular orbits. The self-force gives the first order perturbative corrections to the equations of motion. Section 1.2 of the thesis has a more detailed ...

Sweeping processes are a class of evolution differential inclusions arising in elastoplasticity and were introduced by J.J. Moreau in the early seventies. The solution operator of the sweeping processes represents a relevant example of rate independent operator. As a particular case we get the so called play operator, which is a typical example of a hysteresis operator. The continuity properties of these operators were studied in several works. In this note we address the continuity with respect to the strict metric in the space of functions of bounded variation with values in the metric space of closed convex subsets of a Hilbert space. We provide counterexamples showing that for all BV-formulations of the sweeping process the corresponding solution operator is not continuous when its domain is endowed with the strict topology of BV and its codomain is endowed with the L1-topology. This is at variance with the play operator which has a BV-extension that is continuous in this case.

Cytogenetics plays a central role in the detection of chromosomal abnormalities and in the diagnosis of genetic diseases. A karyogram is an image representation of human chromosomes arranged in order of decreasing size and paired in 23 classes. In this paper we propose an approach to automatically pair the chromosomes into a karyogram, using the information obtained in a rough SVM-based classification step, to help the pairing process mainly based on similarity metrics between the chromosomes. Using a set of geometric and band pattern features extracted from the chromosome images, the algorithm is formulated on a Bayesian framework, combining the similarity metric with the results from the classifier. The solution is obtained solving a mixed integer program. Two datasets with contrasting quality levels and 836 chromosomes each were used to test and validate the algorithm. Relevant improvements with respect to the algorithm described by the authors in [1] were obtained with average paring rates above 92%, close to the rates obtained by human operators.

Cutting the emissions of Short-Lived Climate-Forcing Air Pollutants (SLCPs) gains increasing global attention as a mitigation policy option because of direct benefits for climate and co-benefits such as improvements in air quality. Including SLCPs as target components to abate within a single basket (e.g. the Kyoto Protocol) would, however, face issues with regard to: i) additional assumptions that are required to compare SLCP emissions and CO2 emissions within a basket in terms of climatic effects, especially because of the difference in lifetimes, ii) the accountability of non-climatic effects in the emission trading between SLCPs and CO2. The idea of a two-basket approach was originally proposed as a climatic analogue to the Montreal Protocol dealing with ozone depleting substances (Jackson 2009; Daniel et al. 2012; Smith et al. 2013). In a two-basket approach, emissions are allowed to be traded within a basket but not across the baskets. While this approach potentially ensures scientifically supported emission trading (e.g. (Smith et al. 2013)), this approach leaves open the important issue of how to determine the relative weight between two baskets. Determining the weight cannot be answered by science alone, as the question involves a value judgment as stressed in metric studies (e.g. (Tanaka et al. 2010; Tanaka et al. 2013)). We discuss emission metrics in the context of a two-basket approach and present policy implications of such an approach. In a two-basket approach, the weight between two baskets needs to be determined a priori or exogenously. Here, an opportunity arises to present synergetic policy options targeted at mitigating climate change and air pollution simultaneously. In other words, this could be a strategy to encourage policymakers to consider cross-cutting issues. Under a two-basket climate policy, policymakers would be exposed to questions such as: - What type of damages caused by climate change does one choose to avoid? - To what extent

The Department of Homeland Security National Cyber Security Division supported development of a small set of security ideals as a framework to establish measurable control systems security. Based on these ideals, a draft set of proposed technical metrics was developed to allow control systems owner-operators to track improvements or degradations in their individual control systems security posture. The technical metrics development effort included review and evaluation of over thirty metrics-related documents. On the bases of complexity, ambiguity, or misleading and distorting effects the metrics identified during the reviews were determined to be weaker than necessary to aid defense against the myriad threats posed by cyber-terrorism to human safety, as well as to economic prosperity. Using the results of our metrics review and the set of security ideals as a starting point for metrics development, we identified thirteen potential technical metrics - with at least one metric supporting each ideal. Two case study applications of the ideals and thirteen metrics to control systems were then performed to establish potential difficulties in applying both the ideals and the metrics. The case studies resulted in no changes to the ideals, and only a few deletions and refinements to the thirteen potential metrics. This led to a final proposed set of ten core technical metrics. To further validate the security ideals, the modifications made to the original thirteen potential metrics, and the final proposed set of ten core metrics, seven separate control systems security assessments performed over the past three years were reviewed for findings and recommended mitigations. These findings and mitigations were then mapped to the security ideals and metrics to assess gaps in their coverage. The mappings indicated that there are no gaps in the security ideals and that the ten core technical metrics provide significant coverage of standard security issues with 87% coverage. Based

In crystal optics and quantum electrodynamics in gravitational vacua, the propagation of light is not described by a metric, but an area metric geometry. In this article, this prompts us to study conditions for linear electrodynamics on area metric manifolds to be well-posed. This includes an identification of the timelike future cones and their duals associated to an area metric geometry, and thus paves the ground for a discussion of the related local and global causal structures in standard fashion. In order to provide simple algebraic criteria for an area metric manifold to present a consistent spacetime structure, we develop a complete algebraic classification of area metric tensors up to general transformations of frame. This classification, valuable in its own right, is then employed to prove a theorem excluding the majority of algebraic classes of area metrics as viable spacetimes. Physically, these results classify and drastically restrict the viable constitutive tensors of non-dissipative linear optical media.

This study is an overview of network topology metrics and a computational approach to analyzing graph topology via multiple-metric analysis on graph ensembles. The paper cautions against studying single metrics or combining disparate graph ensembles from different domains to extract global patterns. This is because there often exists considerable diversity among graphs that share any given topology metric, patterns vary depending on the underlying graph construction model, and many real data sets are not actual statistical ensembles. As real data examples, we present five airline ensembles, comprising temporal snapshots of networks of similar topology. Wikipedia language networks are shown as an example of a nontemporal ensemble. General patterns in metric correlations, as well as exceptions, are discussed by representing the data sets via hierarchically clustered correlation heat maps. Most topology metrics are not independent and their correlation patterns vary across ensembles. In general, density-related metrics and graph distance-based metrics cluster and the two groups are orthogonal to each other. Metrics based on degree-degree correlations have the highest variance across ensembles and cluster the different data sets on par with principal component analysis. Namely, the degree correlation, the s metric, their elasticities, and the rich club moments appear to be most useful in distinguishing topologies.

Full Text Available Students motivation for learning mathematical concepts can be increased when showing the usefulness of these concepts in practical problems. One important mathematical concept is the concept of metric space and, more related to the applications, the concept of metric function. In this work we aim to illustrate how important is to appropriately choose the metric when dealing with a practical problem. In particular, we focus on the problem of detection of noisy pixels in colour images. In this context, it is very important to appropriately measure the distances and similarities between the image pixels, which is done by means of an appropriate metric. We study the performance of different metrics, including recent fuzzy metrics, within a specific filter to show that it is indeed a critical choice to appropriately solve the task.

Metrics focus attention on what is important. Balanced metrics of primary health care inform purpose and aspiration as well as performance. Purpose in primary health care is about improving the health of people and populations in their community contexts. It is informed by metrics that include long-term, meaning- and relationship-focused perspectives. Aspirational uses of metrics inspire evolving insights and iterative improvement, using a collaborative, developmental perspective. Performance metrics assess the complex interactions among primary care tenets of accessibility, a whole-person focus, integration and coordination of care, and ongoing relationships with individuals, families, and communities; primary health care principles of inclusion and equity, a focus on people's needs, multilevel integration of health, collaborative policy dialogue, and stakeholder participation; basic and goal-directed health care, prioritization, development, and multilevel health outcomes. Environments that support reflection, development, and collaborative action are necessary for metrics to advance health and minimize unintended consequences.

A second-order expansion for the quantum fluctuations of the matter field was considered in the framework of the warm inflation scenario. The friction and Hubble parameters were expended by means of a semiclassical approach. The fluctuations of the Hubble parameter generates fluctuations of the metric. These metric fluctuations produce an effective term of curvature. The power spectrum for the metric fluctuations can be calculated on the infrared sector.

A problem of improving the accuracy of nonparametric entropy estimation for a stationary ergodic process is considered. New weak metrics are introduced and relations between metrics, measures, and entropy are discussed. Based on weak metrics, a new nearest-neighbor entropy estimator is constructed and has a parameter with which the estimator is optimized to reduce its bias. It is shown that estimator's variance is upper-bounded by a nearly optimal Cramer-Rao lower bound.

The modernized metric system, known universally as the International System of Units (abbreviated SI under the French name) was renamed in 1960 by the world body on standards. A map shows 98 percent of the world using or moving toward adoption of SI units. Only the countries of Burma, Liberia, Brunei, and Southern Yemen are nonmetric. The author describes a two-week session in Pretoria and Johannesburg on metrication, followed by additional meetings on metrication in Rhodesia. (MCW)

The use of secondary metrics has become special interest in bioequivalency studies. The applicability of partial area method, truncated AUC and Cmax/AUC has been argued by many authors. This study aims to evaluate the possible superiority of these metrics to primary metrics (i.e. AUCinf, Cmax and Tmax). The suitability of truncated AUC for assessment of absorption extent as well as Cmax/AUC and partial AUC for the evaluation of absorption rate in bioequivalency determination was investigated ...

Let X be a metric space. We say that a continuous surjection f: X→X is a topological Anosov map ( abbrev. TA-map) if f is expansive and has pseudo-orbit tracing property with respect to some compatible metric for X. This paper studies the properties of TA-maps of non-compact metric spaces and gives some conditions for the map to be topologically mixing.

In this paper we present a construction of effective cosmological models which describe the propagation of a massive quantum scalar field on a quantum anisotropic cosmological spacetime. Each obtained effective model is represented by a rainbow metric in which particles of distinct momenta propagate on different classical geometries. Our analysis shows that upon certain assumptions and conditions on the parameters determining such anisotropic models, we surprisingly obtain a unique deformation parameter $\\beta$ in the modified dispersion relation of the modes. Hence inducing an isotropic deformation despite the general starting considerations. We then ensure the recovery of the dispersion relation realized in the isotropic case, studied in [arXiv:1412.6000], when some proper symmetry constraints are imposed, and we estimate the value of the deformation parameter for this case in loop quantum cosmology context.

The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information 'pond' is disorganized, it a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by disjoint middleware.

The low frequency gravitational wave detectors like the evolved Laser Interferometer Space Antenna/New Gravitational Wave Observatory (eLISA/NGO) will give us the opportunity to test whether the supermassive compact objects lying at the centers of galaxies are indeed Kerr black holes. One way to do such a test is to compare the gravitational wave signals with templates of perturbed black hole spacetimes, the so-called bumpy black hole spacetimes. The Zipoy-Voorhees (ZV) spacetime (known also as the γ spacetime) can be included in the bumpy black hole family, since it can be considered as a perturbation of the Schwarzschild spacetime background. Several authors have suggested that the ZV metric corresponds to an integrable system. Contrary to this integrability conjecture, the present article shows by numerical examples that, in general, ZV belongs to the family of nonintegrable systems.

It is shown by explicit construction of new metrics, that General Relativity can solve the exact Poinc$\\acute{a}$re recurrence problem. In these solutions, the light cone, flips periodically between past and future, due to a periodically alternating arrow of the proper time. The geodesics in these universes show periodic Loschmidt's velocity reversion $v \\to -v$, at critical points, which leads to recurrence. However, the matter tensors of some of these solutions exhibit unusual properties - such as, periodic variations in density and pressure. While this is to be expected in periodic models, the physical basis for such a variation is not clear. Present paper therefore can be regarded as an extension of Tipler's "no go theorem for recurrence in an expanding universe", to other space-time geometries.

Usage data is increasingly regarded as a valuable resource in the assessment of scholarly communication items. However, the development of quantitative, usage-based indicators of scholarly impact is still in its infancy. The Digital Library Research & Prototyping Team at the Los Alamos National Laboratory's Research library has therefore started a program to expand the set of usage-based tools for the assessment of scholarly communication items. The two-year MESUR project, funded by the Andrew W. Mellon Foundation, aims to define and validate a range of usage-based impact metrics, and issue guidelines with regards to their characteristics and proper application. The MESUR project is constructing a large-scale semantic model of the scholarly community that seamlessly integrates a wide range of bibliographic, citation and usage data. Functioning as a reference data set, this model is analyzed to characterize the intricate networks of typed relationships that exist in the scholarly community. The resulting c...

A physics-first derivation of the Schwarzschild metric is given. Gravitation is described in terms of the effects of tidal forces (or of spacetime curvature) on the volume of a small ball of test particles (a dust ball), freely falling after all particles were at rest with respect to each other initially. The possibility to express Einstein's equation this way and some of its ramifications have been enjoyably discussed by Baez and Bunn [Am. J. Phys. 73, 644 (2005)]. Since the formulation avoids the use of tensors, neither advanced tensor calculus nor sophisticated differential geometry are needed in the calculation. The derivation is not lengthy and it has visual appeal, so it may be useful in teaching.

In this paper we present a construction of effective cosmological models which describe the propagation of a massive quantum scalar field on a quantum anisotropic cosmological spacetime. Each obtained effective model is represented by a rainbow metric in which particles of distinct momenta propagate on different classical geometries. Our analysis shows that upon certain assumptions and conditions on the parameters determining such anisotropic models, we surprisingly obtain a unique deformation parameter β in the modified dispersion relation of the modes, hence, inducing an isotropic deformation despite the general starting considerations. We then ensure the recovery of the dispersion relation realized in the isotropic case, studied in [M. Assanioussi, A. Dapor, and J. Lewandowski, Phys. Lett. B 751, 302 (2015), 10.1016/j.physletb.2015.10.043], when some proper symmetry constraints are imposed, and we estimate the value of the deformation parameter for this case in loop quantum cosmology context.

Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2013 Annual Metrics Report.

Evolving multiplex networks are a powerful model for representing the dynamics along time of different phenomena, such as social networks, power grids, biological pathways. However, exploring the structure of the multiplex network time series is still an open problem. Here we propose a two-steps strategy to tackle this problem based on the concept of distance (metric) between networks. Given a multiplex graph, first a network of networks is built for each time steps, and then a real valued time series is obtained by the sequence of (simple) networks by evaluating the distance from the first element of the series. The effectiveness of this approach in detecting the occurring changes along the original time series is shown on a synthetic example first, and then on the Gulf dataset of political events.

We prove that the metric of a general holographic spacetime can be reconstructed (up to an overall conformal factor) from distinguished spatial slices—‘light-cone cuts’—of the conformal boundary. Our prescription is covariant and applies to bulk points in causal contact with the boundary. Furthermore, we describe a procedure for determining the light-cone cuts corresponding to bulk points in the causal wedge of the boundary in terms of the divergences of correlators in the dual field theory. Possible extensions for determining the conformal factor and including the cuts of points outside of the causal wedge are discussed. We also comment on implications for subregion/subregion duality.

String theory can accommodate black holes with the black hole parameters related to string moduli. It is a well known but remarkable feature that the near horizon geometry of a large class of black holes arising from string theory contains a BTZ part. A mathematical theorem (Sullivan's Theorem) relates the three dimensional geometry of the BTZ metric to the conformal structures of a two dimensional space, thus providing a precise kinematic statement of holography. Using this theorem it is possible to argue that the string moduli space in this region has to have negative curvature from the BTZ part of the associated spacetime. This is consistent with a recent conjecture of Ooguri and Vafa on string moduli space.

Full Text Available This paper presents an overview of economic metrics for wind energy projects. The attractiveness of the proposed wind energy can vary considerably between evaluation of the private and public sector. The financing structure is very important influencing factor for the attractiveness of wind energy project. In many cases, the economic activities practiced by economic agents of financing the project in order to earn sufficient income to meet the investors‘ needs and other economic agents involved. They are also characterized the assessment indicators and economic-financial management of projects implemented renewable energy exclusively for onshore wind energy systems. All indicators presented should be used in economic engineering analysis to meet specific information needs for decision making in situations of investment opportunity for renewable energy projects.

The primary objective of this project is to determine the feasibility of producing technology transfer metrics that answer the question: Do NASA/MSFC technical assistance activities impact economic growth? The data for this project resides in a 7800-record database maintained by Tec-Masters, Incorporated. The technology assistance data results from survey responses from companies and individuals who have interacted with NASA via a Technology Transfer Agreement, or TTA. The goal of this project was to determine if the existing data could provide indications of increased wealth. This work demonstrates that there is evidence that companies that used NASA technology transfer have a higher job growth rate than the rest of the economy. It also shows that the jobs being supported are jobs in higher wage SIC codes, and this indicates improvements in personal wealth. Finally, this work suggests that with correct data, the wealth issue may be addressed.

Quantitative versions of the central results of the metric theory of continued fractions were given primarily by C. De Vroedt. In this paper we give improvements of the bounds involved . For a real number , let $$x=c_0+\\dfrac{1}{c_1+\\dfrac{1}{c_2+\\dfrac{1}{c_3+\\dfrac{1}{c_4+_\\ddots}}}}.$$ A sample result we prove is that given $\\epsilon > 0$, $$(c_1(x)\\cdots c_n(x))^{\\frac{1}{n}}=\\prod^\\infty_{k=1}\\left( 1+\\frac{1}{k(k+2)}\\right)^{\\frac{\\log \\, k}{\\log \\, 2}}+o\\left(n^{-\\frac{1}{2}}(\\log \\, n)^{\\frac{3}{2}}(\\log \\, \\log \\, n)^{\\frac{1}{2}+\\epsilon}\\right)$$

Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2014 Annual Metrics Report.

This paper considers a cloud computing setting in which similarity querying of metric data is outsourced to a service provider. The data is to be revealed only to trusted users, not to the service provider or anyone else. Users query the server for the most similar data objects to a query example. Outsourcing offers the data owner scalability and a low-initial investment. The need for privacy may be due to the data being sensitive (e.g., in medicine), valuable (e.g., in astronomy), or otherwise confidential. Given this setting, the paper presents techniques that transform the data prior to supplying it to the service provider for similarity queries on the transformed data. Our techniques provide interesting trade-offs between query cost and accuracy. They are then further extended to offer an intuitive privacy guarantee. Empirical studies with real data demonstrate that the techniques are capable of offering privacy while enabling efficient and accurate processing of similarity queries.

A relativistic gas in a Schwarzschild metric is studied within the framework of a relativistic Boltzmann equation in the presence of gravitational fields, where Marle's model for the collision operator of the Boltzmann equation is employed. The transport coefficients of bulk and shear viscosities and thermal conductivity are determined from the Chapman-Enskog method. It is shown that the transport coefficients depend on the gravitational potential. Expressions for the transport coefficients in the presence of weak gravitational fields in the non-relativistic (low temperatures) and ultra-relativistic (high temperatures) limiting cases are given. Apart from the temperature gradient the heat flux has two relativistic terms. The first one, proposed by Eckart, is due to the inertia of energy and represents an isothermal heat flux when matter is accelerated. The other, suggested by Tolman, is proportional to the gravitational potential gradient and indicates that -- in the absence of an acceleration field -- a stat...

Proteins manifest themselves as phenotypic traits, retained or lost in living systems via evolutionary pressures. Simply put, survival is essentially the ability of a living system to synthesize a functional protein that allows for a response to environmental perturbations (adaptation). Loss of functional proteins leads to extinction. Currently there are no universally applicable quantitative metrics at the molecular level for either measuring ‘evolvability’ of life or for assessing the conditions under which a living system would go extinct and why. In this work, we show emergence of the first such metric by utilizing the recently discovered stoichiometric margin of life for all known naturally occurring (and functional) proteins. The constraint of having well-defined stoichiometries of the 20 amino acids in naturally occurring protein sequences requires utilization of the full scope of degeneracy in the genetic code, i.e. usage of all codons coding for an amino acid, by only 11 of the 20 amino acids. This shows that the non-availability of individual codons for these 11 amino acids would disturb the fine stoichiometric balance resulting in non-functional proteins and hence extinction. Remarkably, these amino acids are found in close proximity of any given amino acid in the backbones of thousands of known crystal structures of folded proteins. On the other hand, stoichiometry of the remaining 9 amino acids, found to be farther/distal from any given amino acid in backbones of folded proteins, is maintained independent of the number of codons available to synthesize them, thereby providing some robustness and hence survivability.

In the context of Thurstons geometrisation program we address the question which compact aspherical 3-manifolds admit Riemannian metrics of nonpositive curvature. We show that non-geometric Haken manifolds generically, but not always, admit such metrics. More precisely, we prove that a Haken manifold with, possibly empty, boundary of zero Euler characteristic admits metrics of nonpositive curvature if the boundary is non-empty or if at least one atoroidal component occurs in its canonical topological decomposition. Our arguments are based on Thurstons Hyperbolisation Theorem. We give examples of closed graph-manifolds with linear gluing graph and arbitrarily many Seifert components which do not admit metrics of nonpositive curvature.

Traditionally, the success of a researcher is assessed by the number of publications he or she publishes in peer-reviewed, indexed, high impact journals. This essential yardstick, often referred to as the impact of a specific researcher, is assessed through the use of various metrics. While researchers may be acquainted with such matrices, many do not know how to use them to enhance their careers. In addition to these metrics, a number of other factors should be taken into consideration to objectively evaluate a scientist's profile as a researcher and academician. Moreover, each metric has its own limitations that need to be considered when selecting an appropriate metric for evaluation. This paper provides a broad overview of the wide array of metrics currently in use in academia and research. Popular metrics are discussed and defined, including traditional metrics and article-level metrics, some of which are applied to researchers for a greater understanding of a particular concept, including varicocele that is the thematic area of this Special Issue of Asian Journal of Andrology. We recommend the combined use of quantitative and qualitative evaluation using judiciously selected metrics for a more objective assessment of scholarly output and research impact. PMID:26806079

The first part of this paper we talk about the story of how to introduce the Hua domains and summarize the main results on Hua domains.The second part,the explicit complete Einstein-K(a)hler metric on the special type of Hua domains is given and the sharp estimate of holomorphic sectional curvature under this metric is also obtained.In the meantime we also prove that the complete Einstein-K(a)hler metric is equivalent to the Bergman metric on the special type of Hua domain.

We use a recent result by Cabezas et al. to build up an approximate solution to the gravitational field created by a rigidly rotating polytrope. We solve the linearized Einstein equations inside and outside the surface of zero pressure including second-order corrections due to rotational motion to get an asymptotically flat metric in a global harmonic coordinate system. We prove that if the metric and their first derivatives are continuous on the matching surface up to this order of approximation, the multipole moments of this metric cannot be fitted to those of the Kerr metric.

This paper consists of two results dealing with balanced metrics (in S. Donaldson terminology) on nonconpact complex manifolds. In the first one we describe all balanced metrics on Cartan domains. In the second one we show that the only Cartan-Hartogs domain which admits a balanced metric is the complex hyperbolic space. By combining these results with those obtained in [13] (Kaehler-Einstein submanifolds of the infinite dimensional projective space, to appear in Mathematische Annalen) we also provide the first example of complete, Kaehler-Einstein and projectively induced metric g such that $\\alpha g$ is not balanced for all $\\alpha >0$.

Presentation examines selected green chemistry breakthroughs by industrial leaders, and discusses tools and metrics companies are using to assess their sustainable and green chemistry and engineering efforts.

determined to be conceptually different from one another. The metrics were classified by their meaning and interpretation based on the types of information necessary to calculate the metrics. Four different classes were identified: 1) Sensitivity robustness metrics; 2) Size of feasible design space...... and to remove the ambiguities of the term robustness. By applying an exemplar metric from each class to a case study, the differences between the classes were further highlighted.These classes form the basis for the definition of four specific sub-definitions of robustness, namely the ‘robust concept’, ‘robust...

Given a metric space with a Borel probability measure, for each integer $N$ we obtain a probability distribution on $N\\times N$ distance matrices by considering the distances between pairs of points in a sample consisting of $N$ points chosen indepenedently from the metric space with respect to the given measure. We show that this gives an asymptotically bi-Lipschitz relation between metric measure spaces and the corresponding distance matrices. This is an effective version of a result of Vershik that metric measure spaces are determined by associated distributions on infinite random matrices.

Matter can be consistently coupled to two metrics at once. This is allowed in the most general ghost-free, bimetric theory of gravity, and it unlocks an additional symmetry with respect to the exchange of the metrics. This double coupling, however, raises the problem of identifying the observables of the theory. It is shown that there is no physical metric to which matter would universally couple, and that moreover such an effective metric generically does not exist even for an individual matter species. A resolution is suggested in the context of Finsler geometry.

Background. Datasets consisting of synthetic neural data generated with quantifiable and controlled parameters are a valuable asset in the process of testing and validating directed functional connectivity metrics...

Traditionally, the success of a researcher is assessed by the number of publications he or she publishes in peer-reviewed, indexed, high impact journals. This essential yardstick, often referred to as the impact of a specific researcher, is assessed through the use of various metrics. While researchers may be acquainted with such matrices, many do not know how to use them to enhance their careers. In addition to these metrics, a number of other factors should be taken into consideration to objectively evaluate a scientist's profile as a researcher and academician. Moreover, each metric has its own limitations that need to be considered when selecting an appropriate metric for evaluation. This paper provides a broad overview of the wide array of metrics currently in use in academia and research. Popular metrics are discussed and defined, including traditional metrics and article-level metrics, some of which are applied to researchers for a greater understanding of a particular concept, including varicocele that is the thematic area of this Special Issue of Asian Journal of Andrology. We recommend the combined use of quantitative and qualitative evaluation using judiciously selected metrics for a more objective assessment of scholarly output and research impact.

Stochastic processes with values in the space of metric measure spaces (complete separable metric spaces equipped with a probability measure) are becoming more and more important in probability theory, especially for the modelling of evolutionary systems, where at each time the whole phylogenetic tree is considered. Greven, Pfaffelhuber and Winter introduced the Gromov-Prohorov metric d_{GPW} on the space of metric measure spaces and showed that it induces the Gromov-weak topology. They also conjectured that this topology coincides with the topology induced by Gromov's Box_1 metric. In this note, we show that this is indeed true, and the metrics are even bi-Lipschitz equivalent. More precisely, d_{GPW}= 1/2 Box_{1/2} and hence d_{GPW} <= Box_1 <= 2d_{GPW}.

We introduce a model for Hermitian holormorphic Deligne cohomology on a projective algebraic manifold which allows to incorporate singular hermitian structures along a normal crossing divisor. In the case of a projective curve, the cup-product in cohomology is shown to correspond to a generalization of the Deligne pairing to line bundles with "good" hermitian metrics in the sense of Mumford and others. A particular case is that of the tangent bundle of the curve twisted by the negative of the...

Wavefront sensorless schemes for correction of aberrations induced by biological specimens require a time invariant property of an image as a measure of fitness. Image intensity cannot be used as a metric for Single Molecule Localization (SML) microscopy because the intensity of blinking fluorophores follows exponential statistics. Therefore a robust intensity-independent metric is required. We previously reported a Fourier Metric (FM) that is relatively intensity independent. The Fourier metric has been successfully tested on two machine learning algorithms, a Genetic Algorithm and Particle Swarm Optimization, for wavefront correction about 50 μm deep inside the Central Nervous System (CNS) of Drosophila. However, since the spatial frequencies that need to be optimized fall into regions of the Optical Transfer Function (OTF) that are more susceptible to noise, adding a level of denoising can improve performance. Here we present wavelet-based approaches to lower the noise level and produce a more consistent metric. We compare performance of different wavelets such as Daubechies, Bi-Orthogonal, and reverse Bi-orthogonal of different degrees and orders for pre-processing of images.

Automatic matching of multi-modal remote sensing images (e.g., optical, LiDAR, SAR and maps) remains a challenging task in remote sensing image analysis due to significant non-linear radiometric differences between these images. This paper addresses this problem and proposes a novel similarity metric for multi-modal matching using geometric structural properties of images. We first extend the phase congruency model with illumination and contrast invariance, and then use the extended model to build a dense descriptor called the Histogram of Orientated Phase Congruency (HOPC) that captures geometric structure or shape features of images. Finally, HOPC is integrated as the similarity metric to detect tie-points between images by designing a fast template matching scheme. This novel metric aims to represent geometric structural similarities between multi-modal remote sensing datasets and is robust against significant non-linear radiometric changes. HOPC has been evaluated with a variety of multi-modal images including optical, LiDAR, SAR and map data. Experimental results show its superiority to the recent state-of-the-art similarity metrics (e.g., NCC, MI, etc.), and demonstrate its improved matching performance.

This analysis will determine the equations of the cosine directions for all flux of the optical spectrum in quasars. Studies on Hausdorff metric will greatly enhance our understanding of quasars distances. This study will complete steps in the classification of quasars by finding the minimum variance of flux by using the RaoBlackwell Theorem. The papers of C. R. Rao and D. Blackwell will be examined to clarify more of the above theorem. Keywords: Theory of Flux, SDSS, Quasars, Redshift (z), Population Perimeters, Regression Analysis

Full Text Available The purpose of this paper is to prove some Presic-Boyd-Wong type fixed point theorems in ordered metric spaces. The results of this paper generalize the famous results of Presic and Boyd-Wong in ordered metric spaces. We also initiate the homotopy result in product spaces. Some examples are provided which illustrate the results proved herein.

The Caratheodory and Kobayashi metrics have proved to be important tools in the function theory of several complex variables. But they are less familiar in the context of one complex variable. Our purpose here is to gather in one place the basic ideas about these important invariant metrics for domains in the plane and to provide some illuminating examples and applications.

Full Text Available Purpose: To comprehensively evaluate the overall performance of a group or an individual in both bibliometrics and patentometrics. Design/methodology/approach: Trace metrics were applied to the top 30 universities in the 2014 Academic Ranking of World Universities (ARWU — computer sciences, the top 30 ESI highly cited papers in the computer sciences field in 2014, as well as the top 30 assignees and the top 30 most cited patents in the National Bureau of Economic Research (NBER computer hardware and software category. Findings: We found that, by applying trace metrics, the research or marketing impact efficiency, at both group and individual levels, was clearly observed. Furthermore, trace metrics were more sensitive to the different publication-citation distributions than the average citation and h-index were. Research limitations: Trace metrics considered publications with zero citations as negative contributions. One should clarify how he/she evaluates a zero-citation paper or patent before applying trace metrics. Practical implications: Decision makers could regularly examinine the performance of their university/company by applying trace metrics and adjust their policies accordingly. Originality/value: Trace metrics could be applied both in bibliometrics and patentometrics and provide a comprehensive view. Moreover, the high sensitivity and unique impact efficiency view provided by trace metrics can facilitate decision makers in examining and adjusting their policies.

Full Text Available We discuss Caristi’s fixed point theorem for mappings defined on a metric space endowed with a graph. This work should be seen as a generalization of the classical Caristi’s fixed point theorem. It extends some recent works on the extension of Banach contraction principle to metric spaces with graph.

... the Office of Small and Disadvantaged Business Utilization (SDB) will be obtained prior to... each fiscal year, each USAID/W procurement activity and each Mission will submit a copy of the metric waiver log for the year to the USAID Metric Executive. (Mission logs are to be consolidated in a Mission...

... business-related activities. Metric implementation may take longer where the use of the system is initially... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Metric system of measurement. 84.15 Section 84.15 Housing and Urban Development Office of the Secretary, Department of Housing and Urban...

We present an exact stationary {\\it axially symmetric} vacuum solution of metric-affine gravity (MAG) which generalises the recently reported spherically symmetric solution. Besides the metric, it carries nonmetricity and torsion as post-Riemannian geometrical structures. The parameters of the solution are interpreted as mass and angular momentum and as dilation, shear and spin charges.

The ability to correctly identify the greenest of several syntheses is a particularly useful asset for young chemists in the growing green economy. The famous univariate metrics atom economy and environmental factor provide insufficient information to allow for a proper selection of a green process. Multivariate metrics, such as those used in…

In this paper we prove that Sands' topological condition for Collet-Eckmann maps implies Tsujii's metrical condition; on the other hand, if a Collet-Eckmann map satisfies Tsujii's metrical condition, then it satisfies Sands' topologicaJ condition. Thus we obtain three different versions of Benedicks-Carleson Theorem by using topological conditions.

The original M3 surveillance metrics assume that the baseline is known. In this article, adapted M3metrics are presented when the baseline is not known and estimated by available data. Deciding on how much available data is enough is also discussed.

We present a method that combines textures, blending, and scattered-data interpolation to visualize several metrics defined on overlapping areas-of-interest on UML class diagrams. We aim to simplify the task of visually correlating the distribution and outlier values of a multivariate metric dataset