Sample records for background information pertinent

This report summarizes sources of geologic and hydrologic information useful to water managers and others involved in the investigation, appraisal, development, and protection of ground-water resources in Rhode Island. The geographic scope of the report includes Rhode Island and small adjoining areas of Massachusetts and Connecticut, where drainage basins are shared with these States. The information summarized is found in maps and reports prepared by the U.S. Geological Survey and published by either the U.S. Geological Survey or by the State of Rhode Island. Information sources are presented in maps and tables. Reference maps show drainage divides, town lines, and the 7.5-minute grid of latitude and longitude for the State. Maps show availability of surficial geologic maps, bedrock geologic maps, and ground-water studies by 7.5-minute quadrangle, and show availability of ground-water studies by drainage basin, subbasin, and special study area. Sources of geologic and hydrologic information for the thirty-seven 7.5-minute quadrangles covering Rhode Island have been compiled based on the following information categories: surficial geology, bedrock geology, subsurface materials, altitude of bedrock surface, water-table altitudes, water-table contours, saturated thickness, hydraulic conductivity, transmissivity, drainage divides, recharge areas, ground-water reservoirs, induced infiltration, and ground-water quality. A table for each of the 37 quadrangles lists the major categories of information available for that quadrangle, provides references to the publications in which the information can be found, and indicates the format, scale, and other pertinent attributes of the information. A table organized by report series gives full citations for publications prepared by the U.S. Geological Survey pertaining to the geology and hydrology of Rhode Island. To facilitate location of information for particular municipalities, a table lists cities and towns in the State and

This supplement to the Basic Course in Italian developed by the Defense Language Institute provides area backgroundinformation on a variety of topics. They include: (1) housing and servants, (2) dining and a glossary of gastronomic terminology, (3) driving in Italy, and (4) relations with the police. The appendix contains material on: the Italian…

This report provides backgroundinformation to the report Energy Company Competitiveness: Little to Do With Subsidies (DOE 1994). The main body of this publication consists of data uncovered during the course of research on this DOE report. This data pertains to major government energy policies in each country studied. This report also provides a summary of the DOE report. In October 1993, the Office of Energy Intelligence, US Department of Energy (formerly the Office of Foreign Intelligence), requested that Pacific Northwest Laboratory prepare a report addressing policies and actions used by foreign governments to enhance the competitiveness of their energy firms. Pacific Northwest Laboratory prepared the report Energy Company Competitiveness Little to Do With Subsidies (DOE 1994), which provided the analysis requested by DOE. An appendix was also prepared, which provided extensive background documentation to the analysis. Because of the length of the appendix, Pacific Northwest Laboratory decided to publish this information separately, as contained in this report.

To shed light on the fundamental problems posed by dark energy and dark matter, a large number of experiments have been performed and combined to constrain cosmological models. We propose a novel way of quantifying the information gained by updates on the parameter constraints from a series of experiments which can either complement earlier measurements or replace them. For this purpose, we use the Kullback-Leibler divergence or relative entropy from information theory to measure differences in the posterior distributions in model parameter space from a pair of experiments. We apply this formalism to a historical series of cosmic microwave background experiments ranging from Boomerang to WMAP, SPT, and Planck. Considering different combinations of these experiments, we thus estimate the information gain in units of bits and distinguish contributions from the reduction of statistical errors and the "surprise" corresponding to a significant shift of the parameters' central values. For this experiment series, we find individual relative entropy gains ranging from about 1 to 30 bits. In some cases, e.g. when comparing WMAP and Planck results, we find that the gains are dominated by the surprise rather than by improvements in statistical precision. We discuss how this technique provides a useful tool for both quantifying the constraining power of data from cosmological probes and detecting the tensions between experiments.

Since the release of the first NACA publication on fuel characteristics pertinent to the design of aircraft fuel systems (NACA-RM-E53A21), additional information has become available on MIL-F7914(AER) grade JP-5 fuel and several of the current grades of fuel oils. In order to make this information available to fuel-system designers as quickly as possible, the present report has been prepared as a supplement to NACA-RM-E53A21. Although JP-5 fuel is of greater interest in current fuel-system problems than the fuel oils, the available data are not as extensive. It is believed, however, that the limited data on JP-5 are sufficient to indicate the variations in stocks that the designer must consider under a given fuel specification. The methods used in the preparation and extrapolation of data presented in the tables and figures of this supplement are the same as those used in NACA-RM-E53A21.

This report discusses the following information about the Superconducting Super Collider: Goals and milestones; civil construction; ring components; cryogenics; vacuum and cooling water systems; electrical power; instrumentation and control systems; and installation planning.

This paper is intended to provide a solid base of information about the treatment of indirect university research costs in various jurisdictions and to highlight some of the factors that have contributed to increased interest in the issues surrounding the funding of indirect costs of research. University research in Ontario has continued to evolve…

The NASA Lewis Research Center supports many research facilities with many isolated buildings, including wind tunnels, test cells, and research laboratories. These facilities are all located on a 350 acre campus adjacent to the Cleveland Hopkins Airport. The function of NASA-Lewis is to do basic and applied research in all areas of aeronautics, fluid mechanics, materials and structures, space propulsion, and energy systems. These functions require a great variety of remote high speed, high volume data communications for computing and interactive graphic capabilities. In addition, new requirements for local distribution of intercenter video teleconferencing and data communications via satellite have developed. To address these and future communications requirements for the next 15 yrs, a project team was organized to design and implement a new high speed communication system that would handle both data and video information in a common lab-wide Local Area Network. The project team selected cable television broadband coaxial cable technology as the communications medium and first installation of in-ground cable began in the summer of 1980. The Lewis Information Network (LINK) became operational in August 1982 and has become the backbone of all data communications and video.

In two experiments, we investigated whether reading backgroundinformation benefits memory for text content by influencing the amount of content encoded or the organization of the encoded content. In Experiment 1, half of the participants read backgroundinformation about the issues to be discussed in the text material, whereas half did not. All the participants were then tested for free recall and cued recall of text content. Free recall was greater for individuals who read issue information than for those who did not. The groups did not differ on cued recall, suggesting that backgroundinformation did not facilitate the encoding of more text content. Measures of representational organization indicated that increased recall in the issue information group resulted from better organization of content in memory. Experiment 2 extended these findings, using backgroundinformation about text sources, demonstrated that the efficacy of backgroundinformation depends on the semantic relationship between that information and text content. PMID:12219893

Discusses background and theory underlying a design study for an interactive information retrieval system funded by the British Library Research and Development Department which will determine structural representations of anomalous states of knowledge (ASKs) underlying information needs. References are cited. (EJS)

This report provides Minnesota legislators with backgroundinformation on establishing state educational standards and periodic testing to measure student progress. Scientific management, the accountability movement, and the basic education movement were educational trends of the 1970's providing pressure on states to set standards to improve…

Instructors who are very familiar with a study area, as well as those who are not, find the field trip information acquisition and planning process speeded and made more effective by organizing it in stages. The stage follow a deductive progression: from the associated context region, to the study area, to the specific sample window sites, and from generalized backgroundinformation on the study region to specific technical data on the environmental and human use systems to be interpreted at each site. On the class trip and in the follow up laboratory, the learning/interpretive process are at first deductive in applying previously learned information and skills to analysis of the study site, then inductive in reading and interpreting the landscape, imagery, and maps of the site, correlating them with information of other samples sites and building valid generalizations about the larger study area, its context region, and other (similar and/or contrasting) regions.

The concept of a cockpit display of traffic information (CDTI) includes the integration of air traffic, navigation, and other pertinentinformation in a single electronic display in the cockpit. Concise display symbology was developed for use in later full-mission simulator evaluations of the CDTI concept. Experimental variables used included the update interval motion of the aircraft, the update type, (that is, whether the two aircraft were updated at the same update interval or not), the background (grid pattern or no background), and encounter type (straight or curved). Only the type of encounter affected performance.

Living systems are capable of processing multiple sources of information simultaneously. This is true even at the cellular level, where not only coexisting signals stimulate the cell, but also the presence of fluctuating conditions is significant. When information is received by a cell signaling network via one specific input, the existence of other stimuli can provide a background activity -or chatter- that may affect signal transmission through the network and, therefore, the response of the cell. Here we study the modulation of information processing by chatter in the signaling network of a human cell, specifically, in a Boolean model of the signal transduction network of a fibroblast. We observe that the level of external chatter shapes the response of the system to information carrying signals in a nontrivial manner, modulates the activity levels of the network outputs, and effectively determines the paths of information flow. Our results show that the interactions and node dynamics, far from being random, confer versatility to the signaling network and allow transitions between different information-processing scenarios. PMID:22174668

The cosmic microwave background (CMB) contains perturbations that are close to Gaussian and isotropic. This means that its information content, in the sense of the ability to constrain cosmological models, is closely related to the number of modes probed in CMB power spectra. Rather than making forecasts for specific experimental setups, here we take a more pedagogical approach and ask how much information we can extract from the CMB if we are only limited by sample variance. We show that, compared with temperature measurements, the addition of E-mode polarization doubles the number of modes available out to a fixed maximum multipole, provided that all of the TT, TE, and EE power spectra are measured. However, the situation in terms of constraints on particular parameters is more complicated, as we explain and illustrate graphically. We also discuss the enhancements in information that can come from adding B-mode polarization and gravitational lensing. We show how well one could ever determine the basic cosmological parameters from CMB data compared with what has been achieved with Planck, which has already probed a substantial fraction of the TT information. Lastly, we look at constraints on neutrino mass as a specific example of how lensing information improves future prospects beyond the current 6-parameter model.

Developed as background material for the 1978 Ohio White House Conference on Library and Information Services, this document provides information in six areas of concern for Ohio libraries: (1) library and information services--library users, library collections, special user needs; (2) public awareness--definition, current status, suggestions for…

Can knowledge help viewers when they appreciate an artwork? Experts’ judgments of the aesthetic value of a painting often differ from the estimates of naïve viewers, and this phenomenon is especially pronounced in the aesthetic judgment of abstract paintings. We compared the changes in aesthetic judgments of naïve viewers while they were progressively exposed to five pieces of backgroundinformation. The participants were asked to report their aesthetic judgments of a given painting after each piece of information was presented. We found that commentaries by the artist and a critic significantly increased the subjective aesthetic ratings. Does knowledge enable experts to attend to the visual features in a painting and to link it to the evaluative conventions, thus potentially causing different aesthetic judgments? To investigate whether a specific pattern of attention is essential for the knowledge-based appreciation, we tracked the eye movements of subjects while viewing a painting with a commentary by the artist and with a commentary by a critic. We observed that critics’ commentaries directed the viewers’ attention to the visual components that were highly relevant to the presented commentary. However, attention to specific features of a painting was not necessary for increasing the subjective aesthetic judgment when the artists’ commentary was presented. Our results suggest that at least two different cognitive mechanisms may be involved in knowledge- guided aesthetic judgments while viewers reappraise a painting. PMID:25945789

The evaluation of the need for remedial activities at hazardous waste sites requires quantification of risks of adverse health effects to humans and the ecosystem resulting from the presence of chemical and radioactive substances at these sites. The health risks from exposure to these substances are in addition to risks encountered because of the virtually unavoidable exposure to naturally occurring chemicals and radioactive materials that are present in air, water, soil, building materials, and food products. To provide a frame of reference for interpreting risks quantified for hazardous waste sites, it is useful to identify the relative magnitude of risks of both a voluntary and involuntary nature that are ubiquitous throughout east Tennessee. In addition to discussing risks from the ubiquitous presence of background carcinogens in the east Tennessee environment, this report also presents risks resulting from common, everyday activities. Such information should, not be used to discount or trivialize risks from hazardous waste contamination, but rather, to create a sensitivity to general risk issues, thus providing a context for better interpretation of risk information.

Discusses projects of the Graphic Information Research Unit at the Royal College of Art (England), which relates to the legibility of scientific and technical information. Summarizes the Unit's survey of problems in providing adequate guiding in libraries and museums, and reports two studies of Computer Output Microfilm library catalogues. (GT)

This paper considers questions related to the integrity and accessibility of new electronic information resources. It begins with a review of recent developments in networked information resources and the tools to identify, navigate, and use such resources. An overview is then given of the issues involved in access and integrity questions. Links…

We study certain features of scaling behaviors of the mutual information during a process of thermalization, more precisely we extend the time scaling behavior of mutual information which has been discussed in [1] to time-dependent hyperscaling violating geometries. We use the holographic description of entanglement entropy for two disjoint system consisting of two parallel strips whose widths are much larger than the separation between them. We show that during the thermalization process, the dynamical exponent plays a crucial rule in reading the general time scaling behavior of mutual information (e.g., at the pre-local-equilibration regime). It is shown that the scaling violating parameter can be employed to define an effective dimension.

Council for Yukon Indians, Whitehorse (Yukon). Curriculum Development Program.

This booklet was designed as a source of information for teachers seeking a deeper understanding of Native American children and who want to take advantage of opportunities offered by a cross-cultural classroom. The first section is a collection of 13 articles from a wide variety of sources on various aspects of cross-cultural education. Each…

This document summarizes currently available information about the presence and significance of unexploded ordnance (UXO) in the two main areas of Aberdeen Proving Ground: Aberdeen Area and Edgewood Area. Known UXO in the land ranges of the Aberdeen Area consists entirely of conventional munitions. The Edgewood Area contains, in addition to conventional munitions, a significant quantity of chemical-munition UXO, which is reflected in the presence of chemical agent decomposition products in Edgewood Area ground-water samples. It may be concluded from current information that the UXO at Aberdeen Proving Ground has not adversely affected the environment through release of toxic substances to the public domain, especially not by water pathways, and is not likely to do so in the near future. Nevertheless, modest but periodic monitoring of groundwater and nearby surface waters would be a prudent policy.

Because of the importance of fuel properties in design of aircraft fuel systems the present report has been prepared to provide information on the characteristics of current jet fuels. In addition to information on fuel properties, discussions are presented on fuel specifications, the variations among fuels supplied under a given specification, fuel composition, and the pertinence of fuel composition and physical properties to fuel system design. In some instances the influence of variables such as pressure and temperature on physical properties is indicated. References are cited to provide fuel system designers with sources of information containing more detail than is practicable in the present report.

A continent-wide radio telescope system offering the greatest resolving power of any astronomical instrument operational today Overview: The National Science Foundation's VLBA is a system of ten identical radio-telescope antennas controlled from a common headquarters and working together as a single instrument. The radio signals received by each individual antenna contribute part of the information used to produce images of celestial objects with hundreds of times more detail than Hubble Space Telescope images. Scientific Areas: The VLBA can contribute to any astronomical research area where quality, high-resolution radio images will advance knowledge of the field. In its first five years of full operation, the VLBA has produced dramatic new information in these areas: * Stars: With the VLBA, astronomers have tracked gas motions in the atmosphere of a star other than the Sun for the first time; made the first maps of the magnetic field of a star other than the Sun; and studied the violent dances of double-star pairs in which one of the pair is a superdense neutron star or a black hole. * Protostars, star formation, and protoplanetary disks: The VLBA has provided scientists with some of the best views yet of very young stars and the complex regions in which they are born. VLBA images have shown outflows of gas from young stars and disks of material orbiting these new stars - material that later may form planetary systems. * Supernovae and Supernova Remnants: The VLBA has directly measured the expansion of a shell of exploded debris from the supernova SN 1993J, in the galaxy M81, some 11 million light-years from Earth. This has allowed scientists to learn new details about the explosion itself and its surroundings as well as calculate the distance to the supernova by using the VLBA data in conjunction with information from optical telescopes. VLBA images have shown regions of shocked gas in supernova remnants. * The Milky Way: Radio waves from extragalactic objects

Forensic analysis of questioned documents sometimes can be extensively data intensive. A forensic expert might need to analyze a heap of document fragments and in such cases to ensure reliability he/she should focus only on relevant evidences hidden in those document fragments. Relevant document retrieval needs finding of similar document fragments. One notion of obtaining such similar documents could be by using document fragment's physical characteristics like color, texture, etc. In this article we propose an automatic scheme to retrieve similar document fragments based on visual appearance of document paper and texture. Multispectral color characteristics using biologically inspired color differentiation techniques are implemented here. This is done by projecting document color characteristics to Lab color space. Gabor filter-based texture analysis is used to identify document texture. It is desired that document fragments from same source will have similar color and texture. For clustering similar document fragments of our test dataset we use a Self Organizing Map (SOM) of dimension 5×5, where the document color and texture information are used as features. We obtained an encouraging accuracy of 97.17% from 1063 test images.

For 30 years, Gamma Ray Bursts, now known to be the most energetic explosions in the sky, have intrigued scientists and constituted one of the greatest mysteries in astrophysics. Such basic details as their exact locations in the sky and their distances from Earth remained unknown or subject to intense debate until just last year. With the discovery of "afterglows" at X-ray, visible, infrared and radio wavelengths, scientists have been able to study the physics of these explosive fireballs for the first time. Radio telescopes, the NSF's VLA in particular, are vitally important in this quest for the answers about Gamma Ray Bursts. Planned improvements to the VLA will make it an even more valuable tool in this field. Since their first identification in 1967 by satellites orbited to monitor compliance with the atmospheric nuclear test ban, more than 2,000 Gamma Ray Bursts have been detected. The celestial positions of the bursts have only been well-localized since early 1997, when the Italian- Dutch satellite Beppo-SAX went into operation. Since Beppo-SAX began providing improved information on burst positions, other instruments, both orbiting and ground-based, have been able to study the afterglows. So far, X-ray afterglows have been seen in about a dozen bursts, visible-light afterglows in six and radio afterglows in three. The search for radio emission from Gamma Ray Bursts has been an ongoing, target-of-opportunity program at the VLA for more than four years, led by NRAO scientist Dale Frail. The detection of afterglows "opens up a new era in the studies of Gamma Ray Bursts," Princeton University theorist Bohdan Paczynski wrote in a recent scientific paper. Optical studies of GRB 970508 indicated a distance of at least seven billion light-years, the first distance measured for a Gamma Ray Burst. VLA studies of the same burst showed that the fireball was about a tenth of a light-year in diameter a few days after the explosion and that it was expanding at very

PEEX, as a long-term multidisciplinary integrated study, needs a systems design of a relevant informationbackground. The idea of development of an Integrated Land Information System (ILIS) for the region as an initial step of future advanced integrated observing systems is considered as a promising way. The ILIS could serve (1) for introduction of a unified system of classification and quantification of environment, ecosystems and landscapes; (2) as a benchmark for tracing the dynamics of land use - land cover and ecosystems parameters, particularly for forests; (3) as a systems background for empirical assessment of indicators of an interest (e.g., components of biogeochemical cycles); (4) comparisons, harmonizing and mutual constraints of the results obtained by different methods; (5) for parameterization of surface fluxes for the 'atmosphere-land' system; (6) for use in divers models and for models' validation; (7) for downscaling of available information to a required scale; (8) for understanding of gradients for up-scaling of "point" data, etc. The ILIS is presented in form of multi-layer and multi-scale GIS that includes a hybrid land cover (HLC) by a definite date and corresponding legends and attributive databases. The HLC is based on relevant combination of a "multi" remote sensing concept that includes sensors of different type and resolution and ground data. The ILIS includes inter alia (1) general geographical and biophysical description of the territory (landscapes, soil, vegetation, hydrology, bioclimatic zones, permafrost etc.); (2) diverse datasets of measurements in situ; (3) sets of empirical and semi-empirical aggregation and auxiliary models, (4) data on different inventories and surveys (forest inventory, land account, results of forest monitoring); (5) spatial and temporal description of anthropogenic and natural disturbances; (5) climatic data with relevant temporal resolution etc. The ILIS should include only the data with known

The Talkeetna 1? by 3? quadrangle, which consists of about 17,155 km 2 in south-central Alaska, was investigated by integrated field and laboratory studies in the disciplines of geology, geochemistry, geophysics, and Landsat data interpretation for the purpose of assessing its mineral resource potential. Past mineral production has been limited to gold from the Yentna district, but the quadrangle contains potentially significant resources of tin and silver and possibly a few other commodities including chromite and copper. The results of the mineral resource assessment are given in a folio of maps which are accompanied by descriptive texts, diagrams, tables, and pertinent references. This Circular provides backgroundinformation on these investigations and integrates the component maps. A bibliography cites both specific and general references to the geology and mineral deposits of the quadrangle.

This report was prepared for the Office of Conservation, Bonneville Power Administration. The report will be used by the Office as backgroundinformation to support future analysis and implementation of electricity conservation programs for owners of residential rental housing in the Northwest. The principal objective of the study was to review market research information relating to attitudes and actions of Northwest rental housing owners and, to a lesser extent, tenants toward energy conservation and energy-efficiency improvements. Available market research data on these subjects were found to be quite limited. The most useful data were two surveys of Seattle rental housing owners conducted in late 1984 for Seattle City Light. Several other surveys provided supplemental market research information in selected areas. In addition to market research information, the report also includes backgroundinformation on rental housing characteristics in the Northwest.

ABSTRACT Background Dry needling is an evidence-based treatment technique that is accepted and used by physical therapists in the United States. This treatment approach focuses on releasing or inactivating muscular trigger points to decrease pain, reduce muscle tension, and assist patients with an accelerated return to active rehabilitation. Issue While commonly used, the technique has some patient risk and value of the treatment should be based on benefit compared to the potential risk. Adverse effects (AEs) with dry needling can be mild or severe, with overall incidence rates varying from zero to rates of approximately 10 percent. While mild AEs are the rule, any procedure that involves a needle insertion has the potential for an AE, with select regions and the underlying anatomy increasing the risk. Known significant AEs from small diameter needle insertion include pneumothorax, cardiac tamponade, hematoma, infection, central nervous system injury, and other complications. Purpose/Objective Underlying anatomy across individuals has variability, requiring an in-depth knowledge of anatomy prior to any needle placement. This commentary is an overview of pertinent anatomy in the region of the thorax, with a ‘part two’ that addresses the abdomen, pelvis, back, vasovagal response, informed consent and other pertinent issues. The purpose of the commentary is to minimize the risk of a dry needling AE. Conclusions/Implications Dry needling is an effective adjunct treatment procedure that is within the recognized scope of physical therapy practice. Physical therapy education and training provides practitioners with the anatomy, basic sciences, and clinical foundation to use this intervention safely and effectively. A safe and evidenced-based implementation of the procedure is based on a thorough understanding of the underlying anatomy and the potential risks, with risks coordinated with patients via informed consent. Levels of Evidence Level 5 PMID:27525188

In this article I ask whether disciplinary distinctions are pertinent to multicultural education. Are pedagogical prescriptions aimed at providing access and success to students of diverse backgrounds equally applicable across domains? I review cross-cultural cognitive research to depict defunct deficit and extant pluralistic approaches to…

This document contains backgroundinformation for the Workshop in general and the presentation entitled `Identification and Summary Characterization of Materials Potentially Requiring Vitrification` that was given during the first morning of the workshop. summary characteristics of 9 categories of US materials having some potential to be vitrified are given. This is followed by a 1-2 page elaborations for each of these 9 categories. References to more detailed information are included.

This report has been prepared to provide backgroundinformation on White Oak Lake for the Oak Ridge National Laboratory Environmental and Safety Report. The paper presents the history of White Oak Dam and Lake and describes the hydrological conditions of the White Oak Creek watershed. Past and present sediment and water data are included; pathway analyses are described in detail.

The present study highly supported the effective role of providing backgroundinformation via email by the teacher to write e-mail by the students in learners' writing ability. A total number of 50 EFL advanced male students aged between 25 and 40 at different branches of Iran Language Institute in Tehran, Tehran. Through the placement test of…

Examined the effects of providing students with backgroundinformation about the structure and scoring of mathematics performance assessments (PA). Results for 187 elementary school students who had PA orientation and 182 who did not show the effects of test wiseness training for average and above-average students, but not for below-average…

In aircraft and satellite multispectral scanner data, soil background signals are superimposed on or intermingled with information about vegetation. A procedure which accounts for soil background would, therefore, make a considerable contribution to an operational use of Landsat and other spectral data for monitoring the productivity of range, forest, and crop lands. A description is presented of an investigation which was conducted to obtain information for the development of such a procedure. The investigation included a study of the soil reflectance that supplies the background signal of vegetated surfaces. Landsat data as recorded on computer compatible tapes were used in the study. The results of the investigation are discussed, taking into account a study reported by Kauth and Thomas (1976). Attention is given to the determination of Kauth's plane of soils, sun angle effects, vegetation index modeling, and the evaluation of vegetation indexes. Graphs are presented which show the results obtained with a gray mapping technique. The technique makes it possible to display plant, soil, water, and cloud conditions for any Landsat overpass.

Experiencing certain events triggers the acquisition of new memories. Although necessary, however, actual experience is not sufficient for memory formation. One-trial learning is also gated by knowledge of appropriate backgroundinformation to make sense of the experienced occurrence. Strong neurobiological evidence suggests that long-term memory storage involves formation of new synapses. On the short time scale, this form of structural plasticity requires that the axon of the pre-synaptic neuron be physically proximal to the dendrite of the post-synaptic neuron. We surmise that such “axonal-dendritic overlap” (ADO) constitutes the neural correlate of backgroundinformation-gated (BIG) learning. The hypothesis is based on a fundamental neuroanatomical constraint: an axon must pass close to the dendrites that are near other neurons it contacts. The topographic organization of the mammalian cortex ensures that nearby neurons encode related information. Using neural network simulations, we demonstrate that ADO is a suitable mechanism for BIG learning. We model knowledge as associations between terms, concepts or indivisible units of thought via directed graphs. The simplest instantiation encodes each concept by single neurons. Results are then generalized to cell assemblies. The proposed mechanism results in learning real associations better than spurious co-occurrences, providing definitive cognitive advantages. PMID:25767887

This document describes the process followed to develop the Nevada National Security Site (NNSS) Integrated Sampling Plan (referred to herein as the Plan). It provides the Plan’s purpose and objectives, and briefly describes the Underground Test Area (UGTA) Activity, including the conceptual model and regulatory requirements as they pertain to groundwater sampling. Backgroundinformation on other NNSS groundwater monitoring programs—the Routine Radiological Environmental Monitoring Plan (RREMP) and Community Environmental Monitoring Program (CEMP)—and their integration with the Plan are presented. Descriptions of the evaluations, comments, and responses of two Sampling Plan topical committees are also included.

Background Although described in several reports, imported malaria in Europe has not been surveyed nationwide with overall coverage of patients and individually rechecked backgroundinformation. Plasmodium falciparum infections have been reported despite regularly taken appropriate chemoprophylaxis, yet the reliability of such questionnaire-based retrospective data has been questioned. This was the starting-point for conducting a prospective nationwide survey of imported malaria where compliance data was double-checked. Methods Data was collected on all cases of imported malaria confirmed and recorded by the reference laboratory of Finland (population 5.4 million) from 2003 to 2011, and these were compared with those reported to the National Infectious Disease Register (NIDR). Backgroundinformation was gathered by detailed questionnaires sent to the clinicians upon diagnosis; missing data were enquired by telephone of clinician or patient. Special attention was paid to compliance with chemoprophylaxis: self-reported use of anti-malarials was rechecked for all cases of P. falciparum. Results A total of 265 malaria cases (average annual incidence rate 0.5/100,000 population) had been recorded by the reference laboratory, all of them also reported to NIDR: 54% were born in malaria-endemic countries; 86% were currently living in non-endemic regions. Malaria was mainly (81%) contracted in sub-Saharan Africa. Plasmodium falciparum proved to be the most common species (72%). Immigrants constituted the largest group of travellers (44%). Pre-travel advice was received by 20% of those born in endemic regions and 81% of those from non-endemic regions. Of those with P. falciparum, 4% reported regular use of appropriate chemoprophylaxis (mefloquine or atovaquone/proguanil or doxycycline for regions with chloroquine-resistant and atovaquone/proguanil or doxycycline for regions with mefloquine-resistant P. falciparum); after individual rechecking, however, it was found that none

We developed a new, generalized fitting algorithm for miltiparameter models which incorporates varying and correlated errors. This was combined with geometrical methods of sampling to explore model prediction space, notably to plot geodesics and determine the size and edges of the model manifold. We illustrate this using the microwave background spectra for all possible universes, as described by the standard Λ-cold dark matter (Λ-CDM) cosmological model. In this case, the predicted data are fluctuations and highly correlated with varying errors, resulting in a manifold with a varying metric (as the natural metric to use is given by the Fisher information matrix). Furthermore, the model manifold shares the hyperribbon structure seen in other models, with the edges forming a strongly distorted image of a hypercube. Practical applications of such an analysis include optimizing experimental instrumentation designed to test more detailed cosmological theories. Funding supported in part by NSERC.

We summarize progress that has been made on the determination of atomic data pertinent to the fusion energy program. Work is reported on the identification of spectral lines of impurity ions, spectroscopic data assessment and compilations, expansion and upgrade of the NIST atomic databases, collision and spectroscopy experiments with highly charged ions on EBIT, and atomic structure calculations and modeling of plasma spectra.

Evidence from observational cosmology and astrophysics indicates that about one third of the universe is matter, but that the known baryonic matter only contributes to the universe at 4%. A large fraction of the universe is cold and non-baryonic matter, which has important role in the universe structure formation and its evolution. The leading candidate for the non-baryonic dark matter is Weakly Interacting Massive Particles (WIMPs), which naturally occurs in the supersymmetry theory in particle physics. The Cryogenic Dark Matter Search (CDMS) experiment is searching for evidence of a WIMP interaction off an atomic nucleus in crystals of Ge and Si by measuring simultaneously the phonon energy and ionization energy of the interaction in the CDMS detectors. The WIMP interaction energy is from a few keV to tens of keV with a rate less than 0.1 events/kg/day. To reach the goal of WIMP detection, the CDMS experiment has been conducted in the Soudan mine with an active muon veto and multistage passive background shields. The CDMS detectors have a low energy threshold and background rejection capabilities based on ionization yield. However, betas from contamination and other radioactive sources produce surface interactions, which have low ionization yield, comparable to that of bulk nuclear interactions. The low-ionization surface electron recoils must be removed in the WIMP search data analysis. An emphasis of this thesis is on developing the method of the surface-interaction rejection using location information of the interactions, phonon energy distributions and phonon timing parameters. The result of the CDMS Soudan run118 92.3 live day WIMP search data analysis is presented, and represents the most sensitive search yet performed.

This compilation, a draft training manual containing technical backgroundinformation on internal combustion engines and alcohol motor fuel technologies, is presented in 3 parts. The first is a compilation of facts from the state of the art on internal combustion engine fuels and their characteristics and requisites and provides an overview of fuel sources, fuels technology and future projections for availability and alternatives. Part two compiles facts about alcohol chemistry, alcohol identification, production, and use, examines ethanol as spirit and as fuel, and provides an overview of modern evaluation of alcohols as motor fuels and of the characteristics of alcohol fuels. The final section compiles cross references on the handling and combustion of fuels for I.C. engines, presents basic evaluations of events leading to the use of alcohols as motor fuels, reviews current applications of alcohols as motor fuels, describes the formulation of alcohol fuels for engines and engine and fuel handling hardware modifications for using alcohol fuels, and introduces the multifuel engines concept. (LCL)

Investigation of Peele`s Pertinent Puzzle (PPP) by analytical and numerical simulation shows that if covariations of experimental data are determined within frames of rigorous maximum likelihood method (MLM), then least-squares method (LSM) gives for PPP correct but unusually looking results. It is shown also that some restrictions and corrections outside rigorous MLM frame bring to incorrect results instead of improved ones.

Background model updating is a vital process for any background subtraction technique. This paper presents an updating mechanism that can be applied efficiently to any background subtraction technique. This updating mechanism exploits the color and spatial features to characterize each detected object. Spatial and color features are used to classify each detected object as a moving background object, a ghost, or a real moving object. The starting position of each detected object is the cue for updating background images. In addition, this paper presents a hybrid scheme to detect and remove cast shadows based on texture and color features. The robustness of the proposed method and its effectiveness in overcoming challenging problems such as gradual and sudden illumination changes, ghost appearance, non-stationary background objects, the stability of moving objects most of the time, and cast shadows are verified quantitatively and qualitatively.

report summarizes recent results of integrated geological, geochemical, and geophysical field and laboratory studies conducted by the U.S. Geological Survey in the Cordova and Middleton Island 1?x3 ? quadrangles of coastal southern Alaska. Published open-file reports and maps accompanied by descriptive and interpretative texts, tables, diagrams, and pertinent references provide backgroundinformation for a mineral-resource assessment of the two quadrangles. Mines in the Cordova and Middleton Island quadrangles produced copper and byproduct gold and silver in the first three decades of the 20th century. The quadrangles may contain potentially significant undiscovered resources of precious and base metals (gold, silver, copper, zinc, and lead) in veins and massive sulfide deposits hosted by Cretaceous and Paleogene sedimentary and volcanic rocks. Resources of manganese also may be present in the Paleogene rocks; uranium resources may be present in Eocene granitic rocks; and placer gold may be present in beach sands near the mouth of the Copper River, in alluvial sands within the canyons of the Copper River, and in smaller alluvial deposits underlain by rocks of the Valdez Group. Significant coal resources are present in the Bering River area, but difficult access and structural complexities have discouraged development. Investigation of numerous oil and gas seeps near Katalla in the eastern part of the area led to the discovery of a small, shallow field from which oil was produced between 1902 and 1933. The field has been inactive since, and subsequent exploration and drilling onshore near Katalla in the 1960's and offshore near Middleton Island on the outer continental shelf in the 1970's and 1980's was not successful.

... Secretary, Department of the Interior (DOI) announces the proposed extension of a public information....doi.gov . FOR FURTHER INFORMATION CONTACT: Requests for additional information on this renewed.... SUPPLEMENTARY INFORMATION: I. Abstract DOI is below parity with the Relevant Civilian Labor Force...

High-density short oligonucleotide microarrays are a primary research tool for assessing global gene expression. Background noise on microarrays comprises a significant portion of the measured raw data. A number of statistical techniques have been developed to correct for this background noise. Here, we demonstrate that probe minimum folding energy and structure can be used to enhance a previously existing model for background noise correction. We estimate that probe secondary structure accounts for up to 3% of all variation on Affymetrix microarrays. PMID:17387043

... collection of information was published on April 12, 2012 (77 FR 21992). One comment was received. This... Civil Rights, Office of the Secretary, Department of the Interior (DOI) has submitted to OMB for renewal..._Anderson@ios.doi.gov . FOR FURTHER INFORMATION CONTACT: Requests for additional information on...

Despite growing use of patient-facing technologies such as patient portals to address information needs for outpatients, we understand little about how patients manage information and use information technologies in an inpatient context. Based on hospital observations and responses to an online questionnaire from previously hospitalized patients and caregivers, we describe information workspace that patients have available to them in the hospital and the information items that patients and caregivers rate as important and difficult to access or manage while hospitalized. We found that patients and caregivers desired information—such as the plan of care and the schedule of activities—that is difficult to access as needed in a hospital setting. Within this study, we describe the various tools and approaches that patients and caregivers use to help monitor their care as well as illuminate gaps in information needs not typically captured by traditional patient portals. PMID:26958295

... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF JUSTICE Federal Bureau of Investigation National Instant Criminal Background Check System Section Agency... in accordance with Title 5, Code of Federal Regulations (CFR), Sec. 1320.10. If you have...

This document presents an overview of groundwater- and surface water-related laws, regulations, agreements, guidance documents, Executive Orders, and DOE orders pertinent to the Idaho National Engineering Laboratory. This document is a summary and is intended to help readers understand which regulatory requirements may apply to their particular circumstances. However, the document is not intended to be used in lieu of applicable regulations. Unless otherwise noted, the information in this report reflects a summary and evaluation completed July 1, 1995. This document is considered a Living Document, and updates on changing laws and regulations will be provided.

Hard X-ray detectors for astronomical observations are currently being designed with advanced background rejection capabilities, based on high level of pixelisation and on fast signal processing. The development of such devices, based on room temperature semiconductor such as CdTe or CdZnTe comes through extensive testing programs normally based on ground campaigns, using radioactive sources, X-ray tubes and particle beam accelerators. These methods show their limits, however, especially for the measurements of the response to the different types of hadrons. Firtsly, we briefly review the knowledge of the primary sources of background and of the different radiation environments both for space and balloon altitudes, for which typical fluxes/rates are given. Then, we discuss how flying prototypes on high altitude balloons can greatly help to test the detector performance in an environment almost as severe as the conditions found in orbit, with detectors responding at very similar rates.

Peelle's Pertinent Puzzle (PPP) states a seemingly plausible set of measurements with their covariance matrix, which produce an implausible answer. To answer the PPP question, we describe a reasonable experimental situation that is consistent with the PPP solution. The confusion surrounding the PPP arises in part because of its imprecise statement, which permits to a variety of interpretations and resulting answers, some of which seem implausible. We emphasize the importance of basing the analysis on an unambiguous probabilistic model that reflects the experimental situation. We present several different models of how the measurements quoted in the PPP problem could be obtained, and interpret their solution in terms of a detailed probabilistic analysis. We suggest a probabilistic approach to handling uncertainties about which model to use.

Peelle's Pertinent Puzzle (PPP) states a seemingly plausible set of measurements with their covariance matrix, which produce an implausible answer. To answer the PPP question, we describe a reasonable experimental situation that is consistent with the PPP solution. The confusion surrounding the PPP arises in part because of its imprecise statement, which permits to a variety of interpretations and resulting answers, some of which seem implausible. We emphasize the importance of basing the analysis on an unambiguous probabilistic model that reflects the experimental situation. We present several different models of how the measurements quoted in the PPP problem could be obtained, and interpret their solution in terms of a detailed probabilistic analysis. We suggest a probabilistic approach to handling uncertainties about which model to use.

We try to understand the long-standing problem of the Peelle's Pertinent Puzzle (PPP) using the Monte Carlo technique. We allow the probability density functions to be any kind of form to assume the impact of distribution, and obtain the least-squares solution directly from numerical simulations. We found that the standard least squares method gives the correct answer if a weighting function is properly provided. Results from numerical simulations show that the correct answer of PPP is 1.1 {+-} 0.25 if the common error is multiplicative. The thought-provoking answer of 0.88 is also correct, if the common error is additive, and if the error is proportional to the measured values. The least squares method correctly gives us the most probable case, where the additive component has a negative value. Finally, the standard method fails for PPP due to a distorted (non Gaussian) joint distribution.

Using cross-sectional data, we examined the financial information sources, financial knowledge, and financial practices of young adults, many of whom are first generation college students, ethnic minorities, and immigrants or children of immigrants. Participants (n = 1,249) were undergraduate students at a large regional comprehensive university.…

The implementation of traceability in the food supply chain has reinforced adoption of technologies with the ability to track forward and trace back product-related information. Based on the premise that these technologies can be used as a means to provide product-related information to consumers, this paper explores the perceived benefits and drawbacks of such technologies. The aim is to identify factors that influence consumers' perceptions of such technologies, and furthermore to advise the agri-food business on issues that they should consider prior to the implementation of such technologies in their production lines. For the purposes of the study, a focus group study was conducted across 12 European countries, while a set of four different technologies used as a means to provide traceability information to consumers was the focal point of the discussions in each focus group. Results show that the amount of and confidence in the information provided, perceived levels of convenience, impact on product quality and safety, impact on consumers' health and the environment, and potential consequences on ethical and privacy liberties constitute important factors influencing consumers' perceptions of technologies that provide traceability. PMID:19631704

Community structure detection in complex networks is important since it can help better understand the network topology and how the network works. However, there is still not a clear and widely-accepted definition of community structure, and in practice, different models may give very different results of communities, making it hard to explain the results. In this paper, different from the traditional methodologies, we design an enhanced semi-supervised learning framework for community detection, which can effectively incorporate the available prior information to guide the detection process and can make the results more explainable. By logical inference, the prior information is more fully utilized. The experiments on both the synthetic and the real-world networks confirm the effectiveness of the framework.

Safety assessments and environmental impact statements for nuclear fuel cycle facilities require an estimate of the amount of radioactive particulate material initially airborne (source term) during accidents. Pacific Northwest Laboratory (PNL) has surveyed the literature, gathering information on the amount and size of these particles that has been developed from limited experimental work, measurements made from operational accidents, and known aerosol behavior. Information useful for calculating both liquid and powder source terms is compiled in this report. Potential aerosol generating events discussed are spills, resuspension, aerodynamic entrainment, explosions and pressurized releases, comminution, and airborne chemical reactions. A discussion of liquid behavior in sprays, sparging, evaporation, and condensation as applied to accident situations is also included.

The effect of educational television background music on selective exposure and information acquisition was studied. Background music of slow tempo, regardless of its appeal, had negligible effects on attention and information acquisition. Rhythmic, fast-tempo background music, especially when appealing, significantly reduced visual attention to…

We develop a Hamiltonian formalism which can be used to discuss the physics of a massless scalar field in a gravitational background of a Schwarzschild black hole. Using this formalism we show that the time evolution of the system is unitary and yet all known results such as the existence of Hawking radiation can be readily understood. We then point out that the Hamiltonian formalism leads to interesting observations about black hole entropy and the information paradox.

Modern target recognition systems suffer from the lack of human-like abilities to understand the visual scene, detect, unambiguously identify and recognize objects. As result, the target recognition systems become dysfunctional if target doesn't demonstrate remarkably distinctive and contrast features that allow for unambiguous separation from background and identification upon such features. This is somewhat similar to visual systems of primitive animals like frogs, which can separate and recognize only moving objects. However, human vision unambiguously separates any object from its background. Human vision combines a rough but wide peripheral, and narrow but precise foveal systems with visual intelligence that utilize both scene and object contexts and resolve ambiguity and uncertainty in the visual information. Perceptual grouping is one of the most important processes in human vision, and it binds visual information into meaningful patterns and structures. Unlike the traditional computer vision models, biologically-inspired Network-Symbolic models convert image information into an "understandable" Network-Symbolic format, which is similar to relational knowledge models. The equivalent of interaction between peripheral and foveal systems in the network-symbolic system is achieved via interaction between Visual and Object Buffers and the top-level system of Visual Intelligence. This interaction provides recursive rough context identification of regions of interest in the visual scene and their analysis in the object buffer for precise and unambiguous separation of the object from background/clutter with following recognition of the target.

More stringent controls on the quality of wastewater discharges have given rise to increasing volumes of sewage sludge for disposal, principally to land, using either land-spreading or sludge-to-landfill operations. Current sludge-to-landfill methods generally involve mixing the concentrated sludge with other solid waste in municipal landfills. However, stricter waste disposal legislation and higher landfill taxes are forcing the water industry to look for more efficient disposal strategies. Landfill operators are also increasingly reluctant to accept sludge material in the slurry state because of construction difficulties and the potential for instability of the landfill slopes. The engineering and drying properties of a municipal sewage sludge are presented and applied, in particular, to the design, construction, and performance of sewage sludge monofills. Sludge handling and landfill construction are most effectively conducted within the water content range of 85% water content, the optimum water content for standard proctor compaction, and 95% water content, the sticky limit of the sludge material. Standard proctor compaction of the sludge within this water content range also achieves the maximum dry density of approximately 0.56 tonne/m3, which maximizes the storage capacity and, hence, the operational life of the landfill site. Undrained shear strength-water content data (pertinent to the stability of the landfill body during construction) and effective stress-strength parameters, which take into account the landfill age and the effects of ongoing sludge digestion, are presented. Landfill subsidence, which occurs principally because of creep and decomposition of the solid organic particles, is significant and continues indefinitely but at progressively slower rates. PMID:16022414

Mercury is a naturally occurring and widely used element that can cause health and ecological problems when released to the environment through human activities. Though a national and even international issue, the health and environmental impacts of mercury are best understood when studied at the local level. "Mercury: An Educator's Toolkit"…

Lipid biomarkers and their pertinent indices have been used as the most effective proxies for paleoclimate and paleoenvironment conditions. This paper conducts a systematic review on a variety of lipid biomarkers in aquatic sediments and water column that are used as proxies tracing paleoclimate and paleoenvironment information. The sources of those lipid biomarkers are autochthonous and/or allochthonous. General mechanisms of lipid biomarkers used as paleoclimate and paleoenvironment archives include characteristics of carbon chain distribution, temperature adaptation and combined temperature and humidity adaptation. Different lipid indices underpinned by the mechanisms are surrogates for the past precipitation, temperature and humidity as well as plant succession. We propose that the combined use of lipid indices and other biomarkers can expand the outlook of individual index, and provide a better understanding of paleoclimate and paleoenvironment reconstruction.

Infrared moving small target detection under complex cloud backgrounds is one of the key techniques of infrared search and track (IRST) systems. This paper proposes a novel method based on in-frame inter-frame information to detect infrared moving small targets accurately. For a single frame, in the spatial domain, a directional max-median filter is developed to make a pre-processing and a background suppression filtering template is utilized on the denoised image to highlight target. Then, targets in cloud regions and non-cloud regions are extracted by different thresholds according to a cloud discrimination method so that a spatial domain map (SDM) is acquired. In the frequency domain, we design an α-DoB band-pass filter to conduct coarse saliency detection and make an amplitude transformation with smoothing processing which is the so-called elaborate saliency detection. Furthermore, a frequency domain map (FDM) is acquired by an adaptive binary segmentation method. Lastly, candidate targets in single frame are extracted by a discrimination based on intensity and spatial distance criteria. For consecutive frames, a false alarm suppression is conducted on account of differences of motion features between moving target and false alarms to improve detection accuracy again. Large numbers of experiments demonstrate that the proposed method has satisfying detection effectiveness and robustness for infrared moving small target detection under complex cloud backgrounds.

An attractive property of ensemble data assimilation methods is that they provide flow dependent background error covariance estimates which can be used to update fields of observed variables as well as fields of unobserved model variables. Two methods to estimate background error covariances are introduced which share the above property with ensemble data assimilation methods but do not involve the integration of multiple model trajectories. Instead, all the necessary covariance information is obtained from a single model integration. The Space Adaptive Forecast error Estimation (SAFE) algorithm estimates error covariances from the spatial distribution of model variables within a single state vector. The Flow Adaptive error Statistics from a Time series (FAST) method constructs an ensemble sampled from a moving window along a model trajectory.SAFE and FAST are applied to the assimilation of Argo temperature profiles into version 4.1 of the Modular Ocean Model (MOM4.1) coupled to the GEOS-5 atmospheric model and to the CICE sea ice model. The results are validated against unassimilated Argo salinity data. They show that SAFE and FAST are competitive with the ensemble optimal interpolation (EnOI) used by the Global Modeling and Assimilation Office (GMAO) to produce its ocean analysis. Because of their reduced cost, SAFE and FAST hold promise for high-resolution data assimilation applications.

Malpractice actions against surgical pathologists are still relatively uncommon, but they have increased in frequency over time and are associated with sizable indemnity figures. This discussion categorizes areas of liability in surgical pathology into three groups: those that represent health system flaws (problems with specimen identification, or transportation, or both; lack of clinical information or erroneous information; sampling effects and defects; and poorly reproducible or poorly defined diagnostic or prognostic criteria), others that exist at the interface between the system and individuals (allowing clinicians to bypass pathologic review of referred specimens; acceding to clinical demands for inadvisable procedures; and working in a disruptive environment), and truly individual errors by pathologists (lapses in reasoning; deficiencies concerning continuity in the laboratory; invalid assumptions regarding recipients of surgical pathology reports; over-reliance on the results of "special" tests; and problems with peer consultation). Finally, two important topic areas are discussed that commonly enter into lawsuits filed against surgical pathologists; namely, "delay in diagnosis" of malignant neoplasms and "failure to provide adequate prognostic information." Based on a review of the pertinent literature, we conclude that the clinical courses of most common malignancies are not affected in a significant manner by delays in diagnosis. Moreover, the practice of using "personalized external validity" for supposedly prognostic tests is examined, with the resulting opinion that prognostication of tumor behavior in individual patients is not reliable using anything but anatomic staging systems. PMID:17633350

Results are presented of some landing studies that may serve as guidelines in the consideration of landing problems of glider-reentry configurations. The effect of the initial conditions of sinking velocity, angle of attack, and pitch rate on impact severity and the effect of locating the rear gear in various positions are discussed. Some information is included regarding the influence of landing-gear location on effective masses. Preliminary experimental results on the slideout phase of landing include sliding and rolling friction coefficients that have been determined from tests of various skids and all-metal wheels.

Presents the procedures, and findings of a study designed to identify principles in astronomy, geology, meterology, oceanography and physical geography pertinent to general education programs in junior high schools. (LC)

The Butte 1?x2 ? quadrangle in west-central Montana was investigated as part of the U.S. Geological Survey's Conterminous United States Mineral Assessment Program (CUSMAP). These investigations included geologic mapping, geochemical surveys, gravity and aeromagnetic surveys, examinations of mineral deposits, and specialized geochronologic and remote-sensing studies. The data collected during these studies were compiled, combined with available published and unpublished data, analyzed, and used in a mineral-resource assessment of the quadrangle. The results, including data, interpretations, and mineral-resource assessments for nine types of mineral deposits, are published separately as a folio of maps. These maps are accompanied by figures, tables, and explanatory text. This circular provides backgroundinformation on the Butte quadrangle, summarizes the studies and published maps, and lists a selected bibliography of references pertinent to the geology, geochemistry, geophysics, and mineral resources of the quadrangle. The Butte quadrangle, which includes the world-famous Butte mining district, has a long history of mineral production. Many mining districts within the quadrangle have produced large quantities of many commodities; the most important in dollar value of production were copper, gold, silver, lead, zinc, manganese, molybdenum, and phosphate. At present, mines at several locations produce copper, molybdenum, gold, silver, lead, zinc, and phosphate. Exploration, mainly for gold, has indicated the presence of other mineral deposits that may be exploited in the future. The results of the investigations by the U.S. Geological Survey indicate that many areas of the quadrangle are highly favorable for the occurrence of additional undiscovered resources of gold, silver, copper, molybdenum, tungsten, and other metals in several deposit types.

The Ad Hoc Committee on Future Information Services explored possible future directions for information services at Texas A&M University Library and developed a plan to guide the library into the next decade in terms of automated access to information. In exploring future directions for automated information services, the committee members…

Corrective Action Site (CAS) 02-37-02, Gas Sampling Assembly, is associated with nuclear test MULLET. MULLET was an underground safety test conducted on October 17, 1963. The experiment also involved prompt sampling of particulate material from the detonation, similar to CAS 09-99-06, Gas Sampling Assembly, which is associated with PLAYER/YORK. The sampling system at MULLET was similar to that of PLAYER/YORK and was used to convey gas from the MULLET emplacement hole (U2ag) to a sampling assembly. Beyond the sampling assembly, the system had a 'Y' junction with one branch running to a filter unit and the other running to a scrubber unit. The total system length was approximately 250 feet and is depicted on the attached drawing. According to the available backgroundinformation, retrieval of the sample material from the MULLET event caused significant alpha (plutonium) contamination, limited to an area near ground zero (GZ). Test support Radiological Control Technicians did not detect contamination outside the immediate GZ area. In addition, vehicles, equipment, and workers that were contaminated were decontaminated on site. Soil contamination was addressed through the application of oil, and the site was decommissioned after the test. Any equipment that could be successfully decontaminated and had a future use was removed from the site. The contaminated equipment and temporary buildings erected to support the test were buried on site, most likely in the area under the dirt berm. The exact location of the buried equipment and temporary buildings is unknown. No information was found describing the disposition of the filter and scrubber, but they are not known to be at the site. The COMMODORE test was conducted at U2am on May 20, 1967, and formed the crater next to CAS 02-37-02. The COMMODORE test area had been surveyed prior to the test, and alpha contamination was not identified. Furthermore, alpha contamination was not identified during the COMMODORE re

This document provides some background on early childhood planning and system building around the country. Since mid-December, the author has been studying these efforts for Child Care, Inc. (CCI) interviewing national experts and reading widely. This outline provides insights and lesson learned from those inquiries. The goal at this meeting will…

This report examines access to lifelong learning opportunities on Canada's information highway. The report begins with a glossary and a learner-centered model in which the information highway links learners with learning opportunities provided through educational institutions, community organizations, government, and business and industry.…

Vapor explosions are explosive events resulting from the mixing of two liquids, one of which is heated to a temperature well above the boiling point of the second. Under some circumstances mixing of the liquids can boil part of the lower boiling liquid so quickly that the expanding vapor generates a strong pressure wave and explosion. If the lower boiling liquid is water, as is frequently the case, the event is called a ``steam explosion``. Analyses in support of the K-Reactor Probabilistic Risk Assessment have shown that steam explosions generated by the interaction of molten reactor fuel with water contribute significantly to the risk of reactor operation at the SRS. This calculated risk incorporates a conservative treatment of the uncertainties associated with such explosions. Study of steam explosions involving molten reactor materials has been included in the Severe Accident Analysis Program (SAAP) in order to obtain a better evaluation of their importance, and, if possible, to find ways to avoid them. This paper presents a brief review and summary of steam explosion experience from literature accounts, along with the results of experimental studies from the SAAP. It concludes with an evaluation of current knowledge, and suggestions for future development. 71 refs.

Measurements of exposure rates using thermoluminescent dosimeters placed within residences in the Oak Ridge/Knoxville area are presented. The objective of this investigation was to determine the radiation component acquired by Oak Ridge National Laboratory employee personnel dosimeter-security badges during residential badge storage and to develop a model to predict the radiation exposure rate in Oak Ridge/Knoxville-area homes. The exposure rates varied according to building material used and geographic location. Exposure rates were higher in the fall and lower in the spring; stone residences had a higher average dose equivalent rate than residences made of wood. An average yearly exposure rate was determined to be 78 millirems per year for the Oak Ridge-area homes. This value can be compared to the natural background radiation dose equivalent rate in the United States of 80 to 200 millirems per year.

Long-term stewardship is expected to be needed at more than 100 DOE sites after DOE's Environmental Management program completes disposal, stabilization, and restoration operations to address waste and contamination resulting from nuclear research and nuclear weapons production conducted over the past 50 years. From Cleanup to stewardship provides backgroundinformation on the Department of Energy (DOE) long-term stewardship obligations and activities. This document begins to examine the transition from cleanup to long-term stewardship, and it fulfills the Secretary's commitment to the President in the 1999 Performance Agreement to provide a companion report to the Department's Accelerating Cleanup: Paths to Closure report. It also provides backgroundinformation to support the scoping process required for a study on long-term stewardship required by a 1998 Settlement Agreement.

This document is designed to assist the Council of Educational Facility Planners International (CEFP/I) in planning for the establishment of an information system for its members and other stakeholders who need information on educational facilities. The report focuses on the major activities to be accomplished and the issues to be considered when…

In this paper we present a modification to the standard two- alternative forced-choice (2AFC) experiment in an attempt to help the human detect signals by providing redundant information. We call the old experiment 2AFC_RAW, and the new experiment, 2AFC_FILTER. In the 2AFC_FILTER experiment, we provide the observer with the pair of raw data images (as in 2AFC_RAW) plus filtered versions of the raw data. The thought behind this modification is that the human might benefit from generic pre-processing of the data into multiple images, each extracting different information. We defined two different 2AFC_FILTER experiments, each using Laguerre-Gauss functions as the filters. The difference between the two was their defining Gaussian envelope. We tested human performance given a variety of image classes with the 2AFC_RAW and the two 2AFC_FILTER experiments. The same raw data were used in each. We found that there was a significant human performance increase from the 2AFC_RAW to the 2AFC_FILTER experiment. It was also seen that the choice of the filters made a difference. Specifically, human performance was better when the Gaussian envelope of the Laguerre-Gauss functions matched the signal.

: The term neurotropic melanoma has been used to refer to malignant melanoma with associated infiltration of nerve or "neural differentiation"--that is, melanoma cells exhibiting cytological characteristics of nerve cells. Historically, neurotropic melanoma has generally been discussed within the context of desmoplastic melanoma. We report an exceptional case of melanoma notable for a very well-differentiated neural component that was contiguous with obvious overlying melanoma. After careful consideration of all pertinent histological features, the overall diagnostic impression was that of melanoma with associated "malignant neurotization." We have not encountered a previously reported case with such a well-differentiated neural component. The following article details our exceptional case of melanoma with "malignant neurotization" and presents a discussion of the differential diagnosis and brief review of the pertinent literature. PMID:23782676

Purpose of this study is to explore the status of R and D in Japan and the ability of US researchers to keep abreast of Japanese technical advances. US researchers familiar with R and D activities in Japan were interviewed in ten fields that are relevant to the more efficient use of energy: amorphous metals, biotechnology, ceramics, combustion, electrochemical energy storage, heat engines, heat transfer, high-temperature sensors, thermal and chemical energy storage, and tribology. The researchers were questioned about their perceptions of the strengths of R and D in Japan, comparative aspects of US work, and the quality of available information sources describing R and D in Japan. Of the ten related fields, the researchers expressed a strong perception that significant R and D is under way in amorphous metals, biotechnology, and ceramics, and that the US competitive position in these technologies will be significantly challenged. Researchers also identified alternative emphases in Japanese R and D programs in these areas that provide Japan with stronger technical capabilities. For example, in biotechnology, researchers noted the significant Japanese emphasis on industrial-scale bioprocess engineering, which contrasts with a more meager effort in the US. In tribology, researchers also noted the strength of the chemical tribology research in Japan and commented on the effective mix of chemical and mechanical tribology research. This approach contrasts with the emphasis on mechanical tribology in the US.

"Backgrounds in Language," a field-tested inservice course designed for use by groups of 15 or 25 language arts teachers, provides the subject matter background teachers need to make informed decisions about what curriculum materials to use in what way, at what time, and with which students. The course is comprised of eight 2-hour sessions,…

Basic dynamics, sensor, control, and related artificial intelligence issues pertinent to smart robotic hands for the Extra Vehicular Activity (EVA) Retriever system are summarized and discussed. These smart hands are to be used as end effectors on arms attached to manned maneuvering units (MMU). The Retriever robotic systems comprised of MMU, arm and smart hands, are being developed to aid crewmen in the performance of routine EVA tasks including tool and object retrieval. The ultimate goal is to enhance the effectiveness of EVA crewmen.

In this era when many Americans seem resigned to greater encroachments on their personal privacy due to the growth and ubiquity of electronic databases with information about almost every aspect of their lives, a recent statement from the American Association of University Professors (AAUP) seems timely. The statement highlighted the issue of…

... plant, but many compounds may be responsible for valerian' ;s relaxing effect. Are botanical dietary supplements safe? Many ... before their full effects are achieved. For example, valerian may be effective as a sleep aid after ...

This paper describes an original approach to generating scenarios for the purpose of testing the algorithms used to detect special nuclear materials (SNM) that incorporates the use of ontologies. Separating the signal of SNM from the background requires sophisticated algorithms. To assist in developing such algorithms, there is a need for scenarios that capture a very wide range of variables affecting the detection process, depending on the type of detector being used. To provide such a cpability, we developed an ontology-driven information system (ODIS) for generating scenarios that can be used in creating scenarios for testing of algorithms for SNMmore » detection. The ontology-driven scenario generator (ODSG) is an ODIS based on information supplied by subject matter experts and other documentation. The details of the creation of the ontology, the development of the ontology-driven information system, and the design of the web user interface (UI) are presented along with specific examples of scenarios generated using the ODSG. We demonstrate that the paradigm behind the ODSG is capable of addressing the problem of semantic complexity at both the user and developer levels. Compared to traditional approaches, an ODIS provides benefits such as faithful representation of the users' domain conceptualization, simplified management of very large and semantically diverse datasets, and the ability to handle frequent changes to the application and the UI. Furthermore, the approach makes possible the generation of a much larger number of specific scenarios based on limited user-supplied information« less

This paper describes an original approach to generating scenarios for the purpose of testing the algorithms used to detect special nuclear materials (SNM) that incorporates the use of ontologies. Separating the signal of SNM from the background requires sophisticated algorithms. To assist in developing such algorithms, there is a need for scenarios that capture a very wide range of variables affecting the detection process, depending on the type of detector being used. To provide such a cpability, we developed an ontology-driven information system (ODIS) for generating scenarios that can be used in creating scenarios for testing of algorithms for SNM detection. The ontology-driven scenario generator (ODSG) is an ODIS based on information supplied by subject matter experts and other documentation. The details of the creation of the ontology, the development of the ontology-driven information system, and the design of the web user interface (UI) are presented along with specific examples of scenarios generated using the ODSG. We demonstrate that the paradigm behind the ODSG is capable of addressing the problem of semantic complexity at both the user and developer levels. Compared to traditional approaches, an ODIS provides benefits such as faithful representation of the users' domain conceptualization, simplified management of very large and semantically diverse datasets, and the ability to handle frequent changes to the application and the UI. Furthermore, the approach makes possible the generation of a much larger number of specific scenarios based on limited user-supplied information

This report analyzes research results and other pertinentinformation on the biological effects of radiofrequency radiation (RFR). The frequency range of primary interest is 10 kHz to 300 GHz. The main purpose of this report is to serve as a basic reference for other documents dealing with the environmental impact of proposed or currently operating USAF emitter systems with regards to health and safety aspects of exposure to RFR. The report is divided into 7 sections with various aspects of the present knowledge regarding biological effects of RFR. More than 600 references from the world's literature are cited.

The Nabesna quadrangle in south-central Alaska is the first of the l:250,000-scale Alaskan quadrangles to be investigated by an interdisciplinary research team in order to furnish a mineral resource assessment of the State. The assessment of the 17,600-km 2 16,800-mi21 quadrangle is based on field and laboratory investigations of the geology, geochemistry, geophysics, and satellite imagery. The results of the investigations are published as a folio of maps, diagrams, and accompanying discussions. This report provides backgroundinformation on the investigations and integrates the published components of the resource assessment. A comprehensive bibliography cites both specific and general references to the geology and mineral deposits of the Nabesna quadrangle.

An attractive property of ensemble data assimilation methods is that they provide flow dependent background error covariance estimates which can be used to update fields of observed variables as well as fields of unobserved model variables. Two methods to estimate background error covariances are introduced which share the above property with ensemble data assimilation methods but do not involve the integration of multiple model trajectories. Instead, all the necessary covariance information is obtained from a single model integration. The Space Adaptive Forecast error Estimation (SAFE) algorithm estimates error covariances from the spatial distribution of model variables within a single state vector. The Flow Adaptive error Statistics from a Time series (FAST) method constructs an ensemble sampled from a moving window along a model trajectory. SAFE and FAST are applied to the assimilation of Argo temperature profiles into version 4.1 of the Modular Ocean Model (MOM4.1) coupled to the GEOS-5 atmospheric model and to the CICE sea ice model. The results are validated against unassimilated Argo salinity data. They show that SAFE and FAST are competitive with the ensemble optimal interpolation (EnOI) used by the Global Modeling and Assimilation Office (GMAO) to produce its ocean analysis. Because of their reduced cost, SAFE and FAST hold promise for high-resolution data assimilation applications.

The clinical history and indication (CHI) provided with a radiological examination are critical components of a quality interpretation by the radiologist. A patient's chronic conditions offer the context in which acute symptoms and findings can be interpreted more accurately. Seven pertinent (potentially diagnosis altering) chronic conditions, which are fairly prevalent at our institution, were selected. We analyze if and how in 140 CHIs there was mention of a patient's previously reported chronic condition and if and how the condition was subsequently described in the radiology report using a four-item scheme (Mention/Specialization, Generalization, Common comorbidity, No mention). In 40.7% of CHIs, the condition was rated Mention/Specialization. Therefore, we reject our first hypothesis that the CHI is a reliable source for obtaining pertinent chronic conditions (≥ 90.0%). Non-oncological conditions were significantly more likely rated No mention in the CHI than oncological conditions (58.7 versus 8.3%, P < 0.0001). Stat cases were significantly more frequently No mention than non-stat cases (60.0 versus 31.3%, P = 0.0134). We accept our second hypothesis that the condition's rating in the CHI is significantly correlated with its rating of the final radiology report (χ(2) test, P < 0.00001). Our study demonstrates an alarming lack of communication of pertinent medical information to the radiologist, which may negatively impact interpretation quality. Presenting automatically aggregated patient information to the radiologist may be a potential avenue for improving interpretation and adding value of the radiology department to the care chain. PMID:25533493

This manual provides information to assist schools and school systems to collect student backgroundinformation as required by Education Ministers. The purpose is to enable nationally comparable reporting of students' outcomes against the National Goals for Schooling in the Twenty-First Century. It involves the collection of information on…

Concise backgroundinformation on the People's Republic of China is provided. The publication begins with a profile of the country, outlining the people, geography, economy, and membership in international organizations. The bulk of the document then discusses in more detail China's people, geography, history, government, education, economy, and…

Argues that when introductory activities to the classics begin with backgroundinformation, it can upstage or confine the life of the story, and shows little faith in the students as readers or in the literature itself. Suggests sometimes letting the literature begin, and then helping students make sense of it. Discusses examples from "To Kill a…

Urology pertinent neuroendocrine neoplasias are more and more driving to research attractive contributions mainly as regards the urinary tract paragangliomas, besides the prostate cancer neuroendocrine differentiation. About such visceral sympathetic paragangliomas, a considerable attention is aroused by those concerning the renal pelvis, urinary bladder and, particularly, the prostate gland. Essential catecholamine/adrenergic signal-mediated pathophysiological implications and outlined diagnostic approaches are here taken into consideration. Particularly, to reach an accurate functional diagnostic assessment, both plasma and urine catecholamine level tests are required together with 123I or 131I-meta-iodobenzylguanidine (MIBG) scan while 131I-, instead of 123I-, labeled MIBG, proving to be also useful to targeted radionuclide therapy of sympathetic paragangliomas. Nevertheless, a thorough diagnostic confirmation should be obtained by a proper histologic/immunohistochemical study, so that it respectively highlighting the typical “zellballen” cell setting and neuroendocrine tumor cell specific bio-markers such as chromogranin-A, synaptophysin, neuron-specific enolase. Open/laparoscopic/robot-assisted surgical procedures are performed under α1 (doxazosin, prazosin) - and β(propranolol)-adrenergic blockade to avoid the risk of an intraoperative adrenergic signal-triggered hypertensive crisis, what moreover may occur also during cystoscopy and biopsy in case of bladder or prostate paraganglioma. Given a conceivable likeness, about some adrenergic-mediated pathophysiological implications, between prostate paraganglioma and prostate cancer neuroendocrine transdifferentiation – although as regards two obviously different diseases – a reliable pathogenetic matter concerning prostate paraganglioma is requiring novel research approaches. PMID:27381689

Model calculations for the production of cosmic ray events in IR detectors by energy impulses due to fast charged particles' ionization trails are presently compared to the pulse-amplitude spectrum observed from a balloon at an altitude of 38 km. The results are pertinent to the current understanding of cosmic ray backgrounds found in all high sensitivity bolometer applications. The observed signal transients are in all details consistent with the modeling of known cosmic charged particle flux characteristics and with the detector response. Generally, the optics design should minimize detector/substrate cross section.

Infrared thermography has been recently proposed as an access technology for individuals with disabilities, but body functions and structures pertinent to its use have not been documented. Seven clients (2 adults, 5 youth) with severe disabilities and their primary caregivers participated in this study. All clients had a Gross Motor Functional…

...-all method of support has been selected; (b) A certification that: (1) For each person who is included... required by § 152.86(a) for the use the any exclusive use study that would be pertinent to the applicant's... determine the amount and terms of compensation, if any, to be paid for use of any study; and (v)...

...-all method of support has been selected; (b) A certification that: (1) For each person who is included... required by § 152.86(a) for the use the any exclusive use study that would be pertinent to the applicant's... determine the amount and terms of compensation, if any, to be paid for use of any study; and (v)...

This document is a compilation of background readings for the user of Computerized Placement Tests (CPTs) developed by the College Board for student placement purposes. CPTs are computerized adaptive tests that test the individual abilities and backgrounds of examinees. CPTs are part of the ACCUPLACER student information management system. The…

This issue exemplifies the types of articles that JABFM publishes to advance family medicine. We have articles on the implications of health system organizational structures. Three of these are international articles at the level of the national health system (1 from China) and systematic local health interventions (1 from Canada and 1 from Netherlands). Inside the United States, where there are more family physicians, there is less obesity, and designation as a Patient Centered Medical Home is related to increased rates of colorectal cancer screening. Review articles on common clinical topics discuss treatments that are changing (acne in pregnancy) or lack consensus (distal radial fractures). We have articles on making life easier in the office, such as for predicting Vitamin D levels, osteoporosis, and pre-diabetes in normal weight adults. There are articles to raise awareness of the "newest" testing or treatments, that is, auditory brainstem implants. "Reminder" articles highlight known entities that need to be reinforced to prevent over-/underdiagnosis or treatment, for example, "cotton fever." Another article discusses the increased risk for postoperative complications with sleep apnea. We also provide "thought" pieces, in this case about the terminology we are using to extend our concept of patient-centered medical homes. PMID:26957371

A PROJECT WHOSE PURPOSES WERE TO FOSTER BILINGUALISM, FORESTALL ANTICIPATED DIFFICULTIES IN SCIENCE, AND PROVIDE THE MOTIVATION AND COURSE REQUIREMENTS ESSENTIAL FOR SUCCESS IN HIGH SCHOOL IS DESCRIBED AND ASSESSED IN THIS REPORT. TWO SEVENTH-GRADE CLASSES OF SIMILAR AGE, LANGUAGE BACKGROUND, AND ABILITIES WERE GIVEN THE SAME OR EQUIVALENT PROGRAM…

The aim of this study was to obtain unbiased estimates of the diversity parameters, the population history, and the degree of admixture in Cika cattle which represents the local admixed breeds at risk of extinction undergoing challenging conservation programs. Genetic analyses were performed on the genome-wide Single Nucleotide Polymorphism (SNP) Illumina Bovine SNP50 array data of 76 Cika animals and 531 animals from 14 reference populations. To obtain unbiased estimates we used short haplotypes spanning four markers instead of single SNPs to avoid an ascertainment bias of the BovineSNP50 array. Genome-wide haplotypes combined with partial pedigree and type trait classification show the potential to improve identification of purebred animals with a low degree of admixture. Phylogenetic analyses demonstrated unique genetic identity of Cika animals. Genetic distance matrix presented by rooted Neighbour-Net suggested long and broad phylogenetic connection between Cika and Pinzgauer. Unsupervised clustering performed by the admixture analysis and two-dimensional presentation of the genetic distances between individuals also suggest Cika is a distinct breed despite being similar in appearance to Pinzgauer. Animals identified as the most purebred could be used as a nucleus for a recovery of the native genetic background in the current admixed population. The results show that local well-adapted strains, which have never been intensively managed and differentiated into specific breeds, exhibit large haplotype diversity. They suggest a conservation and recovery approach that does not rely exclusively on the search for the original native genetic background but rather on the identification and removal of common introgressed haplotypes would be more powerful. Successful implementation of such an approach should be based on combining phenotype, pedigree, and genome-wide haplotype data of the breed of interest and a spectrum of reference breeds which potentially have had

The aim of this study was to obtain unbiased estimates of the diversity parameters, the population history, and the degree of admixture in Cika cattle which represents the local admixed breeds at risk of extinction undergoing challenging conservation programs. Genetic analyses were performed on the genome-wide Single Nucleotide Polymorphism (SNP) Illumina Bovine SNP50 array data of 76 Cika animals and 531 animals from 14 reference populations. To obtain unbiased estimates we used short haplotypes spanning four markers instead of single SNPs to avoid an ascertainment bias of the BovineSNP50 array. Genome-wide haplotypes combined with partial pedigree and type trait classification show the potential to improve identification of purebred animals with a low degree of admixture. Phylogenetic analyses demonstrated unique genetic identity of Cika animals. Genetic distance matrix presented by rooted Neighbour-Net suggested long and broad phylogenetic connection between Cika and Pinzgauer. Unsupervised clustering performed by the admixture analysis and two-dimensional presentation of the genetic distances between individuals also suggest Cika is a distinct breed despite being similar in appearance to Pinzgauer. Animals identified as the most purebred could be used as a nucleus for a recovery of the native genetic background in the current admixed population. The results show that local well-adapted strains, which have never been intensively managed and differentiated into specific breeds, exhibit large haplotype diversity. They suggest a conservation and recovery approach that does not rely exclusively on the search for the original native genetic background but rather on the identification and removal of common introgressed haplotypes would be more powerful. Successful implementation of such an approach should be based on combining phenotype, pedigree, and genome-wide haplotype data of the breed of interest and a spectrum of reference breeds which potentially have had

Background: Legionella species may colonize in home water systems and cause Legionnaires’ disease (LD). We herein report two cases of sporadic LD associated with the solar energy-heated hot water systems of the patients’ houses. Case Report: A 60-year-old woman with chronic bronchitis and diabetes mellitus presented with a high fever, abdominal pain, and diarrhea. Physical examination revealed rales, and her chest radiograph showed a homogeneous density in the left lung. The Legionella urinary antigen test was positive, and an indirect fluorescent antibody test revealed a serum antibody titer of 1/520 for L. pneumophila serogroup 1. In the second case, a 66-year-old man with diabetes mellitus was treated for pneumonia at another hospital. After the patient’s general condition worsened and he required mechanical ventilation, he was referred to our hospital. The Legionella urinary antigen test was positive. Neither of the patients had been hospitalized or travelled within the previous month. Both patients used hot water storage tanks heated by solar energy; both also used an electrical device in the bathroom to heat the water when solar energy alone was insufficient. The hot water samples from the residences of both patients were positive for L. pneumophila sero-group 1. Conclusion: These cases show that domestic hot water systems heated by solar energy must be considered a possible source of community-acquired LD. PMID:27308081

The Strategic Defense Initiative Organization has created data centers for midcourse, plumes, and backgrounds phenomenologies. The Backgrounds Data Center (BDC) has been designated as the prime archive for data collected by SDIO programs. The BDC maintains a Summary Catalog that contains 'metadata,' that is, information about data, such as when the data were obtained, what the spectral range of the data is, and what region of the Earth or sky was observed. Queries to this catalog result in a listing of all data sets (from all experiments in the Summary Catalog) that satisfy the specified criteria. Thus, the user can identify different experiments that made similar observations and order them from the BDC for analysis. On-site users can use the Science Analysis Facility (SAFE for this purpose. For some programs, the BDC maintains a Program Catalog, which can classify data in as many ways as desired (rather than just by position, time, and spectral range as in the Summary Catalog). For example, data sets could be tagged with such diverse parameters as solar illumination angle, signal level, or the value of a particular spectral ratio, as long as these quantities can be read from the digital record or calculated from it by the ingest program. All unclassified catalogs and unclassified data will be remotely accessible.

The Backgrounds Data Center (BDC) is the designated archive for backgrounds data collected by Ballistic Missile Defense Organization (BMDO) programs, some of which include ultraviolet sensors. Currently, the BDC holds ultraviolet data from the IBSS, UVPI, UVLIM, and FUVCAM sensors. The BDC will also be the prime archive for Midcourse Space Experiment (MSX) data and is prepared to negotiate with program managers to handle other datasets. The purpose of the BDC is to make data accessible to users and to assist them in analyzing it. The BDC maintains the Science Catalog Information Exchange System (SCIES) allowing remote users to log in, read or post notices about current programs, search the catalogs for datasets of interest, and submit orders for data. On-site facilities are also available for the analysis of data, and consist of VMS and UNIX workstations with access to software analysis packages such as IDL, IRAF, and Khoros. Either on-site or remotely, users can employ the BDC-developed graphical user interface called the Visual Interface for Space and Terrestrial Analysis (VISTA) to generate catalog queries and to display and analyze data. SCIES and VISTA permit nearly complete access to BDC services and capabilities without the need to be physically present at the data center.

The Tonopah 1 ? by 2 ? quadrangle in south-central Nevada was studied by an interdisciplinary research team to appraise its mineral resources. The appraisal is based on geological, geochemical, and geophysical field and laboratory investigations, the results of which are published as a folio of maps, figures, and tables, with accompanying discussions. This circular provides backgroundinformation on the investigations and integrates the information presented in the folio. The selected bibliography lists references to the geology, geochemistry, geophysics, and mineral deposits of the Tonopah 1 ? by 2 ? quadrangle.

The Reno 1 ? by 2 ? quadrangle in west-central Nevada was studied by an interdisciplinary research team to appraise its mineral resources. The assessment is based on geological, geochemical, and geophysical field and laboratory investigations, the results of which are published as a folio of maps, reports, figures, and tables, with accompanying discussions. This circular provides backgroundinformation on the investigations and integrates the information presented in the folio. The selected bibliography lists references to the geology, geochemistry, geophysics, and mineral deposits of the Reno 1 ? by 2 ? quadrangle.

The Walker Lake 1? by 2? quadrangle in eastern California and western Nevada was studied by an interdisciplinary research team to appraise its mineral resources. The appraisal is based on geological, geochemical, and geophysical field and laboratory investigations, the results of which are published as a folio of maps, figures, and tables, with accompanying discussions. This circular provides backgroundinformation on the investigations and integrates the information presented in the folio. The selected bibliography lists selected references to the geology, geochemistry, geophysics, and mineral deposits of the Walker Lake 1? by 2? quadrangle.

The Medford 1 ? by 2 ? quadrangle in southern Oregon and northern California was studied by an interdisciplinary research team to appraise its mineral resources. The appraisal is based on geological, geochemical, and geophysical field and laboratory investigations, the results of which are published as a folio of maps, figures, and tables, with accompanying discussions. This circular provides backgroundinformation on the investigations and integrates the information presented in the folio. The bibliography lists selected references to the geology, geochemistry, geophysics, and mineral deposits of the Medford 1 ? by 2 ? quadrangle.

Better conditions for igniting low-reaction coal (anthracite) can be obtained, higher fuel burnout ratio can be achieved, and the problem of shortage of a certain grade of coal can be solved by firing coal mixtures and by combusting coal jointly with solid biomass in coal-fired boilers. Results from studying the synergetic effect that had been revealed previously during the combustion of coal mixtures in flames are presented. A similar effect was also obtained during joint combustion of coal and wood in a flame. The kinetics pertinent to combustion of char mixtures obtained from coals characterized by different degrees of metamorphism and the kinetics pertinent to combustion of wood chars were studied on the RSK-1D laboratory setup. It was found from the experiments that the combustion rate of char mixtures obtained from coals having close degrees of metamorphism is equal to the value determined as a weighted mean rate with respect to the content of carbon. The combustion rate of char mixtures obtained from coals having essentially different degrees of metamorphism is close to the combustion rate of more reactive coal initially in the process and to the combustion rate of less reactive coal at the end of the process. A dependence of the specific burnout rate of carbon contained in the char of two wood fractions on reciprocal temperature in the range 663—833 K is obtained. The combustion mode of an experimental sample is determined together with the reaction rate constant and activation energy.

The objective of this study was to provide a primer on the environmental effects that can affect the durability of nuclear power plant concrete structures. As concrete ages, changes in its properties will occur as a result of continuing microstructural changes (i.e., slow hydration, crystallization of amorphous constituents, and reactions between cement paste and aggregates), as well as environmental influences. These changes do not have to be detrimental to the point that concrete will not be able to meet its performance requirements. Concrete, however, can suffer undesirable changes with time because of improper specifications, a violation of specifications, or adverse performance of its cement paste matrix or aggregate constituents under either physical or chemical attack. Contained in this report is a discussion on concrete durability and the relationship between durability and performance, a review of the historical perspective related to concrete and longevity, a description of the basic materials that comprise reinforced concrete, and information on the environmental factors that can affect the performance of nuclear power plant concrete structures. Commentary is provided on the importance of an aging management program.

Attempts to determine costs in the intensive care unit (ICU) were not successful until now, as they failed to detect differences of costs between patients. The methodology and/or the instruments used might be at the origin of this failure. Based on the results of the European ICUs studies and on the descriptions of the activities of care in the ICU, we gathered and analysed the relevant literature concerning the monitoring of costs in the ICU. The aim was to formulate a methodology, from an economic perspective, in which future research may be framed. A bottom-up microcosting methodology will enable to distinguish costs between patients. The resulting information will at the same time support the decision-making of top management and be ready to include in the financial system of the hospital. Nursing staff explains about 30% of the total costs. This relation remains constant irrespective of the annual nurse/patient ratio. In contrast with other scoring instruments, the nursing activities score (NAS) covers all nursing activities. (1) NAS is to be chosen for quantifying nursing activities; (2) an instrument for measuring the physician's activities is not yet available; (3) because the nursing activities have a large impact on total costs, the standardisation of the processes of care (following the system approach) will contribute to manage costs, making also reproducible the issue of quality of care; (4) the quantification of the nursing activities may be the required (proxy) input for the automated bottom-up monitoring of costs in the ICU. PMID:22967197

Important sources of background for PEP experiments are studied. Background particles originate from high-energy electrons and positrons which have been lost from stable orbits, ..gamma..-rays emitted by the primary beams through bremsstrahlung in the residual gas, and synchrotron radiation x-rays. The effect of these processes on the beam lifetime are calculated and estimates of background rates at the interaction region are given. Recommendations for the PEP design, aimed at minimizing background are presented. 7 figs., 4 tabs.

This article make a case for the importance of background knowledge in children's comprehension. It suggests that differences in background knowledge may account for differences in understanding text for low- and middle-income children. It then describes strategies for building background knowledge in the age of common core standards.

Previous studies have shown that the perceptual organization of the visual scene constrains the deployment of attention. Here we investigated how the organization of multiple elements into larger configurations alters their attentional weight, depending on the "pertinence" or behavioral importance of the elements' features. We assessed object-based effects on distinct aspects of the attentional priority map: top-down control, reflecting the tendency to encode targets rather than distracters, and the spatial distribution of attention weights across the visual scene, reflecting the tendency to report elements belonging to the same rather than different objects. In 2 experiments participants had to report the letters in briefly presented displays containing 8 letters and digits, in which pairs of characters could be connected with a line. Quantitative estimates of top-down control were obtained using Bundesen's Theory of Visual Attention (1990). The spatial distribution of attention weights was assessed using the "paired response index" (PRI), indicating responses for within-object pairs of letters. In Experiment 1, grouping along the task-relevant dimension (targets with targets and distracters with distracters) increased top-down control and enhanced the PRI; in contrast, task-irrelevant grouping (targets with distracters) did not affect performance. In Experiment 2, we disentangled the effect of target-target and distracter-distracter grouping: Pairwise grouping of distracters enhanced top-down control whereas pairwise grouping of targets changed the PRI. We conclude that object-based perceptual representations interact with pertinence values (of the elements' features and location) in the computation of attention weights, thereby creating a widespread pattern of attentional facilitation across the visual scene. (PsycINFO Database Record PMID:26752732

The detection of small targets in maritime infrared surveillance is hampered by the presence of clutter. Sea surface structure, reflection and emission changes related to incident angle variations and surface effects are standard features governing the clutter behavior. Also special effects as sun glint and horizon effects play an important role for clutter. In order to optimize the detection process, quantitative clutter estimates are of use for filter settings. We have recorded a large amount of infrared backgrounds in the last few years, during common NATO trials. A large amount of different meteorological conditions took place during the various experiments. A first set of these data have been analyzed to obtain statistical data that represent the infrared scene. We have derived vertical temperature profiles, vertical fluctuation profiles, horizontal correlation coefficients and temporal correlation functions. In this paper we present the first analysis of these data. We are in the process of obtaining a condensed database of information to regenerate clutter images from bulk meteo parameters, and clutter parameters. The clutter and meteo parameters have been used to simulate various infrared scenes. Examples of this simulation process are shown in the presentation. The simulated images are statistically similar to the original images that were used to derive the parameters. A description of the image- generation is presented. Future expansions of the model are discussed.

To provide useful information during military operations, or as part of other security situations, a biological aerosol detector has to respond within seconds or minutes to an attack by virulent biological agents, and with low false alarms. Within this time frame, measuring virulence of a known microorganism is extremely difficult, especially if the microorganism is of unknown antigenic or nucleic acid properties. Measuring "live" characteristics of an organism directly is not generally an option, yet only viable organisms are potentially infectious. Fluorescence based instruments have been designed to optically determine if aerosol particles have viability characteristics. Still, such commercially available biological aerosol detection equipment needs to be improved for their use in military and civil applications. Air has an endogenous population of microorganisms that may interfere with alarm software technologies. To design robust algorithms, a comprehensive knowledge of the airborne biological background content is essential. For this reason, there is a need to study ambient live bacterial populations in as many locations as possible. Doing so will permit collection of data to define diverse biological characteristics that in turn can be used to fine tune alarm algorithms. To avoid false alarms, improving software technologies for biological detectors is a crucial feature requiring considerations of various parameters that can be applied to suppress alarm triggers. This NATO Task Group will aim for developing reference methods for monitoring biological aerosol characteristics to improve alarm algorithms for biological detection. Additionally, they will focus on developing reference standard methodology for monitoring biological aerosol characteristics to reduce false alarm rates.

The Choteau l? x 2? quadrangle in northwest Montana was studied by an interdisciplinary research team in order to appraise its mineral resource and hydrocarbon potential The appraisal is based on field and laboratory investigations of the geology, geochemistry, and geophysics. The results of the investigations are published as a folio of maps, figures, tables, and accompanying discussions. This circular provides backgroundinformation on the investigations and integrates the published components of the resource appraisal. A comprehensive bibliography cites both specific and general references to the geology, geochemistry, geophysics, and mineral deposits of the Choteau l? x 2? quadrangle.

Encompassing about 21,000 km 2 in southwestern Arizona, the Ajo and Lukeville 1 ? by 2 ? quadrangles have been the subject of mineral resource investigations utilizing field and laboratory studies in the disciplines of geology, geochemistry, geophysics, and Landsat imagery. The results of these studies are published as a folio of maps, figures, and tables, with accompanying discussions. Past mineral production has been limited to copper from the Ajo Mining District. In addition to copper, the quadrangles contain potentially significant resources of gold and silver; a few other commodities, including molybdenum and evaporites, may also exist in the area as appreciable resources. This circular provides backgroundinformation on the mineral deposits and on the investigations and integrates the information presented in the folio. The bibliography cites references to the geology, geochemistry, geophysics, and mineral deposits of the two quadrangles.

Lakes, streams, and wetlands serve many purposes for the people of the state of Kentucky and are necessary and valued elements of its natural resources. The Water Watch program promotes individual responsibility for a common resource, educates people about the use and protection of local water resources, provides recreational opportunities through…

By way of an introduction to the Workshop on Cold Moderators for Pulsed Neutron Sources, this paper surveys the highlights of early cold source developments, summarizes the world situation in existing pulsed neutron sources and advanced pulsed neutron source projects, and explores some of the general features and performance of cold moderators for pulsed neutron sources. (auth)

The Prostate, Lung, Colorectal and Ovarian (PLCO) Cancer Screening Trial is a large population-based randomized trial evaluating screening programs for these cancers. The primary goal of this long-term trial of the National Cancer Institute's (NCI) Division of Cancer Prevention (DCP) is to determine the effects of screening on cancer-related mortality and on secondary endpoints. |

Many studies have been undertaken on climate variability analysis in West Africa since the drastic drought recorded at the beginning of the 1970s. The variability highlighted by these studies relies in many cases on different baseline periods chosen with regard to the data available or to the reference periods defined by the World Meteorological Organization (WMO). However, the significance of the change in a time series for a given period is determined from some statistical tests. We develop in this study a statistical method to identify a pertinent reference period for rainfall and temperature variability analysis in the West African Soudano-Sahelian zone. The method is based on an application of three tests of homogeneity in time series and three tests of shift detection in time series. The pertinent reference period is defined as a period of more than 20 years and homogeneous with regard to the main climate parameters (rainfall and temperature). The application of the method on four different gridded climate data from 1901 to 2012 shows that the 1945-1970 period is the longest homogeneous period with regard to the annual rainfall amount. An assessment of the significance of the difference between the confidence interval at the level of 95% around the average during this period and the annual rainfall amounts time series shows that the normal amount is between -10% and +10% of that average. Thus, with regard to that referential period, a wet (dry) year is defined with a surplus (gap) of 10% in the annual rainfall amount above (below) this average. The decadal proportions of wet and dry years reveal that the 1971-1980 period presents the most important number of significant dry years with 1984 as the driest year over the whole 1901-2012 period. The drought periods recorded in the region are mainly characterized by segments of consecutive dry years that had severe impacts on crop production and livestock. Key words: climate variability; climate change; drought

Data from a survey given to about 50 PER community members were analyzed to determine the backgrounds of the members. The type of college attended, the type of graduate school, when they chose physics, when they chose PER, and other interesting backgroundinformation will be presented. Also presented will be gender analyses of background differences. Remarkably little difference in background was found between men and women in the sample.

Operators in N=4 super Yang-Mills theory with an R-charge of O(N{sup 2}) are dual to backgrounds which are asymtotically AdS{sub 5}xS{sup 5}. In this article we develop efficient techniques that allow the computation of correlation functions in these backgrounds. We find that (i) contractions between fields in the string words and fields in the operator creating the background are the field theory accounting of the new geometry, (ii) correlation functions of probes in these backgrounds are given by the free field theory contractions but with rescaled propagators and (iii) in these backgrounds there are no open string excitations with their special end point interactions; we have only closed string excitations.

In the field of solar fuel cells, the development of efficient photo-converting semiconductors remains a major challenge. A rational analysis of experimental photocatalytic results obtained with material in colloïdal suspensions is needed to access fundamental knowledge required to improve the design and properties of new materials. In this study, a simple system electron donor/nano-TiO2 is considered and examined via spin scavenging electron paramagnetic resonance as well as a panel of analytical techniques (composition, optical spectroscopy and dynamic light scattering) for selected type of nano-TiO2. Independent variables (pH, electron donor concentration and TiO2 amount) have been varied and interdependent variables (aggregate size, aggregate surface vs. volume and acid/base groups distribution) are discussed. This work shows that reliable understanding involves thoughtful combination of interdependent parameters, whereas the specific surface area seems not a pertinent parameter. The conclusion emphasizes the difficulty to identify the key features of the mechanisms governing photocatalytic properties in nano-TiO2. PMID:26829277

The masseter muscle forms a cornerstone of anatomical facial reconstruction (FR) methods, yet it is only scantily described in the FR literature despite relatively intense research focus from other disciplines. This suggests that much more data exists for masseter prediction than that which is currently used in FR. This paper reviews the masseter muscle and finds that highly pertinent anatomical and metric data to be available despite being overlooked in the FR literature. This includes variance and means of the perimeter dimensions, thicknesses, cross-sectional areas, volumes, metrics associated with muscle attachment, and correlations with other biological and craniometric variables (such as sex, age, tooth loss, cranial breadths, facial heights, alveolar thicknesses, and gonial angles). The oversight of these metric data adds to a general pattern seen for other hallmark structures of the face in FR and, taken together, these observations hold major ramifications for longstanding debates of FR accuracy, reliability, and error. Irrespectively, the data reviewed in this manuscript help set an improved basis for quantification of FR techniques. PMID:20338704

Recent interest to establish a dedicated underground laboratory in the United States prompted an experimental program at to quantify the enviromental backgrounds underground at the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. An outline of this program is provided along with recent experimental data on the cosmic ray muon flux at the 650 meter level of WIPP. The implications of the cosmic ray muon and fast neutron background at WIPP will be discussed in the context of new generation, low background experiments envisioned in the future.

The cosmic neutrino background is expected to consist of relic neutrinos from the big bang, of neutrinos produced during nuclear burning in stars, of neutrinos released by gravitational stellar collapse, and of neutrinos produced by cosmic ray interactions with matter and radiation in the interstellar and intergalactic medium. Formation of baryonic dark matter in the early universe, matter-antimatter annihilation in a baryonic symmetric universe, and dark matter annihilation could have also contributed significantly to the cosmic neutrino background. The purpose of this paper is to review the properties of these cosmic neutrino backgrounds, the indirect evidence for their existence, and the prospects for their detection.

An adaptive background model aiming at outdoor vehicle detection is presented in this paper. This model is an improved model of PICA (pixel intensity classification algorithm), it classifies pixels into K-distributions by color similarity, and then a hypothesis that the background pixel color appears in image sequence with a high frequency is used to evaluate all the distributions to determine which presents the current background color. As experiments show, the model presented in this paper is a robust, adaptive and flexible model, which can deal with situations like camera motions, lighting changes and so on.

In order to estimate the ability of the GLAST/LAT to reject unwanted background of charged particles, optimize the on-board processing, size the required telemetry and optimize the GLAST orbit, we developed a detailed model of the background particles that would affect the LAT. In addition to the well-known components of the cosmic radiation, we included splash and reentrant components of protons, electrons (e+ and e-) from 10 MeV and beyond as well as the albedo gamma rays produced by cosmic ray interactions with the atmosphere. We made estimates of the irreducible background components produced by positrons and hadrons interacting in the multilayered micrometeorite shield and spacecraft surrounding the LAT and note that because the orbital debris has increased, the shielding required and hence the background are larger than were present in EGRET. Improvements to the model are currently being made to include the east-west effect.

In order to estimate the ability of the GLAST/LAT to reject unwanted background of charged particles, optimize the on-board processing, size the required telemetry and optimize the GLAST orbit, we developed a detailed model of the background particles that would affect the LAT. In addition to the well-known components of the cosmic radiation, we included splash and reentrant components of protons, electrons (e+ and e-) from 10 MeV and beyond as well as the albedo gamma rays produced by cosmic ray interactions with the atmosphere. We made estimates of the irreducible background components produced by positrons and hadrons interacting in the multilayered micrometeorite shield and spacecraft surrounding the LAT and note that because the orbital debris has increased, the shielding required and hence the background are larger than were present in EGRET. Improvements to the model are currently being made to include the east-west effect.

Background is assumed to be uniform usually for evaluating the performance of thermal imaging systems, however the impact of background cannot be ignored for target acquisition in reality, background character is important research content for thermal imaging technology. A background noise parameter 𝜎 was proposed in MRTD model and used to describe background character. Background experiments were designed, and some typical backgrounds (namely lawn background, concrete pavement background, trees background and snow background) character were analyzed by 𝜎. MRTD including 𝜎 was introduced into MRTD-Channel Width (CW) model, the impact of above typical backgrounds for target information quantity were analyzed by MRTD-CW model with background character. Target information quantity for different backgrounds was calculated by MRTD-CW, and compared with that of TTP model. A target acquisition performance model based on MRTD-CW with background character will be research in the future.

Previous studies have shown that the perceptual organization of the visual scene constrains the deployment of attention. Here we investigated how the organization of multiple elements into larger configurations alters their attentional weight, depending on the “pertinence” or behavioral importance of the elements’ features. We assessed object-based effects on distinct aspects of the attentional priority map: top-down control, reflecting the tendency to encode targets rather than distracters, and the spatial distribution of attention weights across the visual scene, reflecting the tendency to report elements belonging to the same rather than different objects. In 2 experiments participants had to report the letters in briefly presented displays containing 8 letters and digits, in which pairs of characters could be connected with a line. Quantitative estimates of top-down control were obtained using Bundesen’s Theory of Visual Attention (1990). The spatial distribution of attention weights was assessed using the “paired response index” (PRI), indicating responses for within-object pairs of letters. In Experiment 1, grouping along the task-relevant dimension (targets with targets and distracters with distracters) increased top-down control and enhanced the PRI; in contrast, task-irrelevant grouping (targets with distracters) did not affect performance. In Experiment 2, we disentangled the effect of target-target and distracter-distracter grouping: Pairwise grouping of distracters enhanced top-down control whereas pairwise grouping of targets changed the PRI. We conclude that object-based perceptual representations interact with pertinence values (of the elements’ features and location) in the computation of attention weights, thereby creating a widespread pattern of attentional facilitation across the visual scene. PMID:26752732

The Cosmic Background Explorer (CBE), NASA's cosmological satellite which will observe a radiative relic of the big bang, is discussed. The major questions connected to the big bang theory which may be clarified using the CBE are reviewed. The satellite instruments and experiments are described, including the Differential Microwave Radiometer, which measures the difference between microwave radiation emitted from two points on the sky, the Far-Infrared Absolute Spectrophotometer, which compares the spectrum of radiation from the sky at wavelengths from 100 microns to one cm with that from an internal blackbody, and the Diffuse Infrared Background Experiment, which searches for the radiation from the earliest generation of stars.

We point out that, for Dirac neutrinos, in addition to the standard thermal cosmic neutrino background (C ν B ), there could also exist a nonthermal neutrino background with comparable number density. As the right-handed components are essentially decoupled from the thermal bath of standard model particles, relic neutrinos with a nonthermal distribution may exist until today. The relic density of the nonthermal (nt) background can be constrained by the usual observational bounds on the effective number of massless degrees of freedom Neff and can be as large as nν nt≲0.5 nγ. In particular, Neff can be larger than 3.046 in the absence of any exotic states. Nonthermal relic neutrinos constitute an irreducible contribution to the detection of the C ν B and, hence, may be discovered by future experiments such as PTOLEMY. We also present a scenario of chaotic inflation in which a nonthermal background can naturally be generated by inflationary preheating. The nonthermal relic neutrinos, thus, may constitute a novel window into the very early Universe.

The Berkeley Low Background Facility (BLBF) at Lawrence Berkeley National Laboratory (LBNL) in Berkeley, California provides low background gamma spectroscopy services to a wide array of experiments and projects. The analysis of samples takes place within two unique facilities; locally within a carefully-constructed, low background laboratory on the surface at LBNL and at the Sanford Underground Research Facility (SURF) in Lead, SD. These facilities provide a variety of gamma spectroscopy services to low background experiments primarily in the form of passive material screening for primordial radioisotopes (U, Th, K) or common cosmogenic/anthropogenic products; active screening via neutron activation analysis for U,Th, and K as well as a variety of stable isotopes; and neutron flux/beam characterization measurements through the use of monitors. A general overview of the facilities, services, and sensitivities will be presented. Recent activities and upgrades will also be described including an overview of the recently installed counting system at SURF (recently relocated from Oroville, CA in 2014), the installation of a second underground counting station at SURF in 2015, and future plans. The BLBF is open to any users for counting services or collaboration on a wide variety of experiments and projects.

The Berkeley Low Background Facility (BLBF) at Lawrence Berkeley National Laboratory (LBNL) in Berkeley, California provides low background gamma spectroscopy services to a wide array of experiments and projects. The analysis of samples takes place within two unique facilities; locally within a carefully-constructed, low background laboratory on the surface at LBNL and at the Sanford Underground Research Facility (SURF) in Lead, SD. These facilities provide a variety of gamma spectroscopy services to low background experiments primarily in the form of passive material screening for primordial radioisotopes (U, Th, K) or common cosmogenic/anthropogenic products; active screening via neutron activation analysis for U,Th, and K as well as a variety of stable isotopes; and neutron flux/beam characterization measurements through the use of monitors. A general overview of the facilities, services, and sensitivities will be presented. Recent activities and upgrades will also be described including an overview of the recently installed counting system at SURF (recently relocated from Oroville, CA in 2014), the installation of a second underground counting station at SURF in 2015, and future plans. The BLBF is open to any users for counting services or collaboration on a wide variety of experiments and projects.

The infrared (IR) testing of the Olympus thermal model has provided a capability to perform cost effective thermal balance testing of satellites and satellite components. A high-accuracy monitored background radiometer was developed for the measurement of absorbed radiation heat flux encountered during IR thermal vacuum testing of spacecraft. The design, development, and calibration of this radiometer is described.

Ohio State Univ., Columbus. Center on Education and Training for Employment.

This document, which is designed for use in developing a tech prep competency profile for the occupation of health information technician, lists technical competencies and competency builders for 14 units pertinent to the health technologies cluster in general and 6 units specific to the occupation of emergency medical technician. The following…

Background checks involve gathering information from state and federal databases to determine if child care providers have a history of child abuse or other criminal convictions that would make them unacceptable for working with children. Background checks include state criminal history checks, state child abuse registry checks, and Federal Bureau…

This patent describes a testing apparatus for testing and evaluating the performance of laser seeking warheads for missiles, under simulated weather conditions. It comprises support means for supporting a warhead seeker; laser means for generating a laser beam and for directing a laser beam towards the seeker; a diffusion screen interposed between the seeker support means and the laser means for diffusing the laser beam; a collimating lens interposed between the diffusion screen and the seeker support means for collimating the diffused laser beam and for directing the collimated laser beam onto a warhead seeker, supported in the seeker support; background illuminator means for illuminating the seeker support and a seeker disposed therein, supported for movement into and out of an operating position between the diffusion means and the collimating lens for providing background lighting in simulation of weather lighting conditions; and control means for controlling the intensity of the light provided by the illuminator means to simulate various weather conditions.

Four tables of planetary and satellite data are presented which list satellite discoveries, planetary parameters, satellite orbits, and satellite physical properties respectively. A scheme for classifying the satellites is provided and it is noted that most known moons fall into three general classes: regular satellites, collisional shards, and irregular satellites. Satellite processes are outlined with attention given to origins, dynamical and thermal evolution, surface processes, and composition and cratering. Background material is provided for each family of satellites.

Hiring qualified staff is critical for a medical practice. As many physicians have discovered, taking applicants' resumes on faith can be a mistake. Background company searches show that one in three applicants provide false, inaccurate, misleading, or incomplete information. Fake degrees, false licenses and certifications, and criminal histories are a few of the problems that a proper background check can reveal. This article describes further why background checks are essential, how to incorporate a background check into your hiring process, and some of the legalities involved in the process. PMID:17494489

The booklet provides an overview on vision therapy to aid writers, editors, and broadcasters help parents, teachers, older adults, and all consumers learn more about vision therapy. Following a description of vision therapy or vision training, information is provided on how and why vision therapy works. Additional sections address providers of…

This paper describes the Program for Research on Private Higher Education (PROPHE), a program that seeks to build knowledge about private higher education around the world. The program focuses on discovery, analysis, and dissemination of information, as well as creation of an international base of trained researchers. The main mission of the…

The U.S. National Institute for Building Sciences (NIBS) started the development of the National Building Information Model Standard (NBIMS). Its goal is to define standard sets of data required to describe any given building in necessary detail so that any given AECO industry discipline application can find needed data at any point in the building lifecycle. This will include all data that are used in or are pertinent to building energy performance simulation and analysis. This paper describes the background that lead to the development of NBIMS, its goals and development methodology, its Part 1 (Version 1.0), and its probable impact on building energy performance simulation and analysis.

Late last year the National Aeronautics and Space Administration launched its first satellite dedicated to the study of phenomena related to the origins of the universe. The satellite, called the Cosmic Background Explorer (COBE), carries three complementary detectors that will make fundamental measurements of the celestial radiation. Part of that radiation is believed to have originated in processes that occurred at the very dawn of the universe. By measuring the remnant radiation at wavelengths from one micrometer to one centimeter across the entire sky, scientists hope to be able to solve many mysteries regarding the origin and evolution of the early universe. Unfortunately, these radiative relics of the early universe are weak and veiled by local astrophysical and terrestrial sources of radiation. The wavelengths of the various cosmic components may also overlap, thereby making the understanding of the diffuse celestial radiation a challenge. Nevertheless, the COBE instruments, with their full-sky coverage, high sensitivity to a wide range of wavelengths and freedom from interference from the earth's atmosphere, will constitute for astrophysicists an observatory of unprecedented sensitivity and scope. The interesting cosmic signals will then be separated from one another and from noncosmic radiation sources by a comprehensive analysis of the data.

Intelligence as an ability to reason, think abstractly and adapt effectively to the environment is a subject of research in the field of psychology, neurobiology, and in the last twenty years genetics as well. Genetical testing of twins carried out from XX century indicated heritebility of intelligence, therefore confirmed an influence of genetic factor on cognitive processes. Studies on genetic background of intelligence focus on dopaminergic (DRD2, DRD4, COMT, SLC6A3, DAT1, CCKAR) and adrenergic system (ADRB2, CHRM2) genes as well as, neutrofins (BDNF) and oxidative stress genes (LTF, PRNP). Positive effect of investigated gene polymorphism was indicated by variation c.957C>T DRD2 gene (if in polymorphic site is thymine), polymorphism c.472G>A COMT gene (presence of adenine) and also gene ADRB2 c.46A->G (guanine), CHRM2 (thymine in place c.1890A>T) and BDNF (guanine in place c.472G>A) Obtained results indicate that intelligence is a feature dependent not only on genetic but also an environmental factor. PMID:27333929

Forty and twenty years after the two books published by Einar Tandberg-Hanssen (Solar prominences (Geophysics and astrophysics monographs), Vol. 12. Dordrecht: D. Reidel Publishing Co., 1974; The nature of solar prominences, astrophysics and space science library, Vol. 199. Dordrecht: Kluwer Academic Publishers, 1995) on solar prominences, it is time to update our knowledge and understanding of these fascinating solar structures. After a brief history which overviews first eclipse observations (drawings and then photography), spectrographic, coronagraphic and later on polarimetric measurements, the chapter presents samples of the most spectacular results of the last two decades, obtained whether from space or on the ground. It discusses the contents of the book in order to encourage the reader to dip into the following 17 chapters which provide comprehensive and detailed observations, information about the methods used, and interpretation of the results on the basis of the latest theoretical and modelling works.

The International Telecommunications Union (ITU) conceives the radio spectrum as primarily a resource for telecommunications. Indeed most applications of radio are for communications and other radio services, particularly the Radio Astronomy Service, are deemed to be `pretend'communication serviceas for spectrum amnagement purposes. The language of Radio Spectrum Management is permeated by the terminology ofcommunications, some derived from the physics of radio and some from aspects of information theory. This contribution touches on all the essential concepts of radiocommunications which the author thinks should be the common mental equipment of the Spectrum Manager. The fundamental capacity of a communication channel is discussed in terms of the degrees of freedom and bandwidth of a signal, and the signal to noise ratio. It is emphasized that an information bearing signal is inherently unpredictable, and must, at some level, be discontinuous. This has important consequences for the form of its power spectrum. The effect of inserting filters is discussed particularly with regard to constant amplitude signals and, in the context of non-linear power amplifiers, the phenomenon of`sideband recovery'. All the common generic forms of modulation are discussed including the very different case of `no-modulation' which applies in all forms of passive remote sensing. Whilst all are agreed that the radio spectrum should be used `efficiently', there is no quantitative measure of spectral efficiency which embraces all relevant aspects of spectral usage. These various aspects are dicussed. Finally a brief outline of some aspects of antennae are reviewed. It is pointed out that the recent introduction of so-called `active antennnae', which have properties unlike traditional passive antennae, has confused the interpretation of those ITU Radio Regulations which refer to antennae.

The characterization and measurement of background radiation relevant to optical communications system performance is addressed. The necessary optical receiver parameters are described, and radiometric concepts required for the calculation of collected background power are developed. The most important components of optical background power are discussed, and their contribution to the total collected background power in various communications scenarios is examined.

... 10 Energy 3 2011-01-01 2011-01-01 false Other information. 205.327 Section 205.327 Energy... International Boundaries § 205.327 Other information. The applicant may be required after filing the application to furnish such supplemental information as the ERA may deem pertinent. Such requests shall...

This Staff Brief was prepared for the Wisconsin Legislative Council's Special Committee on Acid Rain to provide an introduction to the issue of acid rain. It is divided into four parts. Part I provides an overview on the controversies surrounding the measurement, formation and effects of acid rain. As described in Part I, the term acid rain is used to describe the deposition of acidic components through both wet deposition (e.g., rain or snow) and dry deposition (e.g., direct contact between atmospheric constituents and the land, water or vegetation of the earth). Part II presents backgroundinformation on state agency activities relating to acid rain in Wisconsin, describes what is known about the occurrence of, susceptibility to and effects of acid rain in Wisconsin, and provides information related to man-made sources of sulfur and nitrogen oxides in Wisconsin. Part III describes major policies and regulations relating to acid rain which have been or are being developed jointly by the United States and Canadian governments, by the United States government and by the State of Wisconsin. Part IV briefly discusses possible areas for Committee action.

Deep submergence vehicles bring us closer to the target we want to see in better resolution. It is difficult to obtain very detailed scale seafloor or sub-bottom structure profiling by a conventional surface-towed survey system, in which the horizontal spatial resolution in data obtained is poor because of the great distance between the sensors and the targets (seafloor/sub-bottom). To improve this horizontal resolution of the profiling, we have been developing and using deep-tow profiling systems for more than two decades at the Geological Survey of Japan, AIST. In our presentation we show our latest tool for the deep-sea mapping, named DAI-PACK (Deep-sea Acoustic Imaging Package) system, which has been tested and used in fields for more than four years now. It is not easy, however, to bring the tool deep and to maneuver and keep the towing altitude sufficiently close to the seafloor. To overcome this problem, we made the system work stand-alone and also made it portable in size and weight to be accommodated to ROVs (Remotely Operated Vehicle) or manned submersibles. We have chosen two sensors to be installed in the system, i.e. (1) deep-sea side-scan sonar and (2) deep-sea sub-bottom profiler. All components except sensors are packaged in an aluminum pressure sphere, which can be installed as a piggy back payload on many vehicles available today. From our recent experiments using the DAI-PACK on several vehicles, e.g. ROVs (the Hyper Dolphin in '03~'06, the ROPOS in '04, the Kaiko7000II in '06), and HOV (Shinkai6500 in '04, '05), we have learned several pros and cons regarding the use of the DSVs. We can discuss on the topics, such as (1) noise problem of the vehicles; ROVs were more noisy than Shinkai6500, (2) importance and need of good positioning of the vehicles;any DSVs we have used have not been installed a navigation system good enough to be collocated with the imagery data of the DAI-PACK, and (3) usefulness of sub-bottom profiling, based on real good/bad data we have collected. We haven't had a chance to use AUVs yet, but we can discuss about the advantage/disadvantage of using AUVs to some extent as well.

Up to now most approaches of target and background characterization (and exploitation) concentrate solely on the information given by pixels. In many cases this is a complex and unprofitable task. During the development of automatic exploitation algorithms the main goal is the optimization of certain performance parameters. These parameters are measured during test runs while applying one algorithm with one parameter set to images that constitute of image domains with very different domain characteristics (targets and various types of background clutter). Model based geocoding and registration approaches provide means for utilizing the information stored in GIS (Geographical Information Systems). The geographical information stored in the various GIS layers can define ROE (Regions of Expectations) and may allow for dedicated algorithm parametrization and development. ROI (Region of Interest) detection algorithms (in most cases MMO (Man- Made Object) detection) use implicit target and/or background models. The detection algorithms of ROIs utilize gradient direction models that have to be matched with transformed image domain data. In most cases simple threshold calculations on the match results discriminate target object signatures from the background. The geocoding approaches extract line-like structures (street signatures) from the image domain and match the graph constellation against a vector model extracted from a GIS (Geographical Information System) data base. Apart from geo-coding the algorithms can be also used for image-to-image registration (multi sensor and data fusion) and may be used for creation and validation of geographical maps.

We summarize the experiences of the Collider Detector at Fermilab (CDF) experiment in the presence of backgrounds originating from the counter circulating beams in the Fermilab Tevatron. These backgrounds are measured and their sources identified. Finally, we outline the strategies employed to reduce the effects of these backgrounds on the experiment.

The objective of this investigation was to perform a spectral survey of the low energy diffuse X-ray background using the X-ray Background Survey Spectrometer (XBSS) on board the Space Station Freedom (SSF). XBSS obtains spectra of the X-ray diffuse background in the 11-24 A and 44-84 A wavelength intervals over the entire sky with 15 deg spatial resolution. These X-rays are almost certainly from a very hot (10(exp 6) K) component of the interstellar medium that is contained in regions occupying a large fraction of the interstellar volume near the Sun. Astrophysical plasmas near 10(exp 6) K are rich in emission lines, and the relative strengths of these lines, besides providing information about the physical conditions of the emitting gas, also provide information about its history and heating mechanisms.

This report provides background materials related to the California Senate Select Committee on Higher Education Admissions and Outreach and the California Senate Select Committee on Higher Education hearing on undergraduate admissions at the University of California (UC) and the Board of Regents' Special Proposal 1 (SP-1), which eliminated the use…

The plastic-flow behavior which controls the formation of bulk residual stresses during final heat treatment of powder-metallurgy (PM), nickel-base superalloys was quantified using conventional (isothermal) stress-relaxation (SR) tests and a novel approach which simulates concurrent temperature and strain transients during cooling following solution treatment. The concurrent cooling/straining test involves characterization of the thermal compliance of the test sample. In turn, this information is used to program the ram-displacement- vs-time profile to impose a constant plastic strain rate during cooling. To demonstrate the efficacy of the new approach, SR tests (in both tension and compression) and concurrent cooling/tension-straining experiments were performed on two PM superalloys, LSHR and IN-100. The isothermal SR experiments were conducted at a series of temperatures between 1144 K and 1436 K (871 °C and 1163 °C) on samples that had been supersolvus solution treated and cooled slowly or rapidly to produce starting microstructures comprising coarse gamma grains and coarse or fine secondary gamma-prime precipitates, respectively. The concurrent cooling/straining tests comprised supersolvus solution treatment and various combinations of subsequent cooling rate and plastic strain rate. Comparison of flow-stress data from the SR and concurrent cooling/straining tests showed some similarities and some differences which were explained in the context of the size of the gamma-prime precipitates and the evolution of dislocation substructure. The magnitude of the effect of concurrent deformation during cooling on gamma-prime precipitation was also quantified experimentally and theoretically.

Within all the eukaryotic cells there is an important group of biomolecules that has been potentially related to signalling functions: the myo-inositol phosphates (InsPs). In nature, the most abundant member of this family is the so called InsP6 (phytate, L(12-)), for which our group has strived in the past to elucidate its intricate chemical behaviour. In this work we expand on our earlier findings, shedding light on the inframolecular details of its protonation and complexation processes. We evaluate systematically the chemical performance of InsP6 in the presence and absence of alkali and alkaline earth metal ions, through (31)P NMR measurements, in a non-interacting medium and over a wide pH range. The analysis of the titration curves by means of a model based on the cluster expansion method allows us to describe in detail the distribution of the different protonated microspecies of the ligand. With the aid of molecular modelling tools, we assess the energetic and geometrical characteristics of the protonation sequence and the conformational transition suffered by InsP6 as the pH changes. By completely characterizing the protonation pattern, conformation and geometry of the metal complexes, we unveil the chemical and structural basis behind the influence that the physiologically relevant cations, Na(+), K(+), Mg(2+) and Ca(2+) have over the phytate chemical reactivity. This information is essential in the process of gaining reliable structural knowledge about the most important InsP6 species in the in vitro and in vivo experiments, and how these features modulate their probable biological functions. PMID:25058574

Mean nuclear backgrounds are large, but are arguably amenable to frame-to-frame subtraction. Striated backgrounds on the sensors for defensive interceptors could, however, cause clutter leak-through, which could make detection and track difficult. Nominal motions and backgrounds give signal to clutter ratios too low to be useful. Clutter leakage due to line-of-sight drift can be reduced by stabilizing the line of sight around the background clutter itself. Current interceptors have detector arrays large enough for operation independent of nuclear backgrounds in their fields of view. 6 refs., 2 figs.

The use of Boolean not" logic in selective dissemination of information produced greater user satisfaction, less nonpertinent information, and no apparent decrease in the number of pertinent retrievals. (9 references) (SJ)

Just as all perceptions are of figures differentiated from a larger background, a play takes place against the background of the audience's knowledge and feelings. While audience members generally bring to a performance a large body of backgroundinformation--they evaluate the storyline, for example, using a lifetime of personal experience--at…

...The Department of Energy (DOE), pursuant to the Paperwork Reduction Act of 1995), intends to extend for three years, an information collection request (OMB Control Number 1910-1700) with the Office of Management and Budget (OMB). The proposed voluntary collection will request that an individual or an authorized designee provide pertinentinformation for easy record retrieval allowing for......

... 1 General Provisions 1 2010-01-01 2010-01-01 false Sources of information. 20.5 Section 20.5... information. Pertinent sources of information useful to the public, in areas of public interest such as... should be provided with each agency statement. These sources of information shall plainly identify...

... 1 General Provisions 1 2011-01-01 2011-01-01 false Sources of information. 20.5 Section 20.5... information. Pertinent sources of information useful to the public, in areas of public interest such as... should be provided with each agency statement. These sources of information shall plainly identify...

The diffuse cosmic infrared background (CIB) consists of the cumulative radiant energy released in the processes of structure formation that have occurred since the decoupling of matter and radiation following the Big Bang. In this lecture I will review the observational data that provided the first detections and limits on the CIB, and the theoretical studies explaining the origin of this background. Finally, I will also discuss the relevance of this background to the universe as seen in high energy gamma-rays.

Low background rare event searches in underground laboratories seeking observation of direct dark matter interactions or neutrino-less double beta decay have the potential to profoundly advance our understanding of the physical universe. Successful results from these experiments depend critically on construction from extremely radiologically clean materials and accurate knowledge of subsequent low levels of expected background. The experiments must conduct comprehensive screening campaigns to reduce radioactivity from detector components, and these measurements also inform detailed characterisation and quantification of background sources and their impact, necessary to assign statistical significance to any potential discovery. To provide requisite sensitivity for material screening and characterisation in the UK to support our rare event search activities, we have re-developed our infrastructure to add ultra-low background capability across a range of complementary techniques that collectively allow complete radioactivity measurements. Ultra-low background HPGe and BEGe detectors have been installed at the Boulby Underground Laboratory, itself undergoing substantial facility re-furbishment, to provide high sensitivity gamma spectroscopy in particular for measuring the uranium and thorium decay series products. Dedicated low-activity mass spectrometry instrumentation has been developed at UCL for part per trillion level contaminant identification to complement underground screening with direct U and Th measurements, and meet throughput demands. Finally, radon emanation screening at UCL measures radon background inaccessible to gamma or mass spectrometry techniques. With this new capability the UK is delivering half of the radioactivity screening for the LZ dark matter search experiment.

Low background rare event searches in underground laboratories seeking observation of direct dark matter interactions or neutrino-less double beta decay have the potential to profoundly advance our understanding of the physical universe. Successful results from these experiments depend critically on construction from extremely radiologically clean materials and accurate knowledge of subsequent low levels of expected background. The experiments must conduct comprehensive screening campaigns to reduce radioactivity from detector components, and these measurements also inform detailed characterisation and quantification of background sources and their impact, necessary to assign statistical significance to any potential discovery. To provide requisite sensitivity for material screening and characterisation in the UK to support our rare event search activities, we have re-developed our infrastructure to add ultra-low background capability across a range of complementary techniques that collectively allow complete radioactivity measurements. Ultra-low background HPGe and BEGe detectors have been installed at the Boulby Underground Laboratory, itself undergoing substantial facility re-furbishment, to provide high sensitivity gamma spectroscopy in particular for measuring the uranium and thorium decay series products. Dedicated low-activity mass spectrometry instrumentation has been developed at UCL for part per trillion level contaminant identification to complement underground screening with direct U and Th measurements, and meet throughput demands. Finally, radon emanation screening at UCL measures radon background inaccessible to gamma or mass spectrometry techniques. With this new capability the UK is delivering half of the radioactivity screening for the LZ dark matter search experiment.

Optical imaging spectroscopy is investigated as a method to estimate radiological background by spectral identification of soils, sediments, rocks, minerals and building materials derived from natural materials and assigning tabulated radiological emission values to these materials. Radiological airborne surveys are undertaken by local, state and federal agencies to identify the presence of radiological materials out of regulatory compliance. Detection performance in such surveys is determined by (among other factors) the uncertainty in the radiation background; increased knowledge of the expected radiation background will improve the ability to detect low-activity radiological materials. Radiological background due to naturally occurring radiological materials (NORM) can be estimated by reference to previous survey results, use of global 40K, 238U, and 232Th (KUT) values, reference to existing USGS radiation background maps, or by a moving average of the data as it is acquired. Each of these methods has its drawbacks: previous survey results may not include recent changes, the global average provides only a zero-order estimate, the USGS background radiation map resolutions are coarse and are accurate only to 1 km – 25 km sampling intervals depending on locale, and a moving average may essentially low pass filter the data to obscure small changes in radiation counts. Imaging spectroscopy from airborne or spaceborne platforms can offer higher resolution identification of materials and background, as well as provide imaging context information. AVIRIS hyperspectral image data is analyzed using commercial exploitation software to determine the usefulness of imaging spectroscopy to identify qualitative radiological background emissions when compared to airborne radiological survey data.

Optical imaging spectroscopy is investigated as a method to estimate radiological background by spectral identification of soils, sediments, rocks, minerals and building materials derived from natural materials and assigning tabulated radiological emission values to these materials. Radiological airborne surveys are undertaken by local, state and federal agencies to identify the presence of radiological materials out of regulatory compliance. Detection performance in such surveys is determined by (among other factors) the uncertainty in the radiation background; increased knowledge of the expected radiation background will improve the ability to detect low-activity radiological materials. Radiological background due to naturally occurring radiological materials (NORM) can be estimated by reference to previous survey results, use of global 40K, 238U, and 232Th (KUT) values, reference to existing USGS radiation background maps, or by a moving average of the data as it is acquired. Each of these methods has its drawbacks: previous survey results may not include recent changes, the global average provides only a zero-order estimate, the USGS background radiation map resolutions are coarse and are accurate only to 1 km - 25 km sampling intervals depending on locale, and a moving average may essentially low pass filter the data to obscure small changes in radiation counts. Imaging spectroscopy from airborne or spaceborne platforms can offer higher resolution identification of materials and background, as well as provide imaging context information. AVIRlS hyperspectral image data is analyzed using commercial exploitation software to determine the usefulness of imaging spectroscopy to identify qualitative radiological background emissions when compared to airborne radiological survey data.

FAMILY BACKGROUNDS OF RURAL YOUTH ARE DISCUSSED. THE BACKGROUND PROVIDED BY THE FAMILY HAS IMPLICATIONS FOR THE ADJUSTMENT OF RURAL YOUTH IN AN URBANIZED, HIGHLY TECHNICAL SOCIETY. THE BASIC ECOLOGICAL CONDITIONS OF RURAL AREAS INFLUENCE THE RATE OF SOCIAL CHANGE, THE IMPORTANCE OF THE FAMILY AS A SOCIAL UNIT, AND THE ORIENTATION TOWARD LEGAL…

This paper discusses the background reduction and rejection strategy of the Cryogenic Dark Matter Search (CDMS) experiment. Recent measurements of background levels from CDMS II at Soudan are presented, along with estimates for future improvements in sensitivity expected for a proposed SuperCDMS experiment at SNOLAB.

Electromagnetic properties of hadrons can be computed by lattice simulations of QCD in background fields. We demonstrate new techniques for the investigation of charged hadron properties in electric fields. Our current calculations employ large electric fields, motivating us to analyze chiral dynamics in strong QED backgrounds, and subsequently uncover surprising non-perturbative effects present at finite volume.

Tests G. Armstrong's and B. Greenberg's model of the effect of background television on cognitive performance, applied to reading comprehension and memory. Finds significant deleterious effects of background television, stronger and more consistent effects when testing immediately after reading, and more consistently negative effects resulting…

Measurements have been made to assess the characteristics and origins of background events in microchannel plates (MCPs). An overall background rate of about 0.4 events/sq cm persec has been achieved consistently for MCPs that have been baked and scrubbed. The temperature and gain of the MCPs are found to have no significant effect on the background rate. Detection of 1.46-MeV gamma rays from the MCP glass confirms the presence of K-40, with a concentration of 0.0007 percent, in MCP glass. It is shown that beta decay from K-40 is sufficient to cause the background rate and spectrum observed. Anticoincidence measurements indicate the the background rate caused by cosmic ray interactions is small (less than 0.016 events/sq cm per sec).

This manual provides information to assist schools and school systems to implement changes required by Education Ministers to enrolment forms (and associated data collection and storage processes). This is to enable nationally comparable reporting of students' outcomes against the "National Goals for Schooling in the Twenty-First Century." The…

The neutrinoless double beta (0νββ) decay experiment GERDA at the LNGS of INFN has started physics data taking in November 2011. This paper presents an analysis aimed at understanding and modeling the observed background energy spectrum, which plays an essential role in searches for a rare signal like 0νββ decay. A very promising preliminary model has been obtained, with the systematic uncertainties still under study. Important information can be deduced from the model such as the expected background and its decomposition in the signal region. According to the model the main background contributions around Qββ come from 214Bi, 228Th, 42K, 60Co and α emitting isotopes in the 226Ra decay chain, with a fraction depending on the assumed source positions.

The neutrinoless double beta (0νββ) decay experiment GERDA at the LNGS of INFN has started physics data taking in November 2011. This paper presents an analysis aimed at understanding and modeling the observed background energy spectrum, which plays an essential role in searches for a rare signal like 0νββ decay. A very promising preliminary model has been obtained, with the systematic uncertainties still under study. Important information can be deduced from the model such as the expected background and its decomposition in the signal region. According to the model the main background contributions around Q{sub ββ} come from {sup 214}Bi, {sup 228}Th, {sup 42}K, {sup 60}Co and α emitting isotopes in the {sup 226}Ra decay chain, with a fraction depending on the assumed source positions.

MiniCLEAN is a dark matter experiment using 150kg fiducial mass of liquid cryogen (argon or neon) to search for Weakly Interacting Massive Particles (WIMPs). MiniCLEAN seeks to detect scintillation photons from WIMP-induced argon recoils. A potentially dominant background is from alpha decays on the inner surfaces of the containment vessel. Such events can mimic the prompt signal characteristic of nuclear recoils. This talk will show the expected background rates, methods of background discrimination, and their expected effectiveness.

Researchers should embrace differences in genetic background to build richer disease models that more accurately reflect the level of variation in the human population, posits Clement Chow. PMID:26659016

Pulse sequences for suppressing background signals from spinning modules used in magic-angle spinning NMR are described. These pulse sequences are based on spatially selective composite 90° pulses originally reported by Bax, which provide for no net excitation of spins outside the homogeneous region of the coil. We have achieved essentially complete suppression of background signals originating from our Vespel spinning module (which uses a free-standing coil) in both 1H and 13C spectra without notable loss in signal intensity. Successful modification of both Bloch decay and cross-polarization pulse sequences to include spatially selective pulses was essential to acquire background-free spectra for weak samples. Background suppression was also found to be particularly valuable for both T1 and T1 ϱ, relaxation measurements.

The Low Background Facility (LBF) at Lawrence Berkeley National Laboratory in Berkeley, California provides low background gamma spectroscopy services to a wide array of experiments and projects. The analysis of samples takes place within two unique facilities; locally within a carefully-constructed, low background cave and remotely at an underground location that historically has operated underground in Oroville, CA, but has recently been relocated to the Sanford Underground Research Facility (SURF) in Lead, SD. These facilities provide a variety of gamma spectroscopy services to low background experiments primarily in the form of passive material screening for primordial radioisotopes (U, Th, K)more » or common cosmogenic/anthropogenic products, as well as active screening via Neutron Activation Analysis for specific applications. The LBF also provides hosting services for general R&D testing in low background environments on the surface or underground for background testing of detector systems or similar prototyping. A general overview of the facilities, services, and sensitivities is presented. Recent activities and upgrades will also be presented, such as the completion of a 3π anticoincidence shield at the surface station and environmental monitoring of Fukushima fallout. The LBF is open to any users for counting services or collaboration on a wide variety of experiments and projects.« less

The Low Background Facility (LBF) at Lawrence Berkeley National Laboratory in Berkeley, California provides low background gamma spectroscopy services to a wide array of experiments and projects. The analysis of samples takes place within two unique facilities; locally within a carefully-constructed, low background cave and remotely at an underground location that historically has operated underground in Oroville, CA, but has recently been relocated to the Sanford Underground Research Facility (SURF) in Lead, SD. These facilities provide a variety of gamma spectroscopy services to low background experiments primarily in the form of passive material screening for primordial radioisotopes (U, Th, K) or common cosmogenic/anthropogenic products, as well as active screening via Neutron Activation Analysis for specific applications. The LBF also provides hosting services for general R&D testing in low background environments on the surface or underground for background testing of detector systems or similar prototyping. A general overview of the facilities, services, and sensitivities is presented. Recent activities and upgrades will also be presented, such as the completion of a 3π anticoincidence shield at the surface station and environmental monitoring of Fukushima fallout. The LBF is open to any users for counting services or collaboration on a wide variety of experiments and projects.

The Low Background Facility (LBF) at Lawrence Berkeley National Laboratory in Berkeley, California provides low background gamma spectroscopy services to a wide array of experiments and projects. The analysis of samples takes place within two unique facilities; locally within a carefully-constructed, low background cave and remotely at an underground location that historically has operated underground in Oroville, CA, but has recently been relocated to the Sanford Underground Research Facility (SURF) in Lead, SD. These facilities provide a variety of gamma spectroscopy services to low background experiments primarily in the form of passive material screening for primordial radioisotopes (U, Th, K) or common cosmogenic/anthropogenic products, as well as active screening via Neutron Activation Analysis for specific applications. The LBF also provides hosting services for general R&D testing in low background environments on the surface or underground for background testing of detector systems or similar prototyping. A general overview of the facilities, services, and sensitivities is presented. Recent activities and upgrades will also be presented, such as the completion of a 3π anticoincidence shield at the surface station and environmental monitoring of Fukushima fallout. The LBF is open to any users for counting services or collaboration on a wide variety of experiments and projects.

About 400,000 years after the Big Bang the temperature of the Universe fell to about a few thousand degrees. As a result, the previously free electrons and protons combined and the Universe became neutral. This released a radiation which we now observe as the cosmic microwave background (CMB). The tiny fluctuations* in the temperature and polarization of the CMB carry a wealth of cosmological information. These so-called temperature anisotropies were predicted as the imprints of the initial density perturbations which gave rise to the present large-scale structures such as galaxies and clusters of galaxies. This relation between the present-day Universe and its initial conditions has made the CMB radiation one of the most preferred tools to understand the history of the Universe. The CMB radiation was discovered by radio astronomers Arno Penzias and Robert Wilson in 1965 [72] and earned them the 1978 Nobel Prize. This discovery was in support of the Big Bang theory and ruled out the only other available theory at that time - the steady-state theory. The crucial observations of the CMB radiation were made by the Far-Infrared Absolute Spectrophotometer (FIRAS) instrument on the Cosmic Background Explorer (COBE) satellite [86]- orbited in 1989-1996. COBE made the most accurate measurements of the CMB frequency spectrum and confirmed it as being a black-body to within experimental limits. This made the CMB spectrum the most precisely measured black-body spectrum in nature. The CMB has a thermal black-body spectrum at a temperature of 2.725 K: the spectrum peaks in the microwave range frequency of 160.2 GHz, corresponding to a 1.9mmwavelength. The results of COBE inspired a series of ground- and balloon-based experiments, which measured CMB anisotropies on smaller scales over the next decade. During the 1990s, the first acoustic peak of the CMB power spectrum (see Figure 5.1) was measured with increasing sensitivity and by 2000 the BOOMERanG experiment [26] reported

Accurately quantifying stream thermal regimes can be challenging because stream temperatures are often spatially and temporally heterogeneous. In this study, we present a novel modeling framework that combines stream temperature data sets that are continuous in either space or time. Specifically, we merged the fine spatial resolution of thermal infrared (TIR) imagery with hourly data from 10 stationary temperature loggers in a 100 km portion of the Big Hole River, MT, USA. This combination allowed us to estimate summer thermal conditions at a relatively fine spatial resolution (every 100 m of stream length) over a large extent of stream (100 km of stream) during during the warmest part of the summer. Rigorous evaluation, including internal validation, external validation with spatially continuous instream temperature measurements collected from a Langrangian frame of reference, and sensitivity analyses, suggests the model was capable of accurately estimating longitudinal patterns in summer stream temperatures for this system Results revealed considerable spatial and temporal heterogeneity in summer stream temperatures and highlighted the value of assessing thermal regimes at relatively fine spatial and temporal scales. Preserving spatial and temporal variability and structure in abiotic stream data provides a critical foundation for understanding the dynamic, multiscale habitat needs of mobile stream organisms. Similarly, enhanced understanding of spatial and temporal variation in dynamic water quality attributes, including temporal sequence and spatial arrangement, can guide strategic placement of monitoring equipment that will subsequently capture variation in environmental conditions directly pertinent to research and management objectives.

A new class of conformationally constrained oxa-bridged tricyclo-dicarboxamide (OTDA) ligand was rationally designed for the selective extraction of tetravalent actinides pertinent to the Plutonium Uranium Redox EXtraction (PUREX) process. Two of the designed diamide ligands were synthesized and extraction studies were performed for Pu(iv) from HNO3 medium. The mechanism of extraction was investigated by studying various parameters such as feed HNO3, NaNO3 and OTDA concentrations. The nature of the extracted species was found to be [Pu(NO3)4(OTDA)]. One of the OTDA ligands was elaborately tested and showed the selective extraction of Pu(iv) and Np(iv) over other actinide species, viz., U(vi), Np(v), Am(iii), lanthanides and fission products contained in a nuclear waste from the PUREX process. DFT calculations predicted the charge density on each of the coordinating 'O' atoms of OTDA supporting its high Pu(iv) selectivity over other ions studied and also provided the energy optimized structure of OTDA and its Pu(iv) complex. PMID:27054892

Solar axions could be converted into x-rays inside the strong magnetic field of an axion helioscope, triggering the detection of this elusive particle. Low background x-ray detectors are an essential component for the sensitivity of these searches. We report on the latest developments of the Micromegas detectors for the CERN Axion Solar Telescope (CAST), including technological pathfinder activities for the future International Axion Observatory (IAXO). The use of low background techniques and the application of discrimination algorithms based on the high granularity of the readout have led to background levels below 10-6 counts/keV/cm2/s, more than a factor 100 lower than the first generation of Micromegas detectors. The best levels achieved at the Canfranc Underground Laboratory (LSC) are as low as 10-7 counts/keV/cm2/s, showing good prospects for the application of this technology in IAXO. The current background model, based on underground and surface measurements, is presented, as well as the strategies to further reduce the background level. Finally, we will describe the R&D paths to achieve sub-keV energy thresholds, which could broaden the physics case of axion helioscopes.

One expects three Cosmic Backgrounds: (1) The Cosmic Microwave Background (CMB) originated 380000 years after the Big Bang (BB). (2) The Neutrino Background decoupled about one second after the BB, while (3) the Cosmic Gravitational Wave Background created by the inflationary expansion decoupled directly after the BB. Only the Cosmic Microwave Background (CMB) has been detected and is well studied. Its spectrum follows Planck's black body radiation formula and shows a remarkable constant temperature of T0γ ≈ 2.7 K independent of the direction. The present photon density is about 370 photons per cm3. The size of the hot spots, which deviates only in the fifth decimal of the temperature from the average value, tells us, that the universe is flat. About 380 000 years after the Big Bang at a temperature of T0γ = 3000 K already in the matter dominated era the electrons combine with the protons and 4He and the photons move freely in the neutral universe and form the CMB. So the temperature and distribution of the photons give us information of the universe 380 000 years after the Big Bang. The Cosmic Neutrino Background (CνB) decoupled from matter already one second after the BB at a temperature of about 1010 K. Today their temperature is ~ 1.95 K and the average density is 56 electron-neutrinos and the total density of all neutrinos about 336 per cm3. Measurement of these neutrinos is an extremely challenging experimental problem which can hardly be solved with the present technologies. On the other hand it represents a tempting opportunity to check one of the key elements of the Big Bang Cosmology and to probe the early stages of the universe. The search for the CνB with the induced beta decay νe+3H → 3He + e- using KATRIN (KArlsruhe TRItium Neutrino experiment) is the topic of this contribution.

We calculate the tension of the D3-brane in the fivebrane background which is described by the exactly solvable SU(2)k×U(1) world-sheet conformal field theory with large Kač-Moody level k. The D3-brane tension is extracted from the amplitude of one closed string exchange between two parallel D3-branes, and the amplitude is calculated by utilizing the open-closed string duality. The tension of the D3-brane in the background does not coincide with the one in the flat space-time even in the flat space-time limit: k-->∞. The finite curvature effect should vanish in the flat space-time limit and only the topological effect can remain. Therefore, the deviation suggests the condensation of the gravitino and/or dilatino which has been expected in the fivebrane background as a gravitational instanton.

... 1203.200 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION INFORMATION SECURITY PROGRAM NASA Information Security Program § 1203.200 Background and discussion. (a) In establishing a... public inspection of that information that is classified to protect the national security. (b)...

A review the implications of the spectrum and anisotropy of the cosmic microwave background for cosmology. Thermalization and processes generating spectral distortions are discussed. Anisotropy predictions are described and compared with observational constraints. If the evidence for large-scale power in the galaxy distribution in excess of that predicted by the cold dark matter model is vindicated, and the observed structure originated via gravitational instabilities of primordial density fluctuations, the predicted amplitude of microwave background anisotropies on angular scales of a degree and larger must be at least several parts in 10 exp 6.

The present experiment employed standardized test batteries to assess the effects of fast-tempo music on cognitive performance among 56 male and female university students. A linguistic processing task and a spatial processing task were selected from the Criterion Task Set developed to assess verbal and nonverbal performance. Ten excerpts from Mozart's music matched for tempo were selected. Background music increased the speed of spatial processing and the accuracy of linguistic processing. The findings suggest that background music can have predictable effects on cognitive performance. PMID:20865993

In this short paper-extended abstract the new approach to generation of electronic background music has been presented. The Generative Electronic Background Music System (GEBMS) has been located between other related approaches within the musical algorithm positioning framework proposed by Woller et al. The music composition process is performed by a number of mini-models parameterized by further described properties. The mini-models generate fragments of musical patterns used in output composition. Musical pattern and output generation are controlled by container for the mini-models - a host-model. General mechanism has been presented including the example of the synthesized output compositions.

Physics goals of a Muon Collider (MC) can only be reached with appropriate design of the ring, interaction region (IR), high-field superconducting magnets, machine-detector interface (MDI) and detector. Results of the most recent realistic simulation studies are presented for a 1.5-TeV MC. It is shown that appropriately designed IR and MDI with sophisticated shielding in the detector have a potential to substantially suppress the background rates in the MC detector. The main characteristics of backgrounds are studied.

When string or M theory is compactified to lower dimensions, the U-duality symmetry predicts so-called exotic branes whose higher-dimensional origin cannot be explained by the standard string or M-theory branes. We argue that exotic branes can be understood in higher dimensions as nongeometric backgrounds or U folds, and that they are important for the physics of systems which originally contain no exotic charges, since the supertube effect generically produces such exotic charges. We discuss the implications of exotic backgrounds for black hole microstate (non-)geometries. PMID:20867363

This study examines interviews with 46 undergraduates to explore if participants with differing language and cultural backgrounds view plagiarism or textual appropriation primarily as a) a language problem because of a lack of words of one's own, or b) a cultural challenge as a result of either some first language (L1) cultural training to…

Pensions are an important but comparatively unexamined component of human resource policies in education. In an increasingly competitive world where employees are more mobile than ever, pension policies that were designed in the last century may be out of step with the needs of both individuals and schools. This background paper aims to foster…

CANDLES is a double beta decay experiment using 48Ca in CaF2 crystals. The measurement is being performed with prototype detector (CANDLES III) for high sensitive measurement in the future. Recent status of detector improvements and background reduction techniques are described in this paper.

Parrondo's paradox states that there are losing gambling games which, when being combined stochastically or in a suitable deterministic way, give rise to winning games. Here we investigate the probabilistic background. We show how the properties of the equilibrium distributions of the Markov chains under consideration give rise to the paradoxical behavior, and we provide methods how to find the best a priori strategies.

CANDLES is a double beta decay experiment using {sup 48}Ca in CaF{sub 2} crystals. The measurement is being performed with prototype detector (CANDLES III) for high sensitive measurement in the future. Recent status of detector improvements and background reduction techniques are described in this paper.

Ambient gamma dose rates in air were measured at different locations (indoors and outdoors) to demonstrate the ubiquitous nature of natural background radiation in the environment and to show that levels vary from one location to another, depending on the underlying geology. The effect of a lead shield on a gamma radiation field was also…

A modeling system composed of the global model GEOS-Chem providing hourly lateral boundary conditions to the regional model CMAQ was used to calculate the policy relevant background level of fine particulate: matter. Simulations were performed for the full year of 2004 over the d...

It has proven a very difficult task to discriminate an actual BW threat from the natural occurring ambient particulate aerosol, which includes a significant fraction of particles consisting of mixed mineral and biological material. The interferent particles [clutter] (bio and non bio) concentration varies widely both by location, weather and season and diurnally. Naturally occurring background particulates are composed of fungal and bacterial spores both fragments and components, plant fragments and debris, animal fragments and debris, all of which may be associated with inert dust or combustion material. Some or all of which could also be considered to be an interferent to a biological warfare detector and cause these biodector systems to cause False Alarms by non specific BW bio detectors. I will share analysis of current long term background data sets.

The cosmic star formation history associated with baryon flows within the large scale structure of the expanding Universe has many important consequences, such as cosmic chemical- and galaxy evolution. Stars and accreting compact objects subsequently produce light, from the radio band to the highest photon energies, and dust within galaxies reprocesses a significant fraction of this light into the IR region. The Universe creates a radiation background that adds to the relic field from the big bang, the CMB. In addition, Cosmic Rays are created on variouys scales, and interact with this diffuse radiation field, and neutrinos are added as well. A multi-messenger field is created whose evolution with redshift contains a tremendous amount of cosmological information. We discuss several aspects of this story, emphasizing the background in the HE regime and the neutrino sector, and disccus the use of gamma-ray sources as probes.

The Cosmic Background Explorer (COBE) Mission will measure the diffuse radiation from the universe in the wavelength band 1 micron to 9.6 mm. The band includes the 3 K cosmic background radiation, the known relic of the primeval cosmic explosion. The COBE satellite will be launched from the Western Space and Missile Center (EWSMC) via a Delta launch vehicle into a circular parking orbit of about 300 km. COBE will be placed into a 900-km altitude circular orbit. Coverage will be provided by the Deep Space Network (DSN) for COBE emergencies that would prevent communications via the normal channels of the Tracking and Data Relay Satellite System (TDRSS). Emergency support will be provided by the DSN 26-m subnetwork. Information is given in tabular form for DSN network support, frequency assignments, telemetry, and command.

The study of diffuse backgrounds has played an important role in the recent history of astronomy. From the microwave discovery of the 2.7 K background to the soft X-ray detection of coronal gas to the diffuse H2 emission from warm interstellar gas in our galaxy to the infrared mapping of wisps of dust at high galactic latitudes, diffuse background astronomy has provided fundamental insights into the nature of the universe. As the various regions of the electromagnetic spectrum have been explored, their diffuse backgrounds have been found to arise from the widest possible range of sources: from the local interstellar medium to the farthest reaches of the observable universe; from the wrinkled echo of the Big Bang to the million degree plasma between the stars. Most astronomers are ``point-source" astronomers, and the history of astronomy space missions is that few have been dedicated to the elucidation of the nature of the truly diffuse radiation. And yet a large fraction of the total electromagnetic energy in the universe occurs in the form of diffuse radiation. In some spectral ranges, we do not yet know the fraction of radiation that is diffuse; we are dealing with genuinely unexplored frontiers. We will describe the extraordinary science that can be obtained through a MIDEX mission that is dedicated to the exploration of the diffuse emission in the far ultraviolet and soft X-ray regions of the spectrum, where the diffuse radiation is dominated by emission from the hottest components of the interstellar medium and, perhaps, from the intergalactic medium. HUBE currently enjoys the status of being NASA's MIDEX Alternate Astrophysics Mission. We are re-proposing HUBE in the current MIDEX competition with a much broader scientific set of goals, aiming at a definitive spectroscopic survey of the diffuse background over a greatly-expanded spectral range. Our HUBE proposal effort is being supported by Ball Aerospace Corporation.

In this analysis, a reference background stratospheric aerosol optical model is developed based on the nearly global SAGE 1 satellite observations in the non-volcanic period from March 1979 to February 1980. Zonally averaged profiles of the 1.0 micron aerosol extinction for the tropics and the mid- and high-altitudes for both hemispheres are obtained and presented in graphical and tabulated form for the different seasons. In addition, analytic expressions for these seasonal global zonal means, as well as the yearly global mean, are determined according to a third order polynomial fit to the vertical profile data set. This proposed background stratospheric aerosol model can be useful in modeling studies of stratospheric aerosols and for simulations of atmospheric radiative transfer and radiance calculations in atmospheric remote sensing.

Information retrieval systems were among the first medical informatics applications, yet their use has changed substantially in this decade with the growth of end-user computers and the Internet. While early challenges revolved around how to increase the amount of information available in electronic form, more recent challenges center on how to manage the growing volume. Traditional information retrieval issues--such as how to organize and index information to make it more retrievable as well as how to evaluate the effectiveness of systems--are still as pertinent as ever. PMID:9929180

Five studies of the background level of several perfluorocarbon compounds in Europe are here presented together with measurements from the European Tracer Experiment (ETEX). The tracers used during the two ETEX tracer releases were the perfluorocarbons (PFCs); perfluoromethylcyclohexane (C 7F 14, PMCH) and perfluoromethylcyclopentane (C 6F 12, PMCP). Their background concentrations were detected by using both passive and active sampling techniques, to define the spatial and temporal variation of the PFCs over Europe. Also the background variations of four isomers of the PFC compound perfluorodimethylcyclohexane (C 8F 16, PDCH) were studied. The results were compared to other PFC tracer studies in the U.S.A. and Europe. The mean and median values of the measured PFCs were found to vary slightly and randomly in space and time. They were found to be higher and to have a larger standard deviation than the measurements from the American studies. The background concentrations were still found to be low and stable enough for PFCs to be highly suitable for use in tracer studies. The following concentrations were found: PMCP; 4.6±0.3 fl ℓ -1, PMCH: 4.6±0.8 fl ℓ -1, ocPDCH: 0.96±0.33 fl ℓ -1, mtPDCH: 9.3±0.8 fl ℓ -1, mcPDCH: 8.8±0.8 fl ℓ -1, ptPDCH: 6.1±0.8 fl ℓ -1. A study of the correlation between the measured PFC compounds showed a significant correlation between most of the compounds, which indicate that there are no major PFC sources in Europe.

The purpose of this study was to identify the earth science principles pertinent to current programs in general education at the junior high level, and to determine whether selected textbooks provide adequate representation of those principles. There were 121 principles identified as being essential to the understanding of these five earth science…

Within the derivative expansion of conformally reduced gravity, the modified split Ward identities are shown to be compatible with the flow equations if and only if either the anomalous dimension vanishes or the cutoff profile is chosen to have a power-law form. No solutions exist if the Ward identities are incompatible. In the compatible case, a clear reason is found for why Ward identities can still forbid the existence of fixed points; however, for any cutoff profile, a background independent (and parametrization independent) flow equation is uncovered. Finally, expanding in vertices, the combined equations are shown generically to become either overconstrained or highly redundant beyond the six-point level.

Nearly global SAGE I satellite observations in the nonvolcanic period from March 1979 to February 1980 are used to produce a reference background stratospheric aerosol optical model. Zonally average profiles of the 1.0-micron aerosol extinction for the tropics, midlatitudes, and high latitudes for both hemispheres are given in graphical and tabulated form for the different seasons. A third order polynomial fit to the vertical profile data set is used to derive analytic expressions for the seasonal global means and the yearly global mean. The results have application to the simulation of atmospheric radiative transfer and radiance calculations in atmospheric remote sensing.

Photon counting with lidar returns is usually limited to low light levels, while wide dynamic range is achieved by counting for long times. The broad emission spectrum of inexpensive high-power semiconductor lasers makes receiver filters pass too much background light for traditional photon counting in daylight. Very high speed photon counting is possible, however, at more than 500 MHz which allows the construction of eyesafe lidar operating in the presence of bright clouds. Detector improvements are possible to count to 20 GHz producing a single shot dynamic range of ten decades.

A 48-year old woman in good general health was referred to the orofacial pain clinic in a centre for special dentistry with a toothache in the premolar region of the left maxillary quadrant. The complaints had existed for 15 years and various dental treatments, including endodontic treatments, apical surgery, extraction and splint therapy, had not helped to alleviate the complaints. As a result of the fact that anti-epileptic drugs were able to reduce the pain it was concluded that this 'toothache' satisfied the criteria of an atypical odontalgia: 'toothache' with a neuropathic background. PMID:26181392

Ambient gamma dose rates in air were measured at different locations (indoors and outdoors) to demonstrate the ubiquitous nature of natural background radiation in the environment and to show that levels vary from one location to another, depending on the underlying geology. The effect of a lead shield on a gamma radiation field was also demonstrated to emphasize the important role of shielding in radiation protection. The measurements were carried out with a Geiger-Muller (GM)-based dosimeter and a NaI scintillation gamma-ray spectrometer, which are normally available in physics laboratories. Radioactivity in household materials was demonstrated using a gas mantle as an example.

Available information on gust structure, airplane reactions, and pertinent operating statistics has been examined. This report attempts to coordinate this information with reference to the prediction of gust loads on airplanes. The material covered represents research up to October 1947. (author)

... with cancer for whom a dose reconstruction must be conducted, as required under 20 CFR 30.115. (b) The... EEOICPA, has promulgated regulations at 20 CFR 30.210 and 30.213 that identify current members of the... AND HEALTH RESEARCH AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION UNDER THE...

... with cancer for whom a dose reconstruction must be conducted, as required under 20 CFR 30.115. (b) The... EEOICPA, has promulgated regulations at 20 CFR 30.210 and 30.213 that identify current members of the... compensation may be provided are cancers. There are two categories of covered employees with cancer...

... with cancer for whom a dose reconstruction must be conducted, as required under 20 CFR 30.115. (b) The... EEOICPA, has promulgated regulations at 20 CFR 30.210 and 30.213 that identify current members of the... compensation may be provided are cancers. There are two categories of covered employees with cancer...

... with cancer for whom a dose reconstruction must be conducted, as required under 20 CFR 30.115. (b) The... EEOICPA, has promulgated regulations at 20 CFR 30.210 and 30.213 that identify current members of the... compensation may be provided are cancers. There are two categories of covered employees with cancer...

Seven essays are presented that deal with the students, instruction, and administration of Miami-Dade Community College (MDCC). First, John Losak considers the M-DCC student population since 1969, providing data on ethnicity, age of students, male/female enrollments, foreign student enrollments, program diversity, skill level of enrolling…

Responses to the following are provided: (A) Which isotopes do you (company, agency, university, community) currently use in your activities or distribute (repackage) to end-users? (B) Describe generally what these isotopes are used for, i.e. the science or application. (C) Which isotope(s) do you anticipate may have significant future increase in demand. Identify the isotope(s), its priority, possible chemical form and for what purpose it would be used. (D) Are there other isotopes that you might use but are currently unavilable or not available in difficient quantities? If so, please identify this isotope, from whom have you tired to obtain it and for what prupose would it be used. (E) Do you have any specific issues with respect to the purity, availability, reliability of supply, etc. of isotopes at present?

A review of the US uranium mining industry has revealed a generally depressed industry situation. The 1982 U/sub 3/O/sub 8/ production from both open-pit and underground mines declined to 3800 and 6300 tons respectively with the underground portion representing 46% of total production. US exploration and development has continued downward in 1982. Employment in the mining and milling sectors has dropped 31% and 17% respectively in 1982. Representative forecasts were developed for reactor fuel demand and U/sub 3/O/sub 8/ production for the years 1983 and 1990. Reactor fuel demand is estimated to increase from 15,900 tons to 21,300 tons U/sub 3/O/sub 8/ respectively. U/sub 3/O/sub 8/ production, however, is estimated to decrease from 10,600 tons to 9600 tons respectively. A field examination was conducted of 29 selected underground uranium mines that represent 84% of the 1982 underground production. Data was gathered regarding population, land ownership and private property valuation. An analysis of the increased cost to production resulting from the installation of 20-meter high exhaust borehole vent stacks was conducted. An assessment was made of the current and future /sup 222/Rn emission levels for a group of 27 uranium mines. It is shown that /sup 222/Rn emission rates are increasing from 10 individual operating mines through 1990 by 1.2 to 3.8 times. But for the group of 27 mines as a whole, a reduction of total /sup 222/Rn emissions is predicted due to 17 of the mines being shutdown and sealed. The estimated total /sup 222/Rn emission rate for this group of mines will be 105 Ci/yr by year end 1983 or 70% of the 1978-79 measured rate and 124 Ci/yr by year end 1990 or 83% of the 1978-79 measured rate.

... with cancer for whom a dose reconstruction must be conducted, as required under 20 CFR 30.115. (b) The... EEOICPA, has promulgated regulations at 20 CFR 30.210 and 30.213 that identify current members of the... compensation may be provided are cancers. There are two categories of covered employees with cancer...

Motion estimation (ME) and motion compensation (MC) using variable block size, sub-pixel search, and multiple reference frames (MRFs) are the major reasons for improved coding performance of the H.264 video coding standard over other contemporary coding standards. The concept of MRFs is suitable for repetitive motion, uncovered background, non-integer pixel displacement, lighting change, etc. The requirement of index codes of the reference frames, computational time in ME & MC, and memory buffer for coded frames limits the number of reference frames used in practical applications. In typical video sequences, the previous frame is used as a reference frame with 68-92% of cases. In this article, we propose a new video coding method using a reference frame [i.e., the most common frame in scene (McFIS)] generated by dynamic background modeling. McFIS is more effective in terms of rate-distortion and computational time performance compared to the MRFs techniques. It has also inherent capability of scene change detection (SCD) for adaptive group of picture (GOP) size determination. As a result, we integrate SCD (for GOP determination) with reference frame generation. The experimental results show that the proposed coding scheme outperforms the H.264 video coding with five reference frames and the two relevant state-of-the-art algorithms by 0.5-2.0 dB with less computational time.

Pars planitis is defined as an intermediate uveitis of unknown background of systemic disease with characteristic formations such as vitreous snowballs, snowbanks and changes in peripheral retina. The incidence of pars planitis varies 2.4-15.4 % of the uveitis patients. The pathogenesis of the disease is to be determined in future. Clinical and histopathological findings suggest an autoimmune etiology, most likely as a reaction to endogenous antigen of unknown source, with T cells predominant in both vitreous and pars plana infiltrations. T cells subsets play an important role as a memory-effector peripheral cell. Snowbanks are formed as an effect of post inflammatory glial proliferation of fibrous astrocytes. There is also a genetic predisposition for pars planitis by human leukocyte antigen and several other genes. A coexistence of multiple sclerosis and optic neuritis has been described in numerous studies. Epiretinal membrane, cataract, cystoid macular edema, retinal detachment, retinal vasculitis, neovascularization, vitreous peripheral traction, peripheral hole formation, vitreous hemorrhage, disc edema are common complications observed in pars planitis. There is a need to expand the knowledge of the pathogenic and immunologic background of the pars planitis to create an accurate pharmacological treatment. PMID:26438050

One of the main areas of research is the theory of cosmic microwave background (CMB) anisotropies and analysis of CMB data. Using the four year COBE data we were able to improve existing constraints on global shear and vorticity. We found that, in the flat case (which allows for greatest anisotropy), (omega/H)0 less than 10(exp -7), where omega is the vorticity and H is the Hubble constant. This is two orders of magnitude lower than the tightest, previous constraint. We have defined a new set of statistics which quantify the amount of non-Gaussianity in small field cosmic microwave background maps. By looking at the distribution of power around rings in Fourier space, and at the correlations between adjacent rings, one can identify non-Gaussian features which are masked by large scale Gaussian fluctuations. This may be particularly useful for identifying unresolved localized sources and line-like discontinuities. Levin and collaborators devised a method to determine the global geometry of the universe through observations of patterns in the hot and cold spots of the CMB. We have derived properties of the peaks (maxima) of the CMB anisotropies expected in flat and open CDM models. We represent results for angular resolutions ranging from 5 arcmin to 20 arcmin (antenna FWHM), scales that are relevant for the MAP and COBRA/SAMBA space missions and the ground-based interferometer. Results related to galaxy formation and evolution are also discussed.

One of the main areas of research is the theory of cosmic microwave background (CMB) anisotropies and analysis of CMB data. Using the four year COBE data we were able to improve existing constraints on global shear and vorticity. We found that, in the flat case (which allows for greatest anisotropy), (omega/H)0 less than 10-7, where omega is the vorticity and H is the Hubble constant. This is two orders of magnitude lower than the tightest, previous constraint. We have defined a new set of statistics which quantify the amount of non-Gaussianity in small field cosmic microwave background maps. By looking at the distribution of power around rings in Fourier space, and at the correlations between adjacent rings, one can identify non-Gaussian features which are masked by large scale Gaussian fluctuations. This may be particularly useful for identifying unresolved localized sources and line-like discontinuities. Levin and collaborators devised a method to determine the global geometry of the universe through observations of patterns in the hot and cold spots of the CMB. We have derived properties of the peaks (maxima) of the CMB anisotropies expected in flat and open CDM models. We represent results for angular resolutions ranging from 5 arcmin to 20 arcmin (antenna FWHM), scales that are relevant for the MAP and COBRA/SAMBA space missions and the ground-based interferometer. Results related to galaxy formation and evolution are also discussed.

Provides historical information on Cuba. Addresses early colonization, the advent of plantation agriculture, the role and presence of the United States in the Caribbean and Cuba, and the social and economic developments in Cuba after the revolution in 1959 led by Fidel Castro. (CMK)

Designed to serve as an introduction to some aspects of Korean culture and civilization, this text consists largely of lectures on various topics prepared by staff members of the Defense Language Institute. The major section on the Republic of South Korea includes information on: (1) the historical setting; (2) the politico-military complex; (3)…

A background canceling long range alpha detector which is capable of providing output proportional to both the alpha radiation emitted from a surface and to radioactive gas emanating from the surface. The detector operates by using an electrical field between first and second signal planes, an enclosure and the surface or substance to be monitored for alpha radiation. The first and second signal planes are maintained at the same voltage with respect to the electrically conductive enclosure, reducing leakage currents. In the presence of alpha radiation and radioactive gas decay, the signal from the first signal plane is proportional to both the surface alpha radiation and to the airborne radioactive gas, while the signal from the second signal plane is proportional only to the airborne radioactive gas. The difference between these two signals is proportional to the surface alpha radiation alone.

A background canceling long range alpha detector which is capable of providing output proportional to both the alpha radiation emitted from a surface and to radioactive gas emanating from the surface. The detector operates by using an electrical field between first and second signal planes, an enclosure and the surface or substance to be monitored for alpha radiation. The first and second signal planes are maintained at the same voltage with respect to the electrically conductive enclosure, reducing leakage currents. In the presence of alpha radiation and radioactive gas decay, the signal from the first signal plane is proportional to both the surface alpha radiation and to the airborne radioactive gas, while the signal from the second signal plane is proportional only to the airborne radioactive gas. The difference between these two signals is proportional to the surface alpha radiation alone. 5 figs.

The GERDA experiment operates bare Germanium diodes enriched in {sup 76}Ge in an environment of pure liquid argon to search for neutrinoless double beta decay. A very low radioactive background is essential for the success of the experiment. We present here the research done in order to remove radio-impurities coming from the liquid argon, the stainless steel cryostat and the front-end electronics. We found that liquid argon can be purified efficiently from {sup 222}Rn. The main source of {sup 222}Rn in GERDA is the cryostat which emanates about 55 mBq. A thin copper shroud in the center of the cryostat was implemented to prevent radon from approaching the diodes. Gamma ray screening of radio-pure components for front-end electronics resulted in the development of a pre-amplifier with a total activity of less than 1 mBq {sup 228}Th.

Double neutron stars are one of the most promizing sources for terrestrial gravitational wave interferometers. For actual interferometers and their planned upgrades, the probability of having a signal present in the data is small, but as the sensitivity improves, the detection rate increases and the waveforms may start to overlap, creating a confusion background, ultimately limiting the capabilities of future detectors. The third generation Einstein Telescope, with an horizon of z > 1 and very low frequency "seismic wall" may be affected by such confusion noise. At a minimum, careful data analysis will be require to separate signals which will appear confused. This result should be borne in mind when designing highly advanced future instruments.

In this work we present an extension of the ROMA map-making code for data analysis of Cosmic Microwave Background polarization, with particular attention given to the inflationary polarization B-modes. The new algorithm takes into account a possible cross- correlated noise component among the different detectors of a CMB experiment. We tested the code on the observational data of the BOOMERanG (2003) experiment and we show that we are provided with a better estimate of the power spectra, in particular the error bars of the BB spectrum are smaller up to 20% for low multipoles. We point out the general validity of the new method. A possible future application is the LSPE balloon experiment, devoted to the observation of polarization at large angular scales.

Can the background affect a foreground target in distant, low-quality imagery? If it does, it might occur in our mind, or perhaps it may represent a snapshot of our early vision. An affirmative answer, one way or another, may affect our current understanding of this phenomena and potentially for related applications. How can we be sure about this in the psycho-physical sense? We begin with the physiology of our brain's homeostasis, of which an isothermal equilibrium is characterized by the minimum of Helmholtz isothermal Free Energy: A = U - T0S >= 0, where T0 = 37°C, the Boltzmann Entropy S = KB1n(W), and U is the unknown internal energy to be computed.

It is a continuous and ongoing effort to maintain radioactivity in materials and in the environment surrounding most underground experiments at very low levels. These low levels are required so that experiments can achieve the required detection sensitivities for the detection of low-energy neutrinos, searches for dark matter and neutrinoless double-beta decay. SNOLAB has several facilities which are used to determine these low background levels in the materials and the underground environment. This proceedings will describe the SNOLAB High Purity Germanium Detector which has been in continuous use for the past five years and give results of many of the items that have been counted over that period. Brief descriptions of SNOLAB's alpha-beta and electrostatic counters will be given, and the radon levels at SNOLAB will be discussed.

We use numerical simulations to calculate the cosmic microwave background anisotropy induced by the evolution of a global texture field, with special emphasis on individual textures. Both spherically symmetric and general configurations are analyzed, and in the latter case we consider field configurations which exhibit unwinding events and also ones which do not. We compare the results given by evolving the field numerically under both the expanded core (XCORE) and non-linear sigma model (NLSM) approximations with the analytic predictions of the NLSM exact solution for a spherically symmetric self-similar (SSSS) unwinding. We find that the random unwinding configuration spots' typical peak height is 60-75\\% and angular size typically only 10% of those of the SSSS unwinding, and that random configurations without an unwinding event nonetheless may generate indistinguishable hot and cold spots. A brief comparison is made with other work.

An analytic expression for the expected nucleotide diversity is obtained for a neutral locus in a region with deleterious mutation and recombination. Our analytic results are used to predict levels of variation for the entire third chromosome of Drosophila melanogaster. The predictions are consistent with the low levels of variation that have been observed at loci near the centromeres of the third chromosome of D. melanogaster. However, the low levels of variation observed near the tips of this chromosome are not predicted using currently available estimates of the deleterious mutation rate and of selection coefficients. If considerably smaller selection coefficients are assumed, the low observed levels of variation at the tips of the third chromosome are consistent with the background selection model. 33 refs., 4 figs., 1 tab.

Most cosmologists now believe that we live in an evolving universe that has been expanding and cooling since its origin about 15 billion years ago. Strong evidence for this standard cosmological model comes from studies of the cosmic microwave background radiation (CMBR), the remnant heat from the initial fireball. The CMBR spectrum is blackbody, as predicted from the hot Big Bang model before the discovery of the remnant radiation in 1964. In 1992 the cosmic background explorer (COBE) satellite finally detected the anisotropy of the radiation-fingerprints left by tiny temperature fluctuations in the initial bang. Careful design of the COBE satellite, and a bit of luck, allowed the 30 microK fluctuations in the CMBR temperature (2.73 K) to be pulled out of instrument noise and spurious foreground emissions. Further advances in detector technology and experiment design are allowing current CMBR experiments to search for predicted features in the anisotropy power spectrum at angular scales of 1 degrees and smaller. If they exist, these features were formed at an important epoch in the evolution of the universe--the decoupling of matter and radiation at a temperature of about 4,000 K and a time about 300,000 years after the bang. CMBR anisotropy measurements probe directly some detailed physics of the early universe. Also, parameters of the cosmological model can be measured because the anisotropy power spectrum depends on constituent densities and the horizon scale at a known cosmological epoch. As sophisticated experiments on the ground and on balloons pursue these measurements, two CMBR anisotropy satellite missions are being prepared for launch early in the next century. PMID:9419320

We report measurements of photon and neutron radiation levels observed while transmitting a 0.43 MW electron beam through millimeter-sized apertures and during beam-on, but accelerating gradient RF-on, operation. These measurements were conducted at the Free-Electron Laser (FEL) facility of the Jefferson National Accelerator Laboratory (JLab) using a 100 MeV electron beam from an energy-recovery linear accelerator. The beam was directed successively through 6 mm, 4 mm, and 2 mm diameter apertures of length 127 mm in aluminum at a maximum current of 4.3 mA (430 kW beam power). This study was conducted to characterize radiation levels for experiments that need to operate in this environment, such as the proposed DarkLight Experiment. We find that sustained transmission of a 430 kW CW beam through a 2 mm aperture is feasible with manageable beam-related backgrounds. We also find that during beam-off, RF-on operation, field emission inside the niobium cavities of the accelerator cryomodules is the primary source of ambient radiation.

Studies investigating neurobiological bases of negative symptoms of schizophrenia failed to provide consistent findings, possibly due to the heterogeneity of this psychopathological construct. We tried to review the findings published to date investigating neurobiological abnormalities after reducing the heterogeneity of the negative symptoms construct. The literature in electronic databases as well as citations and major articles are reviewed with respect to the phenomenology, pathology, genetics and neurobiology of schizophrenia. We searched PubMed with the keywords "negative symptoms," "deficit schizophrenia," "persistent negative symptoms," "neurotransmissions," "neuroimaging" and "genetic." Additional articles were identified by manually checking the reference lists of the relevant publications. Publications in English were considered, and unpublished studies, conference abstracts and poster presentations were not included. Structural and functional imaging studies addressed the issue of neurobiological background of negative symptoms from several perspectives (considering them as a unitary construct, focusing on primary and/or persistent negative symptoms and, more recently, clustering them into factors), but produced discrepant findings. The examined studies provided evidence suggesting that even primary and persistent negative symptoms include different psychopathological constructs, probably reflecting the dysfunction of different neurobiological substrates. Furthermore, they suggest that complex alterations in multiple neurotransmitter systems and genetic variants might influence the expression of negative symptoms in schizophrenia. On the whole, the reviewed findings, representing the distillation of a large body of disparate data, suggest that further deconstruction of negative symptomatology into more elementary components is needed to gain insight into underlying neurobiological mechanisms. PMID:25797499

Cosmic infrared background (CIB) contains information about galaxy luminosities over the entire history of the Universe and can be a powerful diagnostic of the early populations otherwise inaccessible to telescopic studies. Its measurements are very difficult because of the strong IR foregrounds from the Solar system and the Galaxy. Nevertheless, substantial recent progress in measuring the CIB and its structure has been made. The measurements now allow to set significant constraints on early galaxy evolution and, perhaps, even detect the elusive Population III era. We discuss briefly the theory behind the CIB, review the latest measurements of the CIB and its structure, and discuss their implications for detecting and/or constraining the first stars and their epochs.

Anisotropies in the Cosmic Microwave Background (CMB) contain a wealth of information about the past history of the universe and the present values of cosmological parameters. I online some of the theoretical advances of the last few years. In particular, I emphasize that for a wide class of cosmological models, theorists can accurately calculate the spectrum to better than a percent. The spectrum of anisotropies today is directly related to the pattern of inhomogeneities present at the time of recombination. This recognition leads to a powerful argument that will enable us to distinguish inflationary models from other models of structure formation. If the inflationary models turn out to be correct, the free parameters in these models will be determined to unprecedented accuracy by the upcoming satellite missions.

There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background! Experiments designed to map the microwave sky are returning a flood of data (time streams of instrument response as a beam is swept over the sky) at several different frequencies (from 30 to 900 GHz), all with different resolutions and noise properties. The resulting analysis challenge is to estimate, and quantify our uncertainty in, the spatial power spectrum of the cosmic microwave background given the complexities of "missing data", foreground emission, and complicated instrumental noise. Bayesian formulation of this problem allows consistent treatment of many complexities including complicated instrumental noise and foregrounds, and can be numerically implemented with Gibbs sampling. Gibbs sampling has now been validated as an efficient, statistically exact, and practically useful method for low-resolution (as demonstrated on WMAP 1 and 3 year temperature and polarization data). Continuing development for Planck - the goal is to exploit the unique capabilities of Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters.

Background The Parties to the United Nations Framework Convention on Climate Change (UNFCCC) are required to develop and report a national inventory of greenhouse gases not controlled by the Montreal Protocol. In the Asia region, "Workshops on Greenhouse Gas Inventories in Asia (WGIA)" have been organised annually since 2003 under the support of the government of Japan. WGIAs promote information exchange in the region to support countries' efforts to improve the quality of greenhouse gas inventories. This paper reports the major outcomes of the WGIAs and discusses the key aspects of information exchange in the region for the improvement of inventories. Results The major outcomes of WGIAs intended to help countries improve GHG inventories, can be summarised as follows: (1) identification of common issues and possible solutions by sector, (2) reporting country inventory practices, and (3) verification of the UNFCCC reporting requirements. Conclusion The workshops provided the opportunity for countries to share common issues and constraints pertinent to GHG inventories and to exchange information regarding possible solutions for those issues based on their own experience. The relevance of information exchange is determined due to emission sources, emitting mechanisms from sources, and technologies used. Information exchange about emission sources that are unique to Asia, like those of the agriculture sector, contributes significantly to the accumulation of knowledge at the regional and global levels. Enabling countries to verify their national circumstances with the reporting requirements under UNFCCC is also an essential part of the WGIA information exchange activities. PMID:16930465

Supernumerary teeth (ST) are odontostomatologic anomaly characterized by as the existence excessive number of teeth in relation to the normal dental formula. This condition is commonly seen with several congenital genetic disorders such as Gardner's syndrome, cleidocranial dysostosis and cleft lip and palate. Less common syndromes that are associated with ST are; Fabry Disease, Ellis-van Creveld syndrome, Nance-Horan syndrome, Rubinstein-Taybi Syndrome and Trico–Rhino–Phalangeal syndrome. ST can be an important component of a distinctive disorder and an important clue for early diagnosis. Certainly early detecting the abnormalities gives us to make correct management of the patient and also it is important for making well-informed decisions about long-term medical care and treatment. In this review, the genetic syndromes that are related with ST were discussed. PMID:25713500

Extensive source terms for beta, gamma, and neutrons following fission pulses are presented in various tabular and graphical forms. Neutron results from a wide range of fissioning nuclides (42) are examined and detailed information is provided for four fuels: /sup 235/U, /sup 238/U, /sup 232/Th, and /sup 239/Pu; these bracket the range of the delayed spectra. Results at several cooling (decay) times are presented. For ..beta../sup -/ and ..gamma.. spectra, only /sup 235/U and /sup 239/Pu results are given; fission-product data are currently inadequate for other fuels. The data base consists of all known measured data for individual fission products extensively supplemented with nuclear model results. The process is evolutionary, and therefore, the current base is summarized in sufficient detail for users to judge its quality. Comparisons with recent delayed neutron experiments and total ..beta../sup -/ and ..gamma.. decay energies are included. 27 refs., 47 figs., 9 tabs.

Supernumerary teeth (ST) are odontostomatologic anomaly characterized by as the existence excessive number of teeth in relation to the normal dental formula. This condition is commonly seen with several congenital genetic disorders such as Gardner's syndrome, cleidocranial dysostosis and cleft lip and palate. Less common syndromes that are associated with ST are; Fabry Disease, Ellis-van Creveld syndrome, Nance-Horan syndrome, Rubinstein-Taybi Syndrome and Trico-Rhino-Phalangeal syndrome. ST can be an important component of a distinctive disorder and an important clue for early diagnosis. Certainly early detecting the abnormalities gives us to make correct management of the patient and also it is important for making well-informed decisions about long-term medical care and treatment. In this review, the genetic syndromes that are related with ST were discussed. PMID:25713500

The recent Advanced LIGO detection of gravitational waves from the binary black hole GW150914 suggests there exists a large population of merging binary black holes in the Universe. Although most are too distant to be individually resolved by advanced detectors, the superposition of gravitational waves from many unresolvable binaries is expected to create an astrophysical stochastic background. Recent results from the LIGO and Virgo Collaborations show that this astrophysical background is within reach of Advanced LIGO. In principle, the binary black hole background encodes interesting astrophysical properties, such as the mass distribution and redshift distribution of distant binaries. However, we show that this information will be difficult to extract with the current configuration of advanced detectors (and using current data analysis tools). Additionally, the binary black hole background also constitutes a foreground that limits the ability of advanced detectors to observe other interesting stochastic background signals, for example, from cosmic strings or phase transitions in the early Universe. We quantify this effect.

In this study, a new method for suppressing the background odor effect is proposed. Since odor sensors response to background odors in addition to a target odor, it is difficult to detect the target odor information. In the conventional odor sensing systems, the effect of the background odors are compensated by subtracting the response to the background odors (the baseline response). Although this simple subtraction method is effective for constant background odors, it fails in the compensation for time-varying background odors. The proposed method for the background suppression is effective even for the time-varying background odors.

Digital technologies enable the storage of vast amounts of information, accessible with remarkable ease. However, along with this facility comes the challenge to find pertinentinformation from the volumes of nonrelevant information. The present article describes the pearl-harvesting methodological framework for information retrieval. Pearl…

The cosmic infrared background (CIB) reflects the sum total of galactic luminosities integrated over the entire age of the universe. From its measurement the red-shifted starlight and dust-absorbed and re-radiated starlight of the CIB can be used to determine (or constrain) the rates of star formation and metal production as a function of time and deduce information about objects at epochs currently inaccessible to telescopic studies. This review discusses the state of current CIB measurements and the (mostly space-based) instruments with which these measurements have been made, the obstacles (the various foreground emissions) and the physics behind the CIB and its structure. Theoretical discussion of the CIB levels can now be normalized to the standard cosmological model narrowing down theoretical uncertainties. We review the information behind and theoretical modeling of both the mean (isotropic) levels of the CIB and their fluctuations. The CIB is divided into three broad bands: near-IR (NIR), mid-IR (MIR) and far-IR (FIR). For each of the bands we review the main contributors to the CIB flux and the epochs at which the bulk of the flux originates. We also discuss the data on the various quantities relevant for correct interpretation of the CIB levels: the star-formation history, the present-day luminosity function measurements, resolving the various galaxy contributors to the CIB, etc. The integrated light of all galaxies in the deepest NIR galaxy counts to date fails to match the observed mean level of the CIB, probably indicating a significant high-redshift contribution to the CIB. Additionally, Population III stars should have left a strong and measurable signature via their contribution to the CIB anisotropies for a wide range of their formation scenarios, and measuring the excess CIB anisotropies coming from high z would provide direct information on the epoch of the first stars.

Electron microprobe trace element analysis is a significant challenge, but can provide critical data when high spatial resolution is required. Due to the low peak intensity, the accuracy and precision of such analyses relies critically on background measurements, and on the accuracy of any pertinent peak interference corrections. A linear regression between two points selected at appropriate off-peak positions is a classical approach for background characterization in microprobe analysis. However, this approach disallows an accurate assessment of background curvature (usually exponential). Moreover, if present, background interferences can dramatically affect the results if underestimated or ignored. The acquisition of a quantitative WDS scan over the spectral region of interest is still a valuable option to determine the background intensity and curvature from a fitted regression of background portions of the scan, but this technique retains an element of subjectivity as the analyst has to select areas in the scan, which appear to represent background. We present here a new method, "Multi-Point Background" (MPB), that allows acquiring up to 24 off-peak background measurements from wavelength positions around the peaks. This method aims to improve the accuracy, precision, and objectivity of trace element analysis. The overall efficiency is amended because no systematic WDS scan needs to be acquired in order to check for the presence of possible background interferences. Moreover, the method is less subjective because "true" backgrounds are selected by the statistical exclusion of erroneous background measurements, reducing the need for analyst intervention. This idea originated from efforts to refine EPMA monazite U-Th-Pb dating, where it was recognised that background errors (peak interference or background curvature) could result in errors of several tens of million years on the calculated age. Results obtained on a CAMECA SX-100 "UltraChron" using monazite

An image segmentation method refining background extraction in two phases is presented. In the first phase, the method detects homogeneous-background blocks and estimates the local background to be extracted throughout the image. A block is classified homogeneous if its left and right standard deviations are small. The second phase of the method refines background extraction in nonhomogeneous blocks by recomputing the shoulder thresholds. Rules that predict the final background extraction are derived by observing the behavior of successive background statistical measurements in the regions under the presence of dark and/or bright object pixels. Good results are shown for a number of outdoor scenes.

... UNEMPLOYMENT ASSISTANCE § 625.19 Information, reports and studies. (a) Routine responses. State agencies shall... Disaster Unemployment Assistance as the result of a major disaster in the State have been made, the State... chronological list of significant events, pertinent statistics about the Disaster Unemployment...

... UNEMPLOYMENT ASSISTANCE § 625.19 Information, reports and studies. (a) Routine responses. State agencies shall... Disaster Unemployment Assistance as the result of a major disaster in the State have been made, the State... chronological list of significant events, pertinent statistics about the Disaster Unemployment...

California State Dept. of Education, Sacramento. Career Education Task Force.

The annotated bibliography, listing source materials pertinent to career education, is arranged alphabetically within each of eight sections. The first four sections present selected sources of information on: (1) the nature of students in the 1970's, (2) the occupational market, (3) statements of objectives, and (4) methods of developing…

The author's conceptualization of an Information Commons (IC) is revisited and elaborated in reaction to Bailey and Tierney's article. The IC's role as testbed for instructional support and knowledge discovery is explored, and progress on pertinent research is reviewed. Prospects for media-rich learning environments relate the IC to the…

Background In the post-genomic era, systems-level studies are being performed that seek to explain complex biological systems by integrating diverse resources from fields such as genomics, proteomics or transcriptomics. New information management systems are now needed for the collection, validation and analysis of the vast amount of heterogeneous data available. Multiple alignments of complete sequences provide an ideal environment for the integration of this information in the context of the protein family. Results MACSIMS is a multiple alignment-based information management program that combines the advantages of both knowledge-based and ab initio sequence analysis methods. Structural and functional information is retrieved automatically from the public databases. In the multiple alignment, homologous regions are identified and the retrieved data is evaluated and propagated from known to unknown sequences with these reliable regions. In a large-scale evaluation, the specificity of the propagated sequence features is estimated to be >99%, i.e. very few false positive predictions are made. MACSIMS is then used to characterise mutations in a test set of 100 proteins that are known to be involved in human genetic diseases. The number of sequence features associated with these proteins was increased by 60%, compared to the features available in the public databases. An XML format output file allows automatic parsing of the MACSIM results, while a graphical display using the JalView program allows manual analysis. Conclusion MACSIMS is a new information management system that incorporates detailed analyses of protein families at the structural, functional and evolutionary levels. MACSIMS thus provides a unique environment that facilitates knowledge extraction and the presentation of the most pertinentinformation to the biologist. A web server and the source code are available at . PMID:16792820

Disclosure of information prior to consent is a very complex area of medical ethics. On the surface it would seem to be quite clear cut, but on closer inspection the scope for 'grey areas' is vast. In practice, however, it could be argued that the number of cases that result in complaint or litigation is comparatively small. However, this does not mean that wrong decisions or unethical scenarios do not occur. It would seem that in clinical practice these ethical grey areas concerning patients' full knowledge of their condition or treatment are quite common. One of the barometers for how much disclosure should be given prior to consent could be the feedback obtained from the patients. Are they asking relevant questions pertinent to their condition and do they show a good understanding of the options available? This should be seen as a positive trait and should be welcomed by the healthcare professionals. Ultimately it gives patients greater autonomy and the healthcare professional can expand and build on the patient's knowledge as well as allay fears perhaps based on wrongly held information. Greater communication with the patient would help the healthcare professional pitch their explanations at the right level. Every case and scenario is different and unique and deserves to be treated as such. Studies have shown that most patients can understand their medical condition and treatment provided communication has been thorough (Gillon 1996). It is in the patients' best interests to feel comfortable with the level of disclosure offered to them. It can only foster greater trust and respect between them and the healthcare profession which has to be mutually beneficial to both parties. PMID:16939165

Congress of the U.S., Washington, DC. Office of Technology Assessment.

This is the second publication from the Office of Technology Assessment's assessment on information technology and research, which was requested by the House Committee on Science and Technology and the Senate Committee on Commerce, Science, and Transportation. The first background paper, "High Performance Computing & Networking for Science,"…

The relationship between housing and family well-being and self-sufficiency are explored in this paper's three major sections. The first section provides backgroundinformation on the relationship between housing and the physical and social well-being of individuals and families and describes the major federal housing initiatives that have been…

Information for newly appointed heads of graduate departments of psychology is presented as background material for the 1974 Chairman's Workshop. Topics include the following: the budgetary situation, pressures for increased teaching loads, effects upon recruiting faculty, faculty morale, graduate and undergraduate student morale, the intellectual…

Backgroundinformation is given on the problems caused to anadromous fish migrations, especially salmon and steelhead trout, by the development of hydroelectric power dams on the Columbia River and its tributaries. Programs arising out of the Pacific Northwest Electric Power Planning and conservation Act of 1980 to remedy these problems and restore fish and wildlife populations are described. (ACR)

In educational research the problem of student description is eternal. On what basis do researchers make decisions about aspects of students' material lives to count as data, interpretive categories, contextual information, results? This paper focuses specifically on the problem of "background" in researching the student subject. The paper argues…

Although it is lengthy, the Family Background Questionnaire (FBQ) provides reliable behaviorally specific family history information. Results from reliability and validity analyses suggest that a brief version of this instrument that assesses parental responsiveness, child maltreatment, and parental substance abuse would provide a useful screening…

In this article, part I of a series, the forensic methods used in "typing" human blood, which as physical evidence is often found in the dried state, are outlined. Backgroundinformation about individualization, antibody typing, fresh blood, dried blood, and additional systems is provided. (CW)

Many employers in the United States have been initiating or expanding policies requiring background checks of prospective employees. The ability to perform such checks has been abetted by the growth of computerized databases and of commercial enterprises that facilitate access to personal information. Employers now have ready access to public…

An effective method for small and dim moving target detection in complicated background is proposed. The proposed approach takes advantage of the Non-local means filter, and applies a novel weight calculation model based on circular mask to the original background estimation pattern. By associating similarity of grayscale distribution of the images with temporal information, the extended method estimates the complicated background precisely and extracts point target successfully. To compare existing target detection methods and the proposed one, signal-to-clutter ratio gain (SCRG) and background suppression factor (BSF) are employed for spatial performance comparison and receiver operating characteristics (ROC) is used for detection-performance comparison of the target trajectory. Experimental results demonstrate good performance of the proposed method for infrared images in complicated scene, especially for images with low signal-to-noise ratio.

Because of the relative far distance between infrared imaging system and target or the wide field infrared optical, the imaging area of infrared target is only a few pixels, which is isolated or spots to be showed in the field of view. The only available is the intensity information (gray value) for the target detection. Simultaneously, there are many shortcomings of the infrared image, such as large noise, interference and so on, therefore the small target is always buried in the background and noises. The small target is relatively difficult to detect, so generally, it is impossible to make reliable detection to this target in a single frame image. Summarily, the core of the infrared small target detection algorithm is the background and noise suppression based on a single frame image. Aiming at the infrared small target detection and the above problems, a shearlets-based background suppression algorithm for infrared image is proposed. The algorithm demonstrates the performance of advantage based on shearlets, which is especially designed to address anisotropic and directional information at various scales. This transform provides an optimally efficient representation of images, which is greatly reduced the amount of the information and the available information representation. In the paper, introducing the principle of shearlets first, and then proposing the theory of the algorithm and explaining the implementation step. Finally, giving the simulation results. In Matlab simulations with this method for several sets of infrared images, simulation results conformed to the theory on background suppression based on shearlets. The result showed that this method can effectively suppress background, and improve the SCR and achieve a satisfactory effect in the sky background. The method is very effectively for target detection, identification, track in infrared image system for the future.

Background: Schools are the major locations for implementing children’s dietary behavior related educational or interventional programs. Recently, there has been an increase in school-based nutrition interventions. The objective of this systematic review was to overview the evidence for the effectiveness of school-based nutrition intervention on fruit and vegetable consumption. Methods: PubMed was used to search for articles on school-based nutrition interventions that measured students’ fruit and vegetable consumption. Our search yielded 238 articles.The article was included if published in a peer-reviewed journal, written in English language,administered in the United States, and conducted among a population-based sample of children in Kindergarten through eighth grade. A total of 14 publications met the inclusion criteria. Results: Eight articles successfully showed the positive effect on increasing fruit and or vegetable consumption while the other six did not. Several factors, including (but not limited to) intervention duration, type of theory used, style of intervention leadership, and positively affecting antecedents of fruit and vegetable consumption were compared; however, no dominant factor was found to be shared among the studies with significant findings. Given that the criteria for selection were high, the lack of consistency between interventions and positive outcomes was surprising. Conclusion: With high levels of scrutiny and budget constraints on school nutrition, it is imperative that more research be conducted to identify the effective intervention components. PMID:27123430

Product aspect recognition is a key task in fine-grained opinion mining. Current methods primarily focus on the extraction of aspects from the product reviews. However, it is also important to cluster synonymous extracted aspects into the same category. In this paper, we focus on the problem of product aspect clustering. The primary challenge is to properly cluster and generalize aspects that have similar meanings but different representations. To address this problem, we learn two types of background knowledge for each extracted aspect based on two types of effective aspect relations: relevant aspect relations and irrelevant aspect relations, which describe two different types of relationships between two aspects. Based on these two types of relationships, we can assign many relevant and irrelevant aspects into two different sets as the background knowledge to describe each product aspect. To obtain abundant background knowledge for each product aspect, we can enrich the available information with background knowledge from the Web. Then, we design a hierarchical clustering algorithm to cluster these aspects into different groups, in which aspect similarity is computed using the relevant and irrelevant aspect sets for each product aspect. Experimental results obtained in both camera and mobile phone domains demonstrate that the proposed product aspect clustering method based on two types of background knowledge performs better than the baseline approach without the use of background knowledge. Moreover, the experimental results also indicate that expanding the available background knowledge using the Web is feasible. PMID:27561001

Product aspect recognition is a key task in fine-grained opinion mining. Current methods primarily focus on the extraction of aspects from the product reviews. However, it is also important to cluster synonymous extracted aspects into the same category. In this paper, we focus on the problem of product aspect clustering. The primary challenge is to properly cluster and generalize aspects that have similar meanings but different representations. To address this problem, we learn two types of background knowledge for each extracted aspect based on two types of effective aspect relations: relevant aspect relations and irrelevant aspect relations, which describe two different types of relationships between two aspects. Based on these two types of relationships, we can assign many relevant and irrelevant aspects into two different sets as the background knowledge to describe each product aspect. To obtain abundant background knowledge for each product aspect, we can enrich the available information with background knowledge from the Web. Then, we design a hierarchical clustering algorithm to cluster these aspects into different groups, in which aspect similarity is computed using the relevant and irrelevant aspect sets for each product aspect. Experimental results obtained in both camera and mobile phone domains demonstrate that the proposed product aspect clustering method based on two types of background knowledge performs better than the baseline approach without the use of background knowledge. Moreover, the experimental results also indicate that expanding the available background knowledge using the Web is feasible. PMID:27561001

The reported studies have aimed to investigate whether informational masking in a multi-talker background relies on semantic interference between the background and target using an adapted semantic priming paradigm. In 3 experiments, participants were required to perform a lexical decision task on a target item embedded in backgrounds composed of 1–4 voices. These voices were Semantically Consistent (SC) voices (i.e., pronouncing words sharing semantic features with the target) or Semantically Inconsistent (SI) voices (i.e., pronouncing words semantically unrelated to each other and to the target). In the first experiment, backgrounds consisted of 1 or 2 SC voices. One and 2 SI voices were added in Experiments 2 and 3, respectively. The results showed a semantic priming effect only in the conditions where the number of SC voices was greater than the number of SI voices, suggesting that semantic priming depended on prime intelligibility and strategic processes. However, even if backgrounds were composed of 3 or 4 voices, reducing intelligibility, participants were able to recognize words from these backgrounds, although no semantic priming effect on the targets was observed. Overall this finding suggests that informational masking can occur at a semantic level if intelligibility is sufficient. Based on the Effortfulness Hypothesis, we also suggest that when there is an increased difficulty in extracting target signals (caused by a relatively high number of voices in the background), more cognitive resources were allocated to formal processes (i.e., acoustic and phonological), leading to a decrease in available resources for deeper semantic processing of background words, therefore preventing semantic priming from occurring. PMID:25400572

The Cosmic Microwave Background (CMB) has been a rich source of information about the early Universe. Detailed measurements of its spectrum and spatial distribution have helped solidify the Standard Model of Cosmology. However, many questions still remain. Standard Cosmology does not explain why the early Universe is geometrically flat, expanding, homogenous across the horizon, and riddled with a small anisotropy that provides the seed for structure formation. Inflation has been proposed as a mechanism that naturally solves these problems. In addition to solving these problems, inflation is expected to produce a spectrum of gravitational waves that will create a particular polarization pattern on the CMB. Detection of this polarized signal is a key test of inflation and will give a direct measurement of the energy scale at which inflation takes place. This polarized signature of inflation is expected to be -9 orders of magnitude below the 2.7 K monopole level of the CMB. This measurement will require good control of systematic errors, an array of many detectors having the requisite sensitivity, and a reliable method for removing polarized foregrounds, and nearly complete sky coverage. Ultimately, this measurement is likely to require a space mission. To this effect, technology and mission concept development are currently underway.

Context. We present a method for determining the background of the gamma-ray bursts (GRBs) of the Fermi Gamma-ray Burst Monitor (GBM) using the satellite positional information and a physical model. Since the polynomial fitting method typically used for GRBs is generally only indicative of the background over relatively short timescales, this method is particularly useful in the cases of long GRBs or those that have autonomous repoint request (ARR) and a background with much variability on short timescales. Aims: Modern space instruments, like Fermi, have some specific motion to survey the sky and catch gamma-ray bursts in the most effective way. However, GBM bursts sometimes have highly varying backgrounds (with or without ARR), and modelling them with a polynomial function of time is not efficient - one needs more complex, Fermi-specific methods. This article presents a new direction dependent background fitting method and shows how it can be used for filtering the lightcurves. Methods: First, we investigate how the celestial position of the satellite may have influence on the background and define three underlying variables with physical meaning: celestial distance of the burst and the detector's orientation, the contribution of the Sun and the contribution of the Earth. Then, we use multi-dimensional general least square fitting and Akaike model selection criterion for the background fitting of the GBM lightcurves. Eight bursts are presented as examples, of which we computed the duration using background fitted cumulative lightcurves. Results: We give a direction dependent background fitting (DDBF) method for separating the motion effects from the real data and calculate the duration (T90, T50, and confidence intervals) of the nine example bursts, from which two resulted an ARR. We also summarize the features of our method and compare it qualitatively with the official GBM Catalogue. Conclusions: Our background filtering method uses a model based on the

This supplement describes national adaptations made to the international version of the TIMSS 2011 background questionnaires. This information provides users with a guide to evaluate the availability of internationally comparable data for use in secondary analyses involving the TIMSS 2011 background variables. Background questionnaire adaptations…

Background High-density short oligonucleotide microarrays are a primary research tool for assessing global gene expression. Background noise on microarrays comprises a significant portion of the measured raw data, which can have serious implications for the interpretation of the generated data if not estimated correctly. Results We introduce an approach to calculate probe affinity based on sequence composition, incorporating nearest-neighbor (NN) information. Our model uses position-specific dinucleotide information, instead of the original single nucleotide approach, and adds up to 10% to the total variance explained (R2) when compared to the previously published model. We demonstrate that correcting for background noise using this approach enhances the performance of the GCRMA preprocessing algorithm when applied to control datasets, especially for detecting low intensity targets. Conclusion Modifying the previously published position-dependent affinity model to incorporate dinucleotide information significantly improves the performance of the model. The dinucleotide affinity model enhances the detection of differentially expressed genes when implemented as a background correction procedure in GeneChip preprocessing algorithms. This is conceptually consistent with physical models of binding affinity, which depend on the nearest-neighbor stacking interactions in addition to base-pairing. PMID:18947404

Hydro-sedimentology development is a great challenge in Peru due to limited data as well as sparse and confidential information. This study aimed to quantify and to understand the suspended sediment yield from the west-central Andes Mountains and to identify the main erosion-control factors and their relevance. The Tablachaca River (3132 km2) and the Santa River (6815 km2), located in two adjacent Andes catchments, showed similar statistical daily rainfall and discharge variability but large differences in specific suspended-sediment yield (SSY). In order to investigate the main erosion factors, daily water discharge and suspended sediment concentration (SSC) datasets of the Santa and Tablachaca rivers were analysed. Mining activity in specific lithologies was identified as the major factor that controls the high SSY of the Tablachaca (2204 t km2 yr-1), which is four times greater than the Santa's SSY. These results show that the analysis of control factors of regional SSY at the Andes scale should be done carefully. Indeed, spatial data at kilometric scale and also daily water discharge and SSC time series are needed to define the main erosion factors along the entire Andean range.

We develop the perturbation theory of double field theory around arbitrary solutions of its field equations. The exact gauge transformations are written in a manifestly background covariant way and contain at most quadratic terms in the field fluctuations. We expand the generalized curvature scalar to cubic order in fluctuations and thereby determine the cubic action in a manifestly background covariant form. As a first application we specialize this theory to group manifold backgrounds, such as S U (2 )≃S3 with H -flux. In the full string theory this corresponds to a Wess-Zumino-Witten background CFT. Starting from closed string field theory, the cubic action around such backgrounds has been computed before by Blumenhagen, Hassler, and Lüst. We establish precise agreement with the cubic action derived from double field theory. This result confirms that double field theory is applicable to arbitrary curved background solutions, disproving assertions in the literature to the contrary.

COBE, the Cosmic Background Explorer spacecraft, and its mission are described. COBE was designed to study the origin and dynamics of the universe including the theory that the universe began with a cataclysmic explosion referred to as the Big Bang. To this end, earth's cosmic background - the infrared radiation that bombards earth from every direction - will be measured by three sophisticated instruments: the Differential Microwave Radiometer (DMR), the Far Infrared Absolute Spectrophotometer (FIRAS), and the Diffuse Infrared Background Experiment (DIRBE).

Background-oriented schlieren is a method of visualizing refractive disturbances by comparing digital images with and without a refractive disturbance distorting a background pattern. Traditionally, backgrounds consist of random distributions of high-contrast color transitions or speckle patterns. To image a refractive disturbance, a digital image correlation algorithm is used to identify the location and magnitude of apparent pixel shifts in the background pattern between the two images. Here, a novel method of using color gradient backgrounds is explored as an alternative that eliminates the need to perform a complex image correlation between the digital images. A simple image subtraction can be used instead to identify the location, magnitude, and direction of the image distortions. Gradient backgrounds are demonstrated to provide quantitative data only limited by the camera's pixel resolution, whereas speckle backgrounds limit resolution to the size of the random pattern features and image correlation window size. Quantitative measurement of density in a thermal boundary layer is presented. Two-dimensional gradient backgrounds using multiple colors are demonstrated to allow measurement of two-dimensional refractions. A computer screen is used as the background, which allows for rapid modification of the gradient to tune sensitivity for a particular application.

Background oriented schlieren (BOS) imaging is a method of visualizing refractive disturbances through the comparison of digital images. By comparing images with and without a refractive disturbance visualizations can be achieved via a range of image processing methods. Traditionally, backgrounds consist of random distributions of high contrast speckle patterns. To image a refractive disturbance, a digital image correlation algorithm is used to identify the location and magnitude of apparent pixel shifts in the background pattern. Here a novel method of using color gradient backgrounds is explored as an alternative. The gradient background eliminates the need to perform an image correlation between the two digital images, as simple image subtraction can be used to identify the location, magnitude, and direction of the image distortions. This allows for quicker processing. Two-dimensional gradient backgrounds using multiple colors are shown. The gradient backgrounds are demonstrated to provide quantitative data limited only by the camera's pixel resolution, whereas speckle backgrounds limit resolution to the size of the random pattern features and image correlation window size. Additional results include the use of a computer screen as a background.

Technical Requirements Analysis and Control Systems/Initial Operating Capability (TRACS/IOC) computer program provides supplemental software tools for analysis, control, and interchange of project requirements so qualified project members have access to pertinent project information, even if in different locations. Enables users to analyze and control requirements, serves as focal point for project requirements, and integrates system supporting efficient and consistent operations. TRACS/IOC is HyperCard stack for use on Macintosh computers running HyperCard 1.2 or later and Oracle 1.2 or later.

Hydro-sedimentology development is a great challenge in Peru due to limited data as well as sparse and confidential information. Consequently, little is known at present about the relationship between the El Niño Southern Oscillation (ENSO), precipitation, runoff, land use and the sediment transport dynamics. The aim of this paper is to bridge this gap in order to quantify and understand the signal of magnitude and frequency of the sediment fluxes from the central western Andes; also, to identify the main erosion control factor and its relevance. The Tablachaca River (3132 km2) and the Santa River (6815 km2), two mountainous Andean catchments that are geographically close to each other, both showed similar statistical daily rainfall and discharge variability but high contrast in sediment yield (SY). In order to investigate which factors are of importance, the continuous water discharge and hourly suspended sediment concentrations (SSC) of the Santa River were studied. Firstly, the specific sediment yield (SSY) at the continental Andes range scale for the Pacific side is one of the highest amounts (2204 t km2 yr-1). Secondly, no relationship between the water discharge (Q) and El Niño/La Niñ a events is found over a 54 yr time period. However, the Santa Basin is highly sensitive during mega Niños (1982-1983 and 1997-1998). Lastly, dispersed micro-mining and mining activity in specific lithologies are identified as the major factors that control the high SSY. These remarks make the Peruvian coast key areas for future research on Andean sediment rates.

... 32 National Defense 5 2014-07-01 2014-07-01 false Background. 732.1 Section 732.1 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY PERSONNEL NONNAVAL MEDICAL AND DENTAL CARE General § 732.1 Background. When a U.S. Navy or Marine Corps member or a Canadian Navy or Marine Corps...

... 32 National Defense 5 2012-07-01 2012-07-01 false Background. 732.1 Section 732.1 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY PERSONNEL NONNAVAL MEDICAL AND DENTAL CARE General § 732.1 Background. When a U.S. Navy or Marine Corps member or a Canadian Navy or Marine Corps...

... 32 National Defense 5 2010-07-01 2010-07-01 false Background. 732.1 Section 732.1 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY PERSONNEL NONNAVAL MEDICAL AND DENTAL CARE General § 732.1 Background. When a U.S. Navy or Marine Corps member or a Canadian Navy or Marine Corps...

... 32 National Defense 5 2011-07-01 2011-07-01 false Background. 732.1 Section 732.1 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY PERSONNEL NONNAVAL MEDICAL AND DENTAL CARE General § 732.1 Background. When a U.S. Navy or Marine Corps member or a Canadian Navy or Marine Corps...

... 32 National Defense 5 2013-07-01 2013-07-01 false Background. 732.1 Section 732.1 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY PERSONNEL NONNAVAL MEDICAL AND DENTAL CARE General § 732.1 Background. When a U.S. Navy or Marine Corps member or a Canadian Navy or Marine Corps...

Presents a model of the relationship between social background and school continuation decisions among White males born between 1900 and 1950. The model predicts a decline in the effects of social background by the last school transition. Reprint available from Institute for Research on Poverty, University of Wisconsin, Madison, WI 53706. (AM)

Studies the impact of changes in family background on grade-level attainment for White males between 1907-1951. Findings show that the effects of social background on grade attainment decrease with increasing levels of attainment. Reprint available from Institute for Research on Poverty, University of Wisconsin-Madison, Madison WI 53706. (AM)

The INTEGRAL Spectrometer, like most gamma-ray instruments, is background dominated. Signal-to-background ratios of a few percent are typical. The background is primarily due to interactions of cosmic rays in the instrument and spacecraft. It characteristically varies by +/- 5% on time scales of days. This variation is caused mainly by fluctuations in the interplanetary magnetic field that modulates the cosmic ray intensity. To achieve the maximum performance from SPI it is essential to have a high quality model of this background that can predict its value to a fraction of a percent. In this poster we characterize the background and its variability, explore various models, and evaluate the accuracy of their predictions.

Gamma-ray background radiation significantly reduces detection sensitivity when searching for radioactive sources in the field, such as in wide-area searches for homeland security applications. Mobile detector systems in particular must contend with a variable background that is not necessarily known or even measurable a priori. This work will present measurements of the spatial and temporal variability of the background, with the goal of merging gamma-ray detection, spectroscopy, and imaging with contextual information--a "nuclear street view" of the ubiquitous background radiation. The gamma-ray background originates from a variety of sources, both natural and anthropogenic. The dominant sources in the field are the primordial isotopes potassium-40, uranium-238, and thorium-232, as well as their decay daughters. In addition to the natural background, many artificially-created isotopes are used for industrial or medical purposes, and contamination from fission products can be found in many environments. Regardless of origin, these backgrounds will reduce detection sensitivity by adding both statistical as well as systematic uncertainty. In particular, large detector arrays will be limited by the systematic uncertainty in the background and will suffer from a high rate of false alarms. The goal of this work is to provide a comprehensive characterization of the gamma-ray background and its variability in order to improve detection sensitivity and evaluate the performance of mobile detectors in the field. Large quantities of data are measured in order to study their performance at very low false alarm rates. Two different approaches, spectroscopy and imaging, are compared in a controlled study in the presence of this measured background. Furthermore, there is additional information that can be gained by correlating the gamma-ray data with contextual data streams (such as cameras and global positioning systems) in order to reduce the variability in the background

The introduction points out that radiation backgrounds fluctuate across very short distances: factors include geology, soil composition, altitude, building structures, topography, and other manmade structures; and asphalt and concrete can vary significantly over short distances. Brief descriptions are given of the detection system, experimental setup, and background variation measurements. It is concluded that positive and negative gradients can greatly reduce the detection sensitivity of an MDS: negative gradients create opportunities for false negatives (nondetection), and positive gradients create a potentially unacceptable FAR (above 1%); the location of use for mobile detection is important to understand; spectroscopic systems provide more information for screening out false alarms and may be preferred for mobile use; and mobile monitor testing at LANL accounts for expected variations in the background.

Adequate research in the peripheral field of medical geography requires familiarity with the literature of medicine, geography, and other environmentally oriented fields. The pertinent literature of the two primary disciplines, as well as that of anthropology, nutrition, and human bioclimatology, is surveyed from a bibliographical point of view. A brief review of historical sources is presented, followed by a discussion of the contemporary organizations, both international and national, active in the field. Emphasis is placed on the publishing programs and projects, maps, atlases, symposia, reports, and other literature sponsored or stimulated by these organizations. Regional bibliographical surveys for East Africa, India, and the Soviet Union are also noted. Pertinent aspects of bibliographies, indexes, abstracts, library card catalogs and accession lists, and other resources are listed, with emphasis on the various subject headings and other approaches to them. Throughout, the sources of information are approached from a multidisciplinary and interdisciplinary viewpoint. PMID:5329543

We consider string pair production in non homogeneous electric backgrounds. We study several particular configurations which can be addressed with the Euclidean world-sheet instanton technique, the analogue of the world-line instanton for particles. In the first case the string is suspended between two D-branes in flat space-time, in the second case the string lives in AdS and terminates on one D-brane (this realizes the holographic Schwinger effect). In some regions of parameter space the result is well approximated by the known analytical formulas, either the particle pair production in non-homogeneous background or the string pair production in homogeneous background. In other cases we see effects which are intrinsically stringy and related to the non-homogeneity of the background. The pair production is enhanced already for particles in time dependent electric field backgrounds. The string nature enhances this even further. For spacial varying electrical background fields the string pair production is less suppressed than the rate of particle pair production. We discuss in some detail how the critical field is affected by the non-homogeneity, for both time and space dependent electric field backgrouds. We also comment on what could be an interesting new prediction for the small field limit. The third case we consider is pair production in holographic confining backgrounds with homogeneous and non-homogeneous fields.

We study holographic thermalization of a strongly coupled theory inspired by two colliding shock waves in a vacuum confining background. Holographic thermalization means a black hole formation, in fact, a trapped surface formation. As the vacuum confining background, we considered the well-know bottom-up AdS/QCD model that provides the Cornell potential and reproduces the QCD β-function. We perturb the vacuum background by colliding domain shock waves that are assumed to be holographically dual to heavy ions collisions. Our main physical assumption is that we can make a restriction on the time of trapped surface formation, which results in a natural limitation on the size of the domain where the trapped surface is produced. This limits the intermediate domain where the main part of the entropy is produced. In this domain, we can use an intermediate vacuum background as an approximation to the full confining background. We find that the dependence of the multiplicity on energy for the intermediate background has an asymptotic expansion whose first term depends on energy as E{sup 1/3}, which is very similar to the experimental dependence of particle multiplicities on the colliding ion energy obtained from the RHIC and LHC. However, this first term, at the energies where the approximation of the confining metric by the intermediate background works, does not saturate the exact answer, and we have to take the nonleading terms into account.

We investigate possible origins of the extragalactic radio background reported by the ARCADE 2 collaboration. The surface brightness of the background is several times higher than that which would result from currently observed radio sources. We consider contributions to the background from diffuse synchrotron emission from clusters and the intergalactic medium, previously unrecognized flux from low surface brightness regions of radio sources, and faint point sources below the flux limit of existing surveys. By examining radio source counts available in the literature, we conclude that most of the radio background is produced by radio point sources that dominate at sub {mu}Jy fluxes. We show that a truly diffuse background produced by elections far from galaxies is ruled out because such energetic electrons would overproduce the observed X-ray/{gamma}-ray background through inverse Compton scattering of the other photon fields. Unrecognized flux from low surface brightness regions of extended radio sources, or moderate flux sources missed entirely by radio source count surveys, cannot explain the bulk of the observed background, but may contribute as much as 10%. We consider both radio supernovae and radio quiet quasars as candidate sources for the background, and show that both fail to produce it at the observed level because of insufficient number of objects and total flux, although radio quiet quasars contribute at the level of at least a few percent. We conclude that the most important population for production of the background is likely ordinary starforming galaxies above redshift 1 characterized by an evolving radio far-infrared correlation, which increases toward the radio loud with redshift.

This review covers the measurements related to the extragalactic background light intensity from γ-rays to radio in the electromagnetic spectrum over 20 decades in wavelength. The cosmic microwave background (CMB) remains the best measured spectrum with an accuracy better than 1%. The measurements related to the cosmic optical background (COB), centred at 1 μm, are impacted by the large zodiacal light associated with interplanetary dust in the inner Solar System. The best measurements of COB come from an indirect technique involving γ-ray spectra of bright blazars with an absorption feature resulting from pair-production off of COB photons. The cosmic infrared background (CIB) peaking at around 100 μm established an energetically important background with an intensity comparable to the optical background. This discovery paved the way for large aperture far-infrared and sub-millimetre observations resulting in the discovery of dusty, starbursting galaxies. Their role in galaxy formation and evolution remains an active area of research in modern-day astrophysics. The extreme UV (EUV) background remains mostly unexplored and will be a challenge to measure due to the high Galactic background and absorption of extragalactic photons by the intergalactic medium at these EUV/soft X-ray energies. We also summarize our understanding of the spatial anisotropies and angular power spectra of intensity fluctuations. We motivate a precise direct measurement of the COB between 0.1 and 5 μm using a small aperture telescope observing either from the outer Solar System, at distances of 5 AU or more, or out of the ecliptic plane. Other future applications include improving our understanding of the background at TeV energies and spectral distortions of CMB and CIB. PMID:27069645

This review covers the measurements related to the extragalactic background light intensity from γ-rays to radio in the electromagnetic spectrum over 20 decades in wavelength. The cosmic microwave background (CMB) remains the best measured spectrum with an accuracy better than 1%. The measurements related to the cosmic optical background (COB), centred at 1 μm, are impacted by the large zodiacal light associated with interplanetary dust in the inner Solar System. The best measurements of COB come from an indirect technique involving γ-ray spectra of bright blazars with an absorption feature resulting from pair-production off of COB photons. The cosmic infrared background (CIB) peaking at around 100 μm established an energetically important background with an intensity comparable to the optical background. This discovery paved the way for large aperture far-infrared and sub-millimetre observations resulting in the discovery of dusty, starbursting galaxies. Their role in galaxy formation and evolution remains an active area of research in modern-day astrophysics. The extreme UV (EUV) background remains mostly unexplored and will be a challenge to measure due to the high Galactic background and absorption of extragalactic photons by the intergalactic medium at these EUV/soft X-ray energies. We also summarize our understanding of the spatial anisotropies and angular power spectra of intensity fluctuations. We motivate a precise direct measurement of the COB between 0.1 and 5 μm using a small aperture telescope observing either from the outer Solar System, at distances of 5 AU or more, or out of the ecliptic plane. Other future applications include improving our understanding of the background at TeV energies and spectral distortions of CMB and CIB. PMID:27069645

Sometimes when we look for one thing we stumble on something else. The Absolute Radiometer for Cosmology, Astrophysics, and Diffuse Emission (ARCADE) was designed to measure the blackbody spectrum of the cosmic microwave background to search for spectral distortions related to the epoch of reionization. Instead, the July 2006 flight found evidence for an extragalactic radio background with amplitude six times brighter than the expected contribution from faint radio sources. The author discusses the ARCADE instrument and the evidence for an extragalactic radio background.

Rejection and protection from background is a key issue for the next generation SuperCDMS SNOLAB experiment that will have a cross-section sensitivity of better than 8 × 10{sup −46} cm{sup 2} for spin-independent WIMP-nucleon interactions. This paper presents the details of the methods used to reject electromagnetic backgrounds using the new iZIP detectors that are currently operated in the Soudan Underground Laboratory, MN and the methods the collaboration is investigating to protect against neutron background in the next generation SuperCDMS experiment.

Model observers have been compared to human performance detecting low contrast signals in a variety of computer generated background including white noise, correlated noise, lumpy backgrounds, and two component noise. The purpose of the present paper is to extend this work by comparing a cumber of previously proposed model observers to human visual detection performance in real anatomic backgrounds. Human and model observer performance are compared as a function of increasing added white noise. Our results show that three of the four models are good predictors of human performance.

Analysis of foreground objects in scenery via image processing often involves a background subtraction process. This process aims to improve blob (connected component) content in the image. Quality blob content is often needed for defining regions of interest for object recognition and tracking. Three techniques are examined which optimize the background to be subtracted - genetic algorithm, an analytic solution based on convex optimization, and a related application of the CVX solver toolbox. These techniques are applied to a set of images and the results are compared. Additionally, a possible implementation architecture that uses multiple optimization techniques with subsequent arbitration to produce the best background subtraction is considered.

In coastal aquifers the mixing between fresh terrestrial water and seawater occurs, which influences groundwater quality. Due to mixing elevated chloride concentrations are often observed in coastal aquifers. In coastal areas terrestrial water-seawater mixing can be caused by anthropogenic activities or natural factors such as tides and sea level changes. Therefore, it is difficult or even impossible to characterize groundwater background concentrations in coastal aquifers. Although it is usual to exclude coastal aquifer when characterizing background concentrations, it is essential to accurately characterize naturally-affected groundwater quality in coastal areas because groundwater is a major water resource for potable, irrigation, domestic uses. So in this work we define groundwater background concentrations as naturally occurring ambient concentrations with excluding groundwater abstraction. Based on this definition, we evaluate groundwater background concentrations in various geologic formations and analyze characteristics of groundwater quality in coastal aquifers by utilizing Groundwater Quality Monitoring System (GQMS) data. The results show that high concentrations of chloride are observed in some coastal areas but not always. Tidal effects and topographical characteristics are thought to be as factors affecting such spatial variations. In some coastal areas high concentrations of chloride are observed with high nitrate concentrations. This means that agricultural practices can attribute to anthropogenic background, leading to elevated concentrations of nitrate. These results provide some essential information for groundwater resources management in coastal areas. Further data collection and analysis is required for evaluating the effect of tide and sea level changes on groundwater quality.

Fluorescence molecular imaging with exogenous probes improves specificity for the detection of diseased tissues by targeting unambiguous molecular signatures. Additionally, increased diagnostic sensitivity is expected with the application of multiple molecular probes. We developed a real-time multispectral fluorescence-reflectance scanning fiber endoscope (SFE) for wide-field molecular imaging of fluorescent dye-labeled molecular probes at nanomolar detection levels. Concurrent multichannel imaging with the wide-field SFE also allows for real-time mitigation of the background autofluorescence (AF) signal, especially when fluorescein, a U.S. Food and Drug Administration approved dye, is used as the target fluorophore. Quantitative tissue AF was measured for the ex vivo porcine esophagus and murine brain tissues across the visible and near-infrared spectra. AF signals were then transferred to the unit of targeted fluorophore concentration to evaluate the SFE detection sensitivity for sodium fluorescein and cyanine. Next, we demonstrated a real-time AF mitigation algorithm on a tissue phantom, which featured molecular probe targeted cells of high-grade dysplasia on a substrate containing AF species. The target-to-background ratio was enhanced by more than one order of magnitude when applying the real-time AF mitigation algorithm. Furthermore, a quantitative estimate of the fluorescein photodegradation (photobleaching) rate was evaluated and shown to be insignificant under the illumination conditions of SFE. In summary, the multichannel laser-based flexible SFE has demonstrated the capability to provide sufficient detection sensitivity, image contrast, and quantitative target intensity information for detecting small precancerous lesions in vivo.

Abstract. Fluorescence molecular imaging with exogenous probes improves specificity for the detection of diseased tissues by targeting unambiguous molecular signatures. Additionally, increased diagnostic sensitivity is expected with the application of multiple molecular probes. We developed a real-time multispectral fluorescence-reflectance scanning fiber endoscope (SFE) for wide-field molecular imaging of fluorescent dye-labeled molecular probes at nanomolar detection levels. Concurrent multichannel imaging with the wide-field SFE also allows for real-time mitigation of the background autofluorescence (AF) signal, especially when fluorescein, a U.S. Food and Drug Administration approved dye, is used as the target fluorophore. Quantitative tissue AF was measured for the ex vivo porcine esophagus and murine brain tissues across the visible and near-infrared spectra. AF signals were then transferred to the unit of targeted fluorophore concentration to evaluate the SFE detection sensitivity for sodium fluorescein and cyanine. Next, we demonstrated a real-time AF mitigation algorithm on a tissue phantom, which featured molecular probe targeted cells of high-grade dysplasia on a substrate containing AF species. The target-to-background ratio was enhanced by more than one order of magnitude when applying the real-time AF mitigation algorithm. Furthermore, a quantitative estimate of the fluorescein photodegradation (photobleaching) rate was evaluated and shown to be insignificant under the illumination conditions of SFE. In summary, the multichannel laser-based flexible SFE has demonstrated the capability to provide sufficient detection sensitivity, image contrast, and quantitative target intensity information for detecting small precancerous lesions in vivo. PMID:25027002

Many of the experiments currently searching for dark matter, studying properties of neutrinos or searching for neutrinoless double beta decay require very low levels of radioactive backgrounds both in their own construction materials and in the surrounding environment. These low background levels are required so that the experiments can achieve the required sensitivities for their searches. SNOLAB has several facilities which are used to directly measure these radioactive backgrounds. This proceedings will describe SNOLAB's High Purity Germanium Detectors, one of which has been in continuous use for the past seven years measuring materials for many experiments in operation or under construction at SNOLAB. A description of the characterisation of SNOLAB's new germanium well detector will be presented. In addition, brief descriptions of SNOLAB's alpha-beta and electrostatic counters will be presented and a description of SNOLAB's future low background counting laboratory will be given.

The LZ experiment, featuring a 7-tonne active liquid xenon target, is aimed at achieving unprecedented sensitivity to WIMPs with the background expected to be dominated by astrophysical neutrinos. To reach this goal, extensive simulations are carried out to accurately calculate the electron recoil and nuclear recoil rates in the detector. Both internal (from target material) and external (from detector components and surrounding environment) backgrounds are considered. A very efficient suppression of background rate is achieved with an outer liquid scintillator veto, liquid xenon skin and fiducialisation. Based on the current measurements of radioactivity of different materials, it is shown that LZ can achieve the reduction of a total background for a WIMP search down to about 2 events in 1000 live days for 5.6 tonne fiducial mass.

We update a previous search for anisotropies in the neutrino background using the new results from the WMAP satellite. We found that the new data confirm the indication for anisotropies in the neutrino background at about {approx}2{sigma} c.l.. Parametrizing the neutrino background through the viscosity parameter c{sub vis}{sup 2}, we demonstrate that current cosmological data provide the constraint c{sub vis}{sup 2}>0.145 at 95% c.l. We discuss the stability of the result when the assumptions of massless neutrinos and the number of massless neutrino families are relaxed. In particular, we found that an extra background of relativistic particles, still compatible with the data, prefers anisotropic stresses with c{sub vis}{sup 2}>0.09 at 95% c.l.

The radiation environment of near-earth space and its effects on biological and hardware systems are examined in reviews and reports. Sections are devoted to particle interactions and propagation, data bases, instrument background and dosimetry, detectors and experimental progress, biological effects, and future needs and strategies. Particular attention is given to angular distributions and spectra of geomagnetically trapped protons in LEO, bremsstrahlung production by electrons, nucleon-interaction data bases for background estimates, instrumental and atmospheric background lines observed by the SMM gamma-ray spectrometer, the GRAD high-altitude balloon flight over Antarctica, space protons and brain tumors, a new radioprotective antioxidative agent, LEO radiation measurements on the Space Station, and particle-background effects on the Hubble Space Telescope and the Lyman FUV Spectroscopic Explorer.

We report on a lattice calculation demonstrating a novel new method to extract the electric polarizability of charged pseudo-scalar mesons by analyzing two point correlation functions computed in classical background electric fields.

The present knowledge about the overall spectrum of the isotropic extragalactic background of electromagnetic radiation is summarized. The role of the HEAO program is discussed. Spectral measurements from HEAO are examined.

The radiation environment of near-earth space and its effects on biological and hardware systems are examined in reviews and reports. Sections are devoted to particle interactions and propagation, data bases, instrument background and dosimetry, detectors and experimental progress, biological effects, and future needs and strategies. Particular attention is given to angular distributions and spectra of geomagnetically trapped protons in LEO, bremsstrahlung production by electrons, nucleon-interaction data bases for background estimates, instrumental and atmospheric background lines observed by the SMM gamma-ray spectrometer, the GRAD high-altitude balloon flight over Antarctica, space protons and brain tumors, a new radioprotective antioxidative agent, LEO radiation measurements on the Space Station, and particle-background effects on the Hubble Space Telescope and the Lyman FUV Spectroscopic Explorer.

An outline is given that estimates the expected gravitational wave background, based on plausible pregalactic sources. Some cosmologically significant limits can be put on incoherent gravitational wave background arising from pregalactic cosmic evolution. The spectral region of cosmically generated and cosmically limited radiation is, at long periods, P greater than 1 year, in contrast to more recent cosmological sources, which have P approx. 10 to 10(exp -3).

CORSIKA is a simulation program for extensive air showers initiated by high energy cosmic particles. These air showers create the majority of the muons and neutrinos which neutrino that telescopes detect and are considered a background signature in searches for astrophysical neutrinos. This contribution will discuss changes to CORSIKA which allow for faster high energy background simulation. The theory, implementation, application, and performance of these modifications will be presented.

We write down the supermembrane actions for M-theory backgrounds dual to general N=2 four-dimensional superconformal field theories. The actions are given to all orders in fermions and are in a particular κ-gauge. When an extra U(1) isometry is present, our actions reduce to κ-gauge fixed Green-Schwarz actions for the corresponding Type IIA backgrounds.

Basic aspects of the background of gravitational waves and its mathematical characterization are reviewed. The spectral energy density parameter {Omega}(f), commonly used as a quantifier of the background, is derived for an ensemble of many identical sources emitting at different times and locations. For such an ensemble, {Omega}(f) is generalized to account for the duration of the signals and of the observation, so that one can distinguish the resolvable and unresolvable parts of the background. The unresolvable part, often called confusion noise or stochastic background, is made by signals that cannot be either individually identified or subtracted out of the data. To account for the resolvability of the background, the overlap function is introduced. This function is a generalization of the duty cycle, which has been commonly used in the literature, in some cases leading to incorrect results. The spectra produced by binary systems (stellar binaries and massive black hole binaries) are presented over the frequencies of all existing and planned detectors. A semi-analytical formula for {Omega}(f) is derived in the case of stellar binaries (containing white dwarfs, neutron stars or stellar-mass black holes). Besides a realistic expectation of the level of background, upper and lower limits are given, to account for the uncertainties in some astrophysical parameters such as binary coalescence rates. One interesting result concerns all current and planned ground-based detectors (including the Einstein Telescope). In their frequency range, the background of binaries is resolvable and only sporadically present. In other words, there is no stochastic background of binaries for ground-based detectors.

The Fermi-LAT Background Estimator (BKGE) is a publicly available open-source tool that can estimate the expected background of the Fermi-LAT for any observational conguration and duration. It produces results in the form of text files, ROOT files, gtlike source-model files (for LAT maximum likelihood analyses), and PHA I/II FITS files (for RMFit/XSpec spectral fitting analyses). Its core is written in C++ and its user interface in Python.

One of the essential ways in which nonlinear image restoration algorithms differ from linear, convolution-type image restoration filters is their capability to restrict the restoration result to nonnegative intensities. The iterative constrained Tikhonov-Miller (ICTM) algorithm, for example, incorporates the nonnegativity constraint by clipping all negative values to zero after each iteration. This constraint will be effective only when the restored intensities have near-zero values. Therefore the background estimation will have an influence on the effectiveness of the nonnegativity constraint of these algorithms. We investigated quantitatively the dependency of the performance of the ICTM, Carrington, and Richardson-Lucy algorithms on the estimation of the background and compared it with the performance of the linear Tikhonov-Miller restoration filter. We found that the performance depends critically on the background estimation: An underestimation of the background will make the nonnegativity constraint ineffective, which results in a performance that does not differ much from the Tikhonov-Miller filter performance. A (small) overestimation, however, degrades the performance dramatically, since it results in a clipping of object intensities. We propose a novel general method to estimate the background based on the dependency of nonlinear restoration algorithms on the background, and we demonstrate its applicability on real confocal images. PMID:10708022

To investigate annoyance of multiple noise sources, two experiments were conducted. The first experiment, which used 48 subjects, was designed to establish annoyance-noise level functions for three community noise sources presented individually: jet aircraft flyovers, air conditioner, and traffic. The second experiment, which used 216 subjects, investigated the effects of background noise on aircraft annoyance as a function of noise level and spectrum shape; and the differences between overall, aircraft, and background noise annoyance. In both experiments, rated annoyance was the dependent measure. Results indicate that the slope of the linear relationship between annoyance and noise level for traffic is significantly different from that of flyover and air conditioner noise and that further research was justified to determine the influence of the two background noises on overall, aircraft, and background noise annoyance (e.g., experiment two). In experiment two, total noise exposure, signal-to-noise ratio, and background source type were found to have effects on all three types of annoyance. Thus, both signal-to-noise ratio, and the background source must be considered when trying to determine community response to combined noise sources.

The Internet's growth and its impact on learners has been phenomenal and accessing the Web is the norm in our daily lives. Youths use the Internet to support their school activities in many ways. The National Science Foundation supported a qualitative research project that was designed to better understand the Internet's impact. It involved…

Renal transplantation is recognized as the treatment of choice in most patients with end-stage renal disease. The evaluation of the candidate for kidney transplantation has been the recent subject of clinical practice guidelines published by the European Renal Association- European Dialysis Transplant Association and the American Society of Transplantation. The purpose of this article is to review the current literature for urological evaluation and treatment of patients prior to renal transplantation. In India, urologists are involved in evaluating not only the genitourinary problems but also vascular access and, vascular anatomy and pathology especially related to major pelvic vessels. Hence, evaluation of the transplant recipient should include assessment of vascular access for hemodialysis, access for peritoneal dialysis, assessment of pelvic vessels to which renal allograft vessels need to be anastomosed and genitourinary system. In addition, review of the serological tests for infective viral diseases like hepatitis and human immunodeficiency viruses should always be done before starting clinical evaluation. A note of the evaluation performed by other specialists like nephrologist, cardiologist, endocrinologist, pulmonologist, anesthetist etc. should always be reviewed. PMID:19718331

Usually, the blood cell counting activity in haematology laboratory uses the comparison of IQC values to the target values proposed by the manufacturer. We intended to improve the monitoring of the proper functioning of our analytical measure system for 17 main haematologic parameters. To set the allowable critical limits of IQC, we propose our reflection based on several elements: benchmark and expert recommendation, clinical requirements, statistical indicators of the laboratory calculated using IQC values (3 levels, 2 different lots, 2 haematology analysers and 2 passage modes) and the EEQ values, during four months. We exploited the reports obtained from the middleware (our own IQC values), and the interlaboratory comparison reports (obtained from SNCS and EuroCell websites) and we compared our performances to the Ricos objectives, to set clearly argued allowable limits for IQC values. Finally, the allowable limits correspond to the imprecision limits stated by Ricos for 14 parameters (desirable for 11 parametres and minimal for 3 parameters) and personalized limits (more exigent than desirable Ricos limits) for 3 parameters of blood cell counting. PMID:25486666

Yield maintenance and improvement is a major area of concern in any integrated circuit manufacturing operation. A major aspect of this concern is controlling and reducing defect density. Obviously, large defect excursions must be immediately addressed in order to maintain yield levels. However, to enhance yields, the subtle defect mechanisms must be reduced or eliminated as well. In-line process control inspections are effective for detecting large variations in the defect density on a real time basis. Examples of in-line inspection strategies include after develop or after etch inspections. They are usually effective for detecting when a particular process segment has gone out of control. However, when a process is running normally, there exists a background defect density that is generally not resolved by in-line process control inspections. The inspection strategies that are frequently used to monitor the background defect density are offline inspections. Offline inspections are used to identify the magnitude and characteristics of the background defect density. These inspections sample larger areas of product wafers than the in-line inspections to allow identification of the defect generating mechanisms that normally occur in the process. They are used to construct a database over a period of time so that trends may be studied. This information enables engineering efforts to be focused on the mechanisms that have the greatest impact on device yield. Once trouble spots in the process are identified, the data base supplies the information needed to isolate and solve them. The key aspect to the entire program is to utilize a reliable data gathering mechanism coupled with a flexible information processing system. This paper describes one method of reducing the background defect density using automated wafer inspection and analysis. The tools used in this evaluation were the KLA 2020 Wafer Inspector, KLA Utility Terminal (KLAUT), and a new software package developed

The GERmanium Detector Array ( Gerda) experiment at the Gran Sasso underground laboratory (LNGS) of INFN is searching for neutrinoless double beta () decay of Ge. The signature of the signal is a monoenergetic peak at 2039 keV, the value of the decay. To avoid bias in the signal search, the present analysis does not consider all those events, that fall in a 40 keV wide region centered around . The main parameters needed for the analysis are described. A background model was developed to describe the observed energy spectrum. The model contains several contributions, that are expected on the basis of material screening or that are established by the observation of characteristic structures in the energy spectrum. The model predicts a flat energy spectrum for the blinding window around with a background index ranging from 17.6 to 23.8 cts/(keV kg yr). A part of the data not considered before has been used to test if the predictions of the background model are consistent. The observed number of events in this energy region is consistent with the background model. The background at is dominated by close sources, mainly due to K, Bi, Th, Co and emitting isotopes from the Ra decay chain. The individual fractions depend on the assumed locations of the contaminants. It is shown, that after removal of the known peaks, the energy spectrum can be fitted in an energy range of 200 keV around with a constant background. This gives a background index consistent with the full model and uncertainties of the same size.

SNO+ is a large multi-purpose liquid scintillator experiment, which first aim is to detect the neutrinoless double beta decay of 130Te. It is placed at SNOLAB, at 6000 m.w.e. and it is based on the SNO infrastructure. SNO+ will contain approximately 780 tonnes of liquid scintillator, loaded with 130Te inside an acrylic vessel (AV) with an external volume of ultra pure water to reduce the external backgrounds. Light produced in the scintillator by the interaction of particles will be detected with about 9,000 photomultiplier's. For the neutrinoless double beta decay phase, due to its the extremely low rate expected, the control, knowledge and reduction of the background is essential. Moreover, it will also benefit other phases of the experiment focused on the study of solar neutrinos, nucleon decay, geoneutrinos and supernovae. In order to reduce the internal background level, a novel purification technique for tellurium loaded scintillators has been developed by the collaboration that reduces the U/Th concentration and several cosmic-activated isotopes by at least a factor 102 -103 in a single pass. In addition, different rejection techniques have been developed for the remaining internal backgrounds based on Monte-Carlo simulations. In this work, the scintillator purification technique and the levels obtained with it will be discussed. Furthermore, an overview of the different backgrounds for the double-beta phase will be presented, highlighting some of the techniques developed to reject the remained decays based on their expected timing differences.

This study presents the modification of bolometer detectors used in particle searches to veto or otherwise reject alpha radiation background and the statistical advantages of doing so. Several techniques are presented in detail - plastic film scintillator vetoes, metallic film ionization vetoes, and Cherenkov radiation vetoes. Plastic scintillator films are cooled to bolometer temperatures and bombarded with 1.4MeV to 6.0MeV alpha particles representative of documented detector background. Quantum dot based liquid scintillator is similarly bombarded to produce a background induced scintillation light. Photomultipliers detect this scintillation light and produce a veto signal. Layered metallic films of a primary metal, dielectric, and secondary metal, such as gold-polyethylene-gold films, are cooled to milli-kelvin temperatures and biased to produce a current signal veto when incident 1.4MeV to 6.0MeV alpha particles ionize conduction paths through the film. Calibration of veto signal to background energy is presented. These findings are extrapolated to quantify the statistical impact of such modifications to bolometer searches. Effects of these techniques on experiment duration and signal-background ratio are discussed.

The world is naturally radioactive and approximately 82% of human-absorbed radiation doses, which are out of control, arise from natural sources such as cosmic, terrestrial, and exposure from inhalation or intake radiation sources. In recent years, several international studies have been carried out, which have reported different values regarding the effect of background radiation on human health. Gamma radiation emitted from natural sources (background radiation) is largely due to primordial radionuclides, mainly 232Th and 238U series, and their decay products, as well as 40K, which exist at trace levels in the earth's crust. Their concentrations in soil, sands, and rocks depend on the local geology of each region in the world. Naturally occurring radioactive materials generally contain terrestrial-origin radionuclides, left over since the creation of the earth. In addition, the existence of some springs and quarries increases the dose rate of background radiation in some regions that are known as high level background radiation regions. The type of building materials used in houses can also affect the dose rate of background radiations. The present review article was carried out to consider all of the natural radiations, including cosmic, terrestrial, and food radiation. PMID:24223380

One of the pressing concerns in Dark Matter detection experiments is ensuring that the potential signal from exceedingly rare Dark Matter interactions is not obscured by background from interactions with more common particles. This work focuses on the ways in which alpha particles from primordial isotopes in the DEAP detector components can cause background events in the region of interest for Dark Matter search, based on both Monte Carlo simulations and data from the DEAP-1 prototype detector. The DEAP experiment uses liquid argon as a target for Dark Matter interactions and relies on the organic electroluminescent dye tetraphenyl butadiene (TPB) to shift the UV argon scintillation light to the visible range. The light yield and pulse shape of alpha particle induced scintillation of TPB, which is an essential input parameter for the simulations, was experimentally determined. An initial mismatch between simulated and measured background spectra could be explained by a model of geometric background events, which was experimentally confirmed and informed the design of certain parts of the DEAP-3600 detector that is under construction at the moment. Modification of the DEAP-1 detector geometry based on this model led to improved background rates. The remaining background was well described by the simulated spectra, and competitive limits on the contamination of acrylic with primordial isotopes were obtained. Purity requirements for the DEAP-3600 detector components were based on this work. The design and testing of a novel large area TPB deposition source, which will be used to make TPB coatings for the DEAP-3600 detector, is described.

The particle transport code MCNP has been used to produce a background radiation data file on a worldwide grid that can easily be sampled as a source in the code. Location-dependent cosmic showers were modeled by Monte Carlo methods to produce the resulting neutron and photon background flux at 2054 locations around Earth. An improved galactic-cosmic-ray feature was used to model the source term as well as data from multiple sources to model the transport environment through atmosphere, soil, and seawater. A new elevation scaling feature was also added to the code to increase the accuracy of the cosmic neutronmore » background for user locations with off-grid elevations. Furthermore, benchmarking has shown the neutron integral flux values to be within experimental error.« less

The particle transport code MCNP has been used to produce a background radiation data file on a worldwide grid that can easily be sampled as a source in the code. Location-dependent cosmic showers were modeled by Monte Carlo methods to produce the resulting neutron and photon background flux at 2054 locations around Earth. An improved galactic-cosmic-ray feature was used to model the source term as well as data from multiple sources to model the transport environment through atmosphere, soil, and seawater. A new elevation scaling feature was also added to the code to increase the accuracy of the cosmic neutron background for user locations with off-grid elevations. Furthermore, benchmarking has shown the neutron integral flux values to be within experimental error.

These notes record three lectures given at the workshop "Higher symmetries in Physics", held at the Universidad Complutense de Madrid in November 2008. In them we explain how to construct a Lie (super)algebra associated to a spin manifold, perhaps with extra geometric data, and a notion of privileged spinors. The typical examples are supersymmetric supergravity backgrounds; although there are more classical instances of this construction. We focus on two results: the geometric constructions of compact real forms of the simple Lie algebras of type B4, F4 and E8 from S7, S8 and S15, respectively; and the construction of the Killing superalgebra of eleven-dimensional supergravity backgrounds. As an application of this latter construction we show that supersymmetric supergravity backgrounds with enough supersymmetry are necessarily locally homogeneous.

Background compensation in a device such as a hand and foot monitor is provided by digital means using a scaler. With no radiation level test initiated, a scaler is down-counted from zero according to the background measured. With a radiation level test initiated, the scaler is up-counted from the previous down-count position according to the radiation emitted from the monitored object and an alarm is generated if, with the scaler having crossed zero in the positive going direction, a particular number is exceeded in a specific time period after initiation of the test. If the test is initiated while the scale is down-counting, the background count from the previous down- count stored in a memory is used as the initial starting point for the up-count.

The Majorana Collaboration is constructing a system containing 40 kg of HPGe detectors to demonstrate the feasibility and potential of a future tonne-scale experiment capable of probing the neutrino mass scale in the inverted-hierarchy region. To realize this, a major goal of the Majorana Demonstrator is to demonstrate a path forward to achieving a background rate at or below 1 cnt/(ROI-t-y) in the 4 keV region of interest around the Q-value at 2039 keV. This goal is pursued through a combination of a significant reduction of radioactive impurities in construction materials with analytical methods for background rejection, for example using powerful pulse shape analysis techniques profiting from the p-type point contact HPGe detectors technology. The effectiveness of these methods is assessed using simulations of the different background components whose purity levels are constrained from radioassay measurements.

We have determined the dispersion relation of a neutrino test particle propagating in the cosmic neutrino background. Describing the relic neutrinos and antineutrinos from the hot big bang as a dense medium, a matter potential or refractive index is obtained. The vacuum neutrino mixing angles are unchanged, but the energy of each mass state is modified. Using a matrix in the space of neutrino species, the induced potential is decomposed into a part which produces signatures in beta-decay experiments and another part which modifies neutrino oscillations. The low temperature of the relic neutrinos makes a direct detection extremely challenging. From a different point of view, the identified refractive effects of the cosmic neutrino background constitute an ultralow background for future experimental studies of nonvanishing Lorentz violation in the neutrino sector.

The Majorana Collaboration is constructing a system containing 40 kg of HPGe detectors to demonstrate the feasibility and potential of a future tonne-scale experiment capable of probing the neutrino mass scale in the inverted-hierarchy region. To realize this, a major goal of the Majorana Demonstrator is to demonstrate a path forward to achieving a background rate at or below 1 cnt/(ROI-t-y) in the 4 keV region of interest around the Q-value at 2039 keV. This goal is pursued through a combination of a significant reduction of radioactive impurities in construction materials with analytical methods for background rejection, for example using powerful pulse shape analysis techniques profiting from the p-type point contact HPGe detectors technology. The effectiveness of these methods is assessed using simulations of the different background components whose purity levels are constrained from radioassay measurements.

The Majorana Collaboration is constructing a system containing 40 kg of HPGe detectors to demonstrate the feasibility and potential of a future tonne-scale experiment capable of probing the neutrino mass scale in the inverted-hierarchy region. To realize this, a major goal of the Majorana Demonstrator is to demonstrate a path forward to achieving a background rate at or below 1 cnt/(ROI-t-y) in the 4 keV region of interest around the Q-value at 2039 keV. This goal is pursued through a combination of a significant reduction of radioactive impurities in construction materials with analytical methods for background rejection, for example usingmore » powerful pulse shape analysis techniques profiting from the p-type point contact HPGe detectors technology. The effectiveness of these methods is assessed using simulations of the different background components whose purity levels are constrained from radioassay measurements.« less

In the original Generalized Geometry Holdup (GGH) model, all holdup deposits were modeled as points, lines, and areas[1, 5]. Two improvements[4] were recently made to the GGH model and are currently in use at the Y-12 National Security Complex. These two improvements are the finite-source correction CF{sub g} and the self-attenuation correction. The finite-source correction corrects the average detector response for the width of point and line geometries which in effect, converts points and lines into areas. The result of a holdup measurement of an area deposit is a density-thickness which is converted to mass by multiplying it by the area of the deposit. From the measured density-thickness, the true density-thickness can be calculated by correcting for the material self-attenuation. Therefore the self-attenuation correction is applied to finite point and line deposits as well as areas. This report demonstrates that the finite-source and self-attenuation corrections also provide a means to better separate the gamma rays emitted by the material from the gamma rays emitted by background sources for an improved background correction. Currently, the measured background radiation is attenuated for equipment walls in the case of area deposits but not for line and point sources. The measured background radiation is not corrected for attenuation by the uranium material. For all of these cases, the background is overestimated which causes a negative bias in the measurement. The finite-source correction and the self-attenuation correction will allow the correction of the measured background radiation for both the equipment attenuation and material attenuation for area sources as well as point and line sources.

A method and system for reducing background noise in a particle collider, comprises identifying an interaction point among a plurality of particles within a particle collider associated with a detector element, defining a trigger start time for each of the pixels as the time taken for light to travel from the interaction point to the pixel and a trigger stop time as a selected time after the trigger start time, and collecting only detections that occur between the start trigger time and the stop trigger time in order to thereafter compensate the result from the particle collider to reduce unwanted background detection.

Photometric observations of the diffuse extreme ultraviolet background with two photometers having bandpasses of 750-940 A and 1040-1080 A are reported. The payload, which was flown aboard an ARIES sounding rocket in June 1982, is described, including the electron detectors, filters, and calibration. The operation of the probe during the experiment, including its motions, are described. The primary experiment involved spectroscopic observation of the hot white dwarf HZ43. The photometer count rate is shown and the measurements of the diffuse background are compared with theoretical predictions. Despite the lower limits obtained using a narrowband detector, the measurements are not sensitive enough to draw any relevant astrophysical conclusions.

Inflation creates both scalar (density) and tensor (gravity wave) metric perturbations. We find that the tensor-mode contribution to the cosmic microwave background anisotropy on large-angular scales can only exceed that of the scalar mode in models where the spectrum of perturbations deviates significantly from scale invariance. If the tensor mode dominates at large-angular scales, then the value of DeltaT/T predicted on 1 deg is less than if the scalar mode dominates, and, for cold-dark-matter models, bias factors greater than 1 can be made consistent with Cosmic Background Explorer (COBE) DMR results.

First results are reported from a program for measuring the field-to-field fluctuation level of the cosmic diffuse background by using differences between the two background positions of each deep exposure with the High Energy X-ray Timing Experiment (HEXTE) instrument on the Remote X Ray Timing Explorer (RXTE). With 8 million live seconds accumulated to date a fluctuation level on the 15-25 keV band is observed which is consistent with extrapolations from the High Energy Astrophysical Observatory-1 (HEAO-1) measurements. Positive results are expected eventually at higher energies. Models of (active galactic nuclei) AGN origin will eventually be constrained by this program.

An exact formalism is developed for describing cosmological models with strong, long wavelength gravitational waves of general polarization propagating over backgrounds corresponding to Bianchi types I through VII. We introduce and discuss a new metric which exhibits the appropriate symmetries of two equivalent independent polarizations of gravitational waves. The formalism is applied to an empty type I cosmology, and it is shown how the original z-dependent chaotic singularity structure transforms itself into gravitational radiation propagating along the z-axis in a Bianchi I background.

An exact formalism is developed for describing cosmological models with strong, long wavelength gravitational waves of general polarization, propagating over backgrounds corresponding to Bianchi types I through VII. A new metric which exhibits the appropriate symmetries of two equivalent independent polarizations of gravitational waves is introduced and discussed. The formalism is applied to an empty type I cosmology, and it is shown how the original z-dependent chaotic singularity structure transforms itself into gravitational radiation propagating along the z-axis in a Bianchi I background.

We study a harmonic triangular lattice, which relaxes in the presence of an incommensurate short-wavelength potential. Monte Carlo simulations reveal that the elastic lattice exhibits only short-ranged translational correlations, despite the absence of defects in either lattice. Extended orientational order, however, persists in the presence of the background. Translational correlation lengths exhibit approximate power-law dependence upon cooling rate and background strength. Our results may be relevant to Wigner crystals, atomic monolayers on crystals surfaces, and flux-line and magnetic bubble lattices.

One of the results of the EINSTEIN/C.f.A. X-ray stellar survey was a determination of the contribution of the disk stellar population to the galactic component of the diffuse soft (0.28 - 1.0 keV) X-ray background. This analysis employed both binned and unbinned nonparametric statistical methods that have been developed by Avni, et al. (1980). These methods permitted the use of the information contained in both the 22 detections and 4 upper bounds on the luminosities of 26 dM stars in order to derive their luminosity function. Luminosity functions for earlier stellar types are not yet developed. For these earlier stellar types, the median luminosities as determined by Vaiana, et al., are used (1981), which underestimates their contribution to the background. We find that it is the M dwarfs that dominate the disk population stellar contribution to this background. To calculate the contribution of the stellar sources to the background, simple models both for the spatial distribution of the stars and for the properties of the intervening interstellar medium are used. A model is chosen in which all stellar classes have the same functional form for their spatial distribution: an exponentially decreasing distribution above the galactic equatorial plane, and a uniform distribution within the galactic plane for a region of several kiloparsecs centered on the Sun.

... AHRQ Effective Health Care Program. Access to published and unpublished pertinent scientific... commissioned the Effective Health Care (EHC) Program Evidence-based Practice Centers to complete a review of... information on indications not included in the review cannot be used by the Effective Health Care...

... AHRQ Effective Health Care Program. Access to published and unpublished pertinent scientific... Health Care (EHC) Program Evidence-based Practice Centers to complete a comparative effectiveness review... information on indications not included in the review cannot be used by the Effective Health Care...

... Information for Defense Purposes with Australia, Belgium, Denmark, France, the Federal Republic of Germany... Organization and European Regional Organizations (USRO), Paris, France, is the United States representative...

... Information for Defense Purposes with Australia, Belgium, Denmark, France, the Federal Republic of Germany... Organization and European Regional Organizations (USRO), Paris, France, is the United States representative...

... Information for Defense Purposes with Australia, Belgium, Denmark, France, the Federal Republic of Germany... Organization and European Regional Organizations (USRO), Paris, France, is the United States representative...

... Information for Defense Purposes with Australia, Belgium, Denmark, France, the Federal Republic of Germany... Organization and European Regional Organizations (USRO), Paris, France, is the United States representative...

How the clumpy structured universe that we see today evolved from the smoothly distributed matter that existed during the dark ages is one of the most pressing questions of modern Cosmology. In the last few years, it has become clear that dusty star-forming galaxies are participating to this major change. Indeed they are a critical player in the assembly of stellar mass and the evolution of massive galaxies.Dusty star-forming galaxies at high redshift are very difficult to detect individually because they are so faint and numerous (compared to the angular resolution achievable in the far-IR to mm), that confusion plagues observations substantially. As a result, CMB experiments, such as Planck, can only see the brightest objects that represent the tip of the iceberg in terms of galaxy mass halos and star formation rates. But fortunately, those experiments are sensitive enough to measure the cumulative IR emission from all galaxies throughout cosmic history, the cosmic IR background. The anisotropies detected in this background trace the large-scale distribution of star-forming galaxies and, to some extent, the underlying distribution of the dark matter haloes in which galaxies reside. It is so bright that it represents (together with the shot noise) the main foreground contaminant to CMB temperature maps at small scales.I will review the current measurements of CIB anisotropies in Planck, but also in SPT, ACT and Herschel. I will discussed what we've learned from these measurements in the framework of galaxy evolution. I will show that most of the information from CIB anisotropies alone has been extracted; the future is now in cross-correlation. Because dusty galaxies trace the underlying dark matter, the CIB will correlate with any other tracer of the same dark matter field, provided that both overlap in redshift. The potential of Planck maps, covering the whole sky, is tremendous. A good illustration of this promising future is the fact that the Planck discovered

Samples from hazardous waste site investigations frequently come from two or more statistical populations. Assessment of "background" levels of contaminants can be a significant problem. This problem is being investigated at the US EPA's EMSL in Las Vegas. This paper describes a ...

Examines three theoretical perspectives (family values, acculturation strategies, and social group identity) as predictors of the psychological well-being of adolescents from immigrant backgrounds. Reveals that the perspectives accounted for between 12% and 22% of variance of mental health, life satisfaction, and self-esteem, while social group…

... ENERGY (GENERAL PROVISIONS) COMPLIANCE WITH FLOODPLAIN AND WETLAND ENVIRONMENTAL REVIEW REQUIREMENTS General § 1022.1 Background. (a) Executive Order (E.O.) 11988—Floodplain Management (May 24, 1977) directs... effects of any action it may take in a floodplain are evaluated and that its planning programs and...

... ENERGY (GENERAL PROVISIONS) COMPLIANCE WITH FLOODPLAIN AND WETLAND ENVIRONMENTAL REVIEW REQUIREMENTS General § 1022.1 Background. (a) Executive Order (E.O.) 11988—Floodplain Management (May 24, 1977) directs... effects of any action it may take in a floodplain are evaluated and that its planning programs and...

... ENERGY (GENERAL PROVISIONS) COMPLIANCE WITH FLOODPLAIN AND WETLAND ENVIRONMENTAL REVIEW REQUIREMENTS General § 1022.1 Background. (a) Executive Order (E.O.) 11988—Floodplain Management (May 24, 1977) directs... effects of any action it may take in a floodplain are evaluated and that its planning programs and...

... ENERGY (GENERAL PROVISIONS) COMPLIANCE WITH FLOODPLAIN AND WETLAND ENVIRONMENTAL REVIEW REQUIREMENTS General § 1022.1 Background. (a) Executive Order (E.O.) 11988—Floodplain Management (May 24, 1977) directs... effects of any action it may take in a floodplain are evaluated and that its planning programs and...

... ENERGY (GENERAL PROVISIONS) COMPLIANCE WITH FLOODPLAIN AND WETLAND ENVIRONMENTAL REVIEW REQUIREMENTS General § 1022.1 Background. (a) Executive Order (E.O.) 11988—Floodplain Management (May 24, 1977) directs... effects of any action it may take in a floodplain are evaluated and that its planning programs and...

... ANTENNAS, TV ANTENNAS, AND SUPPORTING STRUCTURES § 1402.2 Background. As a result of numerous electrocutions which have occurred when consumers contacted powerlines with CB base station and outside TV... antennas, outside TV antennas, and supporting structures due to contact with overhead powerlines....

... ANTENNAS, TV ANTENNAS, AND SUPPORTING STRUCTURES § 1402.2 Background. As a result of numerous electrocutions which have occurred when consumers contacted powerlines with CB base station and outside TV... antennas, outside TV antennas, and supporting structures due to contact with overhead powerlines....

... ANTENNAS, TV ANTENNAS, AND SUPPORTING STRUCTURES § 1402.2 Background. As a result of numerous electrocutions which have occurred when consumers contacted powerlines with CB base station and outside TV... antennas, outside TV antennas, and supporting structures due to contact with overhead powerlines....

... ANTENNAS, TV ANTENNAS, AND SUPPORTING STRUCTURES § 1402.2 Background. As a result of numerous electrocutions which have occurred when consumers contacted powerlines with CB base station and outside TV... antennas, outside TV antennas, and supporting structures due to contact with overhead powerlines....

... ANTENNAS, TV ANTENNAS, AND SUPPORTING STRUCTURES § 1402.2 Background. As a result of numerous electrocutions which have occurred when consumers contacted powerlines with CB base station and outside TV... antennas, outside TV antennas, and supporting structures due to contact with overhead powerlines....

... 32 National Defense 5 2011-07-01 2011-07-01 false Background. 701.40 Section 701.40 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE...

... 32 National Defense 5 2011-07-01 2011-07-01 false Background. 701.56 Section 701.56 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE...

... 32 National Defense 5 2013-07-01 2013-07-01 false Background. 701.40 Section 701.40 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE...

... 32 National Defense 5 2014-07-01 2014-07-01 false Background. 701.56 Section 701.56 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE...

... 32 National Defense 5 2012-07-01 2012-07-01 false Background. 701.40 Section 701.40 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE...

... 32 National Defense 5 2013-07-01 2013-07-01 false Background. 701.56 Section 701.56 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE...

... 32 National Defense 5 2014-07-01 2014-07-01 false Background. 701.40 Section 701.40 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE...

... 32 National Defense 5 2010-07-01 2010-07-01 false Background. 701.56 Section 701.56 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE...

... 32 National Defense 5 2012-07-01 2012-07-01 false Background. 701.56 Section 701.56 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE...

... 32 National Defense 5 2010-07-01 2010-07-01 false Background. 701.40 Section 701.40 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS AVAILABILITY OF DEPARTMENT OF THE NAVY RECORDS AND PUBLICATION OF DEPARTMENT OF THE...

Spectrophone detects very small concentrations of trace gases. With gas in sample cell, laser is tuned to absorption line of interest. Molecular absorption in cell produces pulsed acoustical pressure at chopper frequency. Two optical paths with very different absorption lengths are used to pretune cell to balance out background absorption by cell windows.

The aim of the present study was to explore the educational background of the total population of inmates in Norwegian prisons. The sample consisted of all 3 289 inmates over 18 years of age in Norwegian prisons. The response rate was 71.1 percent. Ninety four percent of the participants were men and mean age was 35 years. A questionnaire…

This paper tests a hypothesized linear relationship between social background and final grades in several political science courses that I taught at the University of Central Arkansas. I employ a cross-sectional research design and ordinary least square (OLS) estimators to test the foregoing hypothesis. Relying on a sample of up to 204…

... 32 National Defense 5 2011-07-01 2011-07-01 false Background. 735.2 Section 735.2 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY PERSONNEL REPORTING BIRTHS AND DEATHS IN COOPERATION...) policy is that military services will require their members to make official record of births,...

... 32 National Defense 5 2014-07-01 2014-07-01 false Background. 735.2 Section 735.2 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY PERSONNEL REPORTING BIRTHS AND DEATHS IN COOPERATION...) policy is that military services will require their members to make official record of births,...

... 32 National Defense 5 2012-07-01 2012-07-01 false Background. 735.2 Section 735.2 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY PERSONNEL REPORTING BIRTHS AND DEATHS IN COOPERATION...) policy is that military services will require their members to make official record of births,...