Sample records for required log reduction

The Monte Carlo method enables detailed, explicit geometric, energy and angular representations, and hence is considered to be the most accurate method available for solving complex radiation transport problems. Because of its associated accuracy, the Monte Carlo method is widely used in the petroleum exploration industry to design, benchmark, and simulate nuclear well-logging tools. Nuclear well-logging tools, which contain neutron and/or gamma sources and two or more detectors, are placed in boreholes that contain water (and possibly other fluids) and that are typically surrounded by a formation (e.g., limestone, sandstone, calcites, or a combination). The response of the detectors to radiation returning from the surrounding formation is used to infer information about the material porosity, density, composition, and associated characteristics. Accurate computer simulation is a key aspect of this exploratory technique. However, because this technique involves calculating highly precise responses (at two or more detectors) based on radiation that has interacted with the surrounding formation, the transport simulations are computationally intensive, requiring significant use of variance reduction techniques, parallel computing, or both. Because of the challenging nature of these problems, nuclear well-logging problems have frequently been used to evaluate the effectiveness of variance reduction techniques (e.g., Refs. 1-4). The primary focus of these works has been on improving the computational efficiency associated with calculating the response at the most challenging detector location, which is typically the detector furthest from the source. Although the objective of nuclear well-logging simulations is to calculate the response at multiple detector locations, until recently none of the numerous variance reduction methods/techniques has been well-suited to simultaneous optimization of multiple detector (tally) regions. Therefore, a separate calculation is

The Monte Carlo method enables detailed, explicit geometric, energy and angular representations, and hence is considered to be the most accurate method available for solving complex radiation transport problems. Because of its associated accuracy, the Monte Carlo method is widely used in the petroleum exploration industry to design, benchmark, and simulate nuclear well-logging tools. Nuclear well-logging tools, which contain neutron and/or gamma sources and two or more detectors, are placed in boreholes that contain water (and possibly other fluids) and that are typically surrounded by a formation (e.g., limestone, sandstone, calcites, or a combination). The response of the detectors to radiation returning from the surrounding formation is used to infer information about the material porosity, density, composition, and associated characteristics. Accurate computer simulation is a key aspect of this exploratory technique. However, because this technique involves calculating highly precise responses (at two or more detectors) based on radiation that has interacted with the surrounding formation, the transport simulations are computationally intensive, requiring significant use of variance reduction techniques, parallel computing, or both. Because of the challenging nature of these problems, nuclear well-logging problems have frequently been used to evaluate the effectiveness of variance reduction techniques (e.g., Refs. 1-4). The primary focus of these works has been on improving the computational efficiency associated with calculating the response at the most challenging detector location, which is typically the detector furthest from the source. Although the objective of nuclear well-logging simulations is to calculate the response at multiple detector locations, until recently none of the numerous variance reduction methods/techniques has been well-suited to simultaneous optimization of multiple detector (tally) regions. Therefore, a separate calculation is

Imagine that you are tasked to help a project improve their testing effort. In a realistic scenario it will quickly become clear, that having an impact is diffcult. First of all, it will likely be a challenge to suggest an alternative approach which is significantly more automated and/or more effective than current practice. The reality is that an average software system has a complex input/output behavior. An automated testing approach will have to auto-generate test cases, each being a pair (i; o) consisting of a test input i and an oracle o. The test input i has to be somewhat meaningful, and the oracle o can be very complicated to compute. Second, even in case where some testing technology has been developed that might improve current practice, it is then likely difficult to completely change the current behavior of the testing team unless the technique is obviously superior and does everything already done by existing technology. So is there an easier way to incorporate formal methods-based approaches than the full edged test revolution? Fortunately the answer is affirmative. A relatively simple approach is to benefit from possibly already existing logging infrastructure, which after all is part of most systems put in production. A log is a sequence of events, generated by special log recording statements, most often manually inserted in the code by the programmers. An event can be considered as a data record: a mapping from field names to values. We can analyze such a log using formal methods, for example checking it against a formal specification. This separates running the system for analyzing its behavior. It is not meant as an alternative to testing since it does not address the important in- put generation problem. However, it offers a solution which testing teams might accept since it has low impact on the existing process. A single person might be assigned to perform such log analysis, compared to the entire testing team changing behavior.

... 47 Telecommunication 4 2010-10-01 2010-10-01 false General requirements related to the station log... requirements related to the station log. (a) The licensee of each station must maintain a station log as required by § 73.1820. This log shall be kept by station employees competent to do so, having...

... 47 Telecommunication 4 2012-10-01 2012-10-01 false General requirements related to the station log... requirements related to the station log. (a) The licensee of each station must maintain a station log as required by § 73.1820. This log shall be kept by station employees competent to do so, having...

... 47 Telecommunication 4 2011-10-01 2011-10-01 false General requirements related to the station log... requirements related to the station log. (a) The licensee of each station must maintain a station log as required by § 73.1820. This log shall be kept by station employees competent to do so, having...

... 47 Telecommunication 4 2013-10-01 2013-10-01 false General requirements related to the station log... requirements related to the station log. (a) The licensee of each station must maintain a station log as required by § 73.1820. This log shall be kept by station employees competent to do so, having...

... 47 Telecommunication 4 2014-10-01 2014-10-01 false General requirements related to the station log... requirements related to the station log. (a) The licensee of each station must maintain a station log as required by § 73.1820. This log shall be kept by station employees competent to do so, having...

... 26 Internal Revenue 16 2010-04-01 2010-04-01 true Reduction in tax for trucks used in logging. 41... Certain Highway Motor Vehicles § 41.4483-6 Reduction in tax for trucks used in logging. (a) In general. The tax imposed by section 4481 shall be reduced by 25 percent in the case of a truck used in...

... 26 Internal Revenue 16 2011-04-01 2011-04-01 false Reduction in tax for trucks used in logging. 41... Certain Highway Motor Vehicles § 41.4483-6 Reduction in tax for trucks used in logging. (a) In general. The tax imposed by section 4481 shall be reduced by 25 percent in the case of a truck used in...

... 26 Internal Revenue 16 2012-04-01 2012-04-01 false Reduction in tax for trucks used in logging. 41... Certain Highway Motor Vehicles § 41.4483-6 Reduction in tax for trucks used in logging. (a) In general. The tax imposed by section 4481 shall be reduced by 25 percent in the case of a truck used in...

... 26 Internal Revenue 16 2013-04-01 2013-04-01 false Reduction in tax for trucks used in logging. 41... Certain Highway Motor Vehicles § 41.4483-6 Reduction in tax for trucks used in logging. (a) In general. The tax imposed by section 4481 shall be reduced by 25 percent in the case of a truck used in...

There is a need to reduce emissions from conventional wood stoves in the short-term while stove replacement takes place over the longer term. One possible is to use fuels that would burn cleaner than cordwood. Densified fuels have been commercially available for years and offer such a possibility. The objective of this project was to evaluate the emissions and efficiency performance of two commercially available densified log types in homes and compare their performance with cordwood. Researchers measured particulate matter (PM), carbon monoxide (CO), and volatile organic matter (VOC) emissions. Both total VOC and methane values are presented. Each home used an Automated Woodstove Emissions Sampler system, developed for the EPA and Bonneville Power Administration, in a series of four week-long tests for each stove. The sequence of tests in each stove was cordwood, Pres-to-Logs, Eco-Logs, and a second, confirming test using Pres-to-Logs. Results show an average reduction of 52% in PM grams per hour emissions overall for the nine stoves using Pres-to-Logs. All nine stoves displayed a reduction in PM emissions. CO emissions were more modestly reduced by 27%, and VOCs were reduced 39%. The emissions reduction percentage was similar for both types of stoves.

There is a need to reduce emissions from conventional wood stoves in the short-term while stove replacement takes place over the longer term. One possible is to use fuels that would burn cleaner than cordwood. Densified fuels have been commercially available for years and offer such a possibility. The objective of this project was to evaluate the emissions and efficiency performance of two commercially available densified log types in homes and compare their performance with cordwood. Researchers measured particulate matter (PM), carbon monoxide (CO), and volatile organic matter (VOC) emissions. Both total VOC and methane values are presented. Each home used an Automated Woodstove Emissions Sampler system, developed for the EPA and Bonneville Power Administration, in a series of four week-long tests for each stove. The sequence of tests in each stove was cordwood, Pres-to-Logs, Eco-Logs, and a second, confirming test using Pres-to-Logs. Results show an average reduction of 52% in PM grams per hour emissions overall for the nine stoves using Pres-to-Logs. All nine stoves displayed a reduction in PM emissions. CO emissions were more modestly reduced by 27%, and VOCs were reduced 39%. The emissions reduction percentage was similar for both types of stoves.

Recent outbreaks of acid-resistant food pathogens in acid foods, including apple cider and orange juice, have raised concerns about the safety of acidified vegetable products. We determined pasteurization times and temperatures needed to assure a 5-logreduction in the numbers of Escherichia coli O157:H7, Listeria monocytogenes, and Salmonella strains in acidified cucumber pickle brines. Cocktails of five strains of each pathogen were (separately) used for heat-inactivation studies between 50 and 60 degrees C in brines that had an equilibrated pH value of 4.1. Salmonella strains were found to be less heat resistant than E. coli O157:H7 or L. monocytogenes strains. The nonlinear killing curves generated during these studies were modeled using a Weibull function. We found no significant difference in the heat-killing data for E. coli O157:H7 and L. monocytogenes (P = 0.9709). The predicted 5-logreduction times for E. coli O157:H7 and L. monocytogenes were found to fit an exponential decay function. These data were used to estimate minimum pasteurization times and temperatures needed to ensure safe processing of acidified pickle products and show that current industry pasteurization practices offer a significant margin of safety. PMID:15726973

To reduce greenhouse gas emissions from deforestation, Indonesia instituted a nationwide moratorium on new license areas ("concessions") for oil palm plantations, timber plantations, and logging activity on primary forests and peat lands after May 2011. Here we indirectly evaluate the effectiveness of this policy using annual nationwide data on deforestation, concession licenses, and potential agricultural revenue from the decade preceding the moratorium. We estimate that on average granting a concession for oil palm, timber, or logging in Indonesia increased site-level deforestation rates by 17-127%, 44-129%, or 3.1-11.1%, respectively, above what would have occurred otherwise. We further estimate that if Indonesia's moratorium had been in place from 2000 to 2010, then nationwide emissions from deforestation over that decade would have been 241-615 MtCO2e (2.8-7.2%) lower without leakage, or 213-545 MtCO2e (2.5-6.4%) lower with leakage. As a benchmark, an equivalent reduction in emissions could have been achieved using a carbon price-based instrument at a carbon price of $3.30-7.50/tCO2e (mandatory) or $12.95-19.45/tCO2e (voluntary). For Indonesia to have achieved its target of reducing emissions by 26%, the geographic scope of the moratorium would have had to expand beyond new concessions (15.0% of emissions from deforestation and peat degradation) to also include existing concessions (21.1% of emissions) and address deforestation outside of concessions and protected areas (58.7% of emissions). Place-based policies, such as moratoria, may be best thought of as bridge strategies that can be implemented rapidly while the institutions necessary to enable carbon price-based instruments are developed. PMID:25605880

Lattice reduction (LR)-aided detectors have been shown great potentials in wireless communications for their low complexity and low bit-error-rate (BER) performance. The LR algorithms use the unimodular transformation to improve the orthogonality of the channel matrix. However, the LR algorithms only utilize the channel state information (CSI) and do not take account for the received signal, which is also important information in enhancing the performance of the detectors. In this paper, we make a readjustment of the received signal in the LR domain and propose a new scheme which is based on the log-likelihood-ratio (LLR) criterion to improve the LR-aided detectors. The motivation of using the LLR criterion is that it utilizes both the received signal and the CSI, so that it can provide exact pairwise error probabilities (PEPs) of the symbols. Then, in the proposed scheme, we design the LLR-based transformation algorithm (TA) which uses the unimodular transformation to minimize the PEPs of the symbols by the LLR criterion. Note that the PEPs of the symbols affect the error propagation in the vertical Bell Laboratories Layered Space-Time (VBLAST) detector, and decreasing the PEPs can reduce the error propagation in the VBLAST detectors; thus, our LLR-based TA-aided VBLAST detectors will exhibit better BER performance than the previous LR-aided VBLAST detectors. Both the BER performance and the computational complexity are demonstrated through the simulation results.

The authors are building two algorithms for fault prediction using raw system-log data. This work is preliminary, and has only been applied to a limited dataset, however the results seem promising. The conclusions are that: (1) obtaining useful data from RAS-logs is challenging; (2) extracting concentrated information improves efficiency and accuracy; and (3) function evaluation algorithms are fast and lend well to scaling.

The detection of minimal residual disease (MRD) in myeloma using a 0.01% threshold (10−4) after treatment is an independent predictor of progression-free survival (PFS), but not always of overall survival (OS). However, MRD level is a continuous variable, and the predictive value of the depth of tumor depletion was evaluated in 397 patients treated intensively in the Medical Research Council Myeloma IX study. There was a significant improvement in OS for each log depletion in MRD level (median OS was 1 year for ≥10%, 4 years for 1% to <10%, 5.9 years for 0.1% to <1%, 6.8 years for 0.01% to <0.1%, and more than 7.5 years for <0.01% MRD). MRD level as a continuous variable determined by flow cytometry independently predicts both PFS and OS, with approximately 1 year median OS benefit per log depletion. The trial was registered at www.isrctn.com as #68454111. PMID:25645353

Health-care associated infections (HAIs) and the increasing number of antibiotic-resistant bacteria strains remain significant public health threats worldwide. Although the number of HAIs has decreased by using improved sterilization protocols, the cost related to HAIs is still quantified in billions of dollars. Furthermore, the development of multi-drug resistant strains is increasing exponentially, demonstrating that current treatments are inefficient. Thus, the quest for new methods to eradicate bacterial infection is increasingly important in antimicrobial, drug delivery and biomaterials research. Herein, the bactericidal activity of a water-soluble NO-releasing polysaccharide derivative was evaluated in nutrient broth media against three bacteria strains that are commonly responsible for HAIs. Data confirmed that this NO-releasing polysaccharide derivative induced an 8-logreduction in bacterial growth after 24h for Escherichia coli, Acinetobacter baumannii and Staphylococcus aureus. Additionally, the absence of bacteria after 72h of exposure to NO illustrates the inability of the bacteria to recover and the prevention of biofilm formation. The presented 8-logreduction in bacterial survival after 24h is among the highest reduction reported for NO delivery systems to date, and reaches the desired standard for industrially-relevant reduction. More specifically, this system represents the only water-soluble antimicrobial to reach such a significant bacterial reduction in nutrient rich media, wherein experimental conditions more closely mimic the in vivo environment than those in previous reports. Furthermore, the absence of bacterial activity after 72h and the versatility of using a water-soluble compound suggest that this NO-releasing polysaccharide derivative is a promising route for treating HAIs. PMID:26374942

To reduce greenhouse gas emissions from deforestation, Indonesia instituted a nationwide moratorium on new license areas (“concessions”) for oil palm plantations, timber plantations, and logging activity on primary forests and peat lands after May 2011. Here we indirectly evaluate the effectiveness of this policy using annual nationwide data on deforestation, concession licenses, and potential agricultural revenue from the decade preceding the moratorium. We estimate that on average granting a concession for oil palm, timber, or logging in Indonesia increased site-level deforestation rates by 17–127%, 44–129%, or 3.1–11.1%, respectively, above what would have occurred otherwise. We further estimate that if Indonesia’s moratorium had been in place from 2000 to 2010, then nationwide emissions from deforestation over that decade would have been 241–615 MtCO2e (2.8–7.2%) lower without leakage, or 213–545 MtCO2e (2.5–6.4%) lower with leakage. As a benchmark, an equivalent reduction in emissions could have been achieved using a carbon price-based instrument at a carbon price of $3.30–7.50/tCO2e (mandatory) or $12.95–19.45/tCO2e (voluntary). For Indonesia to have achieved its target of reducing emissions by 26%, the geographic scope of the moratorium would have had to expand beyond new concessions (15.0% of emissions from deforestation and peat degradation) to also include existing concessions (21.1% of emissions) and address deforestation outside of concessions and protected areas (58.7% of emissions). Place-based policies, such as moratoria, may be best thought of as bridge strategies that can be implemented rapidly while the institutions necessary to enable carbon price-based instruments are developed. PMID:25605880

Selective timber harvesting operations, if uncontrolled, can severely degrade a forest. Although techniques for reducing logging damage are well-known and inexpensive to apply, incentives to adopt these techniques are generally lacking. Power companies and other emitters of {open_quotes}greenhouse{close_quotes} gases soon may be forced to reduce or otherwise offset their net emissions; one offset option is to fund programs aimed at reducing logging damage. To investigate the consequences of reductions in logging damage for ecosystem carbon storage, I constructed a model to simulate changes in biomass and carbon pools following logging of primary dipterocarp forests in southeast Asia. I adapted a physiologically-driven, tree-based model of natural forest gap dynamics (FORMIX) to simulate forest recovery following logging. Input variables included stand structure, volume extracted, stand damage (% stems), and soil disturbance (% area compacted). Output variables included total biomass, tree density, and total carbon storage over time. Assumptions of the model included the following: (1) areas with soil disturbances have elevated probabilities of vine colonization and reduced rates of tree establishment, (2) areas with broken canopy but no soil disturbance are colonized initially by pioneer tree species and 20 yr later by persistent forest species, (3) damaged trees have reduced growth and increased mortality rates. Simulation results for two logging techniques, conventional and reduced-impact logging, are compared with data from field studies conducted within a pilot carbon offset project in Sabah, Malaysia.

The aim of the present study was to evaluate the use of the envelope difference index (EDI) and log-likelihood ratio (LLR) to quantify the independent and interactive effects of wide dynamic range compression, digital noise reduction and directionality, and to carry out self-rated quality measures. A recorded sentence embedded in speech spectrum noise at +5 dB signal to noise ratio was presented to a four channel digital hearing aid and the output was recorded with different combinations of algorithms at 30, 45 and 70 dB HL levels of presentation through a 2 cc coupler. EDI and LLR were obtained in comparison with the original signal using MATLAB software. In addition, thirty participants with normal hearing sensitivity rated the output on the loudness and clarity parameters of quality. The results revealed that the temporal changes happening at the output is independent of the number of algorithms activated together in a hearing aid. However, at a higher level of presentation, temporal cues are better preserved if all of these algorithms are deactivated. The spectral components speech tend to get affected by the presentation level. The results also indicate the importance of quality rating as this helps in considering whether the spectral and/or temporal deviations created in the hearing aid are desirable or not. PMID:26557357

The aim of the present study was to evaluate the use of the envelope difference index (EDI) and log-likelihood ratio (LLR) to quantify the independent and interactive effects of wide dynamic range compression, digital noise reduction and directionality, and to carry out self-rated quality measures. A recorded sentence embedded in speech spectrum noise at +5 dB signal to noise ratio was presented to a four channel digital hearing aid and the output was recorded with different combinations of algorithms at 30, 45 and 70 dB HL levels of presentation through a 2 cc coupler. EDI and LLR were obtained in comparison with the original signal using MATLAB software. In addition, thirty participants with normal hearing sensitivity rated the output on the loudness and clarity parameters of quality. The results revealed that the temporal changes happening at the output is independent of the number of algorithms activated together in a hearing aid. However, at a higher level of presentation, temporal cues are better preserved if all of these algorithms are deactivated. The spectral components speech tend to get affected by the presentation level. The results also indicate the importance of quality rating as this helps in considering whether the spectral and/or temporal deviations created in the hearing aid are desirable or not. PMID:26557357

Tygon(®) and other poly(vinyl chloride)-derived polymers are frequently used for tubing in blood transfusions, hemodialysis, and other extracorporeal circuit applications. These materials, however, tend to promote bacterial proliferation which contributes to the high risk of infection associated with device use. Antibacterial agents, such as nitric oxide donors, can be incorporated into these materials to eliminate bacteria before they can proliferate. The release of the antimicrobial agent from the device, however, is challenging to control and sustain on timescales relevant to blood transport procedures. Surface modification techniques can be employed to address challenges with controlled drug release. Here, surface modification using H2O (v) plasma is explored as a potential method to improve the biocompatibility of biomedical polymers, namely, to tune the nitric oxide-releasing capabilities from Tygon films. Film properties are evaluated pre- and post-treatment by contact angle goniometry, x-ray photoelectron spectroscopy, and optical profilometry. H2O (v) plasma treatment significantly enhances the wettability of the nitric-oxide releasing films, doubles film oxygen content, and maintains surface roughness. Using the kill rate method, the authors determine both treated and untreated films cause an 8 logreduction in the population of both Gram-negative Escherichia coli and Gram-positive Staphylococcus aureus. Notably, however, H2O (v) plasma treatment delays the kill rate of treated films by 24 h, yet antibacterial efficacy is not diminished. Results of nitric oxide release, measured via chemiluminescent detection, are also reported and correlated to the observed kill rate behavior. Overall, the observed delay in biocidal agent release caused by our treatment indicates that plasma surface modification is an important route toward achieving controlled drug release from polymeric biomedical devices. PMID:27440395

..., AND SOUTH ATLANTIC Shrimp Fishery of the South Atlantic Region § 622.207 Bycatch Reduction Device (BRD) requirements. (a) BRD requirement for South Atlantic shrimp. On a shrimp trawler in the South Atlantic EEZ... this section and is certified or provisionally certified for the area in which the shrimp trawler...

..., AND SOUTH ATLANTIC Shrimp Fishery of the South Atlantic Region § 622.207 Bycatch Reduction Device (BRD) requirements. (a) BRD requirement for South Atlantic shrimp. On a shrimp trawler in the South Atlantic EEZ... this section and is certified or provisionally certified for the area in which the shrimp trawler...

... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Reduction of matching contribution requirement. 92.222 Section 92.222 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development HOME INVESTMENT PARTNERSHIPS PROGRAM Program Requirements...

... Arrangements) of this chapter. (b) Steering gear, whistle, and means of communication. Prior to departure. See... service only. See § 196.15-5. (d) Verification of vessel compliance with applicable stability requirements.... See § 196.15-7. (e) Loading doors. Where applicable, every closing and any opening when not...

LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).

U.S. Navy ship acquisition programs such as DD 21 and CVNX are increasingly relying on top down requirements analysis (TDRA) to define and assess design approaches for workload and manpower reduction, and for ensuring required levels of human performance, reliability, safety, and quality of life at sea. The human systems integration (HSI) approach to TDRA begins with a function analysis which identifies the functions derived from the requirements in the Operational Requirements Document (ORD). The function analysis serves as the function baseline for the ship, and also supports the definition of RDT&E and Total Ownership Cost requirements. A mission analysis is then conducted to identify mission scenarios, again based on requirements in the ORD, and the Design Reference Mission (DRM). This is followed by a mission/function analysis which establishes the function requirements to successfully perform the ship's missions. Function requirements of major importance for HSI are information, performance, decision, and support requirements associated with each function. An allocation of functions defines the roles of humans and automation in performing the functions associated with a mission. Alternate design concepts, based on function allocation strategies, are then described, and task networks associated with the concepts are developed. Task network simulations are conducted to assess workloads and human performance capabilities associated with alternate concepts. An assessment of the affordability and risk associated with alternate concepts is performed, and manning estimates are developed for feasible design concepts.

Discusses the use of transaction logging in Okapi-related projects to allow search algorithms and user interfaces to be investigated, evaluated, and compared. A series of examples is presented, illustrating logging software for character-based and graphical user interface systems, and demonstrating the usefulness of relational database management…

Recently, numerous product recalls and one devastating outbreak that claimed 21 lives were attributed to Listeria monocytogenes in ready-to-eat meat products. Consequently, the Food Safety and Inspection Service published a federal register notice requiring manufacturers of ready-to-eat meat and poultry products to reassess their hazard analysis and critical control point plans for these products as specified in 9 CFR 417.4(a). Lebanon bologna is a moist, fermented ready-to-eat sausage. Because of undesirable quality changes. Lebanon bologna is often not processed above 48.9 degrees C (120 degrees F). Therefore, the present research was conducted to validate the destruction of L. monocytogenes in Lebanon bologna batter in a model system. During production, fermentation of Lebanon bologna to pH 4.7 alone significantly reduced L. monocytogenes by 2.3 log10 CFU/g of the sausage mix (P < 0.01). Heating the fermented mix to 48.9 degrees C in 10.5 h destroyed at least 7.0 log10 CFU of L. monocytogenes per g of sausage mix. A combination of low pH (5.0 or lower) and high heating temperatures (> or =43.3 degrees C, 115 degrees F) destroyed more than 5 log10 CFU of L. monocytogenes per g of sausage mix during the processing of Lebanon bologna. In conclusion, an existing commercial process, which was validated for destruction of Escherichia coli O157:H7, was also effective for the destruction of more than 5 log10 CFU of L. monocytogenes. PMID:11403142

We developed a novel decontamination method to inactivate Escherichia coli O157:H7 on radish seeds without adversely affecting seed germination or product quality. The use of heat (55, 60, and 65 °C) combined with relative humidity (RH; 25, 45, 65, 85, and 100%) for 24h was evaluated for effective microbial reduction and preservation of seed germination rates. A significant two-way interaction of heat and RH was observed for both microbial reduction and germination rate (P<0.0001). Increases in heat and RH were associated with corresponding reductions in E. coli O157:H7 and in germination rate (P<0.05). The order of lethality for the different treatments was generally as follows: no treatment <55 °C/25-65% RH ≒60 °C/25-45% RH ≒65 °C/25% RH <55 °C/85% RH =60 °C/65% RH <55 °C/100% RH =60 °C/85-100% RH =65 °C/45-100% RH. The most effective condition, 65 °C/45% RH, completely inactivated E. coli O157:H7 on the seeds (7.0 log CFU/g reduction) and had no significant effect on the germination rate (85.4%; P>0.05) or product quality. The method uses only heat and relative humidity without chemicals, and is thus applicable as a general decontamination procedure in spout producing plants where the use of growth chambers is the norm. PMID:25732001

Risk management for wastewater treatment and reuse have led to growing interest in understanding and optimising pathogen reduction during biological treatment processes. However, modelling pathogen reduction is often limited by poor characterization of the relationships between variables and incomplete knowledge of removal mechanisms. The aim of this paper was to assess the applicability of Bayesian belief network models to represent associations between pathogen reduction, and operating conditions and monitoring parameters and predict AS performance. Naïve Bayes and semi-naïve Bayes networks were constructed from an activated sludge dataset including operating and monitoring parameters, and removal efficiencies for two pathogens (native Giardia lamblia and seeded Cryptosporidium parvum) and five native microbial indicators (F-RNA bacteriophage, Clostridium perfringens, Escherichia coli, coliforms and enterococci). First we defined the Bayesian network structures for the two pathogen log10 reduction values (LRVs) class nodes discretized into two states (< and ≥ 1 LRV) using two different learning algorithms. Eight metrics, such as Prediction Accuracy (PA) and Area Under the receiver operating Curve (AUC), provided a comparison of model prediction performance, certainty and goodness of fit. This comparison was used to select the optimum models. The optimum Tree Augmented naïve models predicted removal efficiency with high AUC when all system parameters were used simultaneously (AUCs for C. parvum and G. lamblia LRVs of 0.95 and 0.87 respectively). However, metrics for individual system parameters showed only the C. parvum model was reliable. By contrast individual parameters for G. lamblia LRV prediction typically obtained low AUC scores (AUC < 0.81). Useful predictors for C. parvum LRV included solids retention time, turbidity and total coliform LRV. The methodology developed appears applicable for predicting pathogen removal efficiency in water treatment

This pilot study examined the feasibility of adherence to a low sodium diet in a sample of healthy New Zealand adults. It also addressed whether following a low sodium diet was accompanied by changes in intakes of other nutrients that influence cardiovascular risk. Eleven healthy adults provided dietary intake data and a 24-hour urine collection at baseline and follow-up. They then received nutritional counselling based on the World Health Organization recommendation for sodium intake (<2000 mg/day) and received ongoing nutritional support while undertaking a low sodium diet for four weeks. At the end of the four-week period, participants completed a semi-structured interview that elicited participants' opinions on barriers and facilitators to following a low sodium diet and explored changes in participants' dietary habits and behaviours. Thematic analysis revealed that adherence to a low sodium diet required substantial changes to participants' usual food purchasing and preparation habits. Participants reported that lack of control over the sodium content of meals eaten away from the home, the complex and time-consuming nature of interpreting nutrition information labels, and difficulty identifying suitable snacks were barriers to adherence. Detailed meal planning and cooking from scratch, using flavour replacements, reading food labels to identify low sodium foods, receiving support from other people and receiving tailored nutrition advice were facilitators. Mean sodium intake reduced over the period, accompanied by a decrease in mean intake of total fat. These factors suggest that sodium reduction in New Zealand adults was feasible. However, considerable changes to eating behaviours were required. PMID:27395412

The purpose of this document is to help developers of Grid middleware and application software generate log files that will be useful to Grid administrators, users, developers and Grid middleware itself. Currently, most of the currently generated log files are only useful to the author of the program. Good logging practices are instrumental to performance analysis, problem diagnosis, and security auditing tasks such as incident tracing and damage assessment. This document does not discuss the issue of a logging API. It is assumed that a standard log API such as syslog (C), log4j (Java), or logger (Python) is being used. Other custom logging API or even printf could be used. The key point is that the logs must contain the required information in the required format. At a high level of abstraction, the best practices for Grid logging are: (1) Consistently structured, typed, log events; (2) A standard high-resolution timestamp; (3) Use of logging levels and categories to separate logs by detail and purpose; (4) Consistent use of global and local identifiers; and (5) Use of some regular, newline-delimited ASCII text format. The rest of this document describes each of these recommendations in detail.

This patent describes a method for conducting a gravimetry survey of an earth formation, comprising the steps of: continuously traversing the formation with a gravity logging tool having at least two piezoelectric force transducers mounted at spaced-apart positions within the tool, exciting the piezoelectric transducers to vibrate at a characteristic resonant frequency, measuring the periods of vibration of the piezoelectric transducers as the logging tool continuously traverses the formation, the periods of vibration changing in response to the force exerted on the piezoelectric transducer by the acceleration of gravity and acceleration due to tool motion along the formation, and determining the difference in the measured periods of vibration of the piezoelectric transducers compensated for temperature relating force to the periods of vibration within the formation.

... OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE FISHERIES OF THE CARIBBEAN, GULF OF MEXICO, AND SOUTH ATLANTIC Shrimp Fishery of the Gulf of Mexico § 622.53 Bycatch reduction device (BRD... RA will notify the Gulf of Mexico Fishery Management Council in writing, and the public will...

... OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE FISHERIES OF THE CARIBBEAN, GULF OF MEXICO, AND SOUTH ATLANTIC Shrimp Fishery of the Gulf of Mexico § 622.53 Bycatch reduction device (BRD... RA will notify the Gulf of Mexico Fishery Management Council in writing, and the public will...

Mn(IV) and Mn(II) are the most stable and prevalent forms of manganese in natural environments. The occurrence of Mn(III) in minerals and the detection of soluble Mn(III) in natural waters, however, suggest that Mn(III) is an intermediate in both the oxidation of Mn(II) and the reduction of Mn(IV). Mn(III) has recently been proposed as an intermediate during the oxidation of Mn(II) by Mn-oxidizing bacteria but has never been considered as an intermediate during the bio-reduction of Mn(IV). Here we show for the first time that microbial Mn(IV) reduction proceeds step-wise via two successive one-electron transfer reactions with production of soluble Mn(III) as transient intermediate. Incubations with mutant strains demonstrate that the reduction of both solid Mn(IV) and soluble Mn(III) occurs at the outer membrane of the cell. In addition, pseudo-first order rate constants obtained from these incubations indicate that Mn(IV) respiration involves only one of the two potential terminal reductases (c-type cytochrome MtrC and OmcA) involved in Fe(III) respiration. More importantly, only the second electron transfer step is coupled to production of dissolved inorganic carbon, suggesting that the first electron transfer reaction is a reductive solubilization step that increases Mn bioavailability. These findings oppose the long-standing paradigm that microbial Mn(IV) reduction proceeds via a single two-electron transfer reaction coupled to organic carbon oxidation, and suggest that diagenetic models should be revised to correctly account for the impact of manganese reduction in the global carbon cycle.

The Incubator Display Software Requirements Specification was initially developed by Intrinsyx Technologies Corporation (Intrinsyx) under subcontract to Lockheed Martin, Contract Number NAS2-02090, for the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC) Space Station Biological Research Project (SSBRP). The Incubator Display is a User Payload Application (UPA) used to control an Incubator subrack payload for the SSBRP. The Incubator Display functions on-orbit as part of the subrack payload laptop, on the ground as part of the Communication and Data System (CDS) ground control system, and also as part of the crew training environment.

A critical factor in ensuring the safety of acidified foods is the establishment of a thermal process that assures the destruction of acid-resistant vegetative pathogenic and spoilage bacteria. For acidified foods such as dressings and mayonnaises with pH values of 3.5 or higher, the high water phase acidity (acetic acid of 1.5 to 2.5% or higher) can contribute to lethality, but there is a lack of data showing how the use of common ingredients such as acetic acid and preservatives, alone or in combination, can result in a 5-logreduction for strains of Escherichia coli O157:H7, Salmonella enterica, and Listeria monocytogenes in the absence of a postpackaging pasteurization step. In this study, we determined the times needed at 10° C to achieve a 5-logreduction of E. coli O157:H7, S. enterica, and L. monocytogenes in pickling brines with a variety of acetic and benzoic acid combinations at pH 3.5 and 3.8. Evaluation of 15 different acid-pH combinations confirmed that strains of E. coli O157:H7 were significantly more acid resistant than strains of S. enterica and L. monocytogenes. Among the acid conditions tested, holding times of 4 days or less could achieve a 5-logreduction for vegetative pathogens at pH 3.5 with 2.5% acetic acid or at pH 3.8 with 2.5% acetic acid containing 0.1% benzoic acid. These data indicate the efficacy of benzoic acid for reducing the time necessary to achieve a 5-logreduction in target pathogens and may be useful for supporting process filings and the determination of critical controls for the manufacture of acidified foods. PMID:23834800

The bisC gene of Escherichia coli is tentatively identified as the structural gene for biotin sulfoxide reductase by the isolation of bisC(Ts) mutants that make thermolabile enzyme. The products of four other E. coli genes (chlA, chlB, chlE and chlG) are also needed for enzymatic activity. Mutations previously assigned to the bisA, bisB, and bisD genes belong to genes chlA, chlE, and chlG, respectively. The biotin sulfoxide reductase deficiency of a chlG, mutant is partially reversed by the addition of 10 mM molybdate to the growth medium. Mutational inactivation of the chlD gene reduces the specific activity of biotin sulfoxide reductase about twofold. This effect is reversed by the addition of 1 mM molybdate to the growth medium. The specific activity of biotin sulfoxide reductase is decreased about 30-fold by the presence of tungstate in the growth medium, an effect that has been observed previously with nitrate reductase and other molybdoenzymes. The specific activity of biotin sulfoxide reductase is not elevated in a lysate prepared by derepressing a lambda cI857 chlG prophage. Whereas biotin sulfoxide reductase prepared by sonic extraction of growing cells is almost completely dependent on the presence of a small heat-stable protein resembling thioredoxin, much of the enzyme obtained from lysates of thermoinduced lambda cI857 lysogens does not require this factor. PMID:6460021

This document addresses requirements for post-test data reduction in support of the Orbital Flight Tests (OFT) mission evaluation team, specifically those which are planned to be implemented in the ODRC (Orbiter Data Reduction Complex). Only those requirements which have been previously baselined by the Data Systems and Analysis Directorate configuration control board are included. This document serves as the control document between Institutional Data Systems Division and the Integration Division for OFT mission evaluation data processing requirements, and shall be the basis for detailed design of ODRC data processing systems.

Ballistic lunar capture trajectories have been successfully utilized for lunar orbital missions since 1991. Recent interest in lunar landing trajectories has occurred due to a directive from President Bush to return humans to the Moon by 2015. NASA requirements for humans to return to the lunar surface include separation of crew and cargo missions, all lunar surface access, and anytime-abort to return to Earth. Such requirements are very demanding from a propellant standpoint. The subject of this paper is the application of lunar ballistic capture for the reduction of lunar landing propellant requirements. Preliminary studies of the application of weak stability boundary (WSB) trajectories and ballistic capture have shown that considerable savings in low Earth orbit (LEO) mission mass may be realized, on the order of 36% less than conventional Hohmann transfer orbit missions. Other advantages, such as reduction in launch window constraints and reduction of lunar orbit maintenance propellant requirements, have also surfaced from this study. PMID:16510407

The use of log-polar image sampling coordinates rather than conventional Cartesian coordinates offers a number of advantages for visual tracking and docking of space vehicles. Pixel count is reduced without decreasing the field of view, with commensurate reduction in peripheral resolution. Smaller memory requirements and reduced processing loads are the benefits in working environments where bulk and energy are at a premium. Rotational and zoom symmetries of log-polar coordinates accommodate range and orientation extremes without computational penalties. Separation of radial and rotational coordinates reduces the complexity of several target centering algorithms, described below.

Several strategies have been employed to reduce the long in vivo half-life of our lead CB1 antagonist, triazolopyridazinone 3, to differentiate the pharmacokinetic profile versus the lead clinical compounds. An in vitro and in vivo clearance data set revealed a lack of correlation; however, when compounds with <5% free fraction were excluded, a more predictable correlation was observed. Compounds with log P between 3 and 4 were likely to have significant free fraction, so we designed compounds in this range to give more predictable clearance values. This strategy produced compounds with desirable in vivo half-lives, ultimately leading to the discovery of compound 46. The progression of compound 46 was halted due to the contemporaneous marketing and clinical withdrawal of other centrally acting CB1 antagonists; however, the design strategy successfully delivered a potent CB1 antagonist with the desired pharmacokinetic properties and a clean off-target profile. PMID:24182233

The analog to digital processing requirements of onsite test facilities are described. The source and medium of all input data to the Data Reduction Complex (DRC) and the destination and medium of all output products of the analog-to-digital processing are identified. Additionally, preliminary input and output data formats are presented along with the planned use of the output products.

Log files are typically semi- or un-structured. To be useable for visualization and machine learning, they need to be parsed into a standard, structured format. Log-tool is a tool for facilitating the parsing, structuring, and routing of log files (e.g. intrusion detection long, web server logs, system logs). It consists of three main components: (1) Input  it will input data from files, standard input, and syslog, (2) Parser  it will parse the log file based on regular expressions into structured data (JSNO format), (3) Output  it will output structured data into commonly used formats, including Redis (a database), standard output, and syslog.

Adoption of reduced-impact logging (RIL) methods could reduce CO2 emissions by 30-50% across at least 20% of remaining tropical forests. We developed two cost effective and robust indices for comparing the climate benefits (reduced CO2 emissions) due to RIL. The indices correct for variability in the volume of commercial timber among concessions. We determined that a correction for variability in terrain slope was not needed. We found that concessions certified by the Forest Stewardship Council (FSC, N = 3), when compared with noncertified concessions (N = 6), did not have lower overall CO2 emissions from logging activity (felling, skidding, and hauling). On the other hand, FSC certified concessions did have lower emissions from one type of logging impact (skidding), and we found evidence of a range of improved practices using other field metrics. One explanation of these results may be that FSC criteria and indicators, and associated RIL practices, were not designed to achieve overall emissions reductions. Also, commonly used field metrics are not reliable proxies for overall logging emissions performance. Furthermore, the simple distinction between certified and noncertified concessions does not fully represent the complex history of investments in improved logging practices. To clarify the relationship between RIL and emissions reductions, we propose the more explicit term 'RIL-C' to refer to the subset of RIL practices that can be defined by quantified thresholds and that result in measurable emissions reductions. If tropical forest certification is to be linked with CO2 emissions reductions, certification standards need to explicitly require RIL-C practices. PMID:24022913

1. GENERAL VIEW OF LOG POND AND BOOM FOR UNLOADING CEDAR LOGS FROM TRUCKS AT LOG DUMP, ADJACENT TO MILL; TRUCKS FORMERLY USED TRIP STAKES, THOUGH FOR SAFER HANDLING OF LOGS WELDED STAKES ARE NOW REQUIRED; AS A RESULT LOADING IS NOW DONE WITH A CRANE - Lester Shingle Mill, 1602 North Eighteenth Street, Sweet Home, Linn County, OR

This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.

Field of view has always been a design feature paramount to helmet design, and in particular space suit design, where the helmet must provide an adequate field of view for a large range of activities, environments, and body positions. For Project Constellation, a slightly different approach to helmet requirement maturation was utilized; one that was less a direct function of body position and suit pressure and more a function of the mission segment in which the field of view is required. Through taxonimization of various parameters that affect suited FOV, as well as consideration for possible nominal and contingency operations during that mission segment, a reduction process was able to condense the large number of possible outcomes to only six unique field of view angle requirements that still captured all necessary variables without sacrificing fidelity. The specific field of view angles were defined by considering mission segment activities, historical performance of other suits, comparison between similar requirements (pressure visor up versus down, etc.), estimated requirements from other teams for field of view (Orion, Altair, EVA), previous field of view tests, medical data for shirtsleeve field of view performance, and mapping of visual field data to generate 45degree off-axis field of view requirements. Full resolution of several specific field of view angle requirements warranted further work, which consisted of low and medium fidelity field of view testing in the rear entry ISuit and DO27 helmet prototype. This paper serves to document this reduction progress and followup testing employed to write the Constellation requirements for helmet field of view.

The statement of purpose is to analyze network monitoring logs to support the computer incident response team. Specifically, gain a clear understanding of the Uniform Resource Locator (URL) and its structure, and provide a way to breakdown a URL based on protocol, host name domain name, path, and other attributes. Finally, provide a method to perform data reduction by identifying the different types of advertisements shown on a webpage for incident data analysis. The procedures used for analysis and data reduction will be a computer program which would analyze the URL and identify and advertisement links from the actual content links.

The aim of this study was to assess PM10 pollution level and estimate required source emission reduction in Belgrade area, the second largest urban center in the Balkans. Daily mass concentrations and trace metal content (As, Cd, Cr, Mn, Ni, Pb) of PM10 were evaluated for three air quality monitoring sites of different types: urban-traffic (Slavija), suburban (Lazarevac) and rural (Grabovac) under the industrial influence, during the period of 2012-13. Noncompliance with current Air Quality Standards (AQS) was noticeable: annual means were higher than AQS at Slavija and Lazarevac, and daily frequency threshold was exceeded at all three locations. Annual means of As at Lazarevac were about four times higher than the target concentration, which could be attributed to the proximity of coal-fired power plants, and dust resuspension from coal basin and nearby ash landfills. Additionally, levels of Ni and Cr were significantly higher than in other European cities. Carcinogenic health risk of inhabitants' exposure to trace metals was assessed as well. Cumulative cancer risk exceeded the upper limit of acceptable US EPA range at two sites, with Cr and As as the major contributors. To estimate source emission reduction, required to meet AQS, lognormal, Weibull and Pearson 5 probability distribution, functions (PDF) were used to fit daily PM10 concentrations. Based on the rollback equation and best fitting PDF, estimated reduction was within the range of 28-98%. Finally, the requiredreduction obtained using two-parameter exponential distribution suggested that risks associated to accidental releases of pollutants should be of greater concern. PMID:26252876

This is an executable python script which offers two different conversions for well log data: 1) Conversion from a BoreholeLASLogData.xls model to a LAS version 2.0 formatted XML file. 2) Conversion from a LAS 2.0 formatted XML file to an entry in the WellLog Content Model. Example templates for BoreholeLASLogData.xls and WellLogsTemplate.xls can be found in the package after download.

Field of view has always been a design feature paramount to helmets, and in particular space suits, where the helmet must provide an adequate field of view for a large range of activities, environments, and body positions. For Project Constellation, a different approach to helmet requirement maturation was utilized; one that was less a direct function of body position and suit pressure and more a function of the mission segment in which the field of view will be required. Through taxonimization of various parameters that affect suited field of view, as well as consideration for possible nominal and contingency operations during that mission segment, a reduction process was employed to condense the large number of possible outcomes to only six unique field of view angle requirements that still captured all necessary variables while sacrificing minimal fidelity.

We report the characterization of the diheme cytochrome c peroxidase (CcP) from Shewanella oneidensis (So) using UV/Visible absorbance, Electron Paramagnetic Resonance Spectroscopy, and Michaelis-Menten kinetics. While sequence alignment with other bacterial diheme cytochrome c peroxidases suggests that So CcP may be active in the as-isolated state, we find that So CcP requiresreductive activation for full activity, similar to the canonical Pseudomonas-type of bacterial CcP enzyme. Peroxide turnover initiated with oxidized So CcP shows a distinct lag-phase, which we interpret as reductive activation in situ. A simple kinetic model is sufficient to recapitulate the lag-phase behavior of the progress curves and separate the contributions of reductive activation and peroxide turnover. The rates of catalysis and activation differ between MBP-fusion and tag-free So CcP, and also depend on the identity of the electron donor. Combined with Michaelis-Menten analysis these data suggest that So CcP can accommodate electron donor binding in several possible orientations, and that the presence of the MBP tag affects the availability of certain binding sites. To further investigate the structural basis of reductive activation in So CcP we introduced mutations into two different regions of the protein that have been suggested to be important for reductive activation in homologous bacterial CcPs. Mutations in a flexible loop region neighboring the low-potential heme significantly increased the activation rate, confirming the importance of flexible loop regions of the protein in converting the inactive, as-isolated enzyme into the activated form. PMID:22239664

In large-scale computing systems, the sheer volume of log data generated presents daunting challenges for debugging and monitoring of these systems. The Oak Ridge Leadership Computing Facility s premier simulation platform, the Cray XT5 known as Jaguar, can generate a few hundred thousand log entries in less than a minute for many system level events. Determining the root cause of such system events requires analyzing and interpretation of a large number of log messages. Most often, the log messages are best understood when they are interpreted collectively rather than individually. In this paper, we present our approach to interpreting log messages by identifying their commonalities and grouping them into clusters. Given a set of log messages within a time interval, we group the messages based on source, target, and/or error type, and correlate the messages with hardware and application information. We monitor the Lustre log messages in the XT5 console log and show that such grouping of log messages assists in detecting the source of system events. By intelligent grouping and correlation of events in the log, we are able to provide system administrators with meaningful information in a concise format for root cause analysis.

... part. (iii) An entry of each test and activation of the Emergency Alert System (EAS) pursuant to the requirement of part 11 of this chapter and the EAS Operating Handbook. Stations may keep EAS data in a special EAS log which shall be maintained at a convenient location; however, this log is considered a part...

... part. (iii) An entry of each test and activation of the Emergency Alert System (EAS) pursuant to the requirement of part 11 of this chapter and the EAS Operating Handbook. Stations may keep EAS data in a special EAS log which shall be maintained at a convenient location; however, this log is considered a part...

... part. (iii) An entry of each test and activation of the Emergency Alert System (EAS) pursuant to the requirement of part 11 of this chapter and the EAS Operating Handbook. Stations may keep EAS data in a special EAS log which shall be maintained at a convenient location; however, this log is considered a part...

... part. (iii) An entry of each test and activation of the Emergency Alert System (EAS) pursuant to the requirement of part 11 of this chapter and the EAS Operating Handbook. Stations may keep EAS data in a special EAS log which shall be maintained at a convenient location; however, this log is considered a part...

... part. (iii) An entry of each test and activation of the Emergency Alert System (EAS) pursuant to the requirement of part 11 of this chapter and the EAS Operating Handbook. Stations may keep EAS data in a special EAS log which shall be maintained at a convenient location; however, this log is considered a part...

... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...

... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...

... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...

... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...

... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...

... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...

... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...

... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...

... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...

... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...

We reviewed the behavior of wildfire in riparian zones, primarily in the western United States, and the potential ecological consequences of postfire logging. Fire behavior in riparian zones is complex, but many aquatic and riparian organisms exhibit a suite of adaptations that allow relatively rapid recovery after fire. Unless constrained by other factors, fish tend to rebound relatively quickly, usually within a decade after a wildfire. Additionally, fire and subsequent erosion events contribute wood and coarse sediment that can create and maintain productive aquatic habitats over time. The potential effects of postfire logging in riparian areas depend on the landscape context and disturbance history of a site; however available evidence suggests two key management implications: (1) fire in riparian areas creates conditions that may not require intervention to sustain the long-term productivity of the aquatic network and (2) protection of burned riparian areas gives priority to what is left rather than what is removed. Research is needed to determine how postfire logging in riparian areas has affected the spread of invasive species and the vulnerability of upland forests to insect and disease outbreaks and how postfire logging will affect the frequency and behavior of future fires. The effectiveness of using postfire logging to restore desired riparian structure and function is therefore unproven, but such projects are gaining interest with the departure of forest conditions from those that existed prior to timber harvest, fire suppression, and climate change. In the absence of reliable information about the potential consequence of postfire timber harvest, we conclude that providing postfire riparian zones with the same environmental protections they received before they burned isjustified ecologically Without a commitment to monitor management experiments, the effects of postfire riparian logging will remain unknown and highly contentious. PMID:16922216

Log files are typically semi- or un-structured. To be useable for visualization and machine learning, they need to be parsed into a standard, structured format. Log-tool is a tool for facilitating the parsing, structuring, and routing of log files (e.g. intrusion detection long, web server logs, system logs). It consists of three main components: (1) Input  it will input data from files, standard input, and syslog, (2) Parser  it will parse the logmore » file based on regular expressions into structured data (JSNO format), (3) Output  it will output structured data into commonly used formats, including Redis (a database), standard output, and syslog.« less

Meeting a greenhouse gas (GHG) reduction target of 80% below 1990 levels in the year 2050 requires detailed long-term planning due to complexity, inertia, and path dependency in the energy system. A detailed investigation of supply and demand alternatives is conducted to assess requirements for future California energy systems that can meet the 2050 GHG target. Two components are developed here that build novel analytic capacity and extend previous studies: (1) detailed bottom-up projections of energy demand across the building, industry and transportation sectors; and (2) a high-resolution variable renewable resource capacity planning model (SWITCH) that minimizes the cost of electricity while meeting GHG policy goals in the 2050 timeframe. Multiple pathways exist to a low-GHG future, all involving increased efficiency, electrification, and a dramatic shift from fossil fuels to low-GHG energy. The electricity system is found to have a diverse, cost-effective set of options that meet aggressive GHG reduction targets. This conclusion holds even with increased demand from transportation and heating, but the optimal levels of wind and solar deployment depend on the temporal characteristics of the resulting load profile. Long-term policy support is found to be a key missing element for the successful attainment of the 2050 GHG target in California.

The purpose of this document is to investigate Oracle database log buffer queuing and its affect on the ability to load data using a specialized data loading system. Experiments were carried out on a Linux system using an Oracle 9.2 database. Previous experiments on a Sun 4800 running Solaris had shown that 100,000 entities per minute was an achievable rate. The question was then asked, can we do this on Linux, and where are the bottlenecks? A secondary question was also lurking, how can the loading be further scaled to handle even higher throughput requirements? Testing was conducted using a Dell PowerEdge 6650 server with four CPUs and a Dell PowerVault 220s RAID array with 14 36GB drives and 128 MB of cache. Oracle Enterprise Edition 9.2.0.4 was used for the database and Red Hat Linux Advanced Server 2.1 was used for the operating system. This document will detail the maximum observed throughputs using the same test suite that was used for the Sun tests. A detailed description of the testing performed along with an analysis of bottlenecks encountered will be made. Issues related to Oracle and Linux will also be detailed and some recommendations based on the findings.

This invention relates to an improved method for determining the oil saturation of subsurface earth formations in the vicinity of a well borehole. High energy neutrons irradiate the subsurface earth formations and gamma rays caused by inelastic scatter with the subsurface earth formation constituent materials are measured. For a chosen borehole depth, gamma ray logs are taken in different situations: first, with the formation fluid water and oil mixture in an undisturbed state; second, after flushing the formation with alcohol to displace the formation water and oil mixture; and, finally, after flushing the alcohol from the formation with water to obtain a measurement with no oil in the formation. The gamma ray measurements obtained are then used to determine the oil saturation without requiring knowledge of the porosity of the earth formation, borehole conditions or formation type. When the original oil content of the formation is at a naturally flushed, or residual, oil saturation, the present invention may be used to determine the residual oil saturation.

We propose an architecture for an audit log system for forensic photography, which ensures that the chain of evidence of a photograph taken by a photographer at a crime scene is maintained from the point of image capture to its end application at trial. The requirements for such a system are specified and the results of experiments are presented which demonstrate the feasibility of the proposed approach.

Enhanced anaerobic dechlorination is a promising technology for in situ remediation of chlorinated ethene DNAPL source areas. However, the build-up of organic acids and HCl in the source zone can lead to significant groundwater acidification. The resulting pH drop inhibits the activity of the dechlorinating microorganisms and thus may stall the remediation process. Source zone remediation requires extensive dechlorination, such that it may be common for soil's natural buffering capacity to be exceeded, and for acidic conditions to develop. In these cases bicarbonate addition (e.g., NaHCO3, KHCO3) is required for pH control. As a design tool for treatment strategies, we have developed BUCHLORAC, a Windows Graphical User Interface based on an abiotic geochemical model that allows the user to predict the acidity generated during dechlorination and associated buffer requirements for their specific operating conditions. BUCHLORAC was motivated by the SABRE (Source Area BioREmediation) project, which aims to evaluate the effectiveness of enhanced reductive dechlorination in the treatment of chlorinated solvent source zones.

Two logs of EE-3A were performed during the last couple of weeks. The first of which, was a Temperature/Casing-Collar Locator (CCL) log, which took place on Friday, December 10th., 1993. The second log was a Caliper log which was done in cooperation with the Dia-Log Company, of Odessa, TX. on Monday, December, 13th., 1993.

This is the final report and addresses all of the work performed on this program. Specifically, it covers vehicle architecture background, definition of six baseline engine cycles, reliability baseline (space shuttle main engine QRAS), and component level reliability/performance/cost for the six baseline cycles, and selection of 3 cycles for further study. This report further addresses technology improvement selection and component level reliability/performance/cost for the three cycles selected for further study, as well as risk reduction plans, and recommendation for future studies.

After 8 years of extensive R and D in the new technology of coal log pipeline (CLP), a pilot plant is being built to demonstrate and test a complete CLP system for coal transportation. The system consists of a coal log fabrication plant, a 3,000-ft-length, 6-inch-diameter underground pipeline loop to transport 5.4-inch diameter coal logs, a log injection/ejection system, a pump bypass, a reservoir that serves as both the intake and the outlet of the CLP systems, an instrumentation system that includes pressure transducers, coal log sensors, and flowmeters, and an automatic control system that includes PLCs and a central computer. The pilot plant is to be completed in May of Year 2000. Upon completion of construction, the pilot plant will be used for running various types of coal, testing the degradation rate of drag reduction in CLP using Polyox (polyethylene oxide), testing the reliability of a special coal log sensor invented at the University of Missouri, testing the reliability and the efficiency of the pump-bypass system for pumping coal log trains through the pipe, and testing various hardware components and software for operating the pilot plant. Data collected from the tests will be used for designing future commercial systems of CLP. The pilot plant experiments are to be completed in two years. Then, the technology of CLP will be ready for commercial use.

Logs record the events that have happened within in a system so they are considered the history of system activities. They are one of the objects that digital forensic investigators would like to examine when a security incident happens. However, logs were initially created for trouble shooting, and are not purposefully designed for digital forensics. Thus, enormous and redundant log data make analysis tasks complicated and time-consuming to find valuable information, and make logging-related techniques difficult utilized in some systems such as embedded systems. In this paper, we reconsider a data logging mechanism in terms of forensics and consequently, we propose purpose-based forensic logging. In purpose-based forensic logging, we only collect the requiredlogs according to a specific purpose, which could decrease the space that logs occupy and may mitigate the analysis tasks during forensic investigations.

An increasing prevalence of obesity all over the world reflects a lack of effective measures in both prevention and treatment of obesity. Obesity as a disease has been underestimated by the lay-public as well as health care providers. However, obesity represents a substantial health problem associated with a decreased quality of life. Obesity is linked to numerous chronic diseases (cardiovascular diseases, diabetes, hyperlipidemia, gout, osteoarthritis, gall-stones, and bowel, breast and genitourinary cancers) that lead to premature disability and mortality. Health risks increase with a body mass index (BMI) over 25 in individuals 19-35 years of age and with a BMI over 27 in those 35 years of age and older. Health risks also increase with an excess accumulation of visceral fat manifested as an increase in waist circumference (> 100 cm) or in waist to hip ratio (> 0.85 for females and > 1.00 for males). According to studies carried out in different countries current economic costs of obesity represent 5-8% of all direct health costs. In contrast, effective treatment of obesity results in a substantial decrease in expenditures associated with pharmacotherapy of hypertension, diabetes, hyperlipidemia and osteoarthritis. Both scientists and clinicians involved in obesity research and treatment recommend to introduce the long-term weight management programs focussing more on the overall health of the participants than the weight loss per se. Therefore, it will be necessary to establish new realistic goals in the obesity management that reflect reasonable weights and recently experienced beneficial health effects of modest (5-10%) weight loss. Comprehensive obesity treatment consisting of low fat diet, exercise, behavioral modification, drug therapy and surgical procedures requires differentiated weight management programs modified according to the degree and type of obesity as well as to current health complications present. The Czech Society for the Study of Obesity

Technologies including NMR logging apparatus and methods are disclosed. Example NMR logging apparatus may include surface instrumentation and one or more downhole probes configured to fit within an earth borehole. The surface instrumentation may comprise a power amplifier, which may be coupled to the downhole probes via one or more transmission lines, and a controller configured to cause the power amplifier to generate a NMR activating pulse or sequence of pulses. Impedance matching means may be configured to match an output impedance of the power amplifier through a transmission line to a load impedance of a downhole probe. Methods may include deploying the various elements of disclosed NMR logging apparatus and using the apparatus to perform NMR measurements.

Currently existing message logging protocols demonstrate a classic pessimistic vs. optimistic tradeoff. We show that the optimistic-pessimistic tradeoff is not inherent to the problem of message logging. We construct a message-logging protocol that has the positive features of both optimistic and pessimistic protocol: our protocol prevents orphans and allows simple failure recovery; however, it requires no blocking in failure-free runs. Furthermore, this protocol does not introduce any additional message overhead as compared to one implemented for a system in which messages may be lost but processes do not crash.

This paper stochastically models a single-node telecommunications system both with and without the use of intelligent multiplexing. Intelligent multiplexers take advantage of the idle periods or silences that occur during the course of speech transmissions to merge (or multiplex) packetized talkspurts from more than one source onto a single channel. This allows for a more efficient use of available bandwidth, thereby reducing the amount of bandwidth required to carry a particular traffic load. Digitizing speech into packets of equal size also allows for compression, further reducing bandwidth needs. By comparing the models for systems both with and without multiplexing, we are able to determine the reduction in bandwidth which may be expected for a particular grade of service (measured by blocking probabilities). A bivariate continuous time Markov chain model for a multiplexer is presented. An approximation is introduced to calculate limiting blocking probabilities much more quickly and for larger systems than is possible by solving a set of linear equations for the bivariate model. The accuracy of the approximation is explored through comparison with the bivariate model; the approximation provides a somewhat conservative estimate of blocking, but is close enough to be used as a tool for the range of relevant values. The approximation is then used to compare blocking probabilities for three different levels of speech activity. Results are shown in tabular form.

A method is described of logging earth formations traversed by a well bore and utilizing a logging tool having a neutron source and a short spaced and a long spaced thermal neutron detector which produce an independent response as a function of depth of the logging tool in a well bore. The method comprises: moving the logging tool through a well bore to locate a section of the earth formations which has minimum porosity and obtaining measurement responses from each of the long and short spaced detectors; normalizing the responses of the long and short spaced detectors by matching the sensitivity of response of the long spaced detector to the sensitivity of response of the short spaced detector for an earth formation which has minimum porosity so that the normalized responses track one another in an earth formation which has minimum porosity; and moving the tool over the length of the well bore to be surveyed while recording the normalized responses of the long and short spaced neutron detectors as a function of depth.

A classroom lecture at Capistrano Connections Academy in Southern California involves booting up the home computer, logging on to a Web site, and observing a teacher conducting a PowerPoint presentation of that day's lesson entirely online. Through microphone headsets, students can watch on their home computers, respond to the teacher's questions,…

The authors created an interactive reflective log (IRL) to provide teachers with an opportunity to use a journal approach to record, evaluate, and communicate student understanding of science concepts. Unlike a traditional journal, the IRL incorporates prompts to encourage students to discuss their understanding of science content and science…

A perl module designed to read and parse the voluminous set of event or accounting log files produced by a Portable Batch System (PBS) server. This module can filter on date-time and/or record type. The data can be returned in a variety of formats.

The major events of the first manned moon landing mission, Apollo 11, are presented in chronological order from launch time until arrival of the astronauts aboard the U.S.S. Hornet. The log is descriptive, non-technical, and includes numerous color photographs of the astronauts on the moon. (PR)

We review and formulate results concerning log-concavity and strong-log-concavity in both discrete and continuous settings. We show how preservation of log-concavity and strongly log-concavity on ℝ under convolution follows from a fundamental monotonicity result of Efron (1969). We provide a new proof of Efron's theorem using the recent asymmetric Brascamp-Lieb inequality due to Otto and Menz (2013). Along the way we review connections between log-concavity and other areas of mathematics and statistics, including concentration of measure, log-Sobolev inequalities, convex geometry, MCMC algorithms, Laplace approximations, and machine learning. PMID:27134693

Uncoordinated checkpointing for message-passing systems allows maximum process autonomy and general nondeterministic execution, but suffers from potential domino effect and the large space overhead for maintaining checkpoints and message logs. Traditionally, it has been assumed that only obsolete checkpoints and message logs before the global recovery line can be garbage-collected. Recently, an approach to identifying all garbage checkpoints based on recovery line transformation and decomposition has been developed. We show in this paper that the same approach can be applied to the problem of identifying all garbage message logs for systems requiring message logging to record in-transit messages. Communication trace-driven simulation for several parallel programs is used to evaluate the proposed algorithm.

A log is described for construction of a log building which comprises: an elongated peeled log of substantially uniform diameter along its length with parallel end faces, a bottom surface of the log having a concave surface configuration centered on a diametrical line of the log, a rounded top surface directly opposite from the concave bottom surface which mates with a concave surface of another log when placed upon the rounded top surface, a vertically extending longitudinal slot in the top surface of the log that extends the length of the log, a vertically extending longitudinal slot along at least one side of the log with the slot extending vertically substantially parallel with the diametrical line with the slot being formed outwardly of the concave surface, the log including at least one butt end, the butt end including an end slot along the diametrical line which extends from a top of the log down through the butt end to the concave surface; and the butt includes at least one short, longitudinally extending arcuate groove near an outer surface of the log which extends from a line juxtaposed the end slot down to at least one longitudinal slot in the log.

A well logging system includes a logging tool adapted to be passed through a borehole traversing an earth formation. The logging tool contains a sensor sensing a condition of the earth formation and providing electrical pulses corresponding in number and peak amplitude to the sensed condition. A first electrical pulse from the sensor occurring during each predetermined time period of a plurality of predetermined time periods, is stretched and then converted to parallel digital signals. A register receives the parallel digital signals and provides a serial digital signal in response to the shift pulses. A network provides an electrical synchronization pulse each time period prior to the occurrence of the shift pulses. A light emitting diode converts the synchronization pulses and the serial digital signals to corresponding light pulses. A cable including a fiber optic conductor transmits the light pulses uphole to the surface. Surface electronics includes a light-to-electrical converter for providing corresponding electrical pulses in accordance with the light pulses, so that the light-to-electrical converter provides a synchronization pulse followed by a serial digital signal each time period. Another circuit provides a set of shift pulses in response to the synchronizing pulse from the light-to-electrical converter, and an output circuit provides parallel output digital signals corresponding to the sensed condition in accordance with the shift pulses and the serial digital signals from the light-to -electrical converter.

12. Upstream view showing thelower log pond log chute in the main channel of the Hudson River. The log chute in the dam can be seen in the background. Facing southwest. - Glens Falls Dam, 100' to 450' West of U.S. Route 9 Bridge Spanning Hudson River, Glens Falls, Warren County, NY

SedLog is a free multi-platform software package for creating graphic sediment logs providing an intuitive graphical user interface. The graphic sediment logs generated by SedLog can be exported as PDF, Scalable Vector Graphics (SVG), or JPEG for use by other drawing applications or for publications. Log data can be imported and exported in Comma Separated Values (CSV) format. The logs can also be printed to any paper size the user wants. Zoom In, Zoom Out, Fit page, Fit Height and Fit Width facilities are also provided to enable the user to customise the workspace size.

A method is described for conducting a gravimetry survey of an earth formation, comprising the steps of: (a) continuously traversing the earth formation with a gravity logging tool having a column of fluid within the tool, (b) measuring a first pressure difference along a first interval within the column of fluid, (c) measuring a second pressure difference along a second interval within the column of fluid, (d) differencing the first and second pressure differences to determine the gravity gradient along the earth formation between the first and second intervals.

... is codifying the control numbers that have been issued by the Office of Management and Budget (OMB... numbers assigned by the Office of Management and Budget (OMB) pursuant to the Paperwork Reduction Act of... CFR 1022.43. 0145 640; 12 CFR 1022.70. 0150 318. 0156 12 CFR part 1014. 0157 12 CFR part 1015. ]...

Log files are a necessary record of events on any system. However, as systems scale, so does the volume of data captured. To complicate matters, this data can be distributed across all nodes within the system. This creates challenges in ways to obtain these files as well as archiving them in a consistent manner. It has become commonplace to develop a custom written utility for each system that is tailored specifically to that system. Formore » computer centers that contain multiple systems, each system would have their own respective utility for gathering and archiving log files. Each time a new log file is produced, a modification to the utility is necessary. With each modification, risks of errors could be introduced as well as spending time to introduce that change. This is precisely the purpose of logjam. Once installed, the code only requires modification when new features are required. A configuration file is used to identify each log file as well as where to harvest it and how to archive it. Adding a new log file is as simple as defining it in a configuration file and testing can be performed in the production environment.« less

Log files are a necessary record of events on any system. However, as systems scale, so does the volume of data captured. To complicate matters, this data can be distributed across all nodes within the system. This creates challenges in ways to obtain these files as well as archiving them in a consistent manner. It has become commonplace to develop a custom written utility for each system that is tailored specifically to that system. For computer centers that contain multiple systems, each system would have their own respective utility for gathering and archiving log files. Each time a new log file is produced, a modification to the utility is necessary. With each modification, risks of errors could be introduced as well as spending time to introduce that change. This is precisely the purpose of logjam. Once installed, the code only requires modification when new features are required. A configuration file is used to identify each log file as well as where to harvest it and how to archive it. Adding a new log file is as simple as defining it in a configuration file and testing can be performed in the production environment.

Results from various surveys are reviewed as regards X-ray source counts at high galactic latitudes and the luminosity functions determined for extragalactic sources. Constraints on the associated log N-log S relation provided by the extragalactic X-ray background are emphasized in terms of its spatial fluctuations and spectrum as well as absolute flux level. The large number of sources required for this background suggests that there is not a sharp boundary in the redshift distribution of visible matter.

The coal log pipeline (CLP) is an innovative means for long-distance transportation of coal. In the CLP concept, coal is pressed into the form of cylinders--coal logs--that are propelled by water flowing through underground pipe. A coal log pipeline has many advantages when compared to coal transport by unit train, slurry pipeline and long-distance trucking: low-cost, low energy consumption, low-water consumption, simple dewatering at pipeline exit, safe, and environmentally friendly. The coal logs travel butted together, as trains. Between the coal log {open_quotes}trains,{close_quotes} some space is allowed for valve switching. The optimum diameter of a coal log is approximately 90 to 95% the inside diameter of the pipe. The coal-to-water ratio is about 4 to 1. A 200 mm diameter CLP can transport about 2 million tonnes of coal per year. The coal logs at their destination come out of the pipeline onto a moving conveyer which transports the logs to a crusher or stock pile. Coal logs are crushed to match the size of existing fuel. The water effluent is treated and reused at the power plant; there is no need for its discharge. Coal logs can be manufactured with and without the use of binder. By using less than 2 percent emulsified asphalt as binder, no heat is required to compact coal logs. Binderless coal logs can be compacted at less than 90{degrees}C. Compaction pressures, for coal logs made with or without binder, are about 70 MPa. The coal particle size distribution and moisture content must be controlled. The economics of coal log pipeline system have been studied. Results indicate that a new coal log pipeline is cost-competitive with existing railroads for distances greater than 80 km, approximately. CLP is much more economical than coal slurry pipeline of the same diameter. This paper describes the current R&D and commercialization plan for CLP. 4 refs.

The requirements for and kinetics of the activation of the nickel-deficient (apo) CO dehydrogenase from Rhodospirillum rubrum by exogenous nickel have been investigated. The activation is strictly dependent upon the presence of a low-potential one-electron reductant. Sodium dithionite and reduced methylviologen are suitable reductants, whereas reduced indigo carmine and the two-electron reductants sodium borohydride, NADH, and dithiothreitol are ineffective in stimulating activation. The midpoint potential for activation was observed at approximately {minus}475 mV. The ability of a reductant to stimulate activation is correlated with the reduced state of the enzyme Fe{sub 4}-S{sub 4} centers. The activation follows apparent first-order kinetics in a saturable fashion, yielding a rate constant of 0.157 min{sup {minus}1} at saturating concentration of nickel. The initial rate at which the enzyme is activated by NiCl{sub 2} is also a saturable process, yielding a dissociation constant (K{sub D}) of 755 {mu}M for the initial association of nickel and enzyme. Cadmium(II), zinc(II), cobalt(II), and iron(II) are competitive inhibitors of nickel activation with inhibition constants of 2.44, 4.16, 175, and 349 {mu}M, respectively. Manganese(II), calcium(II), and magnesium(II) exhibit no inhibition of activation.

... electronically controlled engines broadcast their speed and output shaft torque. (g) A remanufacturing system certified for locomotive engines under 40 CFR part 1033 may be deemed to also meet the requirements of this.... See paragraph (g) of this section for special provisions related to remanufacturing systems...

... electronically controlled engines broadcast their speed and output shaft torque. (g) A remanufacturing system certified for locomotive engines under 40 CFR part 1033 may be deemed to also meet the requirements of this.... See paragraph (g) of this section for special provisions related to remanufacturing systems...

... under this subpart only if they have NOX emissions equivalent to or less than baseline NOX levels and PM... certified for locomotive engines under 40 CFR part 1033 may be deemed to also meet the requirements of this... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Emission standards and...

... under this subpart only if they have NOX emissions equivalent to or less than baseline NOX levels and PM... certified for locomotive engines under 40 CFR part 1033 may be deemed to also meet the requirements of this... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Emission standards and...

... under this subpart only if they have NOX emissions equivalent to or less than baseline NOX levels and PM... certified for locomotive engines under 40 CFR part 1033 may be deemed to also meet the requirements of this... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Emission standards and...

New methods and apparatus are disclosed which allow measurement of the presence of oil and water in geological formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleons present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth3 s magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation . The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be preformed in open boreholes and in cased well bores.

This paper describes the analysis on integration requirements, CO2 compressor in particular, for integration of Carbon Dioxide Removal Assembly (CDRA) and CO2 Reduction Assembly (CRA) as a part of the Node 3 project previously conducted at JSC/NASA. A system analysis on the volume and operation pressure range of the CO2 accumulator was conducted. The hardware and operational configurations of the CO2 compressor were developed. The performance and interface requirements of the compressor were specified. An existing Four-Bed Molecular Sieve CO2 removal computer model was modified into a CDRA model and used in analyzing the requirements of the CDRA CO2 compressor. This CDRA model was also used in analyzing CDRA operation parameters that dictate CO2 pump sizing. Strategy for the pump activation was also analyzed.

Background Ribonucleotide reduction is the only de novo pathway for synthesis of deoxyribonucleotides, the building blocks of DNA. The reaction is catalysed by ribonucleotide reductases (RNRs), an ancient enzyme family comprised of three classes. Each class has distinct operational constraints, and are broadly distributed across organisms from all three domains, though few class I RNRs have been identified in archaeal genomes, and classes II and III likewise appear rare across eukaryotes. In this study, we examine whether this distribution is best explained by presence of all three classes in the Last Universal Common Ancestor (LUCA), or by horizontal gene transfer (HGT) of RNR genes. We also examine to what extent environmental factors may have impacted the distribution of RNR classes. Results Our phylogenies show that the Last Eukaryotic Common Ancestor (LECA) possessed a class I RNR, but that the eukaryotic class I enzymes are not directly descended from class I RNRs in Archaea. Instead, our results indicate that archaeal class I RNR genes have been independently transferred from bacteria on two occasions. While LECA possessed a class I RNR, our trees indicate that this is ultimately bacterial in origin. We also find convincing evidence that eukaryotic class I RNR has been transferred to the Bacteroidetes, providing a stunning example of HGT from eukaryotes back to Bacteria. Based on our phylogenies and available genetic and genomic evidence, class II and III RNRs in eukaryotes also appear to have been transferred from Bacteria, with subsequent within-domain transfer between distantly-related eukaryotes. Under the three-domains hypothesis the RNR present in the last common ancestor of Archaea and eukaryotes appears, through a process of elimination, to have been a dimeric class II RNR, though limited sampling of eukaryotes precludes a firm conclusion as the data may be equally well accounted for by HGT. Conclusions Horizontal gene transfer has clearly played an

Log data are of prime importance in acquiring petrophysical data from hydrocarbon reservoirs. Reliable log analysis in a hydrocarbon reservoir requires a complete set of logs. For many reasons, such as incomplete logging in old wells, destruction of logs due to inappropriate data storage and measurement errors due to problems with logging apparatus or hole conditions, log suites are either incomplete or unreliable. In this study, fuzzy logic and artificial neural networks were used as intelligent tools to synthesize petrophysical logs including neutron, density, sonic and deep resistivity. The petrophysical data from two wells were used for constructing intelligent models in the Fahlian limestone reservoir, Southern Iran. A third well from the field was used to evaluate the reliability of the models. The results showed that fuzzy logic and artificial neural networks were successful in synthesizing wireline logs. The combination of the results obtained from fuzzy logic and neural networks in a simple averaging committee machine (CM) showed a significant improvement in the accuracy of the estimations. This committee machine performed better than fuzzy logic or the neural network model in the problem of estimating petrophysical properties from well logs.

The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms.

The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms. PMID:24743552

Several species belonging to the genus Entamoeba can colonize the mouth or the human gut; however, only Entamoeba histolytica is pathogenic to the host, causing the disease amoebiasis. This illness is responsible for one hundred thousand human deaths per year worldwide, affecting mainly underdeveloped countries. Throughout its entire life cycle and invasion of human tissues, the parasite is constantly subjected to stress conditions. Under in vitro culture, this microaerophilic parasite can tolerate up to 5 % oxygen concentrations; however, during tissue invasion the parasite has to cope with the higher oxygen content found in well-perfused tissues (4-14 %) and with reactive oxygen and nitrogen species derived from both host and parasite. In this work, the role of the amoebic oxygen reduction pathway (ORP) and heat shock response (HSP) are analyzed in relation to E. histolytica pathogenicity. The data suggest that in contrast with non-pathogenic E. dispar, the higher level of ORP and HSPs displayed by E. histolytica enables its survival in tissues by diminishing and detoxifying intracellular oxidants and repairing damaged proteins to allow metabolic fluxes, replication and immune evasion. PMID:26589893

Borehole geophysical logging for site characterization in the volcanic rocks at the proposed nuclear waste repository at Yucca Mountain, Nevada, requires data collection under rather unusual conditions. Logging tools must operate in rugose, dry holes above the water table in the unsaturated zone. Not all logging tools will operate in this environment, therefore; careful consideration must be given to selection and calibration. A sample suite of logs is presented that demonstrates correlation of geological formations from borehole to borehole, the definition of zones of altered mineralogy, and the quantitative estimates of rock properties. We show the results of an exploratory calculation of porosity and water saturation based upon density and epithermal neutron logs. Comparison of the results with a few core samples is encouraging, particularly because the logs can provide continuous data in boreholes where core samples are not available.

Californium-252 is an intense neutron emitter that has only recently become available for experimental well logging. The purpose of this research is to investigate the application of well logging to groundwater hydrology; however, most of the techniques and purposes are quite similar to applications in the petroleum industry.

This paper describes a method of generating pseudovelocity logs using measurements of electrical resistivity. A theoretical relation between electrical resistivity and transit time, which is applicable to a wide range of lithologies, has been developed. The application of this relation using a method which defines lithoresistivity zones as lithological intervals related to the same formation and showing small resistivity variations, has been tested in the Reconcavo sedimentary basin in Bahia, Brazil. A comparison of derived pseudovelocity logs with actual sonic logs for five wells shows the validity of the present approach.

We combined measurements of tree growth and carbon dioxide exchange to investigate the effects of selective logging on the Aboveground Live Biomass (AGLB) of a tropical rain forest in the Amazon. Most of the measurements began at least 10 months before logging and continued at least 36 months after logging. The logging removed ˜15% of the trees with Diameter at Breast Height (DBH) greater than 35 cm, which resulted in an instantaneous 10% reduction in AGLB. Both wood production and mortality increased following logging, while Gross Primary Production (GPP) was unchanged. The ratio of wood production to GPP (the wood Carbon Use Efficiency or wood CUE) more than doubled following logging. Small trees (10 cm < DBH < 35 cm) accounted for most of the enhanced wood production. Medium trees (35 cm < DBH < 55 cm) that were within 30 m of canopy gaps created by the logging also showed increased growth. The patterns of enhanced growth are most consistent with logging-induced increases in light availability. The AGLB continued to decline over the study, as mortality outpaced wood production. Wood CUE and mortality remained elevated throughout the 3 years of postlogging measurements. The future trajectory of AGLB and the forest's carbon balance are uncertain, and will depend on how long it takes for heterotrophic respiration, mortality, and CUE to return to prelogging levels.

We assess climate impacts of global warming using ongoing observations and paleoclimate data. We use Earth's measured energy imbalance, paleoclimate data, and simple representations of the global carbon cycle and temperature to define emission reductions needed to stabilize climate and avoid potentially disastrous impacts on today's young people, future generations, and nature. A cumulative industrial-era limit of approx.500 GtC fossil fuel emissions and 100 GtC storage in the biosphere and soil would keep climate close to the Holocene range to which humanity and other species are adapted. Cumulative emissions of approx.1000 GtC, sometimes associated with 2 C global warming, would spur "slow" feedbacks and eventual warming of 3-4 C with disastrous consequences. Rapid emissions reduction is required to restore Earth's energy balance and avoid ocean heat uptake that would practically guarantee irreversible effects. Continuation of high fossil fuel emissions, given current knowledge of the consequences, would be an act of extraordinary witting intergenerational injustice. Responsible policymaking requires a rising price on carbon emissions that would preclude emissions from most remaining coal and unconventional fossil fuels and phase down emissions from conventional fossil fuels.

We assess climate impacts of global warming using ongoing observations and paleoclimate data. We use Earth’s measured energy imbalance, paleoclimate data, and simple representations of the global carbon cycle and temperature to define emission reductions needed to stabilize climate and avoid potentially disastrous impacts on today’s young people, future generations, and nature. A cumulative industrial-era limit of ∼500 GtC fossil fuel emissions and 100 GtC storage in the biosphere and soil would keep climate close to the Holocene range to which humanity and other species are adapted. Cumulative emissions of ∼1000 GtC, sometimes associated with 2°C global warming, would spur “slow” feedbacks and eventual warming of 3–4°C with disastrous consequences. Rapid emissions reduction is required to restore Earth’s energy balance and avoid ocean heat uptake that would practically guarantee irreversible effects. Continuation of high fossil fuel emissions, given current knowledge of the consequences, would be an act of extraordinary witting intergenerational injustice. Responsible policymaking requires a rising price on carbon emissions that would preclude emissions from most remaining coal and unconventional fossil fuels and phase down emissions from conventional fossil fuels. PMID:24312568

We assess climate impacts of global warming using ongoing observations and paleoclimate data. We use Earth's measured energy imbalance, paleoclimate data, and simple representations of the global carbon cycle and temperature to define emission reductions needed to stabilize climate and avoid potentially disastrous impacts on today's young people, future generations, and nature. A cumulative industrial-era limit of ∼500 GtC fossil fuel emissions and 100 GtC storage in the biosphere and soil would keep climate close to the Holocene range to which humanity and other species are adapted. Cumulative emissions of ∼1000 GtC, sometimes associated with 2°C global warming, would spur "slow" feedbacks and eventual warming of 3-4°C with disastrous consequences. Rapid emissions reduction is required to restore Earth's energy balance and avoid ocean heat uptake that would practically guarantee irreversible effects. Continuation of high fossil fuel emissions, given current knowledge of the consequences, would be an act of extraordinary witting intergenerational injustice. Responsible policymaking requires a rising price on carbon emissions that would preclude emissions from most remaining coal and unconventional fossil fuels and phase down emissions from conventional fossil fuels. PMID:24312568

Treatment of rats with reserpine, an inhibitor of the vesicular monoamine transporter (VMAT), depletes norepinephrine (NE) and regulates NE transporter (NET) expression. The present study examined the molecular mechanisms involved in regulation of the NET by reserpine using cultured cells. Exposure of rat PC12 cells to reserpine for a period as short as 5 min decreased [3H]NE uptake capacity, an effect characterized by a robust decrease in the Vmax of the transport of [3H]NE. As expected, reserpine did not displace the binding of [3H]nisoxetine from the NET in membrane homogenates. The potency of reserpine for reducing [3H]NE uptake was dramatically lower in SK-N-SH cells that have reduced storage capacity for catecholamines. Reserpine had no effect on [3H]NE uptake in HEK-293 cells transfected with the rat NET (293-hNET), cells that lack catecholamine storage vesicles. NET regulation by reserpine was independent of trafficking of the NET from the cell surface. Pre-exposure of cells to inhibitors of several intracellular signaling cascades known to regulate the NET, including Ca2+/Ca2+-calmodulin dependent kinase and protein kinases A, C and G, did not affect the ability of reserpine to reduce [3H]NE uptake. Treatment of PC12 cells with the catecholamine depleting agent, α-methyl-p-tyrosine, increased [3H]NE uptake and eliminated the inhibitory effects of reserpine on [3H]NE uptake. Reserpine non-competitively inhibits NET activity through a Ca2+-independent process that requires catecholamine storage vesicles, revealing a novel pharmacological method to modify NET function. Further characterization of the molecular nature of reserpine's action could lead to the development of alternative therapeutic strategies for treating disorders known to be benefitted by treatment with traditional competitive NET inhibitors. PMID:20176067

Supplemental measurements from induced nuclear spectrometry tools are examined to demonstrate what additional information they provide about the well and reservoir conditions. Logs in shut-in wells from Indonesia provide examples of oxygen activation measurements showing cross-flow from one reservoir to another via open perforations. Leaking squeezed perforations were also observed. An example from Alaska shows radioactive scale build-up in the casing which spectral analysis identifies as a mixture of uranium and thorium salts. Another log, where the casing fluid was replaced with crude oil, demonstrates a technique for identifying cement channels. Logs from Nigeria comparing oil saturation estimates before and after a squeeze operation illustrate the effect of casing fluid flushing of the formation through open perforations. Understanding the diagnostic character of these curves leads to higher confidence in the overall log interpretation process.

The purpose of this article is to offer a reliable and easily formulated alternative to random technique selection or control panel roulette when producing diagnostic radiographs. This system requires only minutes to complete and will reduce the radiation dose to patients, the radiographic film wasted, and the time lost repeating examinations. PMID:523624

Fibrous insulation and refractory concrete are used for logs as well as fireproof walls, incinerator bricks, planters, and roof shingles. Insulation is lighter and more shock resistant than fireclay. Lightweight slag bonded with refractory concrete serves as aggregrate.

We investigate whether non-linear effects on the large-scale power spectrum of dark matter, namely the increase in small-scale power and the smearing of baryon acoustic oscillations, can be decreased by a log-transformation or emulated by an exponential transformation of the linear spectrum. To that end we present a formalism to convert the power spectrum of a log-normal field to the power spectrum of the logarithmic Gaussian field and vice versa. All ingredients of our derivation can already be found in various publications in cosmology and other fields. We follow a more pedagogical approach providing a detailed derivation, application examples, and a discussion of implementation subtleties in one text. We use the formalism to show that the non-linear increase in small-scale power in the matter power spectrum is significantly smaller for the log-transformed spectrum which fits the linear spectrum (with less than 20% error) for redshifts down to 1 and k ≤ 1.0 h Mpc. For lower redshifts the fit to the linear spectrum is not as good, but the reduction of non-linear effects is still significant. Similarly, we show that applying the linear growth factor to the logarithmic density leads to an automatic increase in small-scale power for low redshifts fitting to third-order perturbation spectra and Cosmic Emulator spectra with an error of less than 20%. Smearing of baryon acoustic oscillations is at least three times weaker, but still present.

This project consists of the following nine tasks: Machine design for coal log fabrication; Very rapid compaction of coal logs; Rapid compaction of coal logs; Fast-track experiments on coal log compaction; Coal log fabrication using hydrophobic binders; Drag reduction in large diameter hydraulic capsule pipeline; Automatic control of coal log pipeline system; Hydraulics of CLP (Coal Log Pipeline); and Coal heating system research. The purpose of the task, the work accomplished during this report period, and work proposed for the next quarter are described for each task.

Calculating well logs is a time-consuming process. This template uses input parameters consisting of well name, location county, state, formation name, starting depth, repeat interval, resistivity of shale, and irreducible bulk volume water, which provides heading information for print outs. Required information from basic well logs are porosity, conductivity (optional), formation resistivity, resistivity of the formation water for the zone being calculated, resistivity of the mud filtrate, the porosity cutoff for pay in the zone being calculated, and the saltwater saturation cutoff for the pay zone. These parameters are used to calculate apparent water resistivity, saltwater saturation, bulk volume water, ratio of apparent water resistivity to input water resistivity, irreducible saltwater saturation, resistivity volume of shale, permeability, and a derived porosity value. A print out of the results is available through the lotus print function. Using this template allows maximum control of the input parameters and reduces hand calculation time.

The operating instructions for the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, are provided. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG program has the following four modes of operation: (1) INPUT - putting new records into the data base (2) REVISE - changing or modifying existing records in the data base (3) SEARCH - finding special records existing in the data base (4) ARCHIVE - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the INPUT and SEARCH modes. The MAIL LOG data base consists of three main subfiles: Incoming and outgoing mail correspondence; Design Information Releases and Releases and Reports; and Drawings and Engineering orders.

... engine's fuel must meet the following requirements: (1) The diagnostic system must monitor reductant...) The onboard computer log must record in nonvolatile computer memory all incidents of engine operation... such operation in nonvolatile computer memory. You are not required to monitor NOX...

A method is described for generating a log of the formation phase shift, resistivity and spontaneous potential of an earth formation from data obtained from the earth formation with a multi-electrode induced polarization logging tool. The method comprises obtaining data samples from the formation at measurement points equally spaced in time of the magnitude and phase of the induced voltage and the magnitude and phase of the current supplied by a circuit through a reference resistance R/sub 0/ to a survey current electrode associated with the tool.

A student log book for outdoor education was developed to aid Oakland County (Michigan) teachers and supervisors of outdoor education in preparing student campers for their role and responsibilities in the total program. A sample letter to sixth graders explains the purpose of the booklet. General camp rules (10) are presented, followed by 6 woods…

In response to high numbers of preventable fatal accidents in the logging industry, the Occupational Safety and Health Administration (OSHA) developed a week-long logger safety training program that includes hands-on learning of safety techniques in the woods. Reaching small operators has been challenging; outreach initiatives in Maine, North…

A dual spectra well logging system includes a well logging tool which is adapted to pass through a bore hole in an earth formation. The well logging tool includes at least two sensors which sense at least one condition of the earth formation and provides corresponding pulse signals. A circuit connected to the sensors provides a combined pulse signal wherein the pulses of the pulse signal from one sensor has one polarity and the pulses of the pulse signal from the other sensor has pulses of an opposite polarity. A circuit applies the combined pulse signal to a well logging cable which conducts the combined pulse signal to the surface of the earth formation. Surface apparatus includes a network connected to the cable which provides control signals in accordance with the polarity of the pulses in the combined pulse signal. A network connected to the cable inverts the combined pulse signal and provides a combined pulse signal and an inverted combined pulse signal. A first switching network receiving the combined pulse signal passes the pulses derived from the pulses of the one polarity in acccordance with the control signals to provide a first pulse signal while a second switching network receiving the inverted combined pulse signal passes the pulses derived from the pulses of the opposite polarity in accordance with the control signals to provide a second pulse signal. An output network processes the two pulse signals to provide an indication of the earth's condition in accordance with the processed pulse signals.

Stabilizing or stimulating oil production in old oil fields requires density logging in cased holes where open-hole logging data are either missing or of bad quality. However, measured values from cased-hole density logging are more severely influenced by factors such as fluid, casing, cement sheath and the outer diameter of the open-hole well compared with those from open-hole logging. To correctly apply the cased-hole formation density logging data, one must eliminate these influences on the measured values and study the characteristics of how the cased-hole density logging instrument responds to these factors. In this paper, a Monte Carlo numerical simulation technique was used to calculate the responses of the far detector of a cased-hole density logging instrument to in-hole fluid, casing wall thickness, cement sheath density and the formation and thus to obtain influence rules and response coefficients. The obtained response of the detector is a function of in-hole liquid, casing wall thickness, the casing's outer diameter, cement sheath density, open-hole well diameter and formation density. The ratio of the counting rate of the detector in the calibration well to that in the measurement well was used to get a fairly simple detector response equation and the coefficients in the equation are easy to acquire. These provide a new way of calculating cased-hole density through forward modelling methods.

The evaluation of new energetic nitroaromatic compounds (NACs) for use in green munitions formulations requires models that can predict their environmental fate. Previously invoked linear free energy relationships (LFER) relating the log of the rate constant for this reaction (log(k)) and one-electron reduction potentials for the NAC (E1NAC) normalized to 0.059 V have been re-evaluated and compared to a new analysis using a (nonlinear) free-energy relationship (FER) based on the Marcus theory of outer-sphere electron transfer. For most reductants, the results are inconsistent with simple rate limitation by an initial, outer-sphere electron transfer, suggesting that the linear correlation between log(k) and E1NAC is best regarded as an empirical model. This correlation was used to calibrate a new quantitative structure-activity relationship (QSAR) using previously reported values of log(k) for nonenergetic NAC reduction by Fe(II) porphyrin and newly reported values of E1NAC determined using density functional theory at the M06-2X/6-311++G(2d,2p) level with the COSMO solvation model. The QSAR was then validated for energetic NACs using newly measured kinetic data for 2,4,6-trinitrotoluene (TNT), 2,4-dinitrotoluene (2,4-DNT), and 2,4-dinitroanisole (DNAN). The data show close agreement with the QSAR, supporting its applicability to other energetic NACs. PMID:25671710

Cable logging practices were observed on a production logging operation. Using a marked leave tree stand, damage at each phase of the operation was quantified. Log stability, motion and sweep area were also observed for each turn. These variables were evaluated in relation to the system geometry, terrain and logging practices. The results identify variables which influence log stability, motion and sweep area. Logging damage was closely related to operator log control, both for felling and for yarding. Good control could usually be maintained on slopes of less than 35% but special techniques and equipment were required on slopes of more than 35%. Silvicultural prescription, marking quality, planning and layout also affected the level of logging damage.

During wireline logging operations, tools occasionally become stuck in the borehole and require fishing. A typical fishing job can take anywhere from 1{1/2}--4 days. In the Gulf of Mexico, a fishing job can easily cost between $100,000 and $500,000. These costs result from nonproductive time during the fishing trip, associated wiper trip and relogging the well. Logging while fishing (LWF) technology is a patented system capable of retrieving a stuck fish and completing the logging run during the same pipe descent. Completing logging operations using LWF method saves time and money. The technique also provides well information where data may not otherwise have been obtained. Other benefits include reduced fishing time and an increased level of safety.

A well logging tool adapted for use in a borehole traversing an earth formation includes at least one sensor sensing at least one characteristic of the earth formation. Another sensor senses the ambient temperature and provides a corresponding temperature signal. An output circuit provides a temperature compensated output signal corresponding to the sensed characteristic of the earth formation in accordance with the temperature signal and the characteristic signal.

The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.

The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.

An early prediction of physicochemical properties is highly desirable during drug discovery to find out a viable lead candidate. Although there are several methods available to determine partition coefficient (log P), distribution coefficient (log D) and ionization constant (pKa), none of them involves simple and fixed, miniaturized protocols for diverse set of compounds. Therefore, it is necessary to establish simple, uniform and medium-throughput protocols requiring small sample quantities for the determination of these physicochemical properties. Log P and log D were determined by shake flask method, wherein, the compound was partitioned between presaturated noctanol and water phase (water/PBS pH 7.4) and the concentration of compound in each phase was determined by HPLC. The pKa determination made use of UV spectrophotometric analysis in a 96-well microtiter plate containing a series of aqueous buffers ranging from pH 1.0 to 13.0. The medium-throughput miniaturized protocols described herein, for determination of log P, log D and pKa, are straightforward to set up and require very small quantities of sample (< 5 mg for all three properties). All established protocols were validated using diverse set of compounds. PMID:27137915

Repeated gamma logging in cased holes represents a cost-effective means to monitor gamma-emitting contamination in the deep vadose zone over time. Careful calibration and standardization of gamma log results are required to track changes and to compare results over time from different detectors and logging systems. This paper provides a summary description of Hanford facilities currently available for calibration of logging equipment. Ideally, all logging organizations conducting borehole gamma measurements at the Hanford Site will take advantage of these facilities to produce standardized and comparable results. (authors)

The harmful effects of child poverty are well documented. Despite this, progress in poverty reduction in Canada has been slow. A significant gap exists between what is known about eradicating poverty and its implementation. Paediatricians can play an important role in bridging this gap by understanding and advancing child poverty reduction. Establishment of a comprehensive national poverty reduction plan is essential to improving progress. The present review identifies the key components of an effective poverty reduction strategy. These elements include effective poverty screening, promoting healthy child development and readiness to learn, ensuring food and housing security, providing extended health care coverage for the uninsured and using place-based solutions and team-level interventions. Specific economic interventions are also reviewed. Addressing the social determinants of health in these ways is crucial to narrowing disparities in wealth and health so that all children in Canada reach their full potential. PMID:26038640

The harmful effects of child poverty are well documented. Despite this, progress in poverty reduction in Canada has been slow. A significant gap exists between what is known about eradicating poverty and its implementation. Paediatricians can play an important role in bridging this gap by understanding and advancing child poverty reduction. Establishment of a comprehensive national poverty reduction plan is essential to improving progress. The present review identifies the key components of an effective poverty reduction strategy. These elements include effective poverty screening, promoting healthy child development and readiness to learn, ensuring food and housing security, providing extended health care coverage for the uninsured and using place-based solutions and team-level interventions. Specific economic interventions are also reviewed. Addressing the social determinants of health in these ways is crucial to narrowing disparities in wealth and health so that all children in Canada reach their full potential. PMID:26038640

This study investigated the minimally required feedback elements of a computer-tailored dietary fat reduction intervention to be effective in improving fat intake. In all 588 Healthy Dutch adults were randomly allocated to one of four conditions in an randomized controlled trial: (i) feedback on dietary fat intake [personal feedback (P feedback)],…

Selective logging is one of the most common forms of forest use in the tropics. Although the effects of selective logging on biodiversity have been widely studied, there is little agreement on the relationship between life-history traits and tolerance to logging. In this study, we assessed how species traits and logging practices combine to determine species responses to selective logging, based on over 4000 observations of the responses of nearly 1000 bird species to selective logging across the tropics. Our analysis shows that species traits, such as feeding group and body mass, and logging practices, such as time since logging and logging intensity, interact to influence a species' response to logging. Frugivores and insectivores were most adversely affected by logging and declined further with increasing logging intensity. Nectarivores and granivores responded positively to selective logging for the first two decades, after which their abundances decrease below pre-logging levels. Larger species of omnivores and granivores responded more positively to selective logging than smaller species from either feeding group, whereas this effect of body size was reversed for carnivores, herbivores, frugivores and insectivores. Most importantly, species most negatively impacted by selective logging had not recovered approximately 40 years after logging cessation. We conclude that selective timber harvest has the potential to cause large and long-lasting changes in avian biodiversity. However, our results suggest that the impacts can be mitigated to a certain extent through specific forest management strategies such as lengthening the rotation cycle and implementing reduced impact logging. PMID:25994673

There is a lack of quantitative information on the effectiveness of selective-logging practices in ameliorating effects of logging on faunal communities. We conducted a large-scale replicated field study in 3 selectively logged moist semideciduous forests in West Africa at varying times after timber extraction to assess post logging effects on amphibian assemblages. Specifically, we assessed whether the diversity, abundance, and assemblage composition of amphibians changed over time for forest-dependent species and those tolerant of forest disturbance. In 2009, we sampled amphibians in 3 forests (total of 48 study plots, each 2 ha) in southwestern Ghana. In each forest, we established plots in undisturbed forest, recently logged forest, and forest logged 10 and 20 years previously. Logging intensity was constant across sites with 3 trees/ha removed. Recently logged forests supported substantially more species than unlogged forests. This was due to an influx of disturbance-tolerant species after logging. Simultaneously Simpson's index decreased, with increased in dominance of a few species. As time since logging increased richness of disturbance-tolerant species decreased until 10 years after logging when their composition was indistinguishable from unlogged forests. Simpson's index increased with time since logging and was indistinguishable from unlogged forest 20 years after logging. Forest specialists decreased after logging and recovered slowly. However, after 20 years amphibian assemblages had returned to a state indistinguishable from that of undisturbed forest in both abundance and composition. These results demonstrate that even with low-intensity logging (≤3 trees/ha) a minimum 20-year rotation of logging is required for effective conservation of amphibian assemblages in moist semideciduous forests. Furthermore, remnant patches of intact forests retained in the landscape and the presence of permanent brooks may aid in the effective recovery of amphibian

The Tucker Wireline unit ran a suite of open hole logs right behind the RMOTC logging contractor for comparison purposes. The tools included Dual Laterolog, Phased Induction, BHC Sonic, and Density-Porosity.

There is a gap in aerial logging techniques that has to be filled. The need for a simple, safe, sizeable system has to be developed before aerial logging will become effective and accepted in the logging industry. This paper presents such a system designed on simple principles with realistic cost and ecological benefits.

A log is recording of system's activity, aimed to help system administrator to traceback an attack, find the causes of a malfunction and generally with troubleshooting. The fact that logs are the only information an administrator may have for an incident, makes logging system a crucial part of an IT infrastructure. In large scale infrastructures, such as LHCb Online, where quite a few GB of logs are produced daily, it is impossible for a human to review all of these logs. Moreover, a great percentage of them as just "noise". That makes clear that a more automated and sophisticated approach is needed. In this paper, we present a low-cost centralized logging system which allow us to do in-depth analysis of every log.

Rivers cover only a small part of the Earth`s surface, yet they transfer sediment in globally significant quantities. In mountainous regions, the majority of the total channel length occurs in headwater streams. Those mountain channels are influenced in terms of sediment connectivity by processes on the slopes. For example in such a sediment routing system, sediment originating from debris flows on the slopes is delivered along sediment pathways to the channel system and can be transported further downstream as solid load. Interruption of instream coarse sediment connectivity is closely related to the existence of channel blocking barriers which also can be formed by biota. By storing sediment large wood (LW) log jams disrupt in-channel sediment connectivity. We present a study design in order to decipher the short to long term effects (c. 10-2-102 years) of sediment (dis)connectivity effects of large wood. The study areas are two basins in mountain ranges in Germany and Austria. In Austria the drainage area of the river Fugnitz was chosen which is located in the National Park Thayatal. The other drainage area of the river Sieber in Saxony-Anhalt, Germany, is located in the Harz National Park. Since studies on LW and its geomorphological effects in Central European rivers are still rare the main goals of the project are: •to identify important triggers for LW transport from slopes into the channels •to examine the spatial distribution and characterization of LW in main and slope channels by mapping and dGPS measurements •to determine the effects of LW on channel hydraulic parameters (e.g. slope, width, grains size composition, roughness) by field measurements of channel long profiles and cross section with dGPS and Wolman particle counts •to quantify the direct effects of LW on discharge and bed load transport by measuring flow velocity with an Ott-Nautilus current meter and to measure bed load up- and downstream of log jams using a portable Helley

A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time. 18 figs.

The determination of water saturation from electrical resistivity measurements to evaluate the potential of reservoirs is a fundamental tool of the oil industry. Shaly sandstones are difficult to evaluate because clays are conductive and they lower the resistivity of the rock. A review of shaly-sandstone research concerning ''volume-of-shale'' equations reveals three theoretical categories: (1) laminated clay equations, (2) dispersed clay equations, and (3) equations that assume that the effect of the clays on the conductivity measurement is directly related to water saturation. A new model for predicting the relative amounts of laminated and dispersed shales and accounting for their effects according to their abundance can be used for any sandstone, clean or shaly. Equations representing each of the three theoretical categories and the new equation were tested on cored Wilcox sandstones from two wells. Cores were analyzed to determine the volume and distribution of clays and to correlate porosity with the well logs.

Network forensics involves capturing, recording and analysing network audit trails. A crucial part of network forensics is to gather evidence at the server level, proxy level and from other sources. A web proxy relays URL requests from clients to a server. Analysing web proxy logs can give unobtrusive insights to the browsing behavior of computer users and provide an overview of the Internet usage in an organisation. More importantly, in terms of network forensics, it can aid in detecting anomalous browsing behavior. This paper demonstrates the use of a self-organising map (SOM), a powerful data mining technique, in network forensics. In particular, it focuses on how a SOM can be used to analyse data gathered at the web proxy level.

A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time.

A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time.

Archaeal methane formation from methylamines is initiated by distinct methyltransferases with specificity for monomethylamine, dimethylamine, or trimethylamine. Each methylamine methyltransferase methylates a cognate corrinoid protein, which is subsequently demethylated by a second methyltransferase to form methyl-coenzyme M, the direct methane precursor. Methylation of the corrinoid protein requiresreduction of the central cobalt to the highly reducing and nucleophilic Co(I) state. RamA, a 60-kDa monomeric iron-sulfur protein, was isolated from Methanosarcina barkeri and is required for in vitro ATP-dependent reductive activation of methylamine:CoM methyl transfer from all three methylamines. In the absence of the methyltransferases, highly purified RamA was shown to mediate the ATP-dependent reductive activation of Co(II) corrinoid to the Co(I) state for the monomethylamine corrinoid protein, MtmC. The ramA gene is located near a cluster of genes required for monomethylamine methyltransferase activity, including MtbA, the methylamine-specific CoM methylase and the pyl operon required for co-translational insertion of pyrrolysine into the active site of methylamine methyltransferases. RamA possesses a C-terminal ferredoxin-like domain capable of binding two tetranuclear iron-sulfur proteins. Mutliple ramA homologs were identified in genomes of methanogenic Archaea, often encoded near methyltrophic methyltransferase genes. RamA homologs are also encoded in a diverse selection of bacterial genomes, often located near genes for corrinoid-dependent methyltransferases. These results suggest that RamA mediates reductive activation of corrinoid proteins and that it is the first functional archetype of COG3894, a family of redox proteins of unknown function. PMID:19043046

The viability of replacing Americium–Beryllium (Am–Be) radiological neutron sources in compensated porosity nuclear well logging tools with D–T or D–D accelerator-driven neutron sources is explored. The analysis consisted of developing a model for a typical well-logging borehole configuration and computing the helium-3 detector response to varying formation porosities using three different neutron sources (Am–Be, D–D, and D–T). The results indicate that, when normalized to the same source intensity, the use of a D–D neutron source has greater sensitivity for measuring the formation porosity than either an Am–Be or D–T source. The results of the study provide operational requirements that enable compensated porosity well logging with a compact, low power D–D neutron generator, which the current state-of-the-art indicates is technically achievable.

The viability of replacing Americium-Beryllium (Am-Be) radiological neutron sources in compensated porosity nuclear well logging tools with D-T or D-D accelerator-driven neutron sources is explored. The analysis consisted of developing a model for a typical well-logging borehole configuration and computing the helium-3 detector response to varying formation porosities using three different neutron sources (Am-Be, D-D, and D-T). The results indicate that, when normalized to the same source intensity, the use of a D-D neutron source has greater sensitivity for measuring the formation porosity than either an Am-Be or D-T source. The results of the study provide operational requirements that enable compensated porosity well logging with a compact, low power D-D neutron generator, which the current state-of-the-art indicates is technically achievable.

Selective logging is a major contributor to the social, economic, and ecological dynamics of Brazilian Amazonia. Logging activities have expanded from low-volume floodplain harvests in past centuries to high-volume operations today that take about 25 million m3 of wood from the forest each year. The most common highimpact conventional and often illegal logging practices result in major collateral forest damage, with cascading effects on ecosystem processes. Initial carbon losses and forest recovery rates following timber harvest are tightly linked to initial logging intensity, which drives changes in forest gap fraction, fragmentation, and the light environment. Other ecological processes affected by selective logging include nutrient cycling, hydrological function, and postharvest disturbance such as fire. This chapter synthesizes the ecological impacts of selective logging, in the context of the recent socioeconomic conditions throughout Brazilian Amazonia, as determined from field-based and remote sensing studies carried out during the Large-Scale Biosphere-Atmosphere Experiment in Amazonia program.

Independent (uncoordinated) check pointing for parallel and distributed systems allows maximum process autonomy but suffers from possible domino effects and the associated storage space overhead for maintaining multiple checkpoints and message logs. In most research on check pointing and recovery, it was assumed that only the checkpoints and message logs older than the global recovery line can be discarded. It is shown how recovery line transformation and decomposition can be applied to the problem of efficiently identifying all discardable message logs, thereby achieving optimal garbage collection. Communication trace-driven simulation for several parallel programs is used to show the benefits of the proposed algorithm for message log reclamation.

An apparatus for remotely measuring and logging the flow rate of groundwater seepage into surface water bodies. As groundwater seeps into a cavity created by a bottomless housing, it displaces water through an inlet and into a waterproof sealed upper compartment, at which point, the water is collected by a collection bag, which is contained in a bag chamber. A magnet on the collection bag approaches a proximity switch as the collection bag fills, and eventually enables the proximity switch to activate a control circuit. The control circuit then rotates a three-way valve from the collection path to a discharge path, enables a data logger to record the time, and enables a pump, which discharges the water from the collection bag, through the three-way valve and pump, and into the sea. As the collection bag empties, the magnet leaves the proximity of the proximity switch, and the control circuit turns off the pump, resets the valve to provide a collection path, and restarts the collection cycle.

Iron uptake in Saccharomyces cerevisiae involves at least two steps: reduction of ferric to ferrous ions extracellularly and transport of the reduced ions through the plasma membrane. We have cloned and molecularly characterized FRE2, a gene which is shown to account, together with FRE1, for the total membrane-associated ferric reductase activity of the cell. Although not similar at the nucleotide level, the two genes encode proteins with significantly similar primary structures and very similar hydrophobicity profiles. The FRE1 and FRE2 proteins are functionally related, having comparable properties as ferric reductases. FRE2 expression, like FRE1 expression, is induced by iron deprivation, and at least part of this control takes place at the transcriptional level, since 156 nucleotides upstream of the initiator AUG conferred iron-dependent regulation when fused to a heterologous gene. However, the two gene products have distinct temporal regulation of their activities during cell growth. Images PMID:8164662

The system is focused on the Employee Business Travel Event. The system must be able to CRUD (Create, Retrieve, Update, Delete) instances of the Travel Event as well as the ability to CRUD frequent flyer milage associated with airline travel. Additionally the system must provide for a compliance reporting system to monitor reductions in travel costs and lost opportunity costs (i.e., not taking advantage of business class or 7 day advance tickets).

Irrigation agriculture plays an increasingly important role in food supply. Many evapotranspiration models are used today to estimate the water demand for irrigation. They consider different stages of crop growth by empirical crop coefficients to adapt evapotranspiration throughout the vegetation period. We investigate the importance of the model structural versus model parametric uncertainty for irrigation simulations by considering six evapotranspiration models and five crop coefficient sets to estimate irrigation water requirements for growing wheat in the Murray-Darling Basin, Australia. The study is carried out using the spatial decision support system SPARE:WATER. We find that structural model uncertainty among reference ET is far more important than model parametric uncertainty introduced by crop coefficients. These crop coefficients are used to estimate irrigation water requirement following the single crop coefficient approach. Using the reliability ensemble averaging (REA) technique, we are able to reduce the overall predictive model uncertainty by more than 10%. The exceedance probability curve of irrigation water requirements shows that a certain threshold, e.g. an irrigation water limit due to water right of 400 mm, would be less frequently exceeded in case of the REA ensemble average (45%) in comparison to the equally weighted ensemble average (66%). We conclude that multi-model ensemble predictions and sophisticated model averaging techniques are helpful in predicting irrigation demand and provide relevant information for decision making.

Irrigation agriculture plays an increasingly important role in food supply. Many evapotranspiration models are used today to estimate the water demand for irrigation. They consider different stages of crop growth by empirical crop coefficients to adapt evapotranspiration throughout the vegetation period. We investigate the importance of the model structural vs. model parametric uncertainty for irrigation simulations by considering six evapotranspiration models and five crop coefficient sets to estimate irrigation water requirements for growing wheat in the Murray-Darling Basin, Australia. The study is carried out using the spatial decision support system SPARE:WATER. We find that structural model uncertainty is far more important than model parametric uncertainty to estimate irrigation water requirement. Using the Reliability Ensemble Averaging (REA) technique, we are able to reduce the overall predictive model uncertainty by more than 10%. The exceedance probability curve of irrigation water requirements shows that a certain threshold, e.g. an irrigation water limit due to water right of 400 mm, would be less frequently exceeded in case of the REA ensemble average (45%) in comparison to the equally weighted ensemble average (66%). We conclude that multi-model ensemble predictions and sophisticated model averaging techniques are helpful in predicting irrigation demand and provide relevant information for decision making.

Fluctuations in the availability of electron donor (petroleum hydrocarbons) affected the competition between sulfate-reducing bacteria (SRB) and methanogenic bacteria (MB) for control of electron flow in a petroleum hydrocarbon-contaminated aquifer. The data suggest that abundant electron donor availability allowed MB to sequester a portion of the electron flow even when sulfate was present in sufficient concentrations to support sulfate reduction. For example, in an area of abundant electron-donor availability, SRB appeared to be unable to sequester the electron flow from MB in the presence of 1.4 mg/L sulfate. The data also suggest that when electron-donor availability was limited, SRB outcompeted MB for available substrate at a lower concentration of sulfate than when electron donor was plentiful. For example, in an area of limited electron-donor availability, SRB appeared to maintain dominance of electron flow at sulfate concentrations less than 1 mg/L. The presence of abundant electron donor and a limited amount of sulfate reduced competition for available substrate, allowing both SRB and MB to metabolize available substrates concurrently.

Recently, there have been significant improvements in computerized vibration and online performance monitoring systems. However, despite all the developments, the importance of monitoring rotating equipment through operator log sheets must not be overlooked or neglected. Operator log sheets filled out during shifts can be very useful tools in detecting problems early, provided they are diligently completed and evaluated during the operating shift. In most cases, performance deviations can be corrected by measured within the control of the operator. If the operator understands the purpose of log sheets, and knows the cause and effect of deviations in operating parameters, he or she will be motivated to complete the log sheets to increase equipment reliability. Logged data should include any operating data from equipment that reveals its mechanical condition or performance. The most common data logged are pressure, temperature, flow, power and vibration. The purposes of log sheets are to: establish and recognize the normal operating parameters and identify deviations in performance data; perform timely corrective actions on deviations to avoid unplanned shutdowns and catastrophic failures; avoid repetitive failures and increase mean time between failures; and provide base line data for troubleshooting. Two case histories are presented to illustrate the usefulness of logs: a compressor thrust bearing problem and steam turbine blade washing.

Discover Presidential Log Cabins is a set of materials designed to help educate 6-8 grade students about the significance of three log cabin sites occupied by George Washington, Ulysses Grant, Abraham Lincoln, and Theodore Roosevelt. This teacher's discussion guide is intended for use as part of a larger, comprehensive social studies program, and…

Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square kilometers per year (+/-14%) between 1999 and 2002, equivalent to 60 to 123% of previously reported deforestation area. Up to 1200 square kilometers per year of logging were observed on conservation lands. Each year, 27 million to 50 million cubic meters of wood were extracted, and a gross flux of ~0.1 billion metric tons of carbon was destined for release to the atmosphere by logging.

Project tasks include: (1) Perform the necessary testing and development to demonstrate that the amount of binder in coal logs can be reduced to 8% or lower to produce logs with adequate strength to eliminate breakage during pipeline transportation, under conditions experienced in long distance pipeline systems. Prior to conducting any testing and demonstration, grantee shall perform an information search and make full determination of all previous attempts to extrude or briquette coal, upon which the testing and demonstration shall be based. (2) Perform the necessary development to demonstrate a small model of the most promising injection system for coal-logs, and tests the logs produced. (3) Conduct economic analysis of coal-log pipeline, based upon the work to date. Refine and complete the economic model. (VC)

Purpose: This article aims to validate the Leadership Daily Practice (LDP) log, an instrument for conducting research on leadership in schools. Research Design: Using a combination of data sources--namely, a daily practice log, observations, and open-ended cognitive interviews--the authors evaluate the validity of the LDP log. Participants: Formal…

Optical communication is likely to significantly speed up parallel computation because the vast bandwidth of the optical medium can be divided to produce communication networks of very high degree. However, the problem of contention in high-degree networks makes the routing problem in these networks theoretically (and practically) difficult. In this paper we examine Valiant`s h-relation routing problem, which is a fundamental problem in the theory of parallel computing. The h-relation routing problem arises both in the direct implementation of specific parallel algorithms on distributed-memory machines and in the general simulation of shared memory models such as the PRAM on distributed-memory machines. In an h-relation routing problem each processor has up to h messages that it wishes to send to other processors and each processor is the destination of at most h messages. We present a lower bound for routing an h-relation (for any h > 1) on a complete optical network of size -n. Our lower bound applies to any randomized distributed algorithm for this task. Specifically, we show that the expected number of communication steps required to route an arbitrary h-relation is {Omega}(h + {radical}loglog n). This is the first known lower bound for this problem which does not restrict the class of algorithms under consideration.

Enteric viruses are a major problem in the food industry, especially as human noroviruses are the leading cause of nonbacterial gastroenteritis. Chitosan is known to be effective against some enteric viral surrogates, but more detailed studies are needed to determine the precise application variables. The main objective of this work was to determine the effect of increasing chitosan concentration (0.7-1.5% w/v) on the cultivable enteric viral surrogates, feline calicivirus (FCV-F9), murine norovirus (MNV-1), and bacteriophages (MS2 and phiX174) at 37 °C. Two chitosans (53 and 222 kDa) were dissolved in water (53 kDa) or 1% acetic acid (222 KDa) at 0.7-1.5%, and were then mixed with each virus to obtain a titer of ~5 log plaque-forming units (PFU)/mL. These mixtures were incubated for 3 h at 37 °C. Controls included untreated viruses in phosphate-buffered saline and viruses were enumerated by plaque assays. The 53 kDa chitosan at the concentrations tested reduced FCV-F9, MNV-1, MS2, and phi X174 by 2.6-2.9, 0.1-0.4, 2.6-2.8, and 0.7-0.9 log PFU/mL, respectively, while reduction by 222 kDa chitosan was 2.2-2.4, 0.8-1.0, 2.6-5.2, and 0.5-0.8 log PFU/mL, respectively. The 222 kDa chitosan at 1 and 0.7% w/v in acetic acid (pH 4.5) caused the greatest reductions of MS2 by 5.2 logs and 2.6 logs, respectively. Overall, chitosan treatments showed the greatest reduction of MS2, followed by FCV-F9, phi X174, and MNV-1. These two chitosans may contribute to the reduction of enteric viruses at the concentrations tested but would require use of other hurdles to eliminate food borne viruses. PMID:26162243

Direct push technologies have recently seen a broad development providing several tools for in situ parameterization of unconsolidated sediments. One of these techniques is the measurement of soil colors - a proxy information that reveals to soil/sediment properties. We introduce the direct push driven color logging tool (CLT) for real-time and depth-resolved investigation of soil colors within the visible spectrum. Until now, no routines exist on how to handle high-resolved (mm-scale) soil color data. To develop such a routine, we transform raw data (CIEXYZ) into soil color surrogates of selected color spaces (CIExyY, CIEL*a*b*, CIEL*c*h*, sRGB) and denoise small-scale natural variability by Haar and Daublet4 wavelet transformation, gathering interpretable color logs over depth. However, interpreting color log data as a single application remains challenging. Additional information, such as site-specific knowledge of the geological setting, is required to correlate soil color data to specific layers properties. Hence, we exemplary provide results from a joint interpretation of in situ-obtained soil color data and 'state-of-the-art' direct push based profiling tool data and discuss the benefit of additional data. The developed routine is capable of transferring the provided information obtained as colorimetric data into interpretable color surrogates. Soil color data proved to correlate with small-scale lithological/chemical changes (e.g., grain size, oxidative and reductive conditions), especially when combined with additional direct push vertical high resolution data (e.g., cone penetration testing and soil sampling). Thus, the technique allows enhanced profiling by means of providing another reproducible high-resolution parameter for analysis subsurface conditions. This opens potential new areas of application and new outputs for such data in site investigation. It is our intention to improve color measurements by means method of application and data

This article speaks to a common problem in a lot of industrial and institutional boilerhouses. Most boilerhouses do an excellent job at collecting information. Circular chart recorders churn out pressures, temperatures, and flows for everything from steam to natural gas to city water consumption. At most facilities, this stuff all gets chucked into a drawer or file cabinet daily. Have you ever wondered why you collect and record what you do? What were people thinking when the existing logs were set up? This article attempts to challenge the original thought process and hopes to evoke in the reader a renewed vision of what should be collected, how, and then what can be done with it. The goal of this article is not to define new and expensive data acquisition or control system projects. It is instead to show how to develop systems that only require paper, pencils, and people who are motivated and care. These people are probably already being paid to do most of this work. Experience is that if these people are treated with respect and given some simple tools they will do amazing things beyond what was thought possible. This is a low-tech humanistic approach that has a fabulous rate of return. It`s also something that can be immediately implemented.

Flux compactifications of string theory seem to require the presence of a fine-tuned constant in the superpotential. We discuss a scheme where this constant is replaced by a dynamical quantity which we argue to be a 'continuous Chern-Simons term'. In such a scheme, the gaugino condensate generates the hierarchically small scale of supersymmetry breakdown rather than adjusting its size to a constant. A crucial ingredient is the appearance of the hierarchically small quantity exp(-) which corresponds to the scale of gaugino condensation. Under rather general circumstances, this leads to a scenario of moduli stabilization, which is endowed with a hierarchy between the mass of the lightest modulus, the gravitino mass and the scale of the soft terms, mmodulus {approx} m3/2 {approx} 2 msoft. The 'little hierarchy' is given by the logarithm of the ratio of the Planck scale and the gravitino mass, {approx} log(MPl/m3/2) {approx} 4{pi}2. This exhibits a new mediation scheme of supersymmetry breakdown, called mirage mediation. We highlight the special properties of the scheme, and their consequences for phenomenology and cosmology.

Several dry well surveillance logs from 1975 through 1995 for the SX Tank Farm have been examined to identify potential subsurface zones of radioactive contaminant migration. Several dynamic conditions of the gamma-ray emitting radioactive contaminant shave been identified.

A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.

A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.

The Common Message Logging (CMLOG) system is an object-oriented and distributed system that not only allows applications and systems to log data (messages) of any type into a centralized database but also lets applications view incoming messages in real-time or retrieve stored data from the database according to selection rules. It consists of a concurrent Unix server that handles incoming logging or searching messages, a Motif browser that can view incoming messages in real-time or display stored data in the database, a client daemon that buffers and sends logging messages to the server, and libraries that can be used by applications to send data to or retrieve data from the database via the server. This paper presents the design and implementation of the CMLOG system meanwhile it will also address the issue of integration of CMLOG into existing control systems.

Event logs are a ubiquitous source of system feedback from computer systems, but have widely ranging format and can be extremely numerous, particularly from systems with many logging components. Inspection of these logs is fundamental to system debugging; increased capability to quickly extract meaningful information will impact MTTR (mean time to repair) and may impact MTBF (mean time between failure). Sisyphus is a machine-leanring analysis system whose goal is to enable content-novice analysts to efficieniiy understand evolving trends, identify anomalies, and investigate cause-effect hypotheses in large multiple-souce log sets. The toolkit is comprised a framework for utilizing third-party frequentitemset data mining tools Teiresias and SLCT. and software to cluster messages according to time statistics, and an interactive results viewer.

Industrial logging has become the most extensive land use in Central Africa, with more than 600,000 square kilometers (30%) of forest currently under concession. With use of a time series of satellite imagery for the period from 1976 to 2003, we measured 51,916 kilometers of new logging roads. The density of roads across the forested region was 0.03 kilometer per square kilometer, but areas of Gabon and Equatorial Guinea had values over 0.09 kilometer per square kilometer. A new frontier of logging expansion was identified within the Democratic Republic of Congo, which contains 63% of the remaining forest of the region. Tree felling and skid trails increased disturbance in selectively logged areas. PMID:17556578

Event logs are a ubiquitous source of system feedback from computer systems, but have widely ranging format and can be extremely numerous, particularly from systems with many logging components. Inspection of these logs is fundamental to system debugging; increased capability to quickly extract meaningful information will impact MTTR (mean time to repair) and may impact MTBF (mean time between failure). Sisyphus is a machine-leanring analysis system whose goal is to enable content-novice analysts to efficieniiymore » understand evolving trends, identify anomalies, and investigate cause-effect hypotheses in large multiple-souce log sets. The toolkit is comprised a framework for utilizing third-party frequentitemset data mining tools Teiresias and SLCT. and software to cluster messages according to time statistics, and an interactive results viewer.« less

Email server logs contain records of all email Exchange through this server. Often we would like to analyze those emails not separately but in conversation thread, especially when we need to analyze social network extracted from those email logs. Unfortunately each mail is in different record and those record are not tided to each other in any obvious way. In this paper method for discussion threads extraction was proposed together with experiments on two different data sets - Enron and WrUT..

There has been a lack of an effective NDE technique to locate internal defects within wooden logs. The few available elastic wave propagation based techniques are limited to predicting E values. Other techniques such as X-rays have not been very successful in detecting internal defects in logs. If defects such as embedded metals could be identified before the sawing process, the saw mills could significantly increase their production by reducing the probability of damage to the saw blade and the associated downtime and the repair cost. Also, if the internal defects such as knots and decayed areas could be identified in logs, the sawing blade can be oriented to exclude the defective portion and optimize the volume of high valued lumber that can be obtained from the logs. In this research, GPR has been successfully used to locate internal defects (knots, decays and embedded metals) within the logs. This paper discusses GPR imaging and mapping of the internal defects using both 2D and 3D interpretation methodology. Metal pieces were inserted in a log and the reflection patterns from these metals were interpreted from the radargrams acquired using 900 MHz antenna. Also, GPR was able to accurately identify the location of knots and decays. Scans from several orientations of the log were collected to generate 3D cylindrical volume. The actual location of the defects showed good correlation with the interpreted defects in the 3D volume. The time/depth slices from 3D cylindrical volume data were useful in understanding the extent of defects inside the log.

We summarize the documented and potential impacts of salvage logging--a form of logging that removes trees and other biological material from sites after natural disturbance. Such operations may reduce or eliminate biological legacies, modify rare postdisturbance habitats, influence populations, alter community composition, impair natural vegetation recovery, facilitate the colonization of invasive species, alter soil properties and nutrient levels, increase erosion, modify hydrological regimes and aquatic ecosystems, and alter patterns of landscape heterogeneity These impacts can be assigned to three broad and interrelated effects: (1) altered stand structural complexity; (2) altered ecosystem processes and functions; and (3) altered populations of species and community composition. Some impacts may be different from or additional to the effects of traditional logging that is not preceded by a large natural disturbance because the conditions before, during, and after salvage logging may differ from those that characterize traditional timber harvesting. The potential impacts of salvage logging often have been overlooked, partly because the processes of ecosystem recovery after natural disturbance are still poorly understood and partly because potential cumulative effects of natural and human disturbance have not been well documented. Ecologically informed policies regarding salvage logging are needed prior to major natural disturbances so that when they occur ad hoc and crisis-mode decision making can be avoided. These policies should lead to salvage-exemption zones and limits on the amounts of disturbance-derived biological legacies (e.g., burned trees, logs) that are removed where salvage logging takes place. Finally, we believe new terminology is needed. The word salvage implies that something is being saved or recovered, whereas from an ecological perspective this is rarely the case. PMID:16922212

Biomass and rates of disturbance are major factors in determining the net flux of carbon between terrestrial ecosystems and the atmosphere, and neither of them is well known for most of the earth's surface. Satellite data over large areas are beginning to be used systematically to measure rates of two of the most important types of disturbance, deforestation and reforestation, but these are not the only types of disturbance that affect carbon storage. Other examples include selective logging and fire. In northern mid-latitude forests, logging and subsequent regrowth of forests have, in recent decades, contributed more to the net flux of carbon between terrestrial ecosystems and the atmosphere than any other type of land use. In the tropics logging is also becoming increasingly important. According to the FAO/UNEP assessment of tropical forests, about 25% of total area of productive forests have been logged one or more times in the 60-80 years before 1980. The fraction must be considerably greater at present. Thus, deforestation by itself accounts for only a portion of the emissions carbon from land. Furthermore, as rates of deforestation become more accurately measured with satellites, uncertainty in biomass will become the major factor accounting for the remaining uncertainty in estimates of carbon flux. An approach is needed for determining the biomass of terrestrial ecosystems. 3 Selective logging is increasingly important in Amazonia, yet it has not been included in region-wide, satellite-based assessments of land-cover change, in part because it is not as striking as deforestation. Nevertheless, logging affects terrestrial carbon storage both directly and indirectly. Besides the losses of carbon directly associated with selective logging, logging also increases the likelihood of fire.

... Commanding Officers Afloat § 700.845 Maintenance of logs. (a) A deck log and an engineering log shall be... bell book shall be maintained as an adjunct to the engineering log. (c) The Chief of Naval Operations shall prescribe regulations governing the contents and preparation of the deck and engineering logs...

On March 12, 2003, data were gathered at Yuma Proving Grounds, in Arizona, using a Tensor Magnetic Gradiometer System (TMGS). This report shows how these data were processed and explains concepts required for successful TMGS data reduction. Important concepts discussed include extreme attitudinal sensitivity of vector measurements, low attitudinal sensitivity of gradient measurements, leakage of the common-mode field into gradient measurements, consequences of thermal drift, and effects of field curvature. Spatial-data collection procedures and a spin-calibration method are addressed. Discussions of data-reduction procedures include tracking of axial data by mathematically matching transfer functions among the axes, derivation and application of calibration coefficients, calculation of sensor-pair gradients, thermal-drift corrections, and gradient collocation. For presentation, the magnetic tensor at each data station is converted to a scalar quantity, the I2 tensor invariant, which is easily found by calculating the determinant of the tensor. At important processing junctures, the determinants for all stations in the mapped area are shown in shaded relief map-view. Final processed results are compared to a mathematical model to show the validity of the assumptions made during processing and the reasonableness of the ultimate answer obtained.

In a posteriori access control, users are accountable for actions they performed and must provide evidence, when required by some legal authorities for instance, to prove that these actions were legitimate. Generally, log files contain the needed data to achieve this goal. This logged data can be recorded in several formats; we consider here IHE-ATNA (Integrating the healthcare enterprise-Audit Trail and Node Authentication) as log format. The difficulty lies in extracting useful information regardless of the log format. A posteriori access control frameworks often include a log filtering engine that provides this extraction function. In this paper we define and enforce this function by building an IHE-ATNA based ontology model, which we query using SPARQL, and show how the a posteriori security controls are made effective and easier based on this function. PMID:22874291

The direct on-site assessment of the vertical distribution of discontinuities to rock masses is very important since it can give a first estimation of the hydraulic properties of the strata and has many practical applications, such as groundwater resources investigations, radioactive and toxic waste disposal, dam foundation site investigations, etc. In the present work, the effect that fractures have upon some geophysical parameters which can easily be determined from the analysis of conventional normal resistivity logs is examined and a new technique for the on-site processing of resistivity logging data is introduced. Using a microcomputer in series with the logging unit, a zonation process was applied to the logs, which were interpreted in terms of a series of beds, each having a specific thickness and resistivity, and a new parameter defined by the difference between transverse and longitudinal resistivities was computed (T-L log). In almost all the cases that the method was applied, the obtained results were satisfactory and the microcomputer-based software and hardware package that was developed for the automatic processing of the data proved to be very efficient.

... OFFICIAL RECORDS UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS The Commanding Officer Commanding Officers Afloat § 700.846 Status of logs. The deck log, the engineering log, the compass record,...

... OFFICIAL RECORDS UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS The Commanding Officer Commanding Officers Afloat § 700.846 Status of logs. The deck log, the engineering log, the compass record,...

Data gathered during NSF-supported scientific research cruises represent an important component of the overall oceanographic data collection. The Rolling Deck to Repository (R2R) pilot project aims to improve access to basic shipboard data and ultimately reduce the work required to provide that access. Improved access will be achieved through identification of best practices for shipboard data management, identification of standard metadata and data products from research cruises, development of metadata schemas to describe a research cruise, and development of a prototype data discovery system that could be used by the entire NSF-supported academic research fleet. Shoreside data managers will work collaboratively with ship technicians and vessel operators to develop approaches that scale from smaller coastal vessels to larger open ocean research vessels. One of the coordinated subprojects within the R2R project will focus on development of a shipboard event logging system that would incorporate best practice guidelines, a metadata schema and new and existing applications to generate a scientific sampling event log in the form of a digital text file. A cruise event logging system enables researchers to record digitally all scientific sampling events and assign a unique event identifier to each entry. Decades of work conducted within large coordinated ocean research programs (JGOFS, GLOBEC, WOCE and RIDGE) have shown that creation of a shipboard sampling event log can facilitate greatly the subsequent integration of data sets from individual investigators. In addition to providing a quick way to determine what types of data might have been collected during a cruise, the sampling event log can be used to visualize the relationship, both temporally and geospatially, between the diverse types of sampling events conducted during a research cruise. Research questions in marine ecology or modeling projects are inherently multi-disciplinary and require access to a variety

The fusion mechanism for application in stereo analysis of range restricted the depth of field and therefore required a shift variant mechanism in the peripheral area to find disparity. Misregistration was prevented by restricting the disparity detection range to a neighborhood spanned by the directional edge detection filters. This transformation was essentially accomplished by a nonuniform resampling of the original image in a horizontal direction. While this is easily implemented for digital processing, the approach does not (in the peripheral vision area) model the log-conformal mapping which is known to occur in the human mechanism. This paper therefore modifies the original fusion concept in the peripheral area to include the polar exponential grid-to-log conformal tesselation. Examples of the fusion process resulting in accurate disparity values are given.

This Technical Implementation Procedure (TIP) describes the field operation, and the management of data records pertaining to neutron logging and density logging in welded tuff. This procedure applies to all borehole surveys performed in support of Engineered Barrier System Field Tests (EBSFT), including the Earge Block Tests (LBT) and Initial Engineered Barrier System Field Tests (IEBSFT) - WBS 1.2.3.12.4. The purpose of this TIP is to provide guidelines so that other equally trained and qualified personnel can understand how the work is performed or how to repeat the work if needed. The work will be documented by the use of Scientific Notebooks (SNs) as discussed in 033-YMP-QP 3.4. The TIP will provide a set of guidelines which the scientists will take into account in conducting the mea- surements. The use of this TIP does not imply that this is repetitive work that does not require profes- sional judgment.

This patent describes a well logging system for determining the dielectric constant and/or resistivity of earth formations, some of which have been invaded by drilling fluid, traversed by a borehole. It comprises: a well logging sonde adapted to be passed through the borehole including: means for transmitting electromagnetic energy into the earth formation at a frequency which enables the electromagnetic energy to propagate throughout the surrounding earth formation; first, second and third receiver means; means connected to the three receiver means for processing the three receiver signals to provide a combined signal for application to well logging cable means, well logging cable means for conducting the combined signal from the signal processing means out of the borehole; and surface electronics. The surface electronics includes indication means connected to the well logging cable means for providing an indication of the dielectric constant and/or the resistivity of the earth formation in accordance with portions of the combined signal conducted by the cable means representative of secondary electromagnetic fields at two of the three receiving means locations.

Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux focus on annual timescales, but overlook potentially important process response on shorter intervals immediately following timber harvest. We resolve such dynamics from non-parametric Quantile Regression Forests (QRF) of high-frequency (3-min) measurements of stream discharge and sediment concentrations in similar-sized (~0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the Random Forest algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors which in turn provides model uncertainties. We find that, where no logging occurred, ~80% of the total sediment load was transported during extremely variable runoff events during only 5% of the monitoring period. Particularly dry-season logging dampened the role of these rare, extreme sediment-transport events by increasing load efficiency during more efficient moderate events. We conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment fluxes at high temporal resolution.

Most of the direct knowledge of the geology of the subsurface is gained from the examination of core and drill-cuttings recovered from boreholes drilled by the petroleum and water industries. Wireline logs run in these same boreholes generally have been restricted to tasks of lithostratigraphic correlation and thee location of hydrocarbon pay zones. However, the range of petrophysical measurements has expanded markedly in recent years, so that log traces now can be transformed to estimates of rock composition. Increasingly, logs are available in a digital format that can be read easily by a desktop computer and processed by simple spreadsheet software methods. Taken together, these developments offer accessible tools for new insights into subsurface geology that complement the traditional, but limited, sources of core and cutting observations.

Shear has been the most effective method to create long range order of micro- or nano- structures in soft materials. When shear is applied, soft particles or polymers tend to align along the shear direction to minimize the viscous dissipation, thus transverse (so-called ``log-rolling'') alignment is unfavored. In this study, for the first time we report the transverse alignment of cylinder-forming block copolymers. Poly(styrene-b-methyl methacrylate), PS-PMMA, can form a metastable hemicylinder structure when confined in a thin film, and this hemicylinder structure can align either along the shear direction, or transverse to the shear direction (``log-rolling''), depending on the shearing temperature. This unusual ``log-rolling'' behavior is explained by the different chain mobility of the two blocks in PS-PMMA; the rigidity of core cylinder is the critical parameter determining the direction of shear alignment.

Americium-Beryllium (AmBe) radiological neutron sources have been widely used in the petroleum industry for well logging purposes. There is strong desire on the part of various governmental and regulatory bodies to find alternate sources due to the high activity and small size of AmBe sources. Other neutron sources are available, both radiological (252Cf) and electronic accelerator driven (D-D and D-T). All of these, however, have substantially different neutron energy spectra from AmBe and thus cause significantly different responses in well logging tools. We report on simulations performed using unconventional sources and techniques to attempt to better replicate the porosity and carbon/oxygen ratio responses a well logging tool would see from AmBe neutrons. The AmBe response of these two types of tools is compared to the response from 252Cf, D-D, D-T, filtered D-T, and T-T sources.

Logging is now the most dangerous U.S. occupation. The Occupational Safety and Health Administration (OSHA) developed specialized safety training for the logging industry but has been challenged to reach small operators. An OSHA-approved state program in Minnesota provides annual safety seminars to about two-thirds of the state's full-time…

Test-hole data that may be used to determine the hydrogeology of the zone of high permeability in Palm Beach County, Fla., are presented. Lithologic logs from 46 test wells and geophysical logs from 40 test wells are contained in this report. (USGS)

... certificate holder's manual; (2) Include a certification that— (i) The work was performed in accordance with the requirements of the certificate holder's manual; (ii) All items required to be inspected were... repair station located outside the United States , the airworthiness release or log entry required...

This article introduces a development of pulsed neutron uranium logging instrument. By analyzing the temporal distribution of epithermal neutrons generated from the thermal fission of {sup 235}U, we propose a new method with a uranium-bearing index to calculate the uranium content in the formation. An instrument employing a D-T neutron generator and two epithermal neutron detectors has been developed. The logging response is studied using Monte Carlo simulation and experiments in calibration wells. The simulation and experimental results show that the uranium-bearing index is linearly correlated with the uranium content, and the porosity and thermal neutron lifetime of the formation can be acquired simultaneously.

This article introduces a development of pulsed neutron uranium logging instrument. By analyzing the temporal distribution of epithermal neutrons generated from the thermal fission of (235)U, we propose a new method with a uranium-bearing index to calculate the uranium content in the formation. An instrument employing a D-T neutron generator and two epithermal neutron detectors has been developed. The logging response is studied using Monte Carlo simulation and experiments in calibration wells. The simulation and experimental results show that the uranium-bearing index is linearly correlated with the uranium content, and the porosity and thermal neutron lifetime of the formation can be acquired simultaneously. PMID:25832251

In this paper the authors provide the missing important link between permeability and acoustic velocities by generating a permeability-dependent synthetic sonic log in a carbonate reservoir. The computations are based on Akbar`s theory that relates wave velocity to frequency, rock properties (e.g., lithology, permeability, and porosity), and fluid saturation and properties (viscosity, density, and compressibility). An inverted analytical expression of the theory is used to extract permeability from sonic velocity. The synthetic sonic and the computed permeability are compared with the observed sonic log and with plug permeability, respectively. The results demonstrate, as predicted by theory, that permeability can be related directly to acoustic velocities.

The summary and specifications to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS are provided. The MAIL LOG program has four modes of operation: (1) input - putting new records into the data base; (2) revise - changing or modifying existing records in the data base; (3) search - finding special records existing in the data base; and (4) archive - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the input and search modes.

A Kentucky coal was found more difficult to be compacted into large size strong logs. Study showed that compaction parameters affecting the strength of compacted coal logs could be categorized into three groups. The first group is coal inherent properties such as elasticity and coefficient of friction, the second group is machine properties such as mold geometry, and the third group is the coal mixture preparation parameters such as particle size distribution. Theoretical analysis showed that an appropriate backpressure can reduce surface cracks occurring during ejection. This has been confirmed by the experiments conducted.

The sulfate-methane transition (SMT), a biogeochemical zone where sulfate and methane are metabolized, is commonly observed at shallow depths (1-30 mbsf) in methane-bearing marine sediments. Two processes consume sulfate at and above the SMT, anaerobic oxidation of methane (AOM) and organoclastic sulfate reduction (OSR). Differentiating the relative contribution of each process is critical to estimate methane flux into the SMT, which, in turn, is necessary to predict deeper occurrences of gas hydrates in continental margin sediments. To evaluate the relative importance of these two sulfate reduction pathways, we developed a diagenetic model to compute the pore water concentrations of sulfate, methane, and dissolved inorganic carbon (DIC). By separately tracking DIC containing 12C and 13C, the model also computes ??13C-DIC values. The model reproduces common observations from methane-rich sediments: a well-defined SMT with no methane above and no sulfate below and a ??13C-DIC minimum at the SMT. The model also highlights the role of upward diffusing 13C-enriched DIC in contributing to the carbon isotope mass balance of DIC. A combination of OSR and AOM, each consuming similar amounts of sulfate, matches observations from Site U1325 (Integrated Ocean Drilling Program Expedition 311, northern Cascadia margin). Without AOM, methane diffuses above the SMT, which contradicts existing field data. The modeling results are generalized with a dimensional analysis to the range of SMT depths and sedimentation rates typical of continental margins. The modeling shows that AOM must be active to establish an SMT wherein methane is quantitatively consumed and the ??13C-DIC minimum occurs. The presence of an SMT generally requires active AOM. Copyright 2011 by the American Geophysical Union.

Well-log analysis requires several vectors of input data to be inverted with a physical model that produces more vectors of output data. The problem is inherently suited to either vectorization or parallelization. PLATO (parallel log analysis, timely output) is a research prototype system that uses a parallel architecture computer with memory-mapped graphics to invert vector data and display the result rapidly. By combining this high-performance computing and display system with a graphical user interface, the analyst can interact with the system in real time'' and can visualize the result of changing parameters on up to 1,000 levels of computed volumes and reconstructed logs. It is expected that such instant'' inversion will remove the main disadvantages frequently cited for simultaneous analysis methods, namely difficulty in assessing sensitivity to different parameters and slow output response. Although the prototype system uses highly specific features of a parallel processor, a subsequent version has been implemented on a conventional (Serial) workstation with less performance but adequate functionality to preserve the apparently instant response. PLATO demonstrates the feasibility of petroleum computing applications combining an intuitive graphical interface, high-performance computing of physical models, and real-time output graphics.

The use of neutron and gamma ray measurements for the analysis of material composition has become well established in the last 40 years. Schlumberger has pioneered the use of this technology for logging wells drilled to produce oil and gas, and for this purpose has developed neutron generators that allow measurements to be made in deep (5000 m) boreholes under adverse conditions. We also make ruggedized neutron and gamma ray detector packages that can be used to make reliable measurements on the drill collar of a rotating drill string while the well is being drilled, where the conditions are severe. Modern nuclear methods used in logging measure rock formation parameters like bulk density and porosity, fluid composition, and element abundances by weight including hydrogen concentration. The measurements are made with high precision and accuracy. These devices (well logging sondes) share many of the design criteria required for remote sensing in space; they must be small, light, rugged, and able to perform reliably under adverse conditions. We see a role for the adaptation of this technology to lunar or planetary resource assessment missions.

In order to correct the unphysical log-layer mismatch commonly encountered in detached eddy simulation (DES) of flows with attached boundary layers, a function ℓM,ML, which has a multi-layer structure with scaling laws in each layer and a plateau related to the Kármán constant, is defined. The height of this plateau is found to be crucial for obtaining the correct log-layer. A target scaling function is designed which equals ℓM,ML in the near-wall region, but with the height of plateau determined analytically from the classical log-law. This scaling function is used as a target function according to which the resolved turbulent fluctuations are renormalized, in order to recover the height of plateau prescribed by the log-law. The renormalization procedure guarantees the height of ℓM,ML required by log-law, resulting in correct log layer slope. The method is also shown to maintain similar turbulent properties in the large eddy simulation (LES) region of DES method. Hence it predicts the turbulent intensity correctly. The results demonstrate the relationship between constant ℓM,ML and log-law profile of mean velocity, and relate the Kármán constant to turbulent fluctuations, implying a complete description of turbulent structural ensemble dynamics. The proposed method can be extended to more general flows with log layers since it uses only the log-law with Kármán constant as the input, while the intercept of log layer depends on the solution of Spalart-Allmaras (SA) model in the near-wall field, where Reynolds-averaged Navier-Stokes (RANS) solutions are accurate.

The government of Indonesia, which presides over 10% of the world's tropical forests, has set ambitious targets to cut its high deforestation rates through an REDD+ scheme (Reducing Emissions from Deforestation and forest Degradation). This will require strong law enforcement to succeed. Yet, strategies that have accomplished this are rare and, along with past failures, tend not to be documented. We evaluated a multistakeholder approach that seeks to tackle illegal logging in the carbon-rich province of Aceh, Sumatra. From 2008 to 2009, Fauna & Flora International established and supported a community-based informant network for the 738,000 ha Ulu Masen ecosystem. The network reported 190 forest offenses to local law enforcement agencies, which responded with 86 field operations that confiscated illicit vehicles, equipment, and timber, and arrested 138 illegal logging suspects. From 45 cases subsequently monitored, 64.4% proceeded to court, from which 90.0% of defendants received a prison sentence or a verbal warning for a first offense. Spatial analyses of illegal logging and timber storage incidents predicted that illegal activities would be more effectively deterred by law enforcement operations that targeted the storage sites. Although numerous clusters of incidents were identified, they were still widespread reflecting the ubiquity of illegal activities. The multistakeholder results were promising, but illegal logging still persisted at apparently similar levels at the project's end, indicating that efforts need to be further strengthened. Nevertheless, several actions contributed to the law enforcement achievements: strong political will; strong stakeholder support; and funding that could be promptly accessed. These factors are highlighted as prerequisites for achieving Indonesia's ambitious REDD+ goals. PMID:24628366

Logging technologies developed for hydrocarbon resource evaluation have not migrated into geothermal applications even though data so obtained would strengthen reservoir characterization efforts. Two causative issues have impeded progress: (1) there is a general lack of vetted, high-temperature instrumentation, and (2) the interpretation of log data generated in a geothermal formation is in its infancy. Memory-logging tools provide a path around the first obstacle by providing quality data at a low cost. These tools feature on-board computers that process and store data, and newer systems may be programmed to make decisions. Since memory tools are completely self-contained, they are readily deployed using the slick line found on most drilling locations. They have proven to be rugged, and a minimum training program is required for operator personnel. Present tools measure properties such as temperature and pressure, and the development of noise, deviation, and fluid conductivity logs based on existing hardware is relatively easy. A more complex geochemical tool aimed at a quantitative analysis of (potassium, uranium and thorium) is in the calibration phase, and it is expandable into all nuclear measurements common in the hydrocarbon industry. A fluid sampling tool is in the design phase. All tools are designed for operation at conditions exceeding 400 C, and for deployment in the slim holes produced by mining-coring operations. Partnerships are being formed between the geothermal industry and scientific drilling programs to define and develop inversion algorithms relating raw tool data to more pertinent information. These cooperative efforts depend upon quality guidelines such as those under development within the international Ocean Drilling Program.

... AND OFFICIAL RECORDS UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS The Commanding Officer Commanding Officers Afloat § 700.845 Maintenance of logs. (a) A deck log and an engineering log shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck log. An...

Building structures made from logs appeared in the eastern United States during the late 17th century, and immigrants from Sweden, Finland, and Germany are credited with their construction. There were two types of structures: the horizontal design introduced by the Scandinavians and the German or Pennsylvania Dutch model that was used by the…

Wettability is an important factor in controlling the distribution of oil and water. However, its evaluation has so far been a difficult problem because no log data can directly indicate it. In this paper, a new method is proposed for quantitatively predicting reservoir wettability via well log analysis. Specifically, based on the J function, diagenetic facies classification and the piecewise power functions, capillary pressure curves are constructed from conventional logs and a nuclear magnetic resonance (NMR) log respectively. Under the influence of wettability, the latter is distorted while the former remains unaffected. Therefore, the ratio of the median radius obtained from the two kinds of capillary pressure curve is calculated to reflect wettability, a quantitative relationship between the ratio and reservoir wettability is then established. According to the low-permeability core sample capillary pressure curve, NMR {{T}2} spectrum and contact angle experimental data from the bottom of the Upper Triassic reservoirs in western Ordos Basin, China, two kinds of constructing capillary pressure curve models and a predictive wettability model are calibrated. The wettability model is verified through the Amott wettability index and saturation exponent from resistivity measurement and their determined wettability levels are comparable, indicating that the proposed model is quite reliable. In addition, the model’s good application effect is exhibited in the field study. Thus, the quantitatively predicting reservoir wettability model proposed in this paper provides an effective tool for formation evaluation, field development and the improvement of oil recovery.

Tropical rainforests are among the most threatened ecosystems by large-scale fragmentation due to human activity such as heavy logging and agricultural clearance. Although, they provide crucial ecosystem goods and services, such as sequestering carbon from the atmosphere, protecting watersheds and conserving biodiversity. In several countries forest resource extraction has experienced a shift from clearcutting to selective logging to maintain a significant forest cover and understock of living biomass. However the knowledge on the short and long-term effects of removing selected species in tropical rainforest are scarce and need to be further investigated. One of the main effects of selective logging on forest dynamics seems to be the local disturbance which involve the invasion of open space by weed, vines and climbers at the expense of the late-successional state cenosis. We present a simple deterministic model that describes the dynamics of tropical rainforest subject to selective logging to understand how and why weeds displace native species. We argue that the selective removal of tallest tropical trees carries out gaps of light that allow weeds, vines and climbers to prevail on native species, inhibiting the possibility of recovery of the original vegetation. Our results show that different regime shifts may occur depending on the type of forest management adopted. This hypothesis is supported by a dataset of trees height and weed/vines cover that we collected from 9 plots located in Central and West Africa both in untouched and managed areas.

Although it is unspectacular in appearance, dead wood is one of the most ecologically important resources in forests. Fallen logs, dead standing trees, stumps, and even cavities in live trees fulfill a wide range of roles. Prominent among these is that they provide habitat for many organisms, especially insects. Fourth-grade students at Fox…

The design log is a record of observations, diagnoses, prescriptions, and performance specifications for each space in a structure. It is a systematic approach to design that integrates information about user needs with traditional architectural programming and design. (Author/MLF)

Information relevant to the MAIL LOG program theory is documented. The L-files for mail correspondence, design information release/report, and the drawing/engineering order are given. In addition, sources for miscellaneous external routines and special support routines are documented along with a glossary of terms.

... expressed in Coordinated Universal Time (UTC). (2) “ON WATCH” must be entered by the operator beginning a... until the claim or complaint has been satisfied or barred by statute limiting the time for filing suits... log by the operator's signature. (2) The date and time of making an entry must be shown opposite...

... expressed in Coordinated Universal Time (UTC). (2) “ON WATCH” must be entered by the operator beginning a... until the claim or complaint has been satisfied or barred by statute limiting the time for filing suits... log by the operator's signature. (2) The date and time of making an entry must be shown opposite...

... expressed in Coordinated Universal Time (UTC). (2) “ON WATCH” must be entered by the operator beginning a... until the claim or complaint has been satisfied or barred by statute limiting the time for filing suits... log by the operator's signature. (2) The date and time of making an entry must be shown opposite...

A high temperature spectral gamma tool has been designed and built for use in small-diameter geothermal exploration wells. Several engineering judgments are discussed regarding operating parameters, well model selection, and signal processing. An actual well log at elevated temperatures is given with spectral gamma reading showing repeatability.

Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux focus on annual timescales, but potentially overlook important geomorphic responses on shorter time scales immediately following timber harvest. Sediments fluxes are commonly estimated from linear regression of intermittent measurements of water and sediment discharge using sediment rating curves (SRCs). However, these often unsatisfactorily reproduce non-linear effects such as discharge-load hystereses. We resolve such important dynamics from non-parametric Quantile Regression Forests (QRF) of high-frequency (3 min) measurements of stream discharge and sediment concentrations in similar-sized (~ 0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the Random Forest (RF) algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors. The algorithm belongs to the family of decision-tree classifiers, which allow quantifying relevant predictors in high-dimensional parameter space. We find that, where no logging occurred, ~ 80% of the total sediment load was transported during rare but high magnitude runoff events during only 5% of the monitoring period. The variability of sediment flux of these rare events spans four orders of magnitude. In particular dry-season logging dampened the role of these rare, extreme sediment-transport events by increasing load efficiency during more moderate events. We show that QRFs outperforms traditional SRCs in terms of accurately simulating short-term dynamics of sediment flux, and conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment discharge at high temporal resolution.

Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux often focus on annual timescales but overlook potentially important process response on shorter intervals immediately following timber harvest. We resolve such dynamics with non-parametric quantile regression forests (QRF) based on high-frequency (3 min) discharge measurements and sediment concentration data sampled every 30-60 min in similar-sized (˜0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the random forest algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors. The algorithm belongs to the family of decision-tree classifiers, which allow quantifying relevant predictors in high-dimensional parameter space. We find that, where no logging occurred, ˜80% of the total sediment load was transported during extremely variable runoff events during only 5% of the monitoring period. In particular, dry-season logging dampened the relative role of these rare, extreme sediment-transport events by increasing load efficiency during more efficient moderate events. We show that QRFs outperform traditional sediment rating curves (SRCs) in terms of accurately simulating short-term dynamics of sediment flux, and conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment fluxes at high temporal resolution.

Production log flow profiles provide a valuable tool to evaluate well and reservoir performance. Horizontal wellbores and their associated completion designs present several challenges to profile interpretation for conventional production logging sensors and techniques. A unique approach combining pulsed neutron capture (PNC) log data with conventional production logging measurements is providing improved flow profile answers in slotted liner, horizontal well completions on the North Slope of Alaska. Identifying and eliminating undesirable gas production is one of the chief goals of production logging on the North Slope. This process becomes difficult in horizontal wellbores as fluid segregation affects the area investigated by the various logging sensors and also the velocities of the individual phases. Typical slotted liner completions further complicate analysis as fluids are able to flow in the liner/openhole annulus. Analysis of PNC log data provides two good qualitative indicators of formation permeability. The first technique is derived from the difference of the formation sigma response before and after injecting a high-capture cross-section borax solution. The second technique uses the difference of the formation sigma response and the formation porosity measured while injecting the formation with crude or seawater. Further analysis of PNC log runs show that the two techniques closely correlate with production flow profiles under solution gas-oil ratio (GOR) conditions. These two techniques in combination with conventional production logging measurements of temperature, capacitance, pressure, and spinner improve flow profile results. PNC results can be combined with temperature and pressure data in the absence of valid spinner data to provide an approximate flow profile. These techniques have been used to successfully determine profiles in both cemented and slotted liner completions with GORs in excess of 15,000 scf/bbl.

Particle size distribution (PSD) is a fundamental physical property of soils. Traditionally, the PSD curve was generated by hand from limited data of particle size analysis, which is subjective and may lead to significant uncertainty in the freehand PSD curve and graphically estimated cumulative particle percentages. To overcome these problems, a log-cubic method was proposed for the generation of PSD curve based on a monotone piecewise cubic interpolation method. The log-cubic method and commonly used log-linear and log-spline methods were evaluated by the leave-one-out cross-validation method for 394 soil samples extracted from UNSODA database. Mean error and root mean square error of the cross-validation show that the log-cubic method outperforms two other methods. What is more important, PSD curve generated by the log-cubic method meets essential requirements of a PSD curve, that is, passing through all measured data and being both smooth and monotone. The proposed log-cubic method provides an objective and reliable way to generate a PSD curve from limited soil particle analysis data. This method and the generated PSD curve can be used in the conversion of different soil texture schemes, assessment of grading pattern, and estimation of soil hydraulic parameters and erodibility factor. PMID:23766698

The expansion of selective logging in tropical forests may be an important source of global carbon emissions. However, the effects of logging practices on the carbon cycle have never been quantified over long periods of time. We followed the fate of more than 60 000 tropical trees over 23 years to assess changes in aboveground carbon stocks in 48 1.56-ha plots in French Guiana that represent a gradient of timber harvest intensities, with and without intensive timber stand improvement (TSI) treatments to stimulate timber tree growth. Conventional selective logging led to emissions equivalent to more than a third of aboveground carbon stocks in plots without TSI (85 Mg C/ha), while plots with TSI lost more than one-half of aboveground carbon stocks (142 Mg C/ha). Within 20 years of logging, plots without TSI sequestered aboveground carbon equivalent to more than 80% of aboveground carbon lost to logging (-70.7 Mg C/ha), and our simulations predicted an equilibrium aboveground carbon balance within 45 years of logging. In contrast, plots with intensive TSI are predicted to require more than 100 years to sequester aboveground carbon lost to emissions. These results indicate that in some tropical forests aboveground carbon storage can be recovered within half a century after conventional logging at moderate harvest intensities. PMID:19769089

Setting up the working stages in forest operations is conditioned by environmental protection and forest health requirements. This paper exposes a method for improving the decision-making process by choosing the most environmentally effective logging systems according to terrain configuration and stand characteristics. Such a methodology for selecting machines or logging systems accounting for environment, safety as well as economics, becomes mandatory in the context of sustainable management of forest with multiple functions. Based on analytic hierarchy process analysis the following classification of the environmental performance for four considered alternatives was obtained: skyline system (42.43%), forwarder system (20.22%), skidder system (19.92%) and horse logging system (17.43%). Further, an environmental risk matrix for the most important 28 risk factors specific to any work equipment used in forest operations was produced. In the end, a multicriterial analysis generated a risk index RI ranging between 1.0 and 3.5, which could help choosing the optimal combination of logging system and logging equipment with low environmental impact. In order to demonstrate the usefulness of the proposed approach, a simple application in specific conditions of a harvesting site is presented. - Highlights: • We propose a decision-making algorithm to select eco-friendly logging systems. • Analytic hierarchy process was applied for ranking 4 types of logging systems. • An environmental risk matrix with 28 risk factors in forest operations was made up.

Factor analysis technique is proposed in this research for interpreting the combination of nuclear well logging, including natural gamma ray, density and neutron-porosity, and the electrical well logging of long and short normal, in order to characterize the large extended basaltic areas in southern Syria. Kodana well logging data are used for testing and applying the proposed technique. The four resulting score logs enable to establish the lithological score cross-section of the studied well. The established cross-section clearly shows the distribution and the identification of four kinds of basalt which are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products, clay. The factor analysis technique is successfully applied on the Kodana well logging data in southern Syria, and can be used efficiently when several wells and huge well logging data with high number of variables are required to be interpreted. PMID:24296157

"Lonely guy" (LOG) has been identified as a cytokinin-producing enzyme in plants and plant-interacting fungi. The gene product of Cg2612 from the soil-dwelling bacterium Corynebacterium glutamicum was annotated as an LDC. However, the facts that C. glutamicum lacks an LDC and Cg2612 has high amino acid similarity with LOG proteins suggest that Cg2612 is possibly an LOG protein. To investigate the function of Cg2612, we determined its crystal structure at a resolution of 2.3 Å. Cg2612 functions as a dimer and shows an overall structure similar to other known LOGs, such as LOGs from Arabidopsis thaliana (AtLOG), Claviceps purpurea (CpLOG), and Mycobacterium marinum (MmLOG). Cg2612 also contains a "PGGXGTXXE" motif that contributes to the formation of an active site similar to other LOGs. Moreover, biochemical studies on Cg2612 revealed that the protein has phosphoribohydrolase activity but not LDC activity. Based on these structural and biochemical studies, we propose that Cg2612 is not an LDC family enzyme, but instead belongs to the LOG family. In addition, the prenyl-binding site of Cg2612 (CgLOG) comprised residues identical to those seen in AtLOG and CpLOG, albeit dissimilar to those in MmLOG. The work provides structural and functional implications for LOG-like proteins from other microorganisms. PMID:27507425

“Lonely guy” (LOG) has been identified as a cytokinin-producing enzyme in plants and plant-interacting fungi. The gene product of Cg2612 from the soil-dwelling bacterium Corynebacterium glutamicum was annotated as an LDC. However, the facts that C. glutamicum lacks an LDC and Cg2612 has high amino acid similarity with LOG proteins suggest that Cg2612 is possibly an LOG protein. To investigate the function of Cg2612, we determined its crystal structure at a resolution of 2.3 Å. Cg2612 functions as a dimer and shows an overall structure similar to other known LOGs, such as LOGs from Arabidopsis thaliana (AtLOG), Claviceps purpurea (CpLOG), and Mycobacterium marinum (MmLOG). Cg2612 also contains a “PGGXGTXXE” motif that contributes to the formation of an active site similar to other LOGs. Moreover, biochemical studies on Cg2612 revealed that the protein has phosphoribohydrolase activity but not LDC activity. Based on these structural and biochemical studies, we propose that Cg2612 is not an LDC family enzyme, but instead belongs to the LOG family. In addition, the prenyl-binding site of Cg2612 (CgLOG) comprised residues identical to those seen in AtLOG and CpLOG, albeit dissimilar to those in MmLOG. The work provides structural and functional implications for LOG-like proteins from other microorganisms. PMID:27507425

The requirements for the reporting of polychlorinated biphenyl (PCB)-related activities are found in 40 Code of Federal Regulations (CFR) 761 Subpart J, "General Records and Reports." The PCB Annual Document Log is a detailed record of the PCB waste handling activities at the facility. The facility must prepare it each year by July 1 and maintain it at the facility for at least 3 years after the facility ceases using or storing PCBs and PCB items. While submittal of the PCB Annual Document Log to the U.S. Environmental Protection Agency (EPA) is not required by regulation, EPA has verbally requested in telephone conversations that this report be submitted to them on an annual basis. The Annual Document Log section of this report meets the requirements of 40 CFR 761.180(a)(2), as applicable, while the Annual Records section meets the requirement of 40 CFR 761.180(a)(1).

Accurately detecting and quantifying gas hydrate or free gas in sediments from seismic data require downhole well-log data to calibrate the physical properties of the gas hydrate-/free gas-bearing sediments. As part of the Gulf of Mexico Joint Industry Program, a series of wells were either cored or drilled in the Gulf of Mexico to characterize the physical properties of gas hydrate-bearing sediments, to calibrate geophysical estimates, and to evaluate source and transport mechanisms for gas within the gas hydrates. Downhole acoustic logs were used sparingly in this study because of degraded log quality due to adverse wellbore conditions. However, reliable logging while drilling (LWD) electrical resistivity and porosity logs were obtained. To tie the well-log information to the available 3-D seismic data in this area, a velocity log was calculated from the available resistivity log at the Keathley Canyon 151-2 well, because the acoustic log or vertical seismic data acquired at the nearby Keathley Canyon 151-3 well were either of poor quality or had limited depth coverage. Based on the gas hydrate saturations estimated from the LWD resistivity log, the modified Biot-Gassmann theory was used to generate synthetic acoustic log and a synthetic seismogram was generated with a fairly good agreement with a seismic profile crossing the well site. Based on the well-log information, a faintly defined bottom-simulating reflection (BSR) in this area is interpreted as a reflection representing gas hydrate-bearing sediments with about 15% saturation overlying partially gas-saturated sediments with 3% saturation. Gas hydrate saturations over 30-40% are estimated from the resistivity log in two distinct intervals at 220-230 and 264-300 m below the sea floor, but gas hydrate was not physically recovered in cores. It is speculated that the poor recovery of cores and gas hydrate morphology are responsible for the lack of physical gas hydrate recovery.

The planning and design of any coal mine development requires among others a thorough investigation of the geological, geotechnical and hydrogeological subsurface conditions. As part of a coal mine exploration program we conducted heat pulse vertical flow meter testing. The flow data were combined with absolute and differential temperature logging data to gain information about the hydraulic characteristics of two different coal seams and their over- and interburden. For the strata that were localised based on geophysical logging data including density, gamma ray and resistivity hydraulic properties were quantified. We demonstrate that the temperature log response complements the flow meter log response. A coupling of both methods is therefore recommended to get an insight into the hydraulic conditions in a coal seam and its overburden.

This patent describes a logging sonde for use in a borehole traversing an earth formation. The logging sonde comprising: an elongated sonde body; a plurality of measuring means for measuring a characteristic of the earth formation. Each of the measuring means comprising: a central element; a first measuring flap hingably connected to the central element; a second measuring flap hingable connected to the central element. The measuring flaps being disposed on either side of the central element, the first measuring flap staggered relative to the second measuring flap along the longitudinal direction of the sonde body; means operatively connected between the sonde body and the first and second measuring flaps for applying a resilient force to each of the measuring flaps, thereby tending to move the flaps away from the sonde body; and means connected between the sonde body and each of the measuring means for translocating the measuring means away from and back to the sonde body.

Tree biomass estimation, which is being integrated into the U.S. Forest Service Renewable Resources Evaluation Program, will give foresters the ability to estimate the amount of logging residues they might expect from harvested treetops and branches and residual rough, rotten, and small trees before the actual harvest. With planning, and increased demand for such timber products as pulpwood and fuelwood, product recovery could be increased by up to 43 percent in softwood stands and 99% in hardwoods. Recovery levels affect gross product receipts and site preparation costs. An example of product recovery and residue generation is presented for three harvesting options in Pennsylvania hardwood stands. Under the whole-tree harvesting option, 46% more product was recovered than in single product harvesting, and logging residue levels were reduced by 58%.

SPIRES, an aging high-energy physics publication data base, is in the process of being replaced by INSPIRE. In order to ease the transition from SPIRES to INSPIRE it is important to understand user behavior and the drivers for adoption. The goal of this project was to address some questions in regards to the presumed two-thirds of the users still using SPIRES. These questions are answered through analysis of the log files from both websites. A series of scripts were developed to collect and interpret the data contained in the log files. The common search patterns and usage comparisons are made between INSPIRE and SPIRES, and a method for detecting user frustration is presented. The analysis reveals a more even split than originally thought as well as the expected trend of user transition to INSPIRE.

As educators we must ask ourselves if we are meeting the needs of today's students. The science world is adapting to our ever-changing society; are the methodology and philosophy of our educational system keeping up? In this article, you'll learn why web logs (also called blogs) are an important Web 2.0 tool in your science classroom and how they…

The need for working with and understanding different types of graphs is a common occurrence in everyday life. Examples include anything having to do investments, being an educated juror in a case that involves evidence presented graphically, and understanding many aspect of our current political discourse. Within a science class graphs play a crucial role in presenting and interpreting data. In astronomy, where the range of graphed values is many orders of magnitude, log-axes must be used and understood. Experience shows that students do not understand how to read and interpret log-axes or how they differ from linear. Alters (1996), in a study of college students in an algebra-based physics class, found little understanding of log plotting. The purpose of this poster is to show the method and progression I have developed for use in my “ASTRO 101” class, with the goal being to help students better understand the H-R diagram, mass-luminosity relationship, and digital spectra.

We show that the linearization of all exact solutions of classical chiral gravity around the AdS{sub 3} vacuum have positive energy. Nonchiral and negative-energy solutions of the linearized equations are infrared divergent at second order, and so are removed from the spectrum. In other words, chirality is confined and the equations of motion have linearization instabilities. We prove that the only stationary, axially symmetric solutions of chiral gravity are BTZ black holes, which have positive energy. It is further shown that classical log gravity--the theory with logarithmically relaxed boundary conditions--has finite asymptotic symmetry generators but is not chiral and hence may be dual at the quantum level to a logarithmic conformal field theories (CFT). Moreover we show that log gravity contains chiral gravity within it as a decoupled charge superselection sector. We formally evaluate the Euclidean sum over geometries of chiral gravity and show that it gives precisely the holomorphic extremal CFT partition function. The modular invariance and integrality of the expansion coefficients of this partition function are consistent with the existence of an exact quantum theory of chiral gravity. We argue that the problem of quantizing chiral gravity is the holographic dual of the problem of constructing an extremal CFT, while quantizing log gravity is dual to the problem of constructing a logarithmic extremal CFT.

Neutron and density logs acquired in boreholes at Yucca Mountain, Nevada are used to determine porosity and water content as a function of depth. Computation of porosity requires an estimate of grain density, which is provided by core data, mineralogical data, or is inferred from rock type where neither core nor mineralogy are available. The porosity estimate is merged with mineralogical data acquired by X-ray diffraction to compute the volumetric fractions of major mineral groups. The resulting depth-based portrayal of bulk rock composition is equivalent to a whole rock analysis of mineralogy and porosity. Water content is computed from epithermal and thermal neutron logs. In the unsaturated zone, the density log is required along with a neutron log. Water content can also be computed from dielectric logs, which were acquired in only a fraction of the boreholes, whereas neutron logs were acquired in all boreholes. Mineralogical data are used to compute a structural (or bound) water estimate, which is subtracted from the total water estimate from the neutron-density combination. Structural water can be subtracted only from intervals where mineralogical analyses are available; otherwise only total water can be reported. The algorithms and procedures are applied to logs acquired during 1979 to 1984 at Yucca Mountain. Examples illustrate the results. Comparison between computed porosity and core measurements shows systematic differences ranging from 0.005 to 0.04. These values are consistent with a sensitivity analysis using uncertainty parameters for good logging conditions. Water content from core measurements is available in only one borehole, yielding a difference between computed and core-based water content of 0.006.

The conventional and most accepted method of measuring the lytic activity of a phage against its bacterial host is the plaque assay. This method is laborious, time consuming and expensive, especially in high throughput analyses where multiple phage-bacterial interactions are required to be monitored simultaneously. It can also vary considerably with the experimenter and by the growth and plating conditions. Alternatively, the lytic activity can be measured indirectly by following the decrease in optical density of the bacterial cultures owing to lysis. Here we describe an automated, high throughput, indirect liquid lysis assay to evaluate phage growth using the OmniLog(TM) system. The OmniLog(TM) system uses redox chemistry, employing cell respiration as a universal reporter. During active growth of bacteria, cellular respiration reduces a tetrazolium dye and produces a color change that is measured in an automated fashion. On the other hand, successful phage infection and subsequent growth of the phage in its host bacterium results in reduced bacterial growth and respiration and a concomitant reduction in color. Here we show that microtiter plate wells inoculated with Bacillus anthracis and phage show decreased or no growth, compared with the wells containing bacteria only or phage resistant bacteria plus phage. Also, we show differences in the kinetics of bacterial growth and the timing of appearance of phage resistant bacteria in the presence of individual phages or a cocktail of B. anthracis specific phages. The results of these experiments indicate that the OmniLog(TM) system could be used reliably for indirectly measuring phage growth in high throughput host range and phage and antibiotics combination studies. PMID:23275867

This report summarizes the results of ABLE`s design study of the FORTE deployable log periodic antenna. The resulting Baseline Design of the antenna is the basis for ABLE`s proposal for Phase II of this program. ABLE`s approach to meeting the requirements is to use a coilable ABLE mast as the deployable structure ``backbone`` of the antenna and to use deployable tubes for. the log periodic dipole elements of the antenna. This general approach was adopted at the outset of the Phase I Design Study. The remainder of the study was devoted to detailed design and analysis to properly size these types of mast and antenna elements and to design their deployment mechanisms. Demonstration models of the mast and antenna element deployer were fabricated as part of Phase I study. The study showed that ABLE`s design approach is feasible and can meet all the specified design requirements except the mass limit of 13.5 kg. Results of the design and analysis studies are summarized in this report. The mast and dipole element deployer are to be demonstrated to LANL personnel at the conclusion of this Phase I study.

Wildfire and large-scale forest harvesting are the two major disturbances in the Russian boreal forests. Non-recovered logged sites total about a million hectares in Siberia. Logged sites are characterized by higher fire hazard than forest sites due to the presence of generally untreated logging slash (i.e., available fuel) which dries out much more rapidly compared to understory fuels. Moreover, most logging sites can be easily accessed by local population; this increases the risk for fire ignition. Fire impacts on the overstory trees, subcanopy woody layer, and ground vegetation biomass were estimated on 14 logged and unlogged comparison sites in the Lower Angara Region in 2009-2010 as part of the NASA-funded NEESPI project, The Influence of Changing Forestry Practices on the Effects of Wildfire and on Interactions Between Fire and Changing Climate in Central Siberia. Based on calculated fuel consumption, we estimated carbon emission from fires on both logged and unlogged burned sites. Carbon emission from fires on logged sites appeared to be twice that on unlogged sites. Soil respiration decreased on both site types after fires. This reduction may partially offset fire-produced carbon emissions. Carbon emissions from fire and post-fire ecosystem damage on logged sites are expected to increase under changing climate conditions and as a result of anticipated increases in future forest harvesting in Siberia.

Log-periodic oscillations have been found to decorate the usual power-law behavior found to describe the approach to a critical point, when the continuous scale-invariance symmetry is partially broken into a discrete-scale invariance symmetry. For Ising or Potts spins with ferromagnetic interactions on hierarchical systems, the relative magnitude of the log-periodic corrections are usually very small, of order 10(-5). In growth processes [diffusion limited aggregation (DLA)], rupture, earthquake, and financial crashes, log-periodic oscillations with amplitudes of the order of 10% have been reported. We suggest a "technical" explanation for this 4 order-of-magnitude difference based on the property of the "regular function" g(x) embodying the effect of the microscopic degrees of freedom summed over in a renormalization group (RG) approach F(x)=g(x)+mu(-1)F(gamma x) of an observable F as a function of a control parameter x. For systems for which the RG equation has not been derived, the previous equation can be understood as a Jackson q integral, which is the natural tool for describing discrete-scale invariance. We classify the "Weierstrass-type" solutions of the RG into two classes characterized by the amplitudes A(n) of the power-law series expansion. These two classes are separated by a novel "critical" point. Growth processes (DLA), rupture, earthquake, and financial crashes thus seem to be characterized by oscillatory or bounded regular microscopic functions that lead to a slow power-law decay of A(n), giving strong log-periodic amplitudes. If in addition, the phases of A(n) are ergodic and mixing, the observable presents self-affine nondifferentiable properties. In contrast, the regular function of statistical physics models with "ferromagnetic"-type interactions at equilibrium involves unbound logarithms of polynomials of the control variable that lead to a fast exponential decay of A(n) giving weak log-periodic amplitudes and smoothed observables. PMID

At the Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) for International Space Station (ISS), each flight controller maintains detailed logs of activities and communications at their console position. These logs are critical for accurately controlling flight in real-time as well as providing a historical record and troubleshooting tool. This paper describes logging methods and electronic formats used at the POIC and provides food for thought on their strengths and limitations, plus proposes some innovative extensions. It also describes an inexpensive PC-based scheme for capturing and/or transcribing audio clips from communications consoles. Flight control activity (e.g. interpreting computer displays, entering data/issuing electronic commands, and communicating with others) can become extremely intense. It's essential to document it well, but the effort to do so may conflict with actual activity. This can be more than just annoying, as what's in the logs (or just as importantly not in them) often feeds back directly into the quality of future operations, whether short-term or long-term. In earlier programs, such as Spacelab, log keeping was done on paper, often using position-specific shorthand, and the other reader was at the mercy of the writer's penmanship. Today, user-friendly software solves the legibility problem and can automate date/time entry, but some content may take longer to finish due to individual typing speed and less use of symbols. File layout can be used to great advantage in making types of information easy to find, and creating searchable master logs for a given position is very easy and a real lifesaver in reconstructing events or researching a given topic. We'll examine log formats from several console position, and the types of information that are included and (just as importantly) excluded. We'll also look at when a summary or synopsis is effective, and when extensive detail is needed.

For an arbitrary open set \\Omega\\subset I^2= \\lbrack 0,1)^2 and an arbitrary function f\\in L\\log^+L\\log^+\\log^+L(I^2) such that f=0 on \\Omega the double Fourier series of f with respect to the trigonometric system \\Psi=\\mathscr E and the Walsh-Paley system \\Psi=W is shown to converge to zero (over rectangles) almost everywhere on \\Omega. Thus, it is proved that generalized localization almost everywhere holds on arbitrary open subsets of the square I^2 for the double trigonometric Fourier series and the Walsh-Fourier series of functions in the class L\\log^+L\\log^+\\log^+L (in the case of summation over rectangles). It is also established that such localization breaks down on arbitrary sets that are not dense in I^2, in the classes \\Phi_\\Psi(L)(I^2) for the orthonormal system \\Psi=\\mathscr E and an arbitrary function such that \\Phi_{\\mathscr E}(u)=o(u\\log^+\\log^+u) as u\\to\\infty or for \\Phi_W(u)=u(\\log^+\\log^+u)^{1-\\varepsilon}, 0

Field investigations and a literature review were conducted to determine whether existing well-logging techniques are suitable for measuring /sup 226/Ra at remedial action sites. These methods include passive gamma-ray measurement techniques using NaI(Tl) and, occasionally, intrinsic germanium detectors. Parameters that must be considered when logging boreholes at remedial action sites include: (1) casing material and thickness, (2) water in the borehole, (3) borehole diameter, (4) disequilibrium between uranium and its daughters when using scintillation detectors, and (5) spatial distribution of the tailings material. Information from the uranium exploration industry demonstrates that borehole logging is a better method for estimating radionuclide concentrations in subsurface soils than core and drill cutting analysis. Field measurements using NaI(Tl) and IG detectors at Edgemont, South Dakota, have shown that NaI(Tl) detectors log boreholes faster than IGs. However, if NaI(Tl) detectors are used, additional time is required after logging to obtain representative samples of any anomalies found during logging, conform those samples to a constant geometry, and then count the samples using IG detectors to determine if the materials are tailings. 16 references, 13 figures.

An induced polarization logging tool is described for measuring parameters of a formation surrounding a borehole. The logging tool consists of: a non-conductive logging sonde; a plurality of electrodes disposed on the sonde, the electrodes including at least a survey current electrode and guard electrodes disposed on opposite sides of the survey current electrode, a non-polarizing voltage measuring electrode, a non-polarizing voltage reference electrode and a current return electrode, both the voltage reference and current return electrodes being located a greater distance from the survey current electrode than the guard electrodes; means connected to the survey current electrode and the guard electrodes for generating a signal representative of the potential difference in the formation between the survey current electrode and the guard electrodes; first control means directly coupled to the survey current electrode, the first control means controlling the current flow to the survey current electrode in response to the potential difference signal; a second control means directly coupled to the guard electrodes to control the current flow to the guard electrodes in response to the potential difference signal; a source of alternating current located at the surface, one end of the source being coupled to the two control means and the other to the current return electrode, the source supplying alternating current at various discrete frequencies between substantially 0.01 and 100 Hz; measurement means directly coupled to the voltage measurement and survey current electrodes to measure the amplitude and phase of the voltage induced in the formation and the amplitude and phase of the current flow to the survey electrode; and transmission means for transmitting the measurements to the surface.

Gas hydrates are crystalline substances composed of water and gas, in which a solid-water-lattice accommodates gas molecules in a cage-like structure. Gas hydrates are globally widespread in permafrost regions and beneath the sea in sediment of outer continental margins. While methane, propane, and other gases can be included in the clathrate structure, methane hydrates appear to be the most common in nature. The amount of methane sequestered in gas hydrates is probably enormous, but estimates are speculative and range over three orders of magnitude from about 100,000 to 270,000,000 trillion cubic feet. The amount of gas in the hydrate reservoirs of the world greedy exceeds the volume of known conventional gas reserves. Gas hydrates also represent a significant drilling and production hazard. A fundamental question linking gas hydrate resource and hazard issues is: What is the volume of gas hydrates and included gas within a given gas hydrate occurrence Most published gas hydrate resource estimates have, of necessity, been made by broad extrapolation of only general knowledge of local geologic conditions. Gas volumes that may be attributed to gas hydrates are dependent on a number of reservoir parameters, including the areal extent ofthe gas-hydrate occurrence, reservoir thickness, hydrate number, reservoir porosity, and the degree of gas-hydrate saturation. Two of the most difficult reservoir parameters to determine are porosity and degreeof gas hydrate saturation. Well logs often serve as a source of porosity and hydrocarbon saturation data; however, well-log calculations within gas-hydrate-bearing intervals are subject to error. The primary reason for this difficulty is the lack of quantitative laboratory and field studies. The primary purpose of this paper is to review the response of well logs to the presence of gas hydrates.

Gas hydrates are crystalline substances composed of water and gas, in which a solid-water-lattice accommodates gas molecules in a cage-like structure. Gas hydrates are globally widespread in permafrost regions and beneath the sea in sediment of outer continental margins. While methane, propane, and other gases can be included in the clathrate structure, methane hydrates appear to be the most common in nature. The amount of methane sequestered in gas hydrates is probably enormous, but estimates are speculative and range over three orders of magnitude from about 100,000 to 270,000,000 trillion cubic feet. The amount of gas in the hydrate reservoirs of the world greedy exceeds the volume of known conventional gas reserves. Gas hydrates also represent a significant drilling and production hazard. A fundamental question linking gas hydrate resource and hazard issues is: What is the volume of gas hydrates and included gas within a given gas hydrate occurrence? Most published gas hydrate resource estimates have, of necessity, been made by broad extrapolation of only general knowledge of local geologic conditions. Gas volumes that may be attributed to gas hydrates are dependent on a number of reservoir parameters, including the areal extent ofthe gas-hydrate occurrence, reservoir thickness, hydrate number, reservoir porosity, and the degree of gas-hydrate saturation. Two of the most difficult reservoir parameters to determine are porosity and degreeof gas hydrate saturation. Well logs often serve as a source of porosity and hydrocarbon saturation data; however, well-log calculations within gas-hydrate-bearing intervals are subject to error. The primary reason for this difficulty is the lack of quantitative laboratory and field studies. The primary purpose of this paper is to review the response of well logs to the presence of gas hydrates.

Developing a gamma-ray log profile of an outcrop with a hand-held scintillometer has many applications to subsurface petroleum geology. The outcrop gamma-ray log provides a readily understandable bridge between what is observed in outcrop and what is to be interpreted on well logs and seismic records. Several examples are presented in this paper that demonstrate major applications. An outcrop from the Cretaceous Mesaverde Group in Colorado provides an excellent example of the use of outcrop gamma-ray logs to better visualize spatial variability of depositional settings for improved well log correlations. Out crops from the Cretaceous Almond Formation, Niobrara Formation, and Graneros Shale in Colorado serve as examples of outcrop gamma-ray logging used to correlate outcrops with their subsurface equivalents for improved lithologic and stratigraphic interpretation of well logs. Outcrops of the Cretaceous Sharon Springs Member of the Pierre Shale in Colorado and the Eocene Green River Formation in Wyoming provide examples of the application of outcrop-gamma ray logging to identify and characterize organic-rich shales in outcrops and on well logs. Outcrops of the Pennsylvanian Jackfork Formation in Arkansas demonstrate the use of outcrop logging to yield improved interpretation of reservoir quality on well logs and for one- and two-dimensional seismic modeling. An outcrop of Precambrian and Cambro-Ordovician rocks from Algeria provides an example of outcrop logging to recognize unconformities and other major surfaces on well logs. An outcrop of the Niobrara Formation in Colorado is used as an example for improved understanding of horizontal gamma-ray log response. The example logs presented are all drived with a hand-held scintillometer. This technique is simple, quick, and relatively inexpensive, so is recommended for any outcrop work that is intended to be applied t;o subsurface well logs or seismic interpretation.

A German log rodmeter of the pitot static type was calibrated in Langley tank no. 1 at speeds up to 34 knots and angles of yaw from 0 deg to plus or minus 10 3/4 degrees. The dynamic head approximated the theoretical head at 0 degrees yaw but decreased as the yaw was increased. The static head was negative and in general became more negative with increasing speed and yaw. Cavitation occurred at speeds above 31 knots at 0 deg yaw and 21 knots at 10 3/4 deg yaw.

Motivation: With the explosion of biomedical literature and the evolution of online and open access, scientists are reading more articles from a wider variety of journals. Thus, the list of core journals relevant to their research may be less obvious and may often change over time. To help researchers quickly identify appropriate journals to read and publish in, we developed a web application for finding related journals based on the analysis of PubMed log data. Availability: http://www.ncbi.nlm.nih.gov/IRET/Journals Contact: luzh@ncbi.nlm.nih.gov Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19734155

In this work, we present VAFLE, an interactive network security visualization prototype for the analysis of firewall log events. Keeping it simple yet effective for analysts, we provide multiple coordinated interactive visualizations augmented with clustering capabilities customized to support anomaly detection and cyber situation awareness. We evaluate the usefulness of the prototype in a use case with network traffic datasets from previous VAST Challenges, illustrating its effectiveness at promoting fast and well-informed decisions. We explain how a security analyst may spot suspicious traffic using VAFLE. We further assess its usefulness through a qualitative evaluation involving network security experts, whose feedback is reported and discussed.

This patent describes a method for acoustically logging an earth formation surrounding a borehole which contains a liquid where the approximate shear wave velocity v of the formation is known. The method consists of: vibrating a dipole source in the liquid to generate in the liquid a guided wave the frequencies of which include a critical frequency f given by zeta = ..nu..12a where a is the borehole radius, so that the fastest component of the guided wave has velocity substantially equal to ..nu..; and detecting the arrival of the fastest component of the guided wave at least one location in the liquid spaced longitudinally along the borehole from the dipole source.

We investigate the derivation of the K - log P relation for RR Lyrae stars based upon the simplest possible theoretical pulsation equation. Making use of recent theoretical advances due to Marconi et al. (2015) this leads to a direct determination of MV for individual stars, but at the same time we re- discover a simple method using the reddening free and metallicity independent combination W(B,V) = V -3.06(B-V) to do the same. We discuss the relation between the two approaches and compare with other determinations in the liter- ature. A consistent distance of the LMC is derived directly from measurements of its RR Lyrae stars.

We have developed a series of Electronic Student Encounter Log (ESEL) programs intended to introduce medical students to promising medical informatics concepts. We attempted to expand ESELs scope from ambulatory settings to all hospital venues and to track progress toward educational goals. Students were confused and frustrated by a previously fast interface and delayed feedback. Numerous scaling problems emerged. Our attempt to address these problems in an extensive revision, developed for the latest affordable hardware and operating system, failed due to new data-corrupting crashes. Risks of scaling up and other familiar software development lessons are reinforced. PMID:18998815

A logarithmic amplifier circuit provides pole-zero compensation for improved stability and response time over 6-8 decades of input signal frequency. The amplifer circuit includes a first operational amplifier with a first feedback loop which includes a second, inverting operational amplifier in a second feedstock loop. The compensated output signal is provided by the second operational amplifier with the log elements, i.e., resistors, and the compensating capacitors in each of the feedback loops having equal values so that each break point is offset by a compensating break point or zero.

A logarithmic amplifier circuit provides pole-zero compensation for improved stability and response time over 6-8 decades of input signal frequency. The amplifier circuit includes a first operational amplifier with a first feedback loop which includes a second, inverting operational amplifier in a second feedback loop. The compensated output signal is provided by the second operational amplifier with the log elements, i.e., resistors, and the compensating capacitors in each of the feedback loops having equal values so that each break point or pole is offset by a compensating break point or zero.

We derive and test a new nonlinear filter that implements Bayes' rule using an ODE rather than with a pointwise multiplication of two functions. This avoids one of the fundamental and well known problems in particle filters, namely "particle collapse" as a result of Bayes' rule. We use a log-homotopy to construct this ODE. Our new algorithm is vastly superior to the classic particle filter, and we do not use any proposal density supplied by an EKF or UKF or other outside source. This paper was written for normal engineers, who do not have homotopy for breakfast.

The volume and economic value of the fire-killed and fire-damaged timber can dramatically decrease due to the impacts of deterioration agents after fire. Insect and stain fungi damages cause significant reductions in economic value of timber especially in the first year after fire. The fire-killed and fire-damaged trees should be quickly extracted to recover their economic values. The logging cost is the main factor that affects the net value of the trees after fire. Therefore, logging system should be carefully planned to maximize the net value recovered from the fire-killed and fire-damaged trees. In this study, the cost efficiency of ground-based logging systems in extracting fire-killed and fire-damaged trees was analyzed in a partially burned Brutian Pine (Pinus brutia) forest, located in Kahramanmaras. Besides, to select the optimum skidding distance with minimum logging costs, the logging systems were examined with respect to various skidding distances and associated forest road lengths.

Forests in Southeast Asia are rapidly being logged and converted to oil palm. These changes in land-use are known to affect species diversity but consequences for the functional diversity of species assemblages are poorly understood. Environmental filtering of species with similar traits could lead to disproportionate reductions in trait diversity in degraded habitats. Here, we focus on dung beetles, which play a key role in ecosystem processes such as nutrient recycling and seed dispersal. We use morphological and behavioural traits to calculate a variety of functional diversity measures across a gradient of disturbance from primary forest through intensively logged forest to oil palm. Logging caused significant shifts in community composition but had very little effect on functional diversity, even after a repeated timber harvest. These data provide evidence for functional redundancy of dung beetles within primary forest and emphasize the high value of logged forests as refugia for biodiversity. In contrast, conversion of forest to oil palm greatly reduced taxonomic and functional diversity, with a marked decrease in the abundance of nocturnal foragers, a higher proportion of species with small body sizes and the complete loss of telecoprid species (dung-rollers), all indicating a decrease in the functional capacity of dung beetles within plantations. These changes also highlight the vulnerability of community functioning within logged forests in the event of further environmental degradation. PMID:25821399

A new computationally efficient framework for vehicle tracking on a mobile platform is proposed. The principal component of the framework is the log-polar transformation applied to video frames captured from a standard uniformly sampled format camera. The log-polar transformation provides two major benefits to real-time vehicle tracking from a mobile vehicle platform moving along a single or multi-lane road. First, it significantly reduces the amount of data required to be processed since it collapses the original Cartesian video frames into log-polar images with much smaller dimensions. Second, the log-polar transformation is capable of mitigating perspective distortion due to its scale invariance property. This second aspect is of interest for vehicle tracking because the target vehicle appearance is preserved for all distances from the observer (camera). This works however only if the center of log-polar transformation is coincident with the vanishing point of perspective view. Therefore, a road following algorithm is proposed to keep the center of log-polar transform on the vanishing point at every video frame compensating for the carrying vehicle movements. Since the algorithm is intended to be used in the mobile embedded devices, it is developed to achieve both mathematical simplicity and algorithmic efficiency while avoiding computationally expensive mathematical functions. The use of trigonometric and exponential functions is minimized comparing to the log-Hough transform traditionally used in log-polar space. This new algorithm focuses on straight radial line fragments, thus shifting its mathematical engine to the linear equations' domain.

Various conventional geophysical borehole measurements were made in conjunction with measurements using a recently designed, low-frequency, acoustic-waveform probe and slow velocity flowmeter for characterization of a fractured mafic intrusion in southern Ontario, Canada. Conventional geophysical measurements included temperature, caliper, gamma, acoustic, single-point resistance, and acoustic televiewer logs. Hole stability problems prevented the use of neutron and gamma-gamma logs, because these logsrequire that a radioactive source be lowered into the borehole. Measurements were made in three boreholes as much as 850 m deep and penetrating a few tens of meters into granitic basement. All rocks within the mafic intrusion were characterized by minimal gamma radiation and acoustic velocities of about 6.9 km/sec. The uniformity of the acoustic velocities and the character of acoustic-waveform logs made with a conventional high-frequency logging source correlated with the density of fractures evident on televiewer logs. Sample intervals of high-frequency waveform logs were transformed into interpretations of effective fracture opening using a recent model for acoustic attenuation in fractured rocks. The new low-frequency sparker source did not perform as expected at depths below 250 m because of previously unsuspected problems with source firing under large hydrostatic heads. A new heat-pulse, slow velocity flowmeter was used to delineate in detail the flow regime indicated in a general way by temperature logs. The flowmeter measurements indicated that water was entering 2 of the boreholes at numerous fractures above a depth of 200 m, with flow in at least 2 of the boreholes exiting through large isolated fractures below a depth of 400 m. (Author 's abstract)

Introduction: Dislocation of the metacarpophalangeal joint (MCPJ) of the thumb is rare in children and delayed presentation of this injury is even more uncommon in the literature. We report two cases, both children, who presented to fracture clinic with a dislocated thumb over one week after initial injury. In each case closed reduction was attempted but failed, and open reduction was necessary. Case Report: Case Presentation 1: A 4 year-old right-hand dominant girl sustained a hyper-extension injury to her right thumb while on holiday abroad. She was told she has “sprained” her thumb. On review in fracture clinic 10 days later, the MCPJ of her thumb remained swollen and bruised. Radiographs showed a dorsally dislocated MCPJ of the right thumb. Case Presentation 2: A four-year old right-hand dominant boy presented to fracture clinic after being referred from A&E with a left ‘thumb injury’ – his thumb had accidentally been jammed in a door 1 week previously. Radiographs were reviewed and repeated, confirming a MCPJ dislocation. Conclusion: Dislocation of the MCPJ of the thumb is extremely uncommon in children and therefore the diagnosis can be easily missed. Two unusual cases of dislocated MCPJ of the thumb in children that presented late because both radiological and clinical features had been missed are described. Closed reduction should always be attempted first but it should be recognised that conversion to an open reduction may be needed, particularly if there is a delay in presentation. There are various surgical options for open reduction including volar and dorsal approaches and arthroscopic procedures. The optimal method is controversial. We have explained a successful open reduction using a dorsal approach. In both cases the volar plate was found to be interposed within the joint blocking reduction. At follow up the patients had regained a full range of movement, normal power and grip strength. PMID:27299055

Presents the use of a group log in which members analyze the content and process of each session using a suggested format. The log promotes dialogue between the leader and each group member and involves members more fully in the group process. Feedback indicates the log is valuable. (JAC)

In this article, the author discusses one of the most functional forms of writing to learn, the two-column learning logs. Two-column learning logs are based on the premise that collecting information and processing information are two very different aspects of learning. Two-column logs allow students to connect the facts and theories of science to…

... by station licensees shall be retained by them for a period of 2 years. However, logs involving... inspection of logs pursuant to § 73.1226, availability to FCC of station logs and records. (2) Reproduction... any application; or placed in the station's local public inspection file as part of an application;...

... by station licensees shall be retained by them for a period of 2 years. However, logs involving... inspection of logs pursuant to § 73.1226, availability to FCC of station logs and records. (2) Reproduction... any application; or placed in the station's local public inspection file as part of an application;...

... by station licensees shall be retained by them for a period of 2 years. However, logs involving... inspection of logs pursuant to § 73.1226, availability to FCC of station logs and records. (2) Reproduction... any application; or placed in the station's local public inspection file as part of an application;...

... by station licensees shall be retained by them for a period of 2 years. However, logs involving... inspection of logs pursuant to § 73.1226, availability to FCC of station logs and records. (2) Reproduction... any application; or placed in the station's local public inspection file as part of an application;...

... by station licensees shall be retained by them for a period of 2 years. However, logs involving... inspection of logs pursuant to § 73.1226, availability to FCC of station logs and records. (2) Reproduction... any application; or placed in the station's local public inspection file as part of an application;...

Use of student learning logs is recommended as a means for both students and teacher to assess second-language learning. The approach encourages learners to analyze their learning difficulties and plan for overcoming them. Incorporated into portfolios, logs can be used to analyze progress. Sample log sheet and chart used as a framework for…

The ability to log tutorial interactions in comprehensive, longitudinal, fine-grained detail offers great potential for educational data mining--but what data is logged, and how, can facilitate or impede the realization of that potential. We propose guidelines gleaned over 15 years of logging, exploring, and analyzing millions of events from…

The biological and chemical effects of three types of log storage on water quality were investigated. Three flow-through log ponds, two wet deck operations, and five log rafting areas were studied. Both biological and chemical aspects of stream quality can be adversely affected b...

Nitrogen reduction by ferrous iron has been suggested as an important mechanism in the formation of ammonia on pre-biotic Earth. This paper examines the effects of adsorption of ferrous iron onto a goethite (α-FeOOH) substrate on the thermodynamic driving force and rate of a ferrous iron-mediated reduction of N2 as compared with the homogeneous aqueous reaction. Utilizing density functional theory and Marcus Theory of proton coupled electron transfer reactions, the following two reactions were studied: {text{Fe}}^{2 + } _{left( {{text{aq}}} right)} + {text{N}}_{2left( {{text{aq}}} right)} + {text{H}}_2 {text{O}}_{left( {{text{aq}}} right)} to {text{N}}_2 {text{H}}^ bullet + {text{FeOH}}^{2 + } _{left( {{text{aq}}} right)} and equiv {text{Fe}}^{2 + } _{left( {{text{ads}}} right)} + {text{N}}_{2left( {{text{aq}}} right)} + 2{text{H}}_2 {text{O}}_{left( {{text{aq}}} right)} to {text{N}}_2 {text{H}}^ bullet + α - {text{FeOOH}}_{left( {text{s}} right)} + 2{text{H}}^ + _{left( {{text{aq}}} right)} Although the rates of both reactions were calculated to be approximately zero at 298 K, the model results suggest that adsorption alters the thermodynamic driving force for the reaction but has no other effect on the direct electron transfer kinetics. Given that simply altering the thermodynamic driving force will not reduce dinitrogen, we can make mechanistic connections between possible prebiotic pathways and biological N2 reduction. The key to reduction in both cases is N2 adsorption to multiple transition metal centers with competitive H2 production.

The second phase of the Hot Dry Rock (HDR) Geothermal Development Program at Fenton Hill, New Mexico, consists of two boreholes, directionally-drilled in a northeast direction, inclined at an angle of 35/sup 0/, with a vertical separation of 365 m (1200 ft). The two boreholes will be connected by 12 to 15 vertical parallel fractures to make a geothermal reservoir calculated to produce 20 MW(e) for 20 years. Accurate temperature measurements, borehole caliper logs, and directional surveys are required for the successful development and operation of this man-made system. Obtaining these data is extremely difficult because of the bottom hole static temperature of 335/sup 0/C (635/sup 0/F) at a depth of 4660 m (15,289 ft), the 35/sup 0/ deviation, the abrasive formation, and the presence of sticky drilling residue products. The efforts during July, August, and September 1980, to obtain these data are presented as a case history. The temperature logs and borehole directional survey produced realistic results; but the borehole caliper measurements were inconsistent and unreliable, due to the developmental stage of the caliper tools.

Schlumberger is an international oilfield service company with nearly 80,000 employees of 140 nationalities, operating globally in 80 countries. As a market leader in oilfield services, Schlumberger has developed a suite of technologies to assess the downhole environment, including, among others, electromagnetic, seismic, chemical, and nuclear measurements. In the past 10 years there has been a radical shift in the oilfield service industry from traditional wireline measurements to logging-while-drilling (LWD) analysis. For LWD measurements, the analysis is performed and the instruments are operated while the borehole is being drilled. The high temperature, high shock, and extreme vibration environment of LWD imposes stringent requirements for the devices used in these applications. This has a significant impact on the design of the components and subcomponents of a downhole tool. Another significant change in the past few years for nuclear-based oilwell logging tools is the desire to replace the sealed radioisotope sources with active, electronic ones. These active radiation sources provide great benefits compared to the isotopic sources, ranging from handling and safety to nonproliferation and well contamination issues. The challenge is to develop electronic generators that have a high degree of reliability for the entire lifetime of a downhole tool. LWD tool testing and operations are highlighted with particular emphasis on electronic radiation sources and nuclear detectors for the downhole environment.

Handoffs are a critical component of coordinated patient care; however, poor handoffs have been associated with near misses and adverse events. To address this, national agencies have recommended standardizing handoffs, for example through the use of handoff documentation tools. Recent research suggests that handoff tools, typically designed for physicians, are often used by non-physician providers as information sources. In this study, we investigated patterns of edits of an electronic handoff tool in a large teaching hospital through examination of its usage log data. Qualitative interviews with clinicians were used to triangulate log data findings. The analysis showed that despite its primary focus on facilitating transitions of care, information in the handoff documentation tool was updated throughout the day. Interviews with residents confirmed that they purposefully updated information to make it available for other members of their patient care teams. This further reiterates the view of electronic handoff tools as facilitators of team communication and coordination. However, the study also showed considerable variability in the frequency of updates between different units and across different patients. Further research is required to understand what factors drive such diversity in the use of electronic handoff tool and whether this diversity can be used to make inferences about patients’ conditions. PMID:25954381

It is traditionally assumed that Zipf's law implies the power-law growth of the number of different elements with the total number of elements in a system—the so-called Heaps' law. We show that a careful definition of Zipf's law leads to the violation of Heaps' law in random systems, with growth curves that have a convex shape in log-log scale. These curves fulfill universal data collapse that only depends on the value of Zipf's exponent. We observe that real books behave very much in the same way as random systems, despite the presence of burstiness in word occurrence. We advance an explanation for this unexpected correspondence.

New methods and apparatus are disclosed which allow measurement of the presence of oil and water in gelogical formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleous present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth's magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation. The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be preformed in open boreholes and in cased well bores. The Dual Excitation Acoustic Paramagnetic Logging Tool employing two acoustic sources is also described.

This patent describes a logging tool for use in a borehole traversing an earth formation, the tool having a multiple dipole source connected to the tool in a fixed position and orientation to generate shear wave radiation in multiple directions in the formation upon excitation. It comprises: generating shear wave radiation in a first direction; receiving at each detector at least a portion of the generated shear wave in a second and a third direction; generating shear wave radiation in a fourth direction; receiving at each detector at least a portion of the generated shear wave in a fifth and a sixth direction; determining, for each detector, a composite dipole waveform for azimuthal directions based on at least a portion of the received waveforms in the second, third, fifth and sixth directions received at each detector; determining for each azimuthal direction, a shear wave velocity based on the composite dipole waveforms determined for each azimuthal direction; and obtaining the magnitude and direction of formation anisotropy, relative to the position of the logging tool.

The Free-Form Deformation (FFD) algorithm is a widely used method for non-rigid registration. Modifications have previously been proposed to ensure topology preservation and invertibility within this framework. However, in practice, none of these yield the inverse transformation itself, and one loses the parsimonious B-spline parametrisation. We present a novel log-Euclidean FFD approach in which a spline model of a stationary velocity field is exponentiated to yield a diffeomorphism, using an efficient scaling-and-squaring algorithm. The log-Euclidean framework allows easy computation of a consistent inverse transformation, and offers advantages in group-wise atlas building and statistical analysis. We optimise the Normalised Mutual Information plus a regularisation term based on the Jacobian determinant of the transformation, and we present a novel analytical gradient of the latter. The proposed method has been assessed against a fast FFD implementation (F3D) using simulated T1- and T2-weighted magnetic resonance brain images. The overlap measures between propagated grey matter tissue probability maps used in the simulations show similar results for both approaches; however, our new method obtains more reasonable Jacobian values, and yields inverse transformations.

"Close Call Action Log Form" ("CCALF") is the name of both a computer program and a Web-based service provided by the program for creating an enhanced database of close calls (in the colloquial sense of mishaps that were avoided by small margins) assigned to the Center Operations Directorate (COD) at Johnson Space Center. CCALF provides a single facility for on-line collaborative review of close calls. Through CCALF, managers can delegate responses to employees. CCALF utilizes a pre-existing e-mail system to notify managers that there are close calls to review, but eliminates the need for the prior practices of passing multiple e-mail messages around the COD, then collecting and consolidating them into final responses: CCALF now collects comments from all responders for incorporation into reports that it generates. Also, whereas it was previously necessary to manually calculate metrics (e.g., numbers of maintenance-work orders necessitated by close calls) for inclusion in the reports, CCALF now computes the metrics, summarizes them, and displays them in graphical form. The reports and all pertinent information used to generate the reports are logged, tracked, and retained by CCALF for historical purposes.

New methods and apparatus are disclosed which allow measurement of the presence of oil and water in geological formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleons present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth's magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation. The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be performed in open boreholes and in cased well bores. The Dual Excitation Acoustic Paramagnetic Logging Tool employing two acoustic sources is also described. 6 figs.

This report focuses on Sandia National Laboratories' effort to create high-temperature logging tools for geothermal applications without the need for heat shielding. One of the mechanisms for failure in conventional downhole tools is temperature. They can only survive a limited number of hours in high temperature environments. For the first time since the evolution of integrated circuits, components are now commercially available that are qualified to 225 C with many continuing to work up to 300 C. These components are primarily based on Silicon-On-Insulator (SOI) technology. Sandia has developed and tested a simple data logger based on this technology that operates up to 300 C with a few limiting components operating to only 250 C without thermal protection. An actual well log to 240 C without shielding is discussed. The first prototype high-temperature tool measures pressure and temperature using a wire-line for power and communication. The tool is based around the HT83C51 microcontroller. A brief discussion of the background and status of the High Temperature Instrumentation program at Sandia, objectives, data logger development, and future project plans are given.

Dihydroorotate dehydrogenase B (DHODB) catalyzes the oxidation of dihydroorotate (DHO) to orotate and is found in the pyrimidine biosynthetic pathway. The Lactococcus lactis enzyme is a dimer of heterodimers containing FMN, FAD, and a 2Fe-2S center. Lys-D48 is found in the catalytic subunit and its side-chain adopts different positions, influenced by ligand binding. Based on crystal structures of DHODB in the presence and absence of orotate, we hypothesized that Lys-D48 has a role in facilitating electron transfer in DHODB, specifically in stabilizing negative charge in the reduced FMN isoalloxazine ring. We show that mutagenesis of Lys-D48 to an alanine, arginine, glutamine, or glutamate residue (mutants K38A, K48R, K48Q, and K48E) impairs catalytic turnover substantially (approximately 50-500-fold reduction in turnover number). Stopped-flow studies demonstrate that loss of catalytic activity is attributed to poor rates of FMN reduction by substrate. Mutation also impairs electron transfer from the 2Fe-2S center to FMN. Addition of methylamine leads to partial rescue of flavin reduction activity. Nicotinamide coenzyme oxidation and reduction at the distal FAD site is unaffected by the mutations. Formation of the spin-interacting state between the FMN semiquinone-reduced 2Fe-2S centers observed in wild-type enzyme is retained in the mutant proteins, consistent with there being little perturbation of the superexchange paths that contribute to the efficiency of electron transfer between these cofactors. Our data suggest a key charge-stabilizing role for Lys-D48 during reduction of FMN by dihydroorotate, or by electron transfer from the 2Fe-2S center, and establish a common mechanism of FMN reduction in the single FMN-containing A-type and the complex multicenter B-type DHOD enzymes. PMID:16624811

The requirements for the reporting of polychlorinated biphenyl (PCB)-related activities are found in 40 Code of Federal Regulations (CFR) 761 Subpart J, "General Records and Reports." The PCB Annual Document Log is a detailed record of the PCB waste handling activities at the facility. The facility must prepare it each year by July 1 and maintain it at the facility for at least 3 years after the facility ceases using or storing PCBs and PCB items. While submittal of the PCB Annual Document Log to the U.S. Environmental Protection Agency (EPA) is not required by regulation, EPA has verbally requested in telephone conversations that this report be submitted to them on an annual basis. The Annual Records are not required to be submitted to EPA and are not considered to be part of the Annual Document Log, but are included to provide the complete disposition history or status of all PCB activities during the year. The Annual Document Log section of this report (Section 2.0) meets the requirements of 40 CFR 761.180(a)(2), as applicable, while the Annual Records section (Section 3.0) meets the requirement of 40 CFR 761.180(a)(1).

Fuzzy analysis technique is proposed in this research for interpreting the combination of nuclear and electrical well logging data, which include natural gamma ray, density and neutron-porosity, while the electrical well logging include long and short normal. The main objective of this work is to describe, characterize and establish the lithology of the large extended basaltic areas in southern Syria. Kodana well logging measurements have been used and interpreted for testing and applying the proposed technique. The established lithological cross section shows the distribution and the identification of four kinds of basalt, which are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products, clay. The fuzzy analysis technique is successfully applied on the Kodana well logging data, and can be therefore utilized as a powerful tool for interpreting huge well logging data with higher number of variables required for lithological estimations. PMID:26275816

The forests of southeastern Alaska remain largely intact and contain a substantial proportion of Earth's remaining old-growth temperate rainforest. Nonetheless, industrial-scale logging has occurred since the 1950s within a relatively narrow range of forest types that has never been quantified at a regional scale. We analyzed historical patterns of logging from 1954 through 2004 and compared the relative rates of change among forest types, landform associations, and biogeographic provinces. We found a consistent pattern of disproportionate logging at multiple scales, including large-tree stands and landscapes with contiguous productive old-growth forests. The highest rates of change were among landform associations and biogeographic provinces that originally contained the largest concentrations of productive old growth (i.e., timber volume >46.6 m³/ha). Although only 11.9% of productive old-growth forests have been logged region wide, large-tree stands have been reduced by at least 28.1%, karst forests by 37%, and landscapes with the highest volume of contiguous old growth by 66.5%. Within some island biogeographic provinces, loss of rare forest types may place local viability of species dependent on old growth at risk of extirpation. Examination of historical patterns of change among ecological forest types can facilitate planning for conservation of biodiversity and sustainable use of forest resources. PMID:23866037

Different methods of lithology predictions from geophysical data have been developed in the last 15 years. The geophysical logs used for predicting lithology are the conventional logs: sonic, neutron-neutron, gamma (total natural-gamma) and density (backscattered gamma-gamma). The prompt gamma neutron activation analysis (PGNAA) is another established geophysical logging technique for in situ element analysis of rocks in boreholes. The work described in this paper was carried out to investigate the application of PGNAA to the lithology interpretation. The data interpretation was conducted using the automatic interpretation program LogTrans based on statistical analysis. Limited test suggests that PGNAA logging data can be used to predict the lithology. A success rate of 73% for lithology prediction was achieved from PGNAA logging data only. It can also be used in conjunction with the conventional geophysical logs to enhance the lithology prediction. PMID:16140021

Log-density gradient estimation is a fundamental statistical problem and possesses various practical applications such as clustering and measuring nongaussianity. A naive two-step approach of first estimating the density and then taking its log gradient is unreliable because an accurate density estimate does not necessarily lead to an accurate log-density gradient estimate. To cope with this problem, a method to directly estimate the log-density gradient without density estimation has been explored and demonstrated to work much better than the two-step method. The objective of this letter is to improve the performance of this direct method in multidimensional cases. Our idea is to regard the problem of log-density gradient estimation in each dimension as a task and apply regularized multitask learning to the direct log-density gradient estimator. We experimentally demonstrate the usefulness of the proposed multitask method in log-density gradient estimation and mode-seeking clustering. PMID:27171983

Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense. PMID:23030125

We demonstrate the accommodation of log-scale concentration gradients of inhibitors on a single microfluidic chip with a semi-direct dilution capability of reagents for the determination of the half-inhibitory concentration or IC50. The chip provides a unique tool for hosting a wide-range of concentration gradient for studies that require an equal distribution of measuring points on a logarithmic scale. Using Matrix metalloproteinase IX and three of its inhibitors, marimastat, batimastat and CP471474, we evaluated the IC50 of each inhibitor with a single experiment. The present work could be applied to the systematic study of biochemical binding and inhibition processes particularly in the field of mechanistic enzymology and the pharmaceutical industry. PMID:21696192

Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

This patent describes a nuclear well logging operation. A method is described for determining a parameter responsive to the condition of a borehole transversing a subsurface earth formation, comprising the steps of: cyclically irradiating the subsurface earth formation with bursts of high energy neutrons; detecting for one or more burst cycles the impingement of gamma radiation upon a first gamma radiation detector means during and between each of the bursts; determining first count of detected impingements of primarily inelastic gamma radiation upon the first detection means; and normalizing the first count to remove the effects upon the first count of variations in the bursts of high energy neutrons, the normalized first count producing the parameter responsive to the condition of the borehole.

In a nuclear well logging operation, a method is described for indicating the presence of gas in a fluid filled zone of a subsurface earth formation, comprising the steps of: cyclically irradiating the subsurface earth formation with bursts of high energy neutrons; detecting for one or more burst cycles the impingement of gamma radiation upon a first gamma radiation detector means during and between each of the burst; determining a first parameter indicative of the count of detected impingements of primarily inelastic gamma radiation upon the first detector means; determining a second parameter indicative of the count of detected impingements of primarily capture gamma radiation upon the first detector means; and comparing the first and second parameters to determine the presence of gas.

One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.

Multiconductor armored well-logging cable is used extensively by the oil and natural gas industry to lower various instruments used to measure the geological and geophysical parameters into deep wellbores. Advanced technology in oil-well drilling makes it possible to achieve borehole depths of 9 km (30,000 ft). The higher temperatures in these deeper boreholes demand advancements in the design and manufacturing of wireline cable and in the electrical insulating and armoring materials used as integral components. If geothermal energy is proved an abundant economic resource, drilling temperatures approaching and exceeding 300/sup 0/C will become commonplace. The adaptation of teflons as electrical insulating material permitted use of armored cable in geothermal wellbores where temperatures are slightly in excess of 200/sup 0/C, and where the concentrations of corrosive minerals and gases are high. Teflon materials presently used in wireline cables, however, are not capable of continuous operation at the anticipated higher temperatures.

The present procedure for finding lower confidence bounds for the quantiles of Weibull populations, on the basis of the solution of a quadratic equation, is more accurate than current Monte Carlo tables and extends to any location-scale family. It is shown that this method is accurate for all members of the log gamma(K) family, where K = 1/2 to infinity, and works well for censored data, while also extending to regression data. An even more accurate procedure involving an approximation to the Lawless (1982) conditional procedure, with numerical integrations whose tables are independent of the data, is also presented. These methods are applied to the case of failure strengths of ceramic specimens from each of three billets of Si3N4, which have undergone flexural strength testing.

Most sedimentary rocks contain movable fluid in the pores. Hydrodynamic effects due to wave-induced oscillatory fluid flow can lead to significant changes of velocities and attenuations of elastic waves in these rocks. In this paper, we consider the influence of a squirt flow (local flow between the pores of different compressibility) on the sonic log response. The calculations are performed using a unified model describing the joint influence of squirt flow and Biot's global flow. The results show that the influence of the squirt flow increases with increase of a signal frequency. This influence is relatively small in the case of the Stoneley wave but it is significant in the case of P and S waves.

This paper proposes a learning based method that can automatically determine how likely a student is to give a correct answer to a problem in an intelligent tutoring system. Only log files that record students' actions with the system are used to train the model, therefore the modeling process doesn't require expert knowledge for identifying…

Acid sulfate soils (ASS) with sulfuric material can be remediated through microbial sulfate reduction stimulated by adding organic matter (OM) and increasing the soil pH to >4.5, but the effectiveness of this treatment is influenced by soil properties. Two experiments were conducted using ASS with sulfuric material. In the first experiment with four ASS, OM (finely ground mature wheat straw) was added at 2-6% (w/w) and the pH adjusted to 5.5. After 36 weeks under flooded conditions, the concentration of reduced inorganic sulfur (RIS) and pore water pH were greater in all treatments with added OM than in the control without OM addition. The RIS concentration increased with OM addition rate. The increase in RIS concentration between 4% and 6% OM was significant but smaller than that between 2% and 4%, suggesting other factors limited sulfate reduction. In the second experiment, the effect of nitrate addition on sulfate reduction at different OM addition rates was investigated in one ASS. Organic matter was added at 2 and 4% and nitrate at 0, 100, and 200 mg nitrate-N kg(-1). After 2 weeks under flooded conditions, soil pH and the concentration of FeS measured as acid volatile sulfur (AVS) were lower with nitrate added at both OM addition rates. At a given nitrate addition rate, pH and AVS concentration were higher at 4% OM than at 2%. It can be concluded that sulfate reduction in ASS at pH 5.5 can be limited by low OM availability and high nitrate concentrations. Further, the inhibitory effect of nitrate can be overcome by high OM addition rates. PMID:25600239

This paper presents an empirical relationship of quantitatively linked electromagnetic (EM) borehole recordings of the total dissolved solids (TDS) in pore water in the Quaternary deposits of the Belgian coastal plain. First, the long normal (LN) logs are linked to EM logs, then the already developed relationships between LN resistivity measurements and the TDS values are rewritten for EM recordings. The main parameter in these equations is the formation factor, which is derived from ground water analyses and LN logs through Archie's law. The EM recording has several advantages compared to the LN logs. The EM analysis allows measuring in PVC-cased wells and is not hindered by the invasion zone around the well. Furthermore, it has a high vertical resolution. LN logs can be measured only once, after drilling a well; EM recordings can be repeated several times in monitoring wells, which allows the gathering of time-dependent data over a complete vertical cross section. Such data could be obtained with LN logs only in wells with screens over the full-depth interval, which causes a hydraulic short circuit. This short circuit can result in a large artificial flow through the well between different levels, resulting in a salinity profile, which is no longer representative for the studied site. Remediation against short circuiting is a reduction of the screened interval, which strongly reduces the gathered information. The application of the derived equations is one of setting up a monitoring network along the Belgian coast to monitor the trend in salinity levels and comparing present salinity levels with older LN recordings to investigate the salinity changes in the last 30 years. Deep wells already present in the Belgiancoastal plain can then be used to monitor both the fresh water head changes and the salt water evolution. The technique has also been used for parameter identification for which real concentration measurements were needed. PMID:12533073

The era of petascale computing brought machines with hundreds of thousands of processors. The next generation of exascale supercomputers will make available clusters with millions of processors. In those machines, mean time between failures will range from a few minutes to few tens of minutes, making the crash of a processor the common case, instead of a rarity. Parallel applications running on those large machines will need to simultaneously survive crashes and maintain high productivity. To achieve that, fault tolerance techniques will have to go beyond checkpoint/restart, which requires all processors to roll back in case of a failure. Incorporating some form of message logging will provide a framework where only a subset of processors are rolled back after a crash. In this paper, we discuss why a simple causal message logging protocol seems a promising alternative to provide fault tolerance in large supercomputers. As opposed to pessimistic message logging, it has low latency overhead, especially in collective communication operations. Besides, it saves messages when more than one thread is running per processor. Finally, we demonstrate that a simple causal message logging protocol has a faster recovery and a low performance penalty when compared to checkpoint/restart. Running NAS Parallel Benchmarks (CG, MG and BT) on 1024 processors, simple causal message logging has a latency overhead below 5%.

This paper assesses the level of interest, awareness, and adoption of ISO 14001 and Forest Stewardship Council (FSC) certification schemes among logging companies in Cameroon. Eleven logging companies located in Douala in the Littoral Region of Cameroon were assessed through a structured interview using an administered questionnaire which was mostly analyzed qualitatively thereafter. The findings indicated that none of the companies was certified for ISO 14001; however 63.64% of them were already FSC-certified. Four companies (36.36%) were neither FSC- nor ISO 14001 EMS-certified. Among the factors found to influence the adoption rate was the level of awareness about ISO 14001 and FSC certification schemes. The main drivers for pursuing FSC certification were easy penetration into international markets, tax holiday benefits, and enhancement of corporate image of the logging companies through corporate social responsibility fulfillments. Poor domestic market for certified products was found to be the major impediment to get certified. To make logging activities more environmentally friendly and socially acceptable, logging companies should be encouraged to get certified through the ISO 14001 EMS scheme which is almost nonexistent so far. This requires awareness creation about the scheme, encouraging domestic markets for certified products and creating policy incentives. PMID:27355041

This paper assesses the level of interest, awareness, and adoption of ISO 14001 and Forest Stewardship Council (FSC) certification schemes among logging companies in Cameroon. Eleven logging companies located in Douala in the Littoral Region of Cameroon were assessed through a structured interview using an administered questionnaire which was mostly analyzed qualitatively thereafter. The findings indicated that none of the companies was certified for ISO 14001; however 63.64% of them were already FSC-certified. Four companies (36.36%) were neither FSC- nor ISO 14001 EMS-certified. Among the factors found to influence the adoption rate was the level of awareness about ISO 14001 and FSC certification schemes. The main drivers for pursuing FSC certification were easy penetration into international markets, tax holiday benefits, and enhancement of corporate image of the logging companies through corporate social responsibility fulfillments. Poor domestic market for certified products was found to be the major impediment to get certified. To make logging activities more environmentally friendly and socially acceptable, logging companies should be encouraged to get certified through the ISO 14001 EMS scheme which is almost nonexistent so far. This requires awareness creation about the scheme, encouraging domestic markets for certified products and creating policy incentives. PMID:27355041

A novel optical temporal log-slope difference mapping approach is proposed for cancerous breast tumor detection. In this method, target tissues are illuminated by near-infrared (700-1000 nm) ultrashort laser pulses from various surface source points, and backscattered time-resolved light signals are collected at the same surface points. By analyzing the log-slopes of decaying signals over all points on the source-detection grid, a log-slope distribution on the surface is obtained. After administration of absorption contrast agents, the presence of cancerous tumors increases the decaying steepness of the transient signals. The mapping of log-slope difference between native tissue and absorption-enhanced cancerous tissue indicates the location and projection of tumors on the detection surface. In this paper, we examine this method in the detection of breast tumors in two model tissue phantoms through computer simulation. The first model has a spherical tumor of 6mm in diameter embedded at the tissue center. The second model is a large tissue phantom embedded with a non-centered spherical tumor 8mm in diameter. Monte Carlo methods were employed to simulate the light transport and signal measurement. It is shown that the tumor in both the tissue models will be accurately projected on the detection surface by the proposed log-slope difference mapping method. The image processing is very fast and does not require any inverse optimization in image reconstruction. PMID:16389079

We have selected a flux limited serendipitous sample of galaxies from the cross-correlation of the BMW (Brera Multiscale Wavelet) ROSAT HRI and the LEDA (Lyon - Meudon Extragalactic Database) Catalogues. This sample is used to study the X-ray properties of normal galaxies in the local universe. We also find that the logN-logS distribution we derived for a serendipitous subsample, optically and X-ray flux limited, is consistent with the euclidean slope in the flux range FX(0.5 - 2) ˜ 1.1 - 110 × 10-14 erg cm-2 s-1. We further show that the same law is valid over 4 decades, from the bright sample derived from the RASS data to the very faint detections in deep XMM-Newton fields.

One of the objectives of the Gulf of Mexico Gas Hydrate Joint Industry Project Leg II (GOM JIP Leg II) was the collection of a comprehensive suite of logging-while-drilling (LWD) data within gas-hydrate-bearing sand reservoirs in order to make accurate estimates of the concentration of gas hydrates under various geologic conditions and to understand the geologic controls on the occurrence of gas hydrate at each of the sites drilled during this expedition. The LWD sensors just above the drill bit provided important information on the nature of the sediments and the occurrence of gas hydrate. There has been significant advancements in the use of downhole well-logging tools to acquire detailed information on the occurrence of gas hydrate in nature: From using electrical resistivity and acoustic logs to identify gas hydrate occurrences in wells to where wireline and advanced logging-while-drilling tools are routinely used to examine the petrophysical nature of gas hydrate reservoirs and the distribution and concentration of gas hydrates within various complex reservoir systems. Recent integrated sediment coring and well-log studies have confirmed that electrical resistivity and acoustic velocity data can yield accurate gas hydrate saturations in sediment grain supported (isotropic) systems such as sand reservoirs, but more advanced log analysis models are required to characterize gas hydrate in fractured (anisotropic) reservoir systems. In support of the GOM JIP Leg II effort, well-log data montages have been compiled and presented in this report which includes downhole logs obtained from all seven wells drilled during this expedition with a focus on identifying and characterizing the potential gas-hydrate-bearing sedimentary section in each of the wells. Also presented and reviewed in this report are the gas-hydrate saturation and sediment porosity logs for each of the wells as calculated from available downhole well logs.

One of the objectives of the Gulf of MexicoGasHydrateJointIndustryProjectLegII (GOM JIP LegII) was the collection of a comprehensive suite of logging-while-drilling (LWD) data within gas-hydrate-bearing sand reservoirs in order to make accurate estimates of the concentration of gashydrates under various geologic conditions and to understand the geologic controls on the occurrence of gashydrate at each of the sites drilled during this expedition. The LWD sensors just above the drill bit provided important information on the nature of the sediments and the occurrence of gashydrate. There has been significant advancements in the use of downhole well-logging tools to acquire detailed information on the occurrence of gashydrate in nature: From using electrical resistivity and acoustic logs to identify gashydrate occurrences in wells to where wireline and advanced logging-while-drilling tools are routinely used to examine the petrophysical nature of gashydrate reservoirs and the distribution and concentration of gashydrates within various complex reservoir systems. Recent integrated sediment coring and well-log studies have confirmed that electrical resistivity and acoustic velocity data can yield accurate gashydrate saturations in sediment grain supported (isotropic) systems such as sand reservoirs, but more advanced log analysis models are required to characterize gashydrate in fractured (anisotropic) reservoir systems. In support of the GOM JIP LegII effort, well-log data montages have been compiled and presented in this report which includes downhole logs obtained from all seven wells drilled during this expedition with a focus on identifying and characterizing the potential gas-hydrate-bearing sedimentary section in each of the wells. Also presented and reviewed in this report are the gas-hydrate saturation and sediment porosity logs for each of the wells as calculated from available downhole well logs.

Modern wireline log combinations give highly diagnostic information that goes beyond the basic shale content, pore volume, and fluid saturation of older logs. Pattern recognition of geology from logs is made conventionally through either the examination of log overlays or log crossplots. Both methods can be combined through the use of color as a medium of information by setting the three color primaries of blue, green, and red light as axes of three dimensional color space. Multiple log readings of zones are rendered as composite color mixtures which, when plotted sequentially with depth, show lithological successions in a striking manner. The method is extremely simple to program and display on a color monitor. Illustrative examples are described from the Kansas subsurface. ?? 1986.

In today's well-connected environments of the Internet, intranets, and extranets, protecting the Microsoft Windows network can be a daunting task for the security engineer. Intrusion Detection Systems are a must-have for most companies, but few have either the financial resources or the people resources to implement and maintain full-scale intrusion detection systems for their networks and hosts. Many will at least invest in intrusion detection for their Internet presence, but others have not yet stepped up to the plate with regard to internal intrusion detection. Unfortunately, most attacks will come from within. Microsoft Windows server operating systems are widely used across both large and small enterprises. Unfortunately, there is no intrusion detection built-in to the Windows server operating system. The security logs are valuable but can be difficult to manage even in a small to medium sized environment. So the question arises, can one effectively detect and identify an in side intruder using the native tools that come with Microsoft Windows Server operating systems? One such method is to use Net Logon Service debug logging to identify and track malicious user activity. This paper discusses how to use Net Logon debug logging to identify and track malicious user activity both in real-time and for forensic analysis.

The evaluation of porosity, water saturation and clay content of oilbearing igneous rocks with well logs is difficult due to the mineralogical complexity of this type of rocks. The log responses to rhyolite and rhyolite tuff; andesite, dacite and zeolite tuff; diabase and basalt have been studied from examples in western Argentina and compared with values observed in other countries. Several field examples show how these log responses can be used in a complex lithology program to make a complete evaluation.

An advanced logging research program is one major aspect of the Western Tight Sands Program. Purpose of this workshop is to help BETC define critical logging needs for tight gas sands and to allow free interchange of ideas on all aspects of the current logging research program. Sixteen papers and abstracts are included together with discussions. Separate abstracts have been prepared for the 12 papers. (DLC)

Preliminary research in the development and use of computerized clinical log records began in 1987 in an allied health college at a midwestern academic health center. This article reviews development and implementation of a computerized system for managing clinical log records to improve and enhance allied health educational programs in the radiation sciences. These clinical log databases are used for quantitative and qualitative analyses of student participation in clinical procedures, and educational planning for each student. Collecting and recording data from clinical log records serves as a valuable instructional tool for students, with both clinical and didactic applications. PMID:10389054

In Canada and the United States pressure to recoup financial costs of wildfire by harvesting burned timber is increasing, despite insufficient understanding of the ecological consequences of postfire salvage logging. We compared the species richness and composition of deadwood-associated beetle assemblages among undisturbed, recently burned, logged, and salvage-logged, boreal, mixed-wood stands. Species richness was lowest in salvage-logged stands, largely due to a negative effect of harvesting on the occurrence of wood- and bark-boring species. In comparison with undisturbed stands, the combination of wildfire and logging in salvage-logged stands had a greater effect on species composition than either disturbance alone. Strong differences in species composition among stand treatments were linked to differences in quantity and quality (e.g., decay stage) of coarse woody debris. We found that the effects of wildfire and logging on deadwood-associated beetles were synergistic, such that the effects of postfire salvage logging could not be predicted reliably on the basis of data on either disturbance alone. Thus, increases in salvage logging of burned forests may have serious negative consequences for deadwood-associated beetles and their ecological functions in early postfire successional forests. PMID:20735453

Usability evaluations collect subjective and objective measures. Examples of the latter are time to complete a task. The paper describes use cases of a log analyser for haptic feedback. The log analyser reads a log file and extracts information such as time of each practice and assessment session, analyses whether the user goes off curve and measures the force applied. A study case using the analyser is performed using a PHANToM haptic learning environment application that is used to teach young visually impaired students the subject of polynomials. The paper answers six questions to illustrate further use cases of the log analyser.

The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. This paper discusses how the general-purpose continuous-energy Monte Carlo code MCNP ({und M}onte {und C}arlo {und n}eutron {und p}hoton), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tally characteristics with standard MCNP features. The time-dependent capability of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data.

Web Usage Mining can be described as the discovery and Analysis of user access pattern through mining of log files and associated data from a particular websites. No. of visitors interact daily with web sites around the world. enormous amount of data are being generated and these information could be very prize to the company in the field of accepting Customerís behaviors. In this paper a complete preprocessing style having data cleaning, user and session Identification activities to improve the quality of data. Efficient preprocessing technique one of the User Identification which is key issue in preprocessing technique phase is to identify the Unique web users. Traditional User Identification is based on the site structure, being supported by using some heuristic rules, for use of this reduced the efficiency of user identification solve this difficulty we introduced proposed Technique DUI (Distinct User Identification) based on IP address ,Agent and Session time ,Referred pages on desired session time. Which can be used in counter terrorism, fraud detection and detection of unusual access of secure data, as well as through detection of regular access behavior of users improve the overall designing and performance of upcoming access of preprocessing results.

Several factors involved in coal log fabrication, storage and handling, such as curing time, aspect ratio and particle size distribution, were evaluated during the fourth quarter of 1994. When Orimulsion is used for coal log fabrication, a certain period of time is required to build up the strength of coal log. From the test results obtained, the longer the curing period the greater the wear resistance of the coal log. From previous studies, the coal log length to diameter ratio (aspect) was found to be an important factor affecting coal log performance during the pipeline degradation test. From the 2 inches pipeline degradation test results, coal logs with aspect ratios ranging from 1.6 to 2.2 traveled in a more stable manner, and had lower weight loss than coal logs with aspect ratios less than 1.6. The influence of particle size on the performance of a coal log was evaluated to determine the optimum particle size for coal log fabrication, based on practical and economical considerations.

An approach is proposed for the structuring of a planetary mission set wherein the peak annual funding is minimized to meet the annual budget restraint. One aspect of the approach is to have a transportation capability that can launch a mission in any planetary opportunity; such capability can be provided by solar electric propulsion. Another cost reduction technique is to structure a mission test in a time sequenced fashion that could utilize essentially the same spacecraft for the implementation of several missions. A third technique would be to fulfill a scientific objective in several sequential missions rather than attempt to accomplish all of the objectives with one mission. The application of the approach is illustrated by an example involving the Solar Orbiter Dual Probe mission.

In rodents, the hypothalamo-pituitary-adrenal (HPA) axis is controlled by a precise regulatory mechanism that is influenced by circulating gonadal and adrenal hormones. In males, gonadectomy increases the adrenocorticotropic hormone (ACTH) and corticosterone (CORT) response to stressors, and androgen replacement returns the response to that of the intact male. Testosterone (T) actions in regulating HPA activity may be through aromatization to estradiol, or by 5α-reduction to the more potent androgen, dihydrotestosterone (DHT). To determine if the latter pathway is involved, we assessed the function of the HPA axis response to restraint stress following hormone treatments, or after peripheral or central treatment with the 5α-reductase inhibitor, finasteride. Initially, we examined the timecourse whereby gonadectomy alters the CORT response to restraint stress. Enhanced CORT responses were evident within 48hrs following gonadectomy. Correspondingly, treatment of intact male rats with the 5α-reductase inhibitor, finasteride, for 48 hrs, enhanced the CORT and ACTH response to restraint stress. Peripheral injections of gonadectomized male rats with DHT or T for 48 hrs reduced the ACTH and CORT response to restraint stress. The effects of T, but not DHT, could be blocked by the third ventricle administration of finasteride prior to stress application. These data indicate that the actions of T in modulating HPA axis activity involve 5α-reductase within the central nervous system. These results further our understanding of how T acts to modulate the neuroendocrine stress responses and indicate that 5α reduction to DHT is a necessary step for T action. PMID:23880372

In rodents, the hypothalamo-pituitary-adrenal (HPA) axis is controlled by a precise regulatory mechanism that is influenced by circulating gonadal and adrenal hormones. In males, gonadectomy increases the adrenocorticotropic hormone (ACTH) and corticosterone (CORT) response to stressors, and androgen replacement returns the response to that of the intact male. Testosterone (T) actions in regulating HPA activity may be through aromatization to estradiol, or by 5α-reduction to the more potent androgen, dihydrotestosterone (DHT). To determine if the latter pathway is involved, we assessed the function of the HPA axis response to restraint stress following hormone treatments, or after peripheral or central treatment with the 5α-reductase inhibitor, finasteride. Initially, we examined the timecourse whereby gonadectomy alters the CORT response to restraint stress. Enhanced CORT responses were evident within 48 h following gonadectomy. Correspondingly, treatment of intact male rats with the 5α-reductase inhibitor, finasteride, for 48 h, enhanced the CORT and ACTH response to restraint stress. Peripheral injections of gonadectomized male rats with DHT or T for 48 h reduced the ACTH and CORT response to restraint stress. The effects of T, but not DHT, could be blocked by the third ventricle administration of finasteride prior to stress application. These data indicate that the actions of T in modulating HPA axis activity involve 5α-reductase within the central nervous system. These results further our understanding of how T acts to modulate the neuroendocrine stress responses and indicate that 5α reduction to DHT is a necessary step for T action. PMID:23880372

Scientific LogAnalyzer is a platform-independent interactive Web service for the analysis of log files. Scientific LogAnalyzer offers several features not available in other log file analysis tools--for example, organizational criteria and computational algorithms suited to aid behavioral and social scientists. Scientific LogAnalyzer is highly flexible on the input side (unlimited types of log file formats), while strictly keeping a scientific output format. Features include (1) free definition of log file format, (2) searching and marking dependent on any combination of strings (necessary for identifying conditions in experiment data), (3) computation of response times, (4) detection of multiple sessions, (5) speedy analysis of large log files, (6) output in HTML and/or tab-delimited form, suitable for import into statistics software, and (7) a module for analyzing and visualizing drop-out. Several methodological features specifically needed in the analysis of data collected in Internet-based experiments have been implemented in the Web-based tool and are described in this article. A regression analysis with data from 44 log file analyses shows that the size of the log file and the domain name lookup are the two main factors determining the duration of an analysis. It is less than a minute for a standard experimental study with a 2 x 2 design, a dozen Web pages, and 48 participants (ca. 800 lines, including data from drop-outs). The current version of Scientific LogAnalyzer is freely available for small log files. Its Web address is http://genpsylab-logcrunsh.unizh.ch/. PMID:15354696

Logs made by water-well drillers were analyzed in conjunction with test-hole drilling and geophysical logging to evaluate usefulness of the driller's log in delineating areas that would be suitable for artificial recharge of the Ogallala Formation. This preliminary study indicates that lack of detailed and accurate information in many drillers' logs prevents their use as a reliable source of lithologic information. For many applications, such as evaluation of potential areas for artificial recharge, the value of more complete and more accurate information will be readily apparent as these applications become more widespread. More effort will be required in collecting lithologic information as part of the drilling operations.

... denial for failure to comply with requirements relating to performance of private security functions. 216... performance of private security functions. (a) In accordance with section 862 of the National Defense... contract with regard to the performance of private security functions in an area of contingency...

... denial for failure to comply with requirements relating to performance of private security functions. 216... performance of private security functions. (a) In accordance with section 862 of the National Defense... contract with regard to the performance of private security functions in an area of contingency...

... denial for failure to comply with requirements relating to performance of private security functions. 216... performance of private security functions. (a) In accordance with section 862 of the National Defense... contract with regard to the performance of private security functions in an area of contingency...

... denial for failure to comply with requirements relating to performance of private security functions. 216... performance of private security functions. (a) In accordance with section 862 of the National Defense... contract with regard to the performance of private security functions in an area of contingency...

The Russian boreal zone supports a huge terrestrial carbon pool. Moreover, it is a tremendous reservoir of wood products concentrated mainly in Siberia. The main natural disturbance in these forests is wildfire, which modifies the carbon budget and has potentially important climate feedbacks. In addition, both legal and illegal logging increase landscape complexity and fire hazard. We investigated a number of sites in different regions of Siberia to evaluate the impacts of fire and logging on fuel loads, carbon emissions, tree regeneration, soil respiration, and microbocenosis. We found large variations of fire and logging effects among regions depending on growing conditions and type of logging activity. Partial logging had no negative impact on forest conditions and carbon cycle. Illegal logging resulted in increase of fire hazard, and higher carbon emissions than legal logging. The highest fuel loads and carbon emissions were found on repeatedly burned unlogged sites where first fire resulted in total tree mortality. Repeated fires together with logging activities in drier conditions and on large burned sites resulted in insufficient regeneration, or even total lack of tree seedlings. Soil respiration was less on both burned and logged areas than in undisturbed forest. The highest structural and functional disturbances of the soil microbocenosis were observed on logged burned sites. Understanding current interactions between fire and logging is important for modeling ecosystem processes and for managers to develop strategies of sustainable forest management. Changing patterns in the harvest of wood products increase landscape complexity and can be expected to increase emissions and ecosystem damage from wildfires, inhibit recovery of natural ecosystems, and exacerbate impacts of wildland fire on changing climate and air quality. The research was supported by NASA LCLUC Program, RFBR grant # 12-04-31258, and Russian Academy of Sciences.

Flowing wellbore electrical-conductivity logging provides a means to determine hydrologic properties of fractures, fracture zones, or other permeable layers intersecting a borehole in saturated rock. The method involves analyzing the time-evolution of fluid electrical-conductivity logs obtained while the well is being pumped and yields information on the location, hydraulic transmissivity, and salinity of permeable layers, as well as their initial (or ambient) pressure head. Earlier analysis methods were restricted to the case in which flows from the permeable layers or fractures were directed into the borehole. More recently, a numerical model for simulating flowing-conductivity logging was adapted to permit treatment of both inflow and outflow, including analysis of natural regional flow in the permeable layer. However, determining the fracture properties with the numerical model by optimizing the match to the conductivity logs is a laborious trial-and-error procedure. In this paper, we identify the signatures of various inflow and outflow features in the conductivity logs to expedite this procedure and to provide physical insight for the analysis of these logs. Generally, inflow points are found to produce a distinctive signature on the conductivity logs themselves, enabling the determination of location, inflow rate, and ion concentration in a straightforward manner. Identifying outflow locations and flow rates, on the other hand, can be done with a more complicated integral method. Running a set of several conductivity logs with different pumping rates (e.g., half and double the original pumping rate) provides further information on the nature of the feed points. In addition to enabling the estimation of flow parameters from conductivity logs, an understanding of the conductivity log signatures can aid in the design of follow-up logging activities.

CCSD logging engineering gather many modern high technologies and employs various advanced logging tools to survey the sidewall continuously. This can obtain various physical, chemical, geometrical, etc in-situ information of the borehole's profile. So well logging is one of the most important parts and pivotal technologies in the project of CCSD. The main logging methods in CCSD-MH(0-2000m) are laterolog (Rd,Rs), gamma ray(GR), nature gamma spectrometry(U, TH, K), density(DEN), photo electric section exponent (Pe), compensated neutron(CNL), multipole array acoustic (Vp, Vs, Vst), Simultaneous Acoustic-Resistivity-image(Star-II), temperature(T),magnetic susceptibility(MS), three component borehole magnetic and redox potential log,etc. The various metamorphic rocks can be classified by logging curves,and their physical parameters can be acquired by analyzing the response characters of various metamorphic rocks and by statistics. According to the logging cross plot, We can research the clustering of metamorphite's physical property. Five lithologic segments can be obtainend by logging curves. The GR, Th, U, K logging values of segment 1 is lower than the third, fourth and fiveth segment, higher than segment 2; The DEN, Pe values of segment 1 higher than the third, fourth and fiveth segments. The main rocks in segment 1,2,3,4,5 are eclogites, serpentinites, paragneiss, orthogneiss, and eclogites(containing silicon and muscovite ) respectively. Generally, eclogite contain rutile, silicon, muscovite, etc. minerals. These minerals have response obviously on log curves.There are rutile,ilmenite, pyrite mineralized, etc. Making use of DEN, Pe, susceptibility log values, these mineralized layers can be goodly demarcation. For example, on the rutile mineralzed layer, the logging curve response characters are of high density and Pe obviously. The key data of the synthetical seismic record is wave impedance. In this paper, Utilize the data of AC, DEN curves to calculate the

The system is focused on the Employee Business Travel Event. The system must be able to CRUD (Create, Retrieve, Update, Delete) instances of the Travel Event as well as the ability to CRUD frequent flyer milage associated with airline travel. Additionally the system must provide for a compliance reporting system to monitor reductions in travel costs and lost opportunity costs (i.e., not taking advantage of business class or 7 day advance tickets).

Molecular dynamics (MD) simulations based on the generalized Born (GB) model of implicit solvation offer a number of important advantages over the traditional explicit solvent based simulations. Yet, in MD simulations, the GB model has not been able to reach its full potential partly due to its computational cost, which scales as ∼n(2), where n is the number of solute atoms. We present here an ∼n log n approximation for the generalized Born (GB) implicit solvent model. The approximation is based on the hierarchical charge partitioning (HCP) method (Anandakrishnan and Onufriev J. Comput. Chem. 2010 , 31 , 691 - 706 ) previously developed and tested for electrostatic computations in gas-phase and distant dependent dielectric models. The HCP uses the natural organization of biomolecular structures to partition the structures into multiple hierarchical levels of components. The charge distribution for each of these components is approximated by a much smaller number of charges. The approximate charges are then used for computing electrostatic interactions with distant components, while the full set of atomic charges are used for nearby components. To apply the HCP concept to the GB model, we define the equivalent of the effective Born radius for components. The component effective Born radius is then used in GB computations for points that are distant from the component. This HCP approximation for GB (HCP-GB) is implemented in the open source MD software, NAB in AmberTools, and tested on a set of representative biomolecular structures ranging in size from 632 atoms to ∼3 million atoms. For this set of test structures, the HCP-GB method is 1.1-390 times faster than the GB computation without additional approximations (the reference GB computation), depending on the size of the structure. Similar to the spherical cutoff method with GB (cutoff-GB), which also scales as ∼n log n, the HCP-GB is relatively simple. However, for the structures considered here, we show

Some of the measures that EFM personnel can take to further reduce their estates' carbon footprint at a time when pressure to cut energy consumption must be balanced both against the requirement to create the best possible patient environment, and new medical technology that may require substantial energy to operate, were the focus of a recent IHEEM carbon reduction seminar in London. The one-day event, "Planning to achieve Carbon Reduction Commitment targets for healthcare premises", also included a look at the key steps affected healthcare organisations, and especially their estates teams, need to be taking already to ensure compliance with the new Carbon Reduction Commitment scheme. PMID:20597381

Log jams in mountains streams are preferred storage sites for bedload material and woody debris. The resulting formation of steps and pools within a channel reduces flow velocities and thereby mitigates natural hazards in case of flood events. However, this requires analysing the resilience of log jams during high discharge events which in case of failure can release large amounts of stored material. In this study we investigate log jams in the Erlenbach mountain stream in the Swiss Prealps regarding their storage function of woody debris and residence times of stored logs. Nine log jams were surveyed in detail regarding their position, extent and volume. Artificially introduced wood pieces were tagged with Radio Frequency Identification (RFID) transponders and tracked along a study reach for five months. These tracers confirmed the hypothesis of debris dams being a preferred storage site for dead wood in mountain streams by the calculating tracer data point densities. The average point density for obstruction free channel reaches amounts to 0.13 pieces per m2 while it increases to 0.46 pieces per m2 for channel areas covered by log jams. The size and position of the log jams are mainly determined by bank erosion and hillslope activity as log jams are situated in highly active zones. Large logs of coniferous wood within the jams were dated using tree-ring analysis and their residence times within the channel determined based on the year of tree dieback. The residence times of large logs stored within the jams show a strong connection to the last two exceptional discharge events that occurred at the Erlenbach in 2007 and 2010 (flood events with return times of 50 and 20 years, respectively). The highest number of logs died back in 2007. The year with the second largest number of introduced logs is 2010. The consecutive years after those two high discharge events showed a declining number of trees entering the stream. So both events presumably caused a reactivation

... rotating work platform meeting the requirements of 29 CFR 1910.68. (v) The employer shall assure that each... operations near overhead electric lines shall be done in accordance with the requirements of 29 CFR 1910.333... Register in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the Society...

... rotating work platform meeting the requirements of 29 CFR 1910.67. (v) The employer shall assure that each... operations near overhead electric lines shall be done in accordance with the requirements of 29 CFR 1910.333... Register in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the Society...

... rotating work platform meeting the requirements of 29 CFR 1910.68. (v) The employer shall assure that each... operations near overhead electric lines shall be done in accordance with the requirements of 29 CFR 1910.333... Register in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the Society...

... rotating work platform meeting the requirements of 29 CFR 1910.68. (v) The employer shall assure that each... operations near overhead electric lines shall be done in accordance with the requirements of 29 CFR 1910.333... Register in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the Society...

... rotating work platform meeting the requirements of 29 CFR 1910.68. (v) The employer shall assure that each... operations near overhead electric lines shall be done in accordance with the requirements of 29 CFR 1910.333... Register in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the Society...

Molecular hydrophobicity (lipophilicity), usually quantified as log "P" where "P" is the partition coefficient, is an important molecular characteristic in medicinal chemistry and drug design. The log "P" coefficient is one of the principal parameters for the estimation of lipophilicity of chemical compounds and pharmacokinetic properties. The…

Discusses the use of a strip log as a diagrammatic representation of the information available in a sequence of sedimentary rocks. Describes the design of the strip log (both symbolically and by visual/spatial patterns) and some of the possible interpretations that can be made using them. (TW)

To most people, the log file would not be considered an exciting area in technology today. However, these relatively benign, slowly growing data sources can drive large business transformations when combined with modern-day analytics. Accenture Technology Labs has built a new framework that helps to expand existing vendor solutions to create new methods of gaining insights from these benevolent information springs. This framework provides a systematic and effective machine-learning mechanism to understand, analyze, and visualize heterogeneous log files. These techniques enable an automated approach to analyzing log content in real time, learning relevant behaviors, and creating actionable insights applicable in traditionally reactive situations. Using this approach, companies can now tap into a wealth of knowledge residing in log file data that is currently being collected but underutilized because of its overwhelming variety and volume. By using log files as an important data input into the larger enterprise data supply chain, businesses have the opportunity to enhance their current operational log management solution and generate entirely new business insights-no longer limited to the realm of reactive IT management, but extending from proactive product improvement to defense from attacks. As we will discuss, this solution has immediate relevance in the telecommunications and security industries. However, the most forward-looking companies can take it even further. How? By thinking beyond the log file and applying the same machine-learning framework to other log file use cases (including logistics, social media, and consumer behavior) and any other transactional data source. PMID:27447306

The long-term viability of a forest industry in the Amazon region of Brazil depends on the maintenance of adequate timber volume and growth in healthy forests. Using extensive high-resolution satellite analyses, we studied the forest damage caused by recent logging operations and the likelihood that logged forests would be cleared within 4 years after timber harvest. Across 2,030,637 km2 of the Brazilian Amazon from 1999 to 2004, at least 76% of all harvest practices resulted in high levels of canopy damage sufficient to leave forests susceptible to drought and fire. We found that 16 ± 1% of selectively logged areas were deforested within 1 year of logging, with a subsequent annual deforestation rate of 5.4% for 4 years after timber harvests. Nearly all logging occurred within 25 km of main roads, and within that area, the probability of deforestation for a logged forest was up to four times greater than for unlogged forests. In combination, our results show that logging in the Brazilian Amazon is dominated by highly damaging operations, often followed rapidly by deforestation decades before forests can recover sufficiently to produce timber for a second harvest. Under the management regimes in effect at the time of our study in the Brazilian Amazon, selective logging would not be sustained. PMID:16901980

Surveys some of the benefits claimed for data-logging methods identified through research. Discusses findings from research that sought to explore the translation of these benefits into the real-world of science classrooms and identify the range of influences on science teachers adopting and developing data-logging methods. (SAH)

The long-term viability of a forest industry in the Amazon region of Brazil depends on the maintenance of adequate timber volume and growth in healthy forests. Using extensive high-resolution satellite analyses, we studied the forest damage caused by recent logging operations and the likelihood that logged forests would be cleared within 4 years after timber harvest. Across 2,030,637 km2 of the Brazilian Amazon from 1999 to 2004, at least 76% of all harvest practices resulted in high levels of canopy damage sufficient to leave forests susceptible to drought and fire. We found that 16+/-1% of selectively logged areas were deforested within 1 year of logging, with a subsequent annual deforestation rate of 5.4% for 4 years after timber harvests. Nearly all logging occurred within 25 km of main roads, and within that area, the probability of deforestation for a logged forest was up to four times greater than for unlogged forests. In combination, our results show that logging in the Brazilian Amazon is dominated by highly damaging operations, often followed rapidly by deforestation decades before forests can recover sufficiently to produce timber for a second harvest. Under the management regimes in effect at the time of our study in the Brazilian Amazon, selective logging would not be sustained. PMID:16901980

The logarithm of the retention time (log RT) of organic chemicals on a permanently bonded (C-18) reverse-phase high-pressure liquid chromatography system is shown to be linearly related to the logarithm of the n-octanol/water partition coefficient (log P). A rapid, inexpensive te...

A Ubiquitous Learning Log (ULL) is defined as a digital record of what a learner has learned in daily life using ubiquitous computing technologies. In this paper, a project which developed a system called SCROLL (System for Capturing and Reusing Of Learning Log) is presented. The aim of developing SCROLL is to help learners record, organize,…

A study was undertaken to investigate the feasibility of sonic logging in a cased borehole. Results were obtained from a scaled-model laboratory experiment and from computer simulations. The waveforms from the computer model indicate that sonic logging can be successful in bonded and unbonded cased holes. A slowness-time semblance signal processing technique is used to obtain wave velocities from waveforms.

A study was undertaken to investigate the feasibility of sonic logging in a cased borehole. Results were obtained from a scale-model laboratory experiment and from computer simulations. The waveforms from the computer model indicate that sonic logging can be successful in bonded and unbonded cased holes. A slowness/timesemblance signal-processing technique is used to obtain wave velocities from waveforms.

Analyzing system logs provides useful insights for identifying system/application anomalies and helps in better usage of system resources. Nevertheless, it is simply not practical to scan through the raw log messages on a regular basis for large-scale systems. First, the sheer volume of unstructured log messages affects the readability, and secondly correlating the log messages to system events is a daunting task. These factors limit large-scale system logs primarily for generating alerts on known system events, and post-mortem diagnosis for identifying previously unknown system events that impacted the systems performance. In this paper, we describe a log monitoring framework that enables prompt analysis of system events in real-time. Our web-based framework provides a summarized view of console, netwatch, consumer, and apsched logs in real- time. The logs are parsed and processed to generate views of applications, message types, individual/group of compute nodes, and in sections of the compute platform. Also from past application runs we build a statistical profile of user/application characteristics with respect to known system events, recoverable/non-recoverable error messages and resources utilized. The web-based tool is being developed for Jaguar XT5 at the Oak Ridge Leadership Computing facility.

This paper focuses on developing a relatively simple method for analyzing web-logs. It also explores the challenges and benefits of web-log analysis. The study of student behavior on this site provides insights into website design and the effectiveness of this site in particular. Another benefit realized from the paper is the ease with which these…

... boots. (4) The employer is not required to pay for: (i) The logging boots required by 29 CFR 1910.266(d... sunscreen. (5) The employer must pay for replacement PPE, except when the employee has lost or...

... boots. (4) The employer is not required to pay for: (i) The logging boots required by 29 CFR 1910.266(d... sunscreen. (5) The employer must pay for replacement PPE, except when the employee has lost or...

A study was initiated to correlate log parameters and core data from West Pembina Nisku (D-2) Pinnacle reefs. The primary objective was to derive basic input data for making volumetric estimates of reserves and for providing initial input data for doing enhanced recovery model studies. A secondary objective was to determine if a set of log analysis parameters could be derived that would work universally in the many pinnacle reefs of the West Pembina area. The results of the study indicate that porosity log response in the West Pembina Nisku reefs deviates a fair amount from the standards used in log analysis chart books. A multilinear regression formula for determining porosity was derived which worked well for 18 wells studied, on which there was both log and core data. A brief description of the Nisku reef geology, a description of the methods used in doing the study, and a graphic presentation of the results are included.

This paper presents a method that uses core and log measurements to quantify the influence of saturation changes on elastic properties at core, log, and seismic scales in order to investigate the feasibility of using seismic measurements to monitor gas production on the Heimdal North Sea field. In the first part, laboratory measurements of P- and S-wave velocities, porosity, density, and mineralogy are used to determined the influence of partial saturation on elastic parameters. These measurement are compared with theoretical predictions using Gassmann`s model. Second, the authors invert from logs, the rock parameters that are necessary to simulate the impact of water rise on elastic properties at log scale. Third, the well logs are modelled at seismic scales of resolution and are used to identify the effect of fluid movement on the seismic response. Finally, the results are compared to repetitive seismic measurements.

Project tasks: Perform the necessary testing and development to demonstrate that the amount of binder in coal logs can be reduced to 8% or lower to produce logs with adequate strength to eliminate breakage during pipeline transportation, under conditions experienced in long distance pipeline systems. Prior to conducting any testing and demonstration, grantee shall perform an information search and make full determination of all previous attempts to extrude or briquette coal, upon which the testing and demonstration shall be based. Perform the necessary development to demonstrate a small model of the most promising injection system for coal-logs, and test the logs produced from Task 1. Conduct economic analysis of coal-log pipeline, based upon the work to date. Refine and complete the economic model. Prepare a final report for DOE.

Vast expanses of tropical forests worldwide are being impacted by selective logging. We evaluate the environmental impacts of such logging and conclude that natural timber-production forests typically retain most of their biodiversity and associated ecosystem functions, as well as their carbon, climatic, and soil-hydrological ecosystem services. Unfortunately, the value of production forests is often overlooked, leaving them vulnerable to further degradation including post-logging clearing, fires, and hunting. Because logged tropical forests are extensive, functionally diverse, and provide many ecosystem services, efforts to expand their role in conservation strategies are urgently needed. Key priorities include improving harvest practices to reduce negative impacts on ecosystem functions and services, and preventing the rapid conversion and loss of logged forests. PMID:25092495

The Russian boreal zone supports a huge terrestrial carbon pool. Moreover, it is a tremendous reservoir of wood products concentrated mainly in Siberia. The main natural disturbance in these forests is wildfire, which modifies the carbon budget and has potentially important climate feedbacks. In addition, both legal and illegal logging increase landscape complexity and affect burning conditions and fuel consumption. We investigated 100 individual sites with different histories of logging and fire on a total of 23 study areas in three different regions of Siberia to evaluate the impacts of fire and logging on fuel loads, carbon emissions, and tree regeneration in pine and larch forests. We found large variations of fire and logging effects among regions depending on growing conditions and type of logging activity. Logged areas in the Angara region had the highest surface and ground fuel loads (up to 135 t ha-1), mainly due to logging debris. This resulted in high carbon emissions where fires occurred on logged sites (up to 41 tC ha-1). The Shushenskoe/Minusinsk and Zabaikal regions are characterized by better slash removal and a smaller amount of carbon emitted to the atmosphere during fires. Illegal logging, which is widespread in the Zabaikal region, resulted in an increase in fire hazard and higher carbon emissions than legal logging. The highest fuel loads (on average 108 t ha-1) and carbon emissions (18-28 tC ha-1) in the Zabaikal region are on repeatedly burned unlogged sites where trees fell on the ground following the first fire event. Partial logging in the Shushenskoe/Minusinsk region has insufficient impact on stand density, tree mortality, and other forest conditions to substantially increase fire hazard or affect carbon stocks. Repeated fires on logged sites resulted in insufficient tree regeneration and transformation of forest to grasslands. We conclude that negative impacts of fire and logging on air quality, the carbon cycle, and ecosystem

Cross-contamination of ready-to-eat (RTE) foods with pathogens on contaminated tableware and food preparation utensils is an important factor associated with foodborne illnesses. To prevent this, restaurants and food service establishments are required to achieve a minimum microbial reduction of 5 logs from these surfaces. This study evaluated the sanitization efficacies of ware-washing protocols (manual and mechanical) used in restaurants to clean tableware items. Ceramic plates, drinking glasses and stainless steel forks were used as the food contact surfaces. These were contaminated with cream cheese and reduced-fat milk inoculated with murine norovirus (MNV-1), Escherichia coli K-12 and Listeria innocua. The sanitizing solutions tested were sodium hypochlorite (chlorine), quaternary ammonium (QAC) and tap water (control). During the study, the survivability and response to the experimental conditions of the bacterial species was compared with that of MNV-1. The results showed that current ware-washing protocols used to remove bacteria from tableware items were not sufficient to achieve a 5 logreduction in MNV-1 titer. After washing, a maximum of 3 logreduction in the virus were obtained. It was concluded that MNV-1 appeared to be more resistant to both the washing process and the sanitizers when compared with E. coli K-12 and L. innocua. PMID:23227163

Supercomputers are prone to frequent faults that adversely affect their performance, reliability and functionality. System logs collected on these systems are a valuable resource of information about their operational status and health. However, their massive size, complexity, and lack of standard format makes it difficult to automatically extract information that can be used to improve system management. In this work we propose a novel method to succinctly represent the contents of supercomputing logs, by using textual clustering to automatically find the syntactic structures of log messages. This information is used to automatically classify messages into semantic groups via an online clustering algorithm. Further, we describe a methodology for using the temporal proximity between groups of log messages to identify correlated events in the system. We apply our proposed methods to two large, publicly available supercomputing logs and show that our technique features nearly perfect accuracy for online log-classification and extracts meaningful structural and temporal message patterns that can be used to improve the accuracy of other log analysis techniques.

The strain rate of cold glacial ice depends mainly on the stress tensor, temperature, grain size, and crystal habit. Lab measurements cannot be made at both the low stresses and low temperatures relevant to flow of cold glacial ice. Field studies with inclinometers measure only the horizontal components of flow. We have developed a new method for measuring the 3D strain-rate field at -40o to -15oC, using the AMANDA neutrino-detecting array frozen into deep ice at South Pole. Each strain detector consists of a photomultiplier tube (PMT) in its pressure vessel. AMANDA has ˜600 PMTs at depths 1500 to 2300 m in a ˜0.02 km3 volume. The coordinates of each PMT relative to a coordinate system moving down slope at 9 m yr-1 can be measured with s.d. <1 m in 1 day by mapping trajectories of down-going cosmic-ray muons that pass through the array. The PMTs record the arrival times of the Cherenkov light emitted along the muon trajectory. Use of maximum likelihood for 105 muon tracks allows PMT positions to be determined; their positions are then updated at six-month intervals. We will report results of strain-rate measurements in three dimensions, made in 2000, 2001, and 2002 at T ≈ -30oC. Applying the same technique to the future 1 km3 IceCube array, by averaging over subsets of the 5000 detectors, values of the strain-rate tensor as small as 3x 10-5 yr-1 can be measured as a function of temperature and lateral position. The vertical strain rate due to snow accumulation, estimated to be ˜ 3x 10-5 yr-1, can be measured and will serve as a check on the method. The second new method is designed to measure mean grain size in the ice surrounding a borehole. We will adapt an all-digital logging tool originally developed by Advanced Logic Technology (Luxembourg) for geophysics prospecting in rock boreholes. A 1.3 MHz transducer emits acoustic pulses horizontally into the ice in increments of 5o in azimuth and records the wave train back-scattered from grain boundaries. For

Data gathered aboard research vessels coordinated by the University-National Oceanographic Laboratory System (UNOLS) represent an important component of the overall oceanographic data collection. The nascent Rolling Deck to Repository (R2R) project aims to improve access to basic shipboard data and ultimately reduce the work required to provide that access. The ultimate vision of R2R is to assist in transforming the academic fleet into an integrated global observing system. One of the coordinated subprojects within the R2R project is development of a shipboard, scientific event logging system that would incorporate best practice guidelines, a metadata schema and new and existing applications to generate a scientific sampling event log in the form of a digital text file. A cruise event logging system enables researchers to record digitally all scientific sampling events and assign a unique event identifier to each entry. Decades of work conducted within large coordinated ocean research programs (JGOFS, GLOBEC, WOCE and RIDGE) have shown that creation of a shipboard sampling event log can facilitate greatly the subsequent integration of data sets from individual investigators. A prototype event logger application, based on ELOG, has been developed and tested and results and lessons learned from this development effort will be shared.

Several techniques to improve the accuracy of radionuclide concentration estimates as a function of depth from gamma-ray logs have appeared in the literature. Much of that work was driven by interest in uranium as an economic mineral. More recently, the problem of mapping and monitoring artificial gamma-emitting contaminants in the ground has rekindled interest in improving the accuracy of radioelement concentration estimates from gamma-ray logs. We are looking at new approaches to accomplishing such improvements. The first step in this effort has been to develop a new computational model of a spectral gamma-ray logging sonde in a borehole environment. The model supports attenuation in any combination of materials arranged in 2-D cylindrical geometry, including any combination of attenuating materials in the borehole, formation, and logging sonde. The model can also handle any distribution of sources in the formation. The model considers unscattered radiation only, as represented by the background-corrected area under a given spectral photopeak as a function of depth. Benchmark calculations using the standard Monte Carlo model MCNP show excellent agreement with total gamma flux estimates with a computation time of about 0.01% of the time required for the MCNP calculations. This model lacks the flexibility of MCNP, although for this application a great deal can be accomplished without that flexibility.

Acoustic logging-while-drilling (LWD) technology has been commercially used in the petroleum industry. However it remains a rather difficult task to invert formation compressional and shear velocities from acoustic LWD signals due to the unwanted strong collar wave, which covers or interferes with signals from the formation. In this paper, seismoelectric LWD is investigated for solving that problem. The seismoelectric field is calculated by solving a modified Poisson's equation, whose source term is the electric disturbance induced electrokinetically by the travelling seismic wave. The seismic wavefield itself is obtained by solving Biot's equations for poroelastic waves. From the simulated waveforms and the semblance plots for monopole, dipole and quadrupole sources, it is found that the electric field accompanies the collar wave as well as other wave groups of the acoustic pressure, despite the fact that seismoelectric conversion occurs only in porous formations. The collar wave in the electric field, however, is significantly weakened compared with that in the acoustic pressure, in terms of its amplitude relative to the other wave groups in the full waveforms. Thus less and shallower grooves are required to damp the collar wave if the seismoelectric LWD signals are recorded for extracting formation compressional and shear velocities.

Both single crystal scintillators and germanium semiconductor detectors are used in oil well-logging tools for gamma-ray detection. Since the scintillator crystals range in size up to 3 inches in diameter and 12 inches long, extremely high crystal quality is necessary to prevent attenuation of the scintillation light over the long light paths. In addition, the elimination of impurities that quench the scintillation light is crucial. NaI(Tl) is the most common scintillator crystal due to its intense emission and good energy resolution. However, recent advances in the crystal growth of Bi 4Ge 3O 12, BaF 2, and CdWO 4 have improved their scintillation properties and made them viable alternatives for certain applications. The only semiconductor crystal in current use is high purity germanium. Other semiconductors such as CdTe and HgI 2 require improvements in crystal growth techniques to improve stoichiometry and remove defects and impurities which inhibit efficient charge collection.

Graduate student journals of research projects and their supervision are suggested as a means of structuring the supervisory process, making it more accountable, and facilitating students' successful completion of their academic and research tasks. However, the method also requires skill in successful thesis production on the supervisor's part.…

Species-abundance (SA) pattern is one of the most fundamental aspects of biological community structure, providing important information regarding species richness, species-area relation and succession. To better describe the SA distribution (SAD) in a community, based on the widely used lognormal (LN) distribution model with exp(-x2) roll-off on Preston's octave scale, this study proposed two additional models, logCauchy (LC) and log-sech (LS), respectively with roll-offs of simple x-2 and e-x. The estimation of the theoretical total number of species in the whole community, S*, including very rare species not yet collected in sample, was derived from the left-truncation of each distribution. We fitted these three models by Levenberg-Marquardt nonlinear regression and measured the model fit to the data using coefficient of determination of regression, parameters' t-test and distribution's Kolmogorov-Smirnov (KS) test. Examining the SA data from six forest communities (five in lower subtropics and one in tropics), we found that: (1) on a log scale, all three models that are bell-shaped and left-truncated statistically adequately fitted the observed SADs, and the LC and LS did better than the LN; (2) from each model and for each community the S* values estimated by the integral and summation methods were almost equal, allowing us to estimate S* using a simple integral formula and to estimate its asymptotic confidence internals by regression of a transformed model containing it; (3) following the order of LC, LS, and LN, the fitted distributions became lower in the peak, less concave in the side, and shorter in the tail, and overall the LC tended to overestimate, the LN tended to underestimate, while the LS was intermediate but slightly tended to underestimate, the observed SADs (particularly the number of common species in the right tail); (4) the six communities had some similar structural properties such as following similar distribution models, having a common

Evaluation of the existence of superoxide radicals (O*-(2)), the site of generation and conditions required for one-e(-) transfer to oxygen from biological redox systems is a prerequisite for the understanding of the deregulation of O(2) homeostasis leading to oxidative stress. Mitochondria are increasingly considered the major O*-(2) source in a great variety of diseases and the aging process. Contradictory reports on mitochondrial O*-(2) release prompted us to critically investigate frequently used O*-(2) detection methods for their suitability. Due to the impermeability of the external mitochondrial membrane for most constituents of O*-(2) detection systems we decided to follow the stable dismutation product H(2)O(2). This metabolite was earlier shown to readily permeate into the cytosol. With the exception of tetramethylbenzidine none of the chemical reactants indicating the presence of H(2)O(2) by horseradish peroxidase-catalyzed absorbance change were suited due to solubility problems or low extinction coefficients. Tetramethylbenzidine-dependent H(2)O(2) detection was counteracted by rereduction of the dye through e(-) carriers of the respiratory chain. Although the fluorescent dyes scopoletin and homovanillic acid were found to be suited for the detection of mitochondrial H(2)O(2) release, fluorescence change was strongly affected by mitochondrial protein constituents. The present study has resolved this problem by separating the detection system from H(2)O(2)-producing mitochondria. PMID:10514548

Kathy Wendolkowski is a citizen scientist. It's a term that Wendolkowski considers far too lofty for what she claims is simply a happy addiction that she and others have for transcribing old logs from naval ship and other vessels. They perform this task to glean the regularly recorded weather data from those logs for the benefit of science. For Wendolkowski, though, greater satisfaction comes from reading what the logs also reveal about the daily lives of the sailors as well as any accompanying historical drama.

This Open-File Report consists of fluid temperature logs compiled during studies of the geohydrology and low temperature geothermal resources of eastern Washington. The fluid temperature logs are divided into two groups. Part A consists of wells which are concentrated in the Moses Lake-Ritzville-Connell area. Full geophysical log suites for many of these wells are presented in Stoffel and Widness (1983) and discussed in Widness (1983, 1984). Part B consists of wells outside of the Moses Lake-Ritzville-Connell study area.

A temperature log and downhole water sample run were conducted in EE-2 on July 13, 1983. The temperature log was taken to show any changes which had occurred in the fracture-to-wellbore intersections as a result of the Experiment 2020 pumping and to locate fluid entries for taking the water sample. The water sample was requested primarily to determine the arsenic concentration in EE-2 fluids (see memo from C.Grigsby, June 28, 1983 concerning arsenic in EE-3 samples.) The temperature log was run using the thermistor in the ESS-6 water samples.

An in-depth study of the state of the art in Geothermal Well Log Interpretation has been made encompassing case histories, technical papers, computerized literature searches, and actual processing of geothermal wells from New Mexico, Idaho, and California. A classification scheme of geothermal reservoir types was defined which distinguishes fluid phase and temperature, lithology, geologic province, pore geometry, salinity, and fluid chemistry. Major deficiencies of Geothermal Well Log Interpretation are defined and discussed with recommendations of possible solutions or research for solutions. The Geothermal Well Log Interpretation study and report has concentrated primarily on Western US reservoirs. Geopressured geothermal reservoirs are not considered.

Nonthermal technologies are being investigated as viable alternatives to, or supplemental utilization, with thermal pasteurization in the food-processing industry. In this study, the effect of ultraviolet (UV)-C light on the inactivation of seven milkborne pathogens (Listeria monocytogenes, Serratia marcescens, Salmonella Senftenberg, Yersinia enterocolitica, Aeromonas hydrophila, Escherichia coli, and Staphylococcus aureus) was evaluated. The pathogens were suspended in ultra-high-temperature whole milk and treated at UV doses between 0 and 5000 J/L at a flow rate of 4300 L/h in a thin-film turbulent flow-through pilot system. Of the seven milkborne pathogens tested, L. monocytogenes was the most UV resistant, requiring 2000 J/L of UV-C exposure to reach a 5-logreduction. The most sensitive bacterium was S. aureus, requiring only 1450 J/L to reach a 5-logreduction. This study demonstrated that the survival curves were nonlinear. Sigmoidal inactivation curves were observed for all tested bacterial strains. Nonlinear modeling of the inactivation data was a better fit than the traditional log-linear approach. Results obtained from this study indicate that UV illumination has the potential to be used as a nonthermal method to reduce microorganism populations in milk. PMID:25884367

Obesity at conception and excess gestational weight gain pose significant risks for adverse health consequences in human offspring. This study evaluated the effects of reducing dietary intake of obese/overfed ewes beginning in early gestation on fetal development. Sixty days prior to conception, ewes were assigned to a control diet [CON: 100% of National Research Council (NRC) recommendations], a diet inducing maternal obesity (MO: 150% of NRC recommendations), or a maternal obesity intervention diet (MOI: 150% of NRC recommendations to day 28 of gestation, then 100% NRC) until necropsy at midgestation (day 75) or late (day 135) gestation. Fetal size and weight, as well as fetal organ weights, were greater (P < 0.05) at midgestation in MO ewes than those of CON and MOI ewes. By late gestation, whereas fetal size and weight did not differ among dietary groups, cardiac ventricular weights and wall thicknesses as well as liver and perirenal fat weights remained elevated in fetuses from MO ewes compared with those from CON and MOI ewes. MO ewes and fetuses exhibited elevated (P < 0.05) plasma concentrations of triglycerides, cholesterol, insulin, glucose, and cortisol at midgestation compared with CON and MOI ewes and fetuses. In late gestation, whereas plasma triglycerides and cholesterol, insulin, and cortisol remained elevated in MO vs. CON and MOI ewes and fetuses, glucose concentrations were elevated in both MO and MOI fetuses compared with CON fetuses, which was associated with elevated placental GLUT3 expression in both groups. These data are consistent with the concept that reducing maternal diet of obese/overfed ewes to requirements from early gestation can prevent subsequent alterations in fetal growth, adiposity, and glucose/insulin dynamics. PMID:23921140

The lack of fault tolerance is becoming a limiting factor for application scalability in HPC systems. The MPI does not provide standardized fault tolerance interfaces and semantics. The MPI Forum's Fault Tolerance Working Group is proposing a collective fault tolerant agreement algorithm for the next MPI standard. Such algorithms play a central role in many fault tolerant applications. This paper combines a log-scaling two-phase commit agreement algorithm with a reduction operation to provide the necessary functionality for the new collective without any additional messages. Error handling mechanisms are described that preserve the fault tolerance properties while maintaining overall scalability.

Logging and hunting are two key direct threats to the survival of wildlife in the tropics, and also disrupt important ecosystem processes. We investigated the impacts of these two factors on the different stages of the seed dispersal cycle, including abundance of plants and their dispersers and dispersal of seeds and recruitment, in a tropical forest in north-east India. We focused on hornbills, which are important seed dispersers in these forests, and their food tree species. We compared abundances of hornbill food tree species in a site with high logging and hunting pressures (heavily disturbed) with a site that had no logging and relatively low levels of hunting (less disturbed) to understand logging impacts on hornbill food tree abundance. We compared hornbill abundances across these two sites. We, then, compared the scatter-dispersed seed arrival of five large-seeded tree species and the recruitment of four of those species. Abundances of hornbill food trees that are preferentially targeted by logging were two times higher in the less disturbed site as compared to the heavily disturbed site while that of hornbills was 22 times higher. The arrival of scatter-dispersed seeds was seven times higher in the less disturbed site. Abundances of recruits of two tree species were significantly higher in the less disturbed site. For another species, abundances of younger recruits were significantly lower while that of older recruits were higher in the heavily disturbed site. Our findings suggest that logging reduces food plant abundance for an important frugivore-seed disperser group, while hunting diminishes disperser abundances, with an associated reduction in seed arrival and altered recruitment of animal-dispersed tree species in the disturbed site. Based on our results, we present a conceptual model depicting the relationships and pathways between vertebrate-dispersed trees, their dispersers, and the impacts of hunting and logging on these pathways. PMID:25781944

Background A Clinical Log was introduced as part of a medical student learning portfolio, aiming to develop a habit of critical reflection while learning was taking place, and provide feedback to students and the institution on learning progress. It was designed as a longitudinal self-directed structured record of student learning events, with reflection on these for personal and professional development, and actions planned or taken for learning. As incentive was needed to encourage student engagement, an innovative Clinical Log station was introduced in the OSCE, an assessment format with established acceptance at the School. This study questions: How does an OSCE Clinical Log station influence Log use by students? Methods The Log station was introduced into the formative, and subsequent summative, OSCEs with careful attention to student and assessor training, marking rubrics and the standard setting procedure. The scoring process sought evidence of educational use of the log, and an ability to present and reflect on key learning issues in a concise and coherent manner. Results Analysis of the first cohort’s Log use over the four-year course (quantified as number of patient visits entered by all students) revealed limited initial use. Usage was stimulated after introduction of the Log station early in third year, with some improvement during the subsequent year-long integrated community-based clerkship. Student reflection, quantified by the mean number of characters in the ‘reflection’ fields per entry, peaked just prior to the final OSCE (mid-Year 4). Following this, very few students continued to enter and reflect on clinical experience using the Log. Conclusion While the current study suggested that we can’t assume students will self-reflect unless such an activity is included in an assessment, ongoing work has focused on building learner and faculty confidence in the value of self-reflection as part of being a competent physician. PMID:23140250

Invertebrates are dominant species in primary tropical rainforests, where their abundance and diversity contributes to the functioning and resilience of these globally important ecosystems. However, more than one-third of tropical forests have been logged, with dramatic impacts on rainforest biodiversity that may disrupt key ecosystem processes. We find that the contribution of invertebrates to three ecosystem processes operating at three trophic levels (litter decomposition, seed predation and removal, and invertebrate predation) is reduced by up to one-half following logging. These changes are associated with decreased abundance of key functional groups of termites, ants, beetles and earthworms, and an increase in the abundance of small mammals, amphibians and insectivorous birds in logged relative to primary forest. Our results suggest that ecosystem processes themselves have considerable resilience to logging, but the consistent decline of invertebrate functional importance is indicative of a human-induced shift in how these ecological processes operate in tropical rainforests. PMID:25865801

With network's growth and popularization, network security experts are facing bigger and bigger network security log. Network security log is a kind of valuable and important information recording various network behaviors, and has the features of large-scale and high dimension. Therefore, how to analyze these network security log to enhance the security of network becomes the focus of many researchers. In this paper, we first design a frequent attack sequencebased hypergraph clustering algorithm to mine the network security log, and then improve this algorithm with a synthetic measure of hyperedge weight and two optimization functions of clustering result. The experimental results show that the synthetic measure and optimization functions can promote significantly the coverage and precision of clustering result. The optimized hypergraph clustering algorithm provides a data analyzing method for intrusion detecting and active forewarning of network.

The symposium contains papers about developments in borehole logging instrumentation that can withstand downhole temperatures in excess of 300/sup 0/C and pressures greater than 103 MPa. Separate abstracts have been prepared for individual papers. (ACR)

Presently there are no suitable non-invasive methods for precisely detecting the subsurface defects in logs in real time. Internal defects such as knots, decays, and embedded metals are of greatest concern for lumber production. Nondestructive scanning of logs using Ground Penetrating Radar (GPR) to detect defects in logs prior to sawing can greatly increase the productivity and yield of high value lumber, and prevent damage to saw blade from embedded metals. In this research, the GPR scanned data has been analyzed to detect subsurface defects such as metals, decays, and knots. Also, GPR offers high speed scanning capability which is needed for future on-line implementation in saw mills. This paper explains the advantages of the GPR technique, experimental setup and parameters used, and data processing for detection of subsurface defects in logs. The results show that GPR can be a very promising technique for future on-line implementation in saw mills.

Invertebrates are dominant species in primary tropical rainforests, where their abundance and diversity contributes to the functioning and resilience of these globally important ecosystems. However, more than one-third of tropical forests have been logged, with dramatic impacts on rainforest biodiversity that may disrupt key ecosystem processes. We find that the contribution of invertebrates to three ecosystem processes operating at three trophic levels (litter decomposition, seed predation and removal, and invertebrate predation) is reduced by up to one-half following logging. These changes are associated with decreased abundance of key functional groups of termites, ants, beetles and earthworms, and an increase in the abundance of small mammals, amphibians and insectivorous birds in logged relative to primary forest. Our results suggest that ecosystem processes themselves have considerable resilience to logging, but the consistent decline of invertebrate functional importance is indicative of a human-induced shift in how these ecological processes operate in tropical rainforests. PMID:25865801