GROUP-THEORETIC ORBIT DECIDABILITY ENRIC VENTURA Abstract. A recent collection of papers with the conjugacy problem made by Bogopolski­Martino­Ventura in [2]. All the consequences up to date, published Government through grant number MTM2011-25955. 1 #12;2 ENRIC VENTURA endomorphisms A = End(X, X

Ventura College Portland State University Transfer Worksheet If you are taking classes that are part of the Intersegmental General Education Transfer Curriculum (IGETC) at Ventura College (VC), you) #12;Ventura College Portland State University 2. DEGREE REQUIREMENTS The majority of majors at PSU

Pattern Classification Using a Quantum System Dan Ventura Brigham Young University Department of Computer Science Provo, UT 84602 USA ventura@cs.byu.edu http://axon.cs.byu.edu/Dan We consider and compare

No Free Lunch in the Search for Creativity Dan Ventura Computer Science Department Brigham Young University ventura@cs.byu.edu Abstract We consider computational creativity as a search pro- cess and give discover good artefacts, and these ideas have been formalized else- where (Ritchie 2007; Ventura 2008

In 1985, Ventura County Environmental Health Department began a technical assistance program to encourage hazardous waste generators to reduce their dependence on land disposal. In order to accomplish this, information from the California State Hazardous Waste Manifest Information System was analyzed to identify the types, quantities and disposition of hazardous waste produced by companies in Ventura County. All generators that rely on land disposal were also surveyed to determine future waste management plans. Waste audits were conducted at each site to determine if alternative waste handling methods were feasible and to ensure that reuse, recycling and waste reduction methods are used when possible. This article summarizes these findings and projects future hazardous waste generation and disposal patterns for industries in Ventura County. It also identifies barriers to volume reduction and provides a framework for future local hazardous waste alternative technology/volume reduction program activities.

Since 1990, the Department of Conservation`s Division of Mines and Geology (DMG) has provided geologic information and conducted several research projects on geology and radon for the California Department of Health Services (DHS) Radon Program. This article provides a brief overview of radon`s occurrence and impact on human health, and summarizes a recent DMG project for DHS that used geologic, geochemical, and indoor radon measurement data to produce detailed radon potential zone maps for Santa Barbara and Ventura counties.

Intersection Type System with de Bruijn Indices Daniel Lima Ventura1 and Mauricio Ayala-RincÂ´on1, Scotland {ventura,ayala}@mat.unb.br, fairouz@macs.hw.ac.uk September 30, 2008 Abstract The -calculus in de

EXTENSIONS OF LINKING SYSTEMS WITH p-GROUP KERNEL BOB OLIVER AND JOANA VENTURA Abstract. We study. Oliver is partially supported by UMR 7539 of the CNRS. J. Ventura is partially supported by FCT. #12;2 BOB OLIVER AND JOANA VENTURA enother prolem is tht in generlD when ev is linking system nd A g

, AND JOANA VENTURA Abstract. We study reduced fusion systems from the point of view of thei* *r, and by project ANR * *BLAN08-2_338236, HGRT. J. Ventura was partially supported by FCT through program POCI;2 KASPER K. S. ANDERSEN, BOB OLIVER, AND JOANA VENTURA The situation we want to study is the following

Historical Ecology of the lower santa clara river, Ventura river, and oxnard Plain: an analysis. Historical ecology of the lower Santa Clara River,Ventura River, and Oxnard Plain: an analysis of terrestrial layers are available on SFEI's website, at www.sfei.org/projects/VenturaHE. Permissions rights for images

Ventura Abstract--We use a type of reservoir computing called the liquid state machine (LSM) to explore Ventura are with the Computer Sci- ence Department, Brigham Young University, Provo, Utah (email: ghotikun@hotmail.com, ventura@cs.byu.edu). Fig. 1. Diagram of a liquid state machine. (a, b) The input signal is transformed

AND JOANA VENTURA Abstract. We study extensions of p-local finite groups where the kernel i* *s of the CNRS. J. Ventura is partially supported by FCT/POCTI/FEDER and grant PDCT/MAT/58497* */2004. Both;2 BOB OLIVER AND JOANA VENTURA Another problem is that in general, when eLis a linking system and A C e

The rootless Ventura Avenue, San Miguelito, and Rincon anticlines (Ventura fold belt) in Pliocene -Pleistocene turbidites are fault-propagation folds related to south-dipping reverse faults rising from a decollement in Miocene shale. To the east, the Sulfur Mountain anticlinorium overlies and is cut by the Sisar, Big Canyon, and Lion south-dipping thrusts that merge downward into the Sisar decollement in lower Miocene shale. Shortening of the Miocene and younger sequence is {approximately} 3 km greater than that of underlying competent Paleogens strata in the Ventura fold belt and {approximately} 7 km greater farther east at Sulfur Mountain. Cross-section balancing requires that this difference be taken up by the Paleogene sequence at the Oak Ridge fault to the south. Convergence is northeast to north-northeast on the base of earthquake focal mechanisms, borehole breakouts, and piercing-point offest of the South Mountain seaknoll by the Oak Ridge fault. A northeast-trending line connecting the west end of Oak Ridge and the east end of Sisar fault separates an eastern domain where late Quaternary displacement is taken up entirely on the Oak Ridge fault and a western domain where displacement is transferred to the Sisar decollement and its overlying rootless folds. This implies that (1) the Oak Ridge fault near the coast presents as much seismic risk as it does farther east, despite negligible near-surface late Quaternary movement; (2) ground-rupture hazard is high for the Sisar fault set in the upper Ojai Valley; and (3) the decollement itself could produce an earthquake analogous to the 1987 Whittier Narrows event in Low Angeles.

Laboratories, and Dan Ventura, Brigham Young University Abstract-- The applicability of complex networks- odm@sandia.gov). Dan Ventura is with the Department of Computer Science, Brigham Young University, Provo, UT 84602, USA (email: ventura@cs.byu.edu). 1T is defined as the set of real numbers, R, with some

A direct method for modeling and unfolding developable surfaces and its application to the Ventura Mountain area in the Ventura basin). This particular structure has been already balanced by a trial. q 2005 Published by Elsevier Ltd. Keywords: Developable; Mesh; Restoration; Unfold; Ventura Basin 1

A SOM-based Multimodal System for Musical Query-by-Content Kyle Dickerson and Dan Ventura Abstract proposed the design of a system that uses a SOM to reduce the dimensionality Kyle Dickerson and Dan Ventura.dickerson@gmail.com, ventura@cs.byu.edu). of the feature space, allowing the use of simple (Euclidean) distance metrics [13

Epigenetic uranium deposits with potential commercial value have been found in the lower part of the upper Eocene to lower Miocene Sespe Formation near Ojai, in Ventura County, California. This report describes the geological and geochemical setting of these deposits and postulates a model for their origin. Several uranium deposits are located on Superior Ridge, a topographic high about 3 miles long located just south of White Ledge Peak and 6 to 9 miles west of Ojai (Photo 1). A single uranium deposit on Laguna Ridge is located about 3 miles south of Superior Ridge, and was included with the Superior Ridge deposits in the White Ledge Peak district. A few small deposits are known to exist in other parts of Ventura County. A preliminary model for uranium mineralization in the Sespe Formation postulated that the organic material necessary for concentrating the uranium by chemical reduction or precipitation originated as terrestrial humic acid or humate.

Improved decoding of limb-state feedback from natural sensors J.B. Wagenaar, V. Ventura and D is with the department of BioEngineering, University of Pittsburgh, jbw14@pitt.edu V. Ventura is with faculty

Personality and Patterns of Facebook Usage Yoram Bachrach Microsoft Research yobach.ac.uk ABSTRACT We show how users' activity on Facebook relates to their per- sonality, as measured by the standard Five Factor Model. Our dataset consists of the personality profiles and Facebook pro- file data

A Generalization­Based Approach to Clustering of Web Usage Sessions Yongjian Fu, Kanwalpreet Sandhu,ksandhu,mingyig@umr.edu Abstract. The clustering of Web usage sessions based on the access patterns is studied. Access patterns of Web users are extracted from Web server log files, and then organized into sessions which represent

Nehovah: Creativity in Generating Neologisms Michael R. Smith and Ryan S. Hintze and Dan Ventura information from social media to incor- porate a dynamic source of pop culture into the neologisms and also

Nonparametric Bootstrap Recycling Val'erie Ventura, Department of Statistics, Baker Hall 132 adjustments. The amount of computation involved is usually considerable, and recycling provides a less computer intensive alternative. Recycling consists of using repeatedly the same samples drawn from

The Santa Barbara and Ventura basins are tectonically active and are economically important because millions of barrels of oil were produced there since the 1800s. This guidebook focuses on structural and sedimentological aspects of two main structural tends in the basin: the Rincon-Ventura anticlinorium, and the Oakridge-South Mountain uplift. Section One of the publication is a roadlog which summarizes geologic features. Section Two focuses on the sedimentation of the principal reservoirs and source rocks in the main oil fields in the two basins. Section Three presents four original papers on the oil fields and tectonic evolution of the area.

It is thought that it will be useful to inform society and people who are interested in hydrogen energy. The study below has been prepared due to this aim can be accepted as an article to exchange of information between people working on this subject. This study has been presented to reader to be utilized as a technical note. Main Energy sources coal, petroleum and natural gas are the fossil fuels we use today. They are going to be exhausted since careless usage in last decades through out the world, and human being is going to face the lack of energy sources in the near future. On the other hand as the fossil fuels pollute the environment makes the hydrogen important for an alternative energy source against to the fossil fuels. Due to the slow progress in hydrogens production, storage and converting into electrical energy experience, extensive usage of Hydrogen can not find chance for applications in wide technological practices. Hydrogen storage stands on an important point in the development of Hydrogen energy Technologies. Hydrogen is volumetrically low energy concentration fuel. Hydrogen energy, to meet the energy quantity necessary for the nowadays technologies and to be accepted economically and physically against fossil fuels, Hydrogen storage technologies have to be developed in this manner. Today the most common method in hydrogen storage may be accepted as the high pressurized composite tanks. Hydrogen is stored as liquid or gaseous phases. Liquid hydrogen phase can be stored by using composite tanks under very high pressure conditions. High technology composite material products which are durable to high pressures, which should not be affected by hydrogen embrittlement and chemical conditions.[1

: 10.1785/0120040126 Neotectonics of the Offshore Oak Ridge Fault near Ventura, Southern California is a large-offset, south-dipping reverse fault that forms the south boundary of the Ventura Basin in southern California. Previous research indicates that the Oak Ridge fault south of the town of Ventura has been

Non-parametric Bootstrap Recycling Val#19;erie Ventura, Department of Statistics, Baker Hall 132 adjustments. The amount of computation involved is usually considerable, and recycling provides a less computer intensive alternative. Recycling consists of using repeatedly the same samples drawn from

Nonparametric Bootstrap Recycling Val'erie Ventura, Department of Statistics, Baker Hall 132. The amount of computation involved is usually considerable, and recycling provides a less computer intensive alternative. Recycling consists of using repeatedly the same samples drawn from a recycling distribution G

This paper reports the findings of a preliminary assessment of the cost effectiveness of distributed energy resources at Naval Base Ventura County (NBVC) Building 1512. This study was conducted in response to the base's request for design assistance to the Federal Energy Management Program. Given the current tariff structure there are two main decisions facing NBVC: whether to install distributed energy resources (DER), or whether to continue the direct access energy supply contract. At the current effective rate, given assumptions about the performance and structure of building energy loads and available generating technology characteristics, the results of this study indicate that if the building installed a 600 kW DER system with absorption cooling and heat capabilities chosen by cost minimization, the energy cost savings would be about 14 percent, or $55,000 per year. However, under current conditions, this study also suggests that significant savings could be obtained if Building 1 512 changed from the direct access contract to a SCE TOU-8 (Southern California Edison time of use tariff number 8) rate without installing a DER system. At current SCE TOU-8 tariffs, the potential savings from installation of a DER system would be about 4 percent, or $15,000 per year.

ADAPTIVE WEB SITES BY WEB USAGE MINING Yongjian Fu Mario Creado Ming­Yi Shih Department of Computer An approach for reorganizing a Web site based on user access patterns is proposed. The Web server's log files and the Web pages on the site are first preprocessed to obtain the access statistics of the Web pages

This report provides the results of the 2011 Radioactive Materials Usage Survey for Unmonitored Point Sources (RMUS), which was updated by the Environmental Protection (ENV) Division's Environmental Stewardship (ES) at Los Alamos National Laboratory (LANL). ES classifies LANL emission sources into one of four Tiers, based on the potential effective dose equivalent (PEDE) calculated for each point source. Detailed descriptions of these tiers are provided in Section 3. The usage survey is conducted annually; in odd-numbered years the survey addresses all monitored and unmonitored point sources and in even-numbered years it addresses all Tier III and various selected other sources. This graded approach was designed to ensure that the appropriate emphasis is placed on point sources that have higher potential emissions to the environment. For calendar year (CY) 2011, ES has divided the usage survey into two distinct reports, one covering the monitored point sources (to be completed later this year) and this report covering all unmonitored point sources. This usage survey includes the following release points: (1) all unmonitored sources identified in the 2010 usage survey, (2) any new release points identified through the new project review (NPR) process, and (3) other release points as designated by the Rad-NESHAP Team Leader. Data for all unmonitored point sources at LANL is stored in the survey files at ES. LANL uses this survey data to help demonstrate compliance with Clean Air Act radioactive air emissions regulations (40 CFR 61, Subpart H). The remainder of this introduction provides a brief description of the information contained in each section. Section 2 of this report describes the methods that were employed for gathering usage survey data and for calculating usage, emissions, and dose for these point sources. It also references the appropriate ES procedures for further information. Section 3 describes the RMUS and explains how the survey results are organized. The RMUS Interview Form with the attached RMUS Process Form(s) provides the radioactive materials survey data by technical area (TA) and building number. The survey data for each release point includes information such as: exhaust stack identification number, room number, radioactive material source type (i.e., potential source or future potential source of air emissions), radionuclide, usage (in curies) and usage basis, physical state (gas, liquid, particulate, solid, or custom), release fraction (from Appendix D to 40 CFR 61, Subpart H), and process descriptions. In addition, the interview form also calculates emissions (in curies), lists mrem/Ci factors, calculates PEDEs, and states the location of the critical receptor for that release point. [The critical receptor is the maximum exposed off-site member of the public, specific to each individual facility.] Each of these data fields is described in this section. The Tier classification of release points, which was first introduced with the 1999 usage survey, is also described in detail in this section. Section 4 includes a brief discussion of the dose estimate methodology, and includes a discussion of several release points of particular interest in the CY 2011 usage survey report. It also includes a table of the calculated PEDEs for each release point at its critical receptor. Section 5 describes ES's approach to Quality Assurance (QA) for the usage survey. Satisfactory completion of the survey requires that team members responsible for Rad-NESHAP (National Emissions Standard for Hazardous Air Pollutants) compliance accurately collect and process several types of information, including radioactive materials usage data, process information, and supporting information. They must also perform and document the QA reviews outlined in Section 5.2.6 (Process Verification and Peer Review) of ES-RN, 'Quality Assurance Project Plan for the Rad-NESHAP Compliance Project' to verify that all information is complete and correct.

The Miley reservoir of the Rincon field is located in the Central Transverse Ranges of southern California on a structural high that borders the Santa Barbara Channel. The east-west-trending Rincon and Ventura anticlines are part of a major oil-productive trend containing the Rincon, San Miguelito, and Ventura Avenue fields, which have estimated ultimate recovery of 1.7 billion BOE. Hydrocarbon accumulations in the multiple and stacked reservoirs within these three fields are controlled by the complex interplay of late Pleistocene folding and reverse fault development. The detailed interpretation reported here combines reservoir performance data with subsurface structural geology and sequential tectonic development to provide a new understanding of the relationship of migration barriers to oil accumulation and production. The Miley reservoir is an axial- and fault-controlled accumulation on the eastern terminus of the Rincon anticline. It is located in a structural saddle formed by the doubly plunging Rincon and Ventura anticlinal trend. Three operative trapping mechanisms confine oil pools: (1) axial accumulations associated with reverse fault closures; (2) traps on the hanging wall of dip-slip reverse faults; and (3) a permeability barrier developed in response to flexural slip folding. Oil trapped within the Rincon-Miley reservoir was primarily generated beneath the Santa Barbara Channel and migrated up the south flank of the anticlinal trend. Four stages of structural development and hydrocarbon migration, encompassing the last 700,000 years, have implications for the enhanced development of reservoirs on this anticlinal trend.

The Ventura Basin, southern Califronia, is located near the Big Bend area of the San Andreas fault system, within the Transverse Ranges physiographic province. Continuous equilibrium temperature logs were measured in 12 idle oil wells located within the onshore Ventura Avenue, San Miguelito, Filmore, Oxnard, and West Montalvo fields to an average depth of about 3100 m (10,200 feet). Thermal conductivities were measured on all available samples. Heat flows were calculated with the aid of a thermostratigraphic scheme based on correlative gradient intervals and average thermal conductivity for the appropriate units. Negative curvature of the Ventura Avenue temperature profiles may be explained by an increase in thermal conductivity associated with tectonic compaction of the underlying Pliocene clastic sequence. Temperature profiles at Fillmore are enigmatic but suggest highly unusual geotectonic conditions. Basinwide, heat flow averages about 48 mW/m{sup 2}, a value which is low relative to most of southern California. As heat flow does not vary systematically to the maximum measured depth of about 4 km, this anomaly is not easily explained in terms of hydrologic effect or recent uplift and erosion. However, a diminution of heat flow is an expectable consequence of the accumulation of cold sediments (up to 12 km) since Eocene time. If 70 mW/m{sub 2} is accepted as the background heat flow, then the sedimentation effect is probably sufficient to explain the anomaly.

This report is the second of a two-part study by BerkeleyLab of a DER (distributed energy resources) system at Navy Base VenturaCounty (NBVC). First, a preliminary assessment ofthe cost effectivenessof distributed energy resources at Naval Base Ventura County (NBVC)Building 1512 was conducted in response to the base s request for designassistance to the Federal Energy Management Program (Bailey and Marnay,2004). That report contains a detailed description of the site and theDER-CAM (Consumer Adoption Model) parameters used. This second reportcontains sensitivity analyses of key parameters in the DER system modelof Building 1512 at NBVC and additionally considers the potential forabsorption-powered refrigeration.The prior analysis found that under thecurrent tariffs, and given assumptions about the performance andstructure of building energy loads and available generating technologycharacteristics, installing a 600 kW DER system with absorption coolingand recovery heat capabilities could deliver cost savings of about 14percent, worth $55,000 per year. However, under current conditions, thisstudy also suggested that significant savings could be obtained ifBuilding 1512 changed from its current direct access contract to a SCETOU-8 (Southern California Edison time of use tariff number 8) ratewithout installing a DER system. Evaluated on this tariff, the potentialsavings from installation of a DER system would be about 4 percent of thetotal bill, or $16,000 per year.

Decisions regarding materials and construction of a building are made all the time in the architectural process, but thought is not always given to how those choices may affect the buildings ultimate energy usage and the ...

Middle Eocene strata in the Topatopa Mountains, northeastern Ventura Basin, California, were deposited as five unconformity-bounded depositional sequences in a seismically-active basin characterized by rapid, episodic subsidence. Analysis of stratigraphic geometries, detailed facies analysis, and backstripping-derived subsidence rates indicate that tectonically-induced differential subsidence caused basin-wide migrations of the Topatopa depocenter at least every 2 m.y. Unusually abundant convolute laminations and a slumped interval may record the influence of earthquakes. The convulate laminations do not grade downward into ripple laminations and are not associated with dewatering dikes and pipes as would be expected if the laminations were formed by shear from an overlying current or by dewatering during rapid burial. Thus, formation during shock-induced liquefaction is more likely. Also a 20-m thick slumped interval in the uppermost Cozy Dell Formation underlies a sequence boundary interpreted as a surface across which depocenter migration took place. Association of this slumped interval with a tectonically-formed surface is consistent with deposition in a seismically-active environment. Rapid, differential subsidence in the Topatopa depocenter was probably episodic and associated with seismic events. Rapid, episodic subsidence with attending earthquakes is recorded in Holocene strata of the Humboldt basin, an analog to the Topatopa depocenter. Also, A Middle Tertiary slumped interval in the southernmost San Joaquin basin is attributed to a seismic event. Similar earthquake-related events are recorded by sedimentation patterns in the Middle Eocene Ventura Basin and may be evident in the strata of other active-margin basins as well.

SPIRES, an aging high-energy physics publication data base, is in the process of being replaced by INSPIRE. In order to ease the transition from SPIRES to INSPIRE it is important to understand user behavior and the drivers for adoption. The goal of this project was to address some questions in regards to the presumed two-thirds of the users still using SPIRES. These questions are answered through analysis of the log files from both websites. A series of scripts were developed to collect and interpret the data contained in the log files. The common search patterns and usage comparisons are made between INSPIRE and SPIRES, and a method for detecting user frustration is presented. The analysis reveals a more even split than originally thought as well as the expected trend of user transition to INSPIRE.

This handbook documents many of the standard interface file formats that have been adopted by the US Department of Energy to facilitate communications between and portability of, various large reactor physics and radiation transport software packages. The emphasis is on those files needed for use of the VENTURE/PC diffusion-depletion code system. File structures, contents and some practical advice on use of the various files are provided.

VEHICLE USAGE AGREEMENT DEPARTMENT OF BIOLOGICAL SCIENCE All drivers of vehicles must certify to the following: 1. I certify that I have a valid driver's license appropriate for the vehicle type and will abide belts. 2. I have read and understand the vehicle operating policies and procedures as defined

University Town Management Office Town Green Usage Background The UTown Management Office (UTMO the sustainability of the Green as the Office of Estate & Development (OED) has highlighted that the landscape consultant has advised that it would require about one year period for the soil to settle to its final level

of the Mobile Learning Portal, at mobilelearningportal.org. This Web site is a #12;Learning Technology CenterReport on Activities And Usage Statistics of Learning Technology Center Services and Facilities 2009-2010 #12;Learning Technology Center Review 2009-2010 December 2010 2 Report on Activities

Controlling the usage of computer systems particularly those operated for the federal government, is an important topic today. Audit requirements have grown to the point where they can be a significant burden to the proprietors of the system. The paper briefly mentions several proposals for responding to increased audit requirements and for monitoring a system to detect unauthorized activity. A technique is proposed for situations where the proscribed or the intended activity can be characterized in terms of program or system performance parameters. The design of a usage monitoring system is outlined. The design is based on enhancing the audit data provided by the monitored system, capturing the audit data in a separate system to protect it from user access, and implementing one of the audit trail analysis systems currently under development.

, .. ~ REDUCING ENERGY USAGE IN,EXTRACTIVE DISTILLATION A. C. Saxena V. A. Bhandari Polysar Limited Sarnia, Ontario, Canada Abstract Butadiene 1:3 is separated from other C. hydrocarbons by extractive distillation in a sieve plate tower.... To improve the energy efficiency, butadiene recovery and productivity of the extractive distillation process, many process changes have been made. Their rationale, the methodology used to implement the various changes, and how they affected the process...

PREDICTING ENERGY USAGE IN A SUPERMARKET A Thesis by DEREK WAYNE SCHROCK Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE December 1989 Major...), which consuine 24. 8% of the energy used by all commercial buildings [1]. This thesis reports on work intended to identify conservation opportunities in the mercantile services category by focusing on operational and maintenance problems. A major...

Register file soft error recovery including a system that includes a first register file and a second register file that mirrors the first register file. The system also includes an arithmetic pipeline for receiving data read from the first register file, and error detection circuitry to detect whether the data read from the first register file includes corrupted data. The system further includes error recovery circuitry to insert an error recovery instruction into the arithmetic pipeline in response to detecting the corrupted data. The inserted error recovery instruction replaces the corrupted data in the first register file with a copy of the data from the second register file.

Some of the efforts of CERN to optimize its energy usage are presented. Work is proceeding in several areas: campus and infrastructure, accelerators and beam lines, as well as R&D for future accelerators. The existing building stock is being renovated and new buildings apply an integrated energy concept. The energy efficiency of the accelerators and beam lines can be further enhanced by powering more equipment dynamically only during the time it is really needed. For designing new accelerators novel approaches are being investigated to re-use energy which in todays installations gets dumped thermally.

This report provides a comparative study of advanced supercomputing usage in Japan and the United States as of Spring 1994. It is based on the findings of a group of US scientists whose careers have centered on programming, evaluating, and designing high-performance supercomputers for over ten years. The report is a follow-on to an assessment of supercomputing technology in Europe and Japan that was published in 1993. Whereas the previous study focused on supercomputer manufacturing capabilities, the primary focus of the current work was to compare where and how supercomputers are used. Research for this report was conducted through both literature studies and field research in Japan.

Two recent assessments of the undiscovered oil and gas resources of Los Padres National Forest lands in the Ventura Basin Province using different methodologies and personnel show remarkable coincidence of estimated resources. The 1989 U.S. Geological Survey assessment was part of a National appraisal. In the Ventura Basin Province, two separate plays were assessed and a percentage of resources from these plays was allocated to Federal lands. By this allocation, the undiscovered oil and gas resources of this part of the Los Padres National Forest are estimated to range from <10-140 MMBO (means probability 60 MMBO, million barrels of oil) and 10-250 BCFG (mean probability 110 BCFG, billion cubic feet of gas). In 1993, the U.S. Forest Service completed an oil and gas assessment of the entire 1.8 million-acre Los Padres National Forest as part of a Reasonably Foreseeable Oil and Gas Development Scenario. In those areas of the forest considered to have high potential for the occurrence of oil and gas deposits, a deposit simulation model was used. This method is based on a fundamental reservoir engineering formula in the USGS computer program, FASPU (Fast Appraisal System for Petroleum-Universal). By this method, the undiscovered oil and gas resource of this part of the Los Padres National Forest are estimated to range from 0-182 MMBO (mean probability 56 MMBO) and 9-233 BCFG (mean probability 103 BCFG). An additional 6 MMBO (mean probability) is allocated to forest lands with medium potential within this province but not to any specific prospects. The remarkable coincidence of estimate resources resulting from such different assessment methods and personnel is noteworthy and appears to provide an increased measure of confidence in the estimates.

This report was prepared in partial fulfillment of Subcontract No. C90-103207 by Baxter D. Honeycutt, P.E., Richardson Texas, for the Idaho National Engineering Laboratory (INEL) and the US DOE, INEL requirements, for the requested report were outlined by letter dated September 4, 1990, included the following: process flow diagrams and descriptive discussions of technical operations; mass and energy balances; a summary of energy-saving opportunities with the cross-cutting technologies emphasized; trends of oil and gas production versus energy expended to achieve new production; conclusions and recommendations for future research. The National Energy Account (NEA) data on energy usage in oil and gas related extraction processes are reproduced for reference. Energy cost and production are given for oil and gas well drilling, crude oil and production, national gas production, and natural gas liquid production.

KOINOTITES: A Web Usage Mining Tool for Personalization Dimitrios Pierrakos Inst. of Informatics@iit.demokritos.gr SUMMARY This paper presents the Web Usage Mining system KOINOTITES, which uses data mining techniques for the construction of user communities on the Web. User communities model groups of visitors in a Web site, who have

EQPT is a data file preprocessor for the EQ3/6 software package. EQ3/6 currently contains five primary data files, called datao files. These files comprise alternative data sets. These data files contain both standard state and activity coefficient-related data. Three (com, sup, and nea) support the use of the Davies or B-dot equations for the activity coefficients; the other two (hmw and pit) support the use of Pitzer`s (1973, 1975) equations. The temperature range of the thermodynamic data on these data files varies from 25{degrees}C only to 0-300{degrees}C. The principal modeling codes in EQ3/6, EQ3NR and EQ6, do not read a data0 file, however. Instead, these codes read an unformatted equivalent called a data1 file. EQPT writes a datal file, using the corresponding data0 file as input. In processing a data0 file, EQPT checks the data for common errors, such as unbalanced reactions. It also conducts two kinds of data transformation. Interpolating polynomials are fit to data which are input on temperature adds. The coefficients of these polynomials are then written on the datal file in place of the original temperature grids. A second transformation pertains only to data files tied to Pitzer`s equations. The commonly reported observable Pitzer coefficient parameters are mapped into a set of primitive parameters by means of a set of conventional relations. These primitive form parameters are then written onto the datal file in place of their observable counterparts. Usage of the primitive form parameters makes it easier to evaluate Pitzer`s equations in EQ3NR and EQ6. EQPT and the other codes in the EQ3/6 package are written in FORTRAN 77 and have been developed to run under the UNIX operating system on computers ranging from workstations to supercomputers.

coherent directory structure for users. Particular files are directed to appropriate underlying file systems by intercepting system calls connecting the Virtual File System (VFS) to the underlying file systems. Files are evaluated by a policy module...

A growing number of building owners are benchmarking their building energy use. This requires the building owner to acquire monthly whole-building energy usage information, which can be challenging for buildings in which individual tenants have their own utility meters and accounts with the utility. Some utilities and utility regulators have turned to aggregation of customer energy use data (CEUD) as a way to give building owners whole-building energy usage data while protecting customer privacy. Meter profile aggregation adds a layer of protection that decreases the risk of revealing CEUD as the number of meters aggregated increases. The report statistically characterizes the similarity between individual energy usage patterns and whole-building totals at various levels of meter aggregation.

Tracking software usage is important for HPC centers, computer vendors, code developers and funding agencies to provide more efficient and targeted software support, and to forecast needs and guide HPC software effort towards the Exascale era. However, accurately tracking software usage on HPC systems has been a challenging task. In this paper, we present a tool called Automatic Library Tracking Database (ALTD) that has been developed and put in production on several Cray systems. The ALTD infrastructure prototype automatically and transparently stores information about libraries linked into an application at compilation time and also the executables launched in a batch job. We will illustrate the usage of libraries, compilers and third party software applications on a system managed by the National Institute for Computational Sciences.

Improving Home Automation by Discovering Regularly Occurring Device Usage Patterns Edwin O in an environment can be mined to discover significant patterns, which an intelligent agent could use to automate of two prediction algorithms, thus demonstrating multiple uses for a home automation system. Finally, we

Statler College Research Data Center Access and Usage Policy Objective Access to the Statler under WVU and federal guidelines. Scope This policy applies to all Statler staff, faculty, administrators, officers, contractors and students. Policy Access to the RDC shall be limited to those authorized

Towards activity-relevant attribution of IT energy usage From a previous study we were able to disaggregate per socket energy consumption from whole domicile energy consumption. We discovered that Home preparation/storage, it is the largest contributor to personal energy consumption. Understanding how and when

A Web Usage Mining Framework for Web Directories Personalization Dimitrios Pierrakos Department framework that combines Web personalization and Web directories, which results in the concept of Community Web Directories. Community Web directories is a novel form of personalization performed on Web

Personalizing Web Directories with the Aid of Web Usage Data Dimitrios Pierrakos, Member, IEEE of Community Web Directories, a concept that we introduced in our recent work, applying personalization to Web directories. In this context, the Web directory is viewed as a thematic hierarchy and personalization

A nuclear fuel bundle includes a square array of fuel rods each having a concentration of enriched uranium and plutonium. Each rod of an interior array of the rods also has a concentration of gadolinium. The interior array of rods is surrounded by an exterior array of rods void of gadolinium. By this design, usage of plutonium in the nuclear reactor is enhanced.

computing field such as electronic structure calculations, and in several other contexts. We are considering sizes M like those arising in scientific computing such as electronic structure calculations [1, 2, 3Dynamic Memory Usage Optimization using ILP A. Allam1 and J. Ramanujam2 1 Electrical Engineering

Transportation & Work: Exploring Car Usage and Employment Outcomes in the LSAL Data Field Area of 1996, or "welfare reform," attention turned to the role of transportation in job search and employment information on car ownership as well as employment history, literacy proficiency, and measures of social

Exploiting Memory Usage Patterns to Improve Garbage Collections in Java Liangliang Tong Department Science The University of Hong Kong fcmlau@cs.hku.hk Abstract Copying-based garbage collectors It is a common belief, however, that ob- jects' survival rates are generally too low to make full use

criminals to hide themselves. In this paper, we provide usage and geo-location analysis of major- linkability, i.e., observers can not link an agent to a specific message or action. Equally contributing a dichotomous issue in both social life and cy- ber space. Anonymity technologies have been used for criminal

Personalized Power Saving Profiles Generation Analyzing Smart Device Usage Patterns Soumya Kanti interactions of smart devices. This paper describes a client-server architecture that proposes personalized and they are sent back to the smart devices. These profiles are highly personalized since they are developed

As consumption in the US grows, so does concern about sustainable materials usage. Increasing recycling is a key component within a broad arsenal of strategies for moving towards sustainable materials usage. There are many ...

Residential insecticide usage in northern California homes with young children XIANGMEI (MAY) WUa of Health and Environment, Seoul National University, Seoul, South Korea Residential insecticide usage and August 2008. Structured telephone interviews were conducted collecting information on residential use

Economic Status) affected children?s creativity scores and computer game usage. Children using computer games heavily showed significantly higher scores on the scale of Figural Originality than those with moderate usage. Highly structured activity students...

CREATION OF CHIMERA THROUGH THE USAGE OF AN INSPIRATIONAL SYSTEM A Thesis by BRANDI NICOLE PARISH Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of the requirements for the degree of MASTER... OF SCIENCE Approved by: Chair of Committee, Carol LaFayette Committee Members, Frederic Parke Kevin Glowacki Head of Department, Tim McLaughlin May 2013 Major Subject: Visualization Copyright 2013 Brandi Nicole Parish ii...

A nuclear fuel bundle includes a square array of fuel rods each having a concentration of enriched uranium and plutonium. Each rod of an interior array of the rods also has a concentration of gadolinium. The interior array of rods is surrounded by an exterior array of rods void of gadolinium. By this design, usage of plutonium in the nuclear reactor is enhanced. 10 figs.

By improving our understanding of residential lighting-energy usage and quantifying it across many different parameters, the new study will be of use to anyone doing energy estimates  such as utilities, market and investment analysts, and government agencies. It will also help manufacturers design products that not only better serve consumers' needs, but that maximize the energy savings that technologies like SSL make possible.

is to present a comparison study of SVC and K- means for Web usage data mining using real-life Web logs. (http://www.hippocrates.ouhsc.edu) We chose K-means for comparison because it is a widely used algorithm for clustering. (Shahabi et al. Section 2 describes SVC and K-means. Section 3 and 4 present the comparison studies and experimental

The summary report presents an overview of the level of traveler usage (e.g., how many vehicles use the freeways) and travel performance (e.g., how fast they are traveling, where and how often congestion occurs) on the principal urban freeways in the central Puget Sound area for 1997. Data presented in this report were collected by the Washington State Department of Transportation`s (WDSOT`s) freeway surveillance system.

Federal regulations exist to ensure the proper distribution and usage of veterinary drugs and to prevent adulteration of the food supply with illegal drug residues through drug misuse in food produc- ing animals. The Food and Drug Administration... of omission or commission, violative residues in livestock and poultry (by irresponsible and illegal distribution and use of drugs) violates state and federal laws. When FSIS inspectors detect violative drug residues in food products derived from ani- mals...

to laptops and desktop PCs, network usage characteristics of smartphones may differ significantly becauseSession Lengths and IP Address Usage of Smartphones in a University Campus WiFi Network be used more opportunistically. In this paper, we study two important network usage characteristics

Results of Comparing Bandwidth Usage and Latency: Service Location Protocol and Jini Javier Govea. This paper focuses on analyzing and evaluating two of these limitations, bandwidth usage and latency, of two present the results of our bandwidth usage analysis and the results a comparison of latency of SLP

Red Storm is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Sandia National Laboratories (SNL). The Red Storm Usage Model (RSUM) documents the capabilities and the environment provided for the FY05 Tri-Lab Level II Limited Availability Red Storm User Environment Milestone and the FY05 SNL Level II Limited Availability Red Storm Platform Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and SNL. Additionally, the Red Storm Usage Model maps the provided capabilities to the Tri-Lab ASC Computing Environment (ACE) requirements. The ACE requirements reflect the high performance computing requirements for the ASC community and have been updated in FY05 to reflect the community's needs. For each section of the RSUM, Appendix I maps the ACE requirements to the Limited Availability User Environment capabilities and includes a description of ACE requirements met and those requirements that are not met in that particular section. The Red Storm Usage Model, along with the ACE mappings, has been issued and vetted throughout the Tri-Lab community.

How to batch upload video files with Unison How to batch upload video files with Unison There are two different ways to upload already existing video files into Unison: Upload from the New Session page (only allows one video file to be uploaded at a time) Launching the editor in Composer (allows

The large-scale analysis of scholarly artifact usage is constrained primarily by current practices in usage data archiving, privacy issues concerned with the dissemination of usage data, and the lack of a practical ontology for modeling the usage domain. As a remedy to the third constraint, this article presents a scholarly ontology that was engineered to represent those classes for which large-scale bibliographic and usage data exists, supports usage research, and whose instantiation is scalable to the order of 50 million articles along with their associated artifacts (e.g. authors and journals) and an accompanying 1 billion usage events. The real world instantiation of the presented abstract ontology is a semantic network model of the scholarly community which lends the scholarly process to statistical analysis and computational support. They present the ontology, discuss its instantiation, and provide some example inference rules for calculating various scholarly artifact metrics.

Nanofiltration (NF) can effectively treat cooling-tower water to reduce water consumption and maximize water usage efficiency of thermoelectric power plants. A pilot is being run to verify theoretical calculations. A side stream of water from a 900 gpm cooling tower is being treated by NF with the permeate returning to the cooling tower and the concentrate being discharged. The membrane efficiency is as high as over 50%. Salt rejection ranges from 77-97% with higher rejection for divalent ions. The pilot has demonstrated a reduction of makeup water of almost 20% and a reduction of discharge of over 50%.

Nowadays, high volatile bituminous coals are broadly used for metallurgical coke production in Russia. The share of such coals in the coking blend is variable from 20 to 40% by weight. There are some large coal deposits in Kuznetskii basin which have coals with low caking tendency. The low caking properties of such coals limit of its application in the coking process. At the same time the usage of low caking coals for coke production would allow flexibility of the feedstock for coke production. Preliminary tests, carried out in COAL-C's lab has shown some differences in coal properties with dependence on the size distribution. That is why the separation of the well-caking fraction from petrographically heterogeneous coals and its further usage in coking process may be promising. Another way for low caking coals application in the coke industry is briquettes production from such coals. This method has been known for a very long time. It may be divided into two possible directions. First is a direct coking of briquettes from the low caking coals. Another way is by adding briquettes to coal blends in defined proportion and combined coking. The possibility of application of coal beneficiation methods mentioned above was investigated in present work.

Despite continual improvements in the performance and reliability of large scale file systems, the management of file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address these problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, metadata, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS includes Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the defacto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.

Although recording of usage data is common in scholarly information services, its exploitation for the creation of value-added services remains limited due to concerns regarding, among others, user privacy, data validity, and the lack of accepted standards for the representation, sharing and aggregation of usage data. A technical, standards-based architecture for sharing usage information is presented. In this architecture, OpenURL-compliant linking servers aggregate usage information of a specific user community as it navigates the distributed information environment that it has access to. This usage information is made OAI-PMH harvestable so that usage information exposed by many linking servers can be aggregated to facilitate the creation of value-added services with a reach beyond that of a single community or a single information service.

Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

GDCT Initialization File [gdct.ini] Format Specification Guide Written By: Jeremy Bradbury June 22, 2000 Below is the layout of the gdct.ini file. It is important to note the following: Â· If the gdct.ini to represent that a file does not exist under the "Recent Files" section of the gdct.ini file. [Internal

Saving Output to a File (Using Codeblocks or Dev-C++) Saving Your Output to a File To save | New | Source File. d. In the new window, right-click and select Paste. e. Then select "File | Save as" to save and name the file. i. In the window that pops up, the bottom fill-in box is labelled "Save as type

community. In response to this demand, Columbia University Libraries (CUL) #12;2 | B a k k a l b a s i & G o the aggregate results indicate that e-book use continues to increase, usage rates are not uniform across the aggregate results indicate that e-book use continues to increase, usage rates are not uniform across

1 A taxonomy of virtual worlds usage in education Ishbel Duncan, Alan Miller and Shangyi Jiang from around the world. A taxonomy is then derived from these articles, delineating current theoretical and practical work on Virtual World usage, specifically in the field of education. The taxonomy identifies rich

The User Side of Sustainability: Modeling Behavior and Energy Usage in the Home Chao Chena, , Diane in everyday home environments to examine the relationship between behavioral patterns and energy consumption;technologies that examine energy usage in homes and to encourage energy effi- cient behaviors, in addition

Clustering Educational Digital Library Usage Data: A Comparison of Latent Class Analysis and K-Means Analysis is superior to K-means on all three comparisons. In particular, LCA is more immune to the variance, the widely used K-means and the model-based Latent Class Analysis, are compared, using usage data from

1 Web Usage Mining as a Tool for Personalization: a survey D. Pierrakos+ , G. Paliouras+ , C GREECE Abstract: This paper is a survey of recent work in the field of Web Usage Mining for the benefit of research on the personalization of Web-based information services. The essence of personalization

Water usage dropping on campus, but UT hopes to lower it more Photo Credit: Zachary Strain | Daily six years, UT has worked to decrease its water usage, but the University still has a ways to go if it the University was using one billion gallons of water per year. Across buildings, irrigation, chilling stations

The Comparison of Usage and Availability Measurements for Evaluating Resource Preference Douglas H and Conditions of Use, available at http://www.jstor.org/about/terms.html. JSTOR's Terms and Conditions of Use~etyof Amenca THE COMPARISON OF USAGE AND AVAILABILITY MEASUREMENTS FOR EVALUATING RESOURCE PREFERENCE1

Modeling the Performance and the Energy Usage of Wireless Sensor Networks by Retrial Queueing and the energy usage of the sensor network. Two operations are compared. In the first case only the event driven requests can initiate reaching the radio trans- mission (RF) unit. Time driven requests have to wait

Usage Patterns of the Java Standard API Homan Ma, Robert Amor, Ewan Tempero Department of Computer,ewan}@cs.auckland.ac.nz Abstract The Java Standard API has grown enormously since Java's be- ginnings, now consisting of over 3 to help determine the "typical" usage of the Standard API. We find that, in an extensive corpus of open

Instructions for transmitting Collector files to KFS In order for a department's KFS Collector file. The name of the KFS Collector file transferred to the Information Systems server by each department should in .xml. Once a department's Collector file has been processed by KFS it will be removed from

Contributing Storage using the Transparent File System JAMES CIPAR and MARK D. CORNER and EMERY D barrier to the adoption of contributory storage systems is that contributing a large quantity of local--all of the currently available space-- without impacting the performance of ordinary file access operations. We show

Figure 1. HomeWindow showing the energy usage of several monitors, displays, and computers with colored auras. Red is high usage while green is low usage. HomeWindow: An Augmented Reality Domestic that reflects its current and historical energy use. Categories and Subject Descriptors H.5.2 [Information

energy sectors of the Scottish market) to 30 December 2010. The information gained from the study is usedWoodfuel Usage Update 1 I Wood fuel use in Scotland 2011 I Hudson Consulting I September 2011 Woodfuel Demand and Usage in Scotland Report 2011 #12;Woodfuel Usage Update 2 I Wood fuel use in Scotland

This paper will highlight the key revisions to existing District Rule 74.17, Solid Waste Disposal Sites and the key requirements of new District Rule 74.17.1, Municipal Solid Waste Landfills to meet new federal requirements. The rule action is necessary to incorporate and implement the requirements of a New Source Performance Standard (NSPS) in Title 40 CFR, Part 60, Subpart Cc -- Emission Guidelines and Compliance Times for Municipal Solid Waste Landfills. The Ventura County Air Pollution Control District (District) is one of only three other districts in California that had previously adopted a landfill gas control rule before the federal EG requirements were adopted by the US Environmental Protection Agency (EPA) in March of 1996. Also, because existing District Rule 74.17 requirements were adopted into the State Implementation Plan (SIP) by the EPA in 1994, several key requirements are carried forward into new District Rule 74.17.1 to prevent a relaxation of the requirements that existing MSW landfills already fulfill. The goal of the rule action was to develop revisions to existing District Rule 74.17 and develop requirements for new District Rule 74.17.1 that at a minimum would incorporate and implement the requirements specified by the EG without causing a relaxation of the existing rule requirements. Because existing District Rule 74.17 and the EG have different non-methane organic compound (NMOC) emission limits, staff gave considerable evaluation to this difference and concluded that, in general, the emission limits are equivalent. Also, based on all of the information reviewed, it is District staff`s opinion that the amount of NMOC emissions controlled from the requirements in new District Rule 74.17.1 are, in general, equivalent to the amount of NMOC emissions control from the requirements in existing Rule 74.17.

An Analysis of Web File Sizes: New Methods and Models A Thesis presented by Brent Tworetzky consider such models and how to improve their fits. This thesis contributes to file size research-improved file size estimations over type-blind models. We therefore present a range of useful new file size

Abstract This study examined the perspectives and usage of technology by Arabic language teachers' in various schools all across The United Arab Emirates. Barriers to integrating technology were closely examined. Dimensions investigated included...

This technical appendix accompanies report PNNL23786 Commercial Building Tenant Energy Usage Data Aggregation and Privacy. The objective is to provide background information on the methods utilized in the statistical analysis of the aggregation thresholds.

A variety of improvements to the MIT Design Advisor, a whole-building energy usage modeling tool intended for use during early design stages, are investigated. These include changes to the thermal mass temperature distribution ...

This presentation describes the computing environment at Argonne National Laboratory and the actions underway to implement a coherent hierarchy of computing systems connected through a heterogeneous file transfer network. A major goal of the Computing Services Division is to integrate heterogeneous computing elements incrementally into a nework, with the goal of having everything somehow connected to everything else. Using standard IBM networking protocols, we have already built a full-function computer-to-computer file transfer network of IBM and DEC VAX systems. Currently, the users on the IBM MVS and VM/CMS systems can use standard IBM commands to send files and mail to DEC VAX users and output devices, and they can receive files from the DEC VAX's as if they had been sent from other IBM systems; similarly, the DEC VAX users can use standard DEC commands to send files and mail to IBM users and output devices, and they can receive files from the IBM systems as if they had been sent from other DEC VAX systems. In fact, the VAXes can exchange files and mail among themselves via the IBM NJE-based network without the need for DECnet links between the VAXes. Because this integrated heterogeneous file transfer network uses the standard IBM peer-to-peer communications protocol, all of the Laboratory's IBM and DEC computers easily communicate with the approximately 170 other computers in the Bitnet university network. Plans call for further integration of existing HP 3000 systems and future word processing systems such as Exxon, NBI, or Wang; we believe it is vitally important to provide smooth paths into this network for users of personal desktop computers. 17 references.

A mechanism is provided for group communications using a MULTI-PIPE synthetic file system. A master application creates a multi-pipe synthetic file in the MULTI-PIPE synthetic file system, the master application indicating a multi-pipe operation to be performed. The master application then writes a header-control block of the multi-pipe synthetic file specifying at least one of a multi-pipe synthetic file system name, a message type, a message size, a specific destination, or a specification of the multi-pipe operation. Any other application participating in the group communications then opens the same multi-pipe synthetic file. A MULTI-PIPE file system module then implements the multi-pipe operation as identified by the master application. The master application and the other applications then either read or write operation messages to the multi-pipe synthetic file and the MULTI-PIPE synthetic file system module performs appropriate actions.

A mechanism is provided for group communications using a MULTI-PIPE synthetic file system. A master application creates a multi-pipe synthetic file in the MULTI-PIPE synthetic file system, the master application indicating a multi-pipe operation to be performed. The master application then writes a header-control block of the multi-pipe synthetic file specifying at least one of a multi-pipe synthetic file system name, a message type, a message size, a specific destination, or a specification of the multi-pipe operation. Any other application participating in the group communications then opens the same multi-pipe synthetic file. A MULTI-PIPE file system module then implements the multi-pipe operation as identified by the master application. The master application and the other applications then either read or write operation messages to the multi-pipe synthetic file and the MULTI-PIPE synthetic file system module performs appropriate actions.

a file han- dle. When an NFS client performs an operation, it passes the file handle to the server, which decodes the file han- dle to determine what object the file handle refers to. Since NFS is a stateless

The objective of the study was to update the heavy-duty truck (HDT) population, activity (e.g., vehicle miles traveled (VMT), numbers of starts and trips, trip duration, etc.), and usage patterns type of service/business (e.g., delivery, construction, etc.), area of operation (i.e., local, short-haul, long-haul) for HDT`s registered and/or operated in California. The population and activity estimates were done on a weight-class-specific basis light-heavy-duty, medium-heavy-duty and heavy-heavy-duty. Population, activity and usage estimates were based primarily on Department of Motor Vehicles (DMV) registration data and Truck Inventory and Usage Survey (TIUS) data. In addition to the analysis of existing data (i.e., DMV and TIUS), 42 HDTs were fitted with on-board data loggers that recorded numbers of trips and starts, daily VMT and travel by time-of-day.

for Index by Sex Analysis of Variance With Respect to Index of Language Usage Mean Index by Sex, Place of Residence, and Border/Non-Border Residence Results of Anova and Mean Index of Language Usage of Mexican American Males and Females With Respect... by Sex: Total Column Indicates What Percent of all R's had Each Index, and Sex Columns Indicate for Each Index, What the Sex Ratio was 7J 1 1 li 21 L Frequency Distribution for Index by Border/ Non-Border Residence Table Page( 22 Percentage...

Patent subsidy and patent filing in China By Zhen Lei , Zhen Sun, and Brian Wright Department of patent subsidy policies on patent filings in Chi- na. China had rapid growth in patenting in recent years and became the number one in patent filings in 2011. We study five neighboring cities in Jiangsu province

, BAE 97]. Caches can bring files nearer the client (with a possible reduction in latency), reduce load curve, which plots the number of requests for each file against the file's popularity ranking. It is often said that this popularity curve follows Zipf's law, Popularity = K* ranking-a , with a being close

The talk will discuss the ten operational capabilities that have made AFS unique in the distributed file system space and how these capabilities are being expanded upon to meet the needs of the 21st century. Derrick Brashear and Jeffrey Altman will present a technical road map of new features and technical innovations that are under development by the OpenAFS community and Your File System, Inc. funded by a U.S. Department of Energy Small Business Innovative Research grant. The talk will end with a comparison of AFS to its modern days competitors.

The construction of a reference ontology for a large domain still remains an hard human task. The process is sometimes assisted by software tools that facilitate the information extraction from a textual corpus. Despite of the great use of XML Schema files on the internet and especially in the B2B domain, tools that offer a complete semantic analysis of XML schemas are really rare. In this paper we introduce Janus, a tool for automatically building a reference knowledge base starting from XML Schema files. Janus also provides different useful views to simplify B2B application integration.

ON THE USAGE OF ANTENNAS IN MIMO AND MISO INTERFERENCE CHANNELS Mariam Kaynia , Andrea J. Goldsmith case of a MISO chan- nel is considered, where exact expressions for the ergodic capac- ity of the capacity of a MISO broadcast channel with a random beamformer is derived. However, the impact

1 Optimal Power Procurement and Demand Response with Quality-of-Usage Guarantees Longbo Huang, Jean the utility company to jointly perform power procurement and demand response so as to maximize the social are the inte- gration of renewable energy technologies [1] and the design of efficient user demand-response

Semantic Analysis of Web Site Audience by Integrating Web Usage Mining and Web Content Mining Jean://www.jrc.it/langtech Abstract. With the emergence of the World Wide Web, analyzing and improving Web communication has become essential to adapt the Web content to the visitors' expectations. Web communication analysis

) protocol in the area of efficient wireless video transmission and its possible usage in cross-layer power-layer approaches for adaptation of the power transmission level of the sender and TFRC feedback information, and therefore user experience, without unnecessary power consumption. INTRODUCTION Networking complexity has led

modern access control systems. Al- though the UCON study has been inspired largely from digital rights only by administrative actions. However, in modern information systems these attributes are often and modern access control policies. 1. Introduction The notion of usage control has been introduced recently

Navy Leadership Disturbed by "Spice" Usage Rise Navy leaders are expressing alarm at recent, mimicking the chemical compounds found in the drug. These products are banned for Navy personnel (NAVADMIN) 108/10 in March 2010, which reemphasized the Navy's drug policy, the U.S. Navy has been

1 REGULATION POLITIQUE ET REGULATION D'USAGE DANS LE TEMPS DE TRAVAIL1 : par G. de Terssac, J REGULATION AND REGULATION OF USE OVER WORKING TIME In this paper we discuss « social regulations » as defined by Reynaud (1999). These regulations, in our case (working)-time regulations, are multiple. Their combination

as much energy as it consumes. When done in 2014, the 130-acre UC Davis West Village will be home to 3Consumers Lifestyle studies Â· Market demand Â· Usage patterns Funding: Calif. Energy Commission, BMW operation Â· Energy savings Funding: Chrysler, US Dept of Energy Lead researcher: Kevin Nesbitt, Ph

overhead, we introduce distributed community gateways as proxies of the utility company to timely respond are managed adaptively to meet the electricity generation and distribution capabilities at any time be optimally utilized in real time. In this paper, we propose a usage-based dynamic pricing (UDP) scheme

This work assessed Hybrid Wing Body (HWB) aircraft in the context of Liquefied Natural Gas (LNG) fuel usage and payload/range scalability at three scales: H1 (B737), H2 (B787) and H3 (B777). The aircraft were optimized for ...

by the operating system's power-management module. We present the models capability to further calculate the powerMobile Application and Device Power Usage Measurements1,2 Rahul Murmuria, Jeffrey Medsger, Angelos Gaithersburg, MD 20899 Email:jeff.voas@nist.gov Abstract--Reducing power consumption has become a cru- cial

fuels (Multiple 2009). In the United States, 41% of all energy is consumed in residential and commercial people with feedback about their energy use can itself produce behavior changes that significantly reduce the value of normative energy feedback, showing users how their usage relates to that of their peers

The OffPAD: Requirements and Usage Kent Are Varmedal1 , Henning Klevjer1 , Joakim Hovlandsv°ag1 by descriptions of different applications that can be implemented with contemporary technology. Finally platform". This certainly provides great flexibility and support for many new business models, but it also

Promiscuous Usage of Nucleotides by the DNA Helicase of Bacteriophage T7 DETERMINANTS OF NUCLEOTIDE activity. Very little is known regarding the architecture of the nucleotide binding site in determining nucleotide specificity. Crystal structures of the T7 helicase domain with bound dATP or dTTP identified Arg

Logo Standards #12;In this guide, you will find usage standards for the new College of Charleston logo system. This guide is provided as a supplement to the College of Charleston Brand Manual, which at http://marcomm.cofc.edu. The new logo, which combines the existing College of Charleston wordmark

the medium pressure manifold (nominally operated at 14 bar), through a steam turbine that can be usedOptimisation of Fuel Usage and Steam Availability in the Power and Steam Plant of a Paper Mill KEYWORDS: Model Predictive Control, Improved Efficiency, Optimisation, Power and Steam Supply System

furnish of 150k odt/yr in 2012. 9. Including wood going for the production of pellets, usage of wood in progress 3.3. Wood fuel usage by fuel category 3.4. Pellet plants 3.5. Greenhouse gas emissions 4 to 1.073 million odt in 2014. 8. Four wood pellet manufacturing plants in Scotland used in total some

Seer is a multipurpose package for performing trigger, signal determination and cuts of an arbitrary number of collider processes stored in the LHCO file format. This article details the use of Seer, including the necessary details for users to customize the code for investigating new kinematic variables.

for the program is provided by the following agencies: Department of Energy, Mines and Resources (Canada) Deutsche&M University, as an account of work performed under the international Ocean Drilling Program which is managedDEEP SEA DRILLING PROJECT DATA FILE DOCUMENTS Ocean Drilling Program Texas A&M University Technical

Despite continual improvements in the performance and reliability of large scale file systems, the management of user-defined file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and semantic metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address these problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, user-defined attributes, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS incorporates Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the de facto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.

A library tracking database has been developed to monitor software/library usage. This Automatic Library Tracking Database (ALTD) automatically and transparently stores, into a database, information about the libraries linked into an application at compilation time and also the executables launched in a batch job. Information gathered into the database can then be mined to provide reports. Analyzing the results from the data collected will help to identify, for example, the most frequently used and the least used libraries and codes, and those users that are using deprecated libraries or applications. We will illustrate the usage of libraries and executables on the Cray XT platforms hosted at the National Institute for Computational Sciences and the Oak Ridge Leadership Computing Facility (both located at Oak Ridge National Laboratory).

This thesis will examine the adoption, usage and outcomes of a mobile money service called MPESA. Since being launched in 2007, the service has seen phenomenal growth in Kenya. Over 7.5 million users, or 34% of the adult ...

3.0 Exporting Models Save model as OBJ file: o Make sure that the model, and the UV-unwrapping, are open and displayed. o File Export wavefront.obj o Navigate to desired directory (remember that single RETURN to accept the name (or use the Export Wavefront button in the upper right) o Click on the Export

Energy Efficient Prefetching with Buffer Disks for Cluster File Systems Adam Manzanares, Xiaojun the energy- efficiency of large scale parallel storage systems. To address these issues we introduce EEVFS (Energy Efficient Virtual File System), which is able to manage data placement and disk states to help

AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Home Page onYou are now leaving Energy.gov You are now leaving Energy.gov You are being directedAnnualProperty EditCalifornia:Power LPInformationCashtonGo Back to PVMagnetotellurics asFiles

Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

will increase. Operational Definitions The following definitions were used to guide this study: ~Win ~rT x ns - Retired persons who stay in the Lower Rio Grande Valley of Texas for at least one month during the Winter. consisting of four counties (Cameron...AN INVESTIGATION OF WINTER TEXANS' TIME USAGE IN THE LOWER RIO GRANDE VALLEY A Thesis by Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE...

U.S. Geological Survey Open-File Report 02Â­328 Geological Survey of Canada Open File 4350 August, University of Victoria, P.O. Box 3055, STN CSC, Victoria, BC, V8W 3P6, Canada #12;ISBN: 0 of Canada and the University of Victoria. This meeting was held at the University of Victoria's Dunsmuir

In this paper, we propose a general operating scheme which allows the utility company to jointly perform power procurement and demand response so as to maximize the social welfare. Our model takes into consideration the effect of the renewable energy and the multi-stage feature of the power procurement process. It also enables the utility company to provide quality-of-usage (QoU) guarantee to the power consumers, which ensures that the average power usage level meets the target value for each user. To maximize the social welfare, we develop a low-complexity algorithm called the \\emph{welfare maximization algorithm} (WMA), which performs joint power procurement and dynamic pricing. WMA is constructed based on a two-timescale Lyapunov optimization technique. We prove that WMA achieves a close-to-optimal utility and ensures that the QoU requirement is met with bounded deficit. WMA can be implemented in a distributed manner and is robust with respect to system dynamics uncertainty.

The report explains how the EVSE are being used along the corridors between the EV Project cities. The EV Project consists of a nationwide collaboration between Idaho National Laboratory (INL), ECOtality North America, Nissan, General Motors, and more than 40 other city, regional and state governments, and electric utilities. The purpose of the EV Project is to demonstrate the deployment and use of approximately 14,000 Level II (208-240V) electric vehicle supply equipment (EVSE) and 300 fast chargers in 16 major cities. This research investigates the usage of all currently installed EV Project commercial EVSE along major interstate corridors. ESRI ArcMap software products are utilized to create geographic EVSE data layers for analysis and visualization of commercial EVSE usage. This research locates the crucial interstate corridors lacking sufficient commercial EVSE and targets locations for future commercial EVSE placement. The results and methods introduced in this research will be used by INL for the duration of the EV Project.

In order to quantify the usage of ground water in an operating electric power plant, a study was initiated in 1978 aimed at determining where in the energy conversion process ground water is used and to suggest, if possible, alternate means for providing this water and/or methods to reduce consumption. The Chalk Point Plant was selected for this study because of its difference in fuel source (coal and oil), and because of its high consumption of ground water (approximately 800,000 to 1,200,000 gallons per day). Located at the confluence of the Patuxent River and Swanson Creek in the southeast corner of Prince George's County, Maryland, it consists of three operating units, with a fourth unit under construction during the survey time. (unit 4 has recently become operational). With the cooperation of the Potomac Electric Power Company, the plant was selected for study and was instrumented with several flow meters. Over a two-year period, this plant has been monitored to provide the information needed to make an assessment of the rates, nature, and methods of water usage in an operating power plant.

Most researchers with little high performance computing (HPC) experience have difficulties productively using the supercomputing resources. To address this issue, we investigated usage behaviors of the world s fastest academic Kraken supercomputer, and built a knowledge-based recommendation system to improve user productivity. Six clustering techniques, along with three cluster validation measures, were implemented to investigate the underlying patterns of usage behaviors. Besides manually defining a category for very large job submissions, six behavior categories were identified, which cleanly separated the data intensive jobs and computational intensive jobs. Then, job statistics of each behavior category were used to develop a knowledge-based recommendation system that can provide users with instructions about choosing appropriate software packages, setting job parameter values, and estimating job queuing time and runtime. Experiments were conducted to evaluate the performance of the proposed recommendation system, which included 127 job submissions by users from different research fields. Great feedback indicated the usefulness of the provided information. The average runtime estimation accuracy of 64.2%, with 28.9% job termination rate, was achieved in the experiments, which almost doubled the average accuracy in the Kraken dataset.

High performance computing has experienced tremendous gains in system performance over the past 20 years. Unfortunately other system capabilities, such as file I/O, have not grown commensurately. In this activity, we present the results of our tests of two leading file systems (GPFS and Lustre) on the same physical hardware. This hardware is the standard commodity storage solution in use at LLNL and, while much smaller in size, is intended to enable us to learn about differences between the two systems in terms of performance, ease of use and resilience. This work represents the first hardware consistent study of the two leading file systems that the authors are aware of.

Wireless Asynchronous Transfer Mode (WATM) networks pose new traffic management problems. One example is the effect of user mobility on Usage Parameter Control (UPC). If the UPC algorithm resets after each handoff between wireless-cells, then users can cheat on their traffic contract. This paper derives explicit relationships between a user`s traffic parameters (Peak Cell Rate, Sustained Cell Rate and Maximum Burst Size), their transit time per wireless-cell, their maximum sustained cheating-rate and the Generic Cell Rate Algorithm`s (GCRA`s) Limit (L) parameter. It also shows that the GCRA can still effectively police Constant Bit Rate (CBR) traffic, but not some types of realistic Variable Bit Rate (VBR) traffic.

Prolonged cold storage of red blood cells by oxygen removal and additive usage. A cost-effective, 4.degree. C. storage procedure that preserves red cell quality and prolongs post-transfusion in vivo survival is described. The improved in vivo survival and the preservation of adenosine triphosphate levels, along with reduction in hemolysis and membrane vesicle production of red blood cells stored at 4.degree. C. for prolonged periods of time, is achieved by reducing the oxygen level therein at the time of storage; in particular, by flushing the cells with an inert gas, and storing them in an aqueous solution which includes adenine, dextrose, mannitol, citrate ion, and dihydrogen phosphate ion, but no sodium chloride, in an oxygen-permeable container which is located in an oxygen-free environment containing oxygen-scavenging materials.

Prolonged cold storage of red blood cells by oxygen removal and additive usage. A cost-effective, 4 C storage procedure that preserves red cell quality and prolongs post-transfusion in vivo survival is described. The improved in vivo survival and the preservation of adenosine triphosphate levels, along with reduction in hemolysis and membrane vesicle production of red blood cells stored at 4 C for prolonged periods of time, is achieved by reducing the oxygen level therein at the time of storage; in particular, by flushing the cells with an inert gas, and storing them in an aqueous solution which includes adenine, dextrose, mannitol, citrate ion, and dihydrogen phosphate ion, but no sodium chloride, in an oxygen-permeable container which is located in an oxygen-free environment containing oxygen-scavenging materials. 8 figs.

A comprehensive evaluated neutronic data file for elemental cobalt is described. The experimental data base, the calculational methods, the evaluation techniques and judgments, and the physical content are outlined. The file contains: neutron total and scattering cross sections and associated properties, (n,2n) and (n,3n) processes, neutron radiative capture processes, charged-particle-emission processes, and photon-production processes. The file extends from 10/sup /minus/5/ eV to 20 MeV, and is presented in the ENDF/B-VI format. Detailed attention is given to the uncertainties and correlations associated with the prominent neutron-induced processes. The numerical contents of the file have been transmitted to the National Nuclear Data Center, Brookhaven National Laboratory. 143 refs., 16 figs., 5 tabs.

Proposed database model and file structures for arthropod collection management Ronald A for taxonomic analysis or behavioral, physiological, and ecological information. The database model described how specific computerization projects can be related to each other. The proposed database model

In the distributed computing model of LHCb the File Catalog (FC) is a central component that keeps track of each file and replica stored on the Grid. It is federating the LHCb data files in a logical namespace used by all LHCb applications. As a replica catalog, it is used for brokering jobs to sites where their input data is meant to be present, but also by jobs for finding alternative replicas if necessary. The LCG File Catalog (LFC) used originally by LHCb and other experiments is now being retired and needs to be replaced. The DIRAC File Catalog (DFC) was developed within the framework of the DIRAC Project and presented during CHEP 2012. From the technical point of view, the code powering the DFC follows an Aspect oriented programming (AOP): each type of entity that is manipulated by the DFC (Users, Files, Replicas, etc) is treated as a separate 'concern' in the AOP terminology. Hence, the database schema can also be adapted to the needs of a Virtual Organization. LHCb opted for a highly tuned MySQL datab...

WeatherMaker is a weather-data utility for use with the ENERGY-10 design-tool computer program. The three main features are: Convert--Weather files can be converted from one format to another. For example, a TMY2 format file can be converted to an ENERGY-10 binary file that can be used in a simulation. This binary file can then be converted to a text format that allows it to be read and/or manipulated in WordPad or Excel. Evaluate--ENERGY-10 weather files can be studied in great detail. There are 8 graphical displays of the data that provide insight into the data, and a summary tables that presents results calculated from the hourly data. Adjust--Hourly temperature data can be adjusted starting with hourly data from a nearby TMY2 site. Dry-bulb and wet-bulb temperatures are adjusted up or down as required to match given monthly statistics. This feature can be used to generate weather files for any of 3,958 sites in the US where such monthly statistics are tabulated. The paper shows a variety of results, explains the methods used, and discusses the rationale for making the adjustments. It is anticipated that WeatherMaker will be released by the time of the ASES Solar 99 conference.

Simulation of the Rungis Wholesale Market: lessons on the calibration, validation and usage on a simulation of the Rungis Wholesale Market (in France) using cognitive agents. The implication of using of the system. Our case, the Fruits and Vegetables wholesale market of the Rungis Food Market, constitutes

IDENTIFYING THE USAGE PATTERNS OF METHYL TERT-BUTYL ETHER (MTBE) AND OTHER OXYGENATES IN GASOLINE 1608 Mt. View Rapid City, SD 57702 Methyl tert-butyl ether (MTBE) is commonly added to gasoline. In 1998, 11.9 billion liters of MTBE were produced in the U.S. MTBE has been detected frequently

into Waste Heat Recovery for Usage by a Rooftop Greenhouse Rohit Singla, Jeremy Lord, Jorden Hetherington Investigation into Waste Heat Recovery for Usage by a Rooftop Greenhouse April 4, 2013 Dr. Naoko Ellis APSC 262 of a microbrewery, an excess amount of waste heat in the form of steam is produced. In the sustainability principles

The electricity consumption of miscellaneous electronic loads (MELs) in the home has grown in recent years, and is expected to continue rising. Consumer electronics, in particular, are characterized by swift technological innovation, with varying impacts on energy use. Desktop and laptop computers make up a significant share of MELs electricity consumption, but their national energy use is difficult to estimate, given uncertainties around shifting user behavior. This report analyzes usage data from 64 computers (45 desktop, 11 laptop, and 8 unknown) collected in 2012 as part of a larger field monitoring effort of 880 households in the San Francisco Bay Area, and compares our results to recent values from the literature. We find that desktop computers are used for an average of 7.3 hours per day (median = 4.2 h/d), while laptops are used for a mean 4.8 hours per day (median = 2.1 h/d). The results for laptops are likely underestimated since they can be charged in other, unmetered outlets. Average unit annual energy consumption (AEC) for desktops is estimated to be 194 kWh/yr (median = 125 kWh/yr), and for laptops 75 kWh/yr (median = 31 kWh/yr). We estimate national annual energy consumption for desktop computers to be 20 TWh. National annual energy use for laptops is estimated to be 11 TWh, markedly higher than previous estimates, likely reflective of laptops drawing more power in On mode in addition to greater market penetration. This result for laptops, however, carries relatively higher uncertainty compared to desktops. Different study methodologies and definitions, changing usage patterns, and uncertainty about how consumers use computers must be considered when interpreting our results with respect to existing analyses. Finally, as energy consumption in On mode is predominant, we outline several energy savings opportunities: improved power management (defaulting to low-power modes after periods of inactivity as well as power scaling), matching the rated power of power supplies to computing needs, and improving the efficiency of individual components.

to represent a wide range of codon usages and secondary structure free energies across the translation differences (fig. S7A). New metrics that take into account both tRNA availability and usage (nTE) show

The purpose of Acquire is to provide scientists and managers quick access to a comprehensive, systematic, computerized compilation of aquatic toxicity data. Scientific papers published both nationally and internationally on the toxicity of chemicals to aquatic organisms and plants are collected and reviewed for ACQUIRE. Independently compiled data files that meet ACQUIRE parameter and quality assurance criteria are also included. Selected toxicity test results and related testing information for any individual chemical from laboratory and field aquatic toxicity effects are included for tests with freshwater and marine organisms. The total number of data records in ACQUIRE is now over 105,300. This includes data from 6000 references, for 5200 chemicals and 2400 test species. A major data file, Acute Toxicity of Organic Chemicals (ATOC), has been incorporated into ACQUIRE. The ATOC file contains laboratory acute test data on 525 organic chemicals using juvenile fathead minnows.

Inventory Routing and On-line Inventory Routing File Format M. Sevaux1,2 M. J. Geiger1 1 Helmut needs in the Inventory Routing Problem types. Instead of creating a new file format or putting ASCII is an extension of the TSPLIB file format description proposed in [1] to be used for the Inventory Routing Problem

CFM Calibration File Manager Morgan Burke Ver. 2.2 1993 Oct 1 1. Introduction What is CFM? CFM, short for Calibration File Manager, is a relational database developed by Brookhaven Experiment 787 to keep track of large numbers of data files containing calibration and related information for a high

The Evaluated Nuclear Structure Data File (ENSDF) is a leading resource for the experimental nuclear data. It is maintained and distributed by the National Nuclear Data Center, Brookhaven National Laboratory. The file is mainly contributed to by an international network of evaluators under the auspice of the International Atomic Energy Agency. The ENSDF is updated, generally by mass number, i.e., evaluating together all isobars for a given mass number. If, however, experimental activity in an isobaric chain is limited to a particular nuclide then only that nuclide is updated. The evaluations are published in the journal Nuclear Data Sheets, Academic Press, a division of Elsevier.

The Franklin Cray XT4 at the NERSC center was equipped with the server-side I/O monitoring infrastructure Cerebro/LMT, which is described here in detail. Insights gained from the data produced include a better understanding of instantaneous data rates during file system testing, file system behavior during regular production time, and long-term average behaviors. Information and insights gleaned from this monitoring support efforts to proactively manage the I/O infrastructure on Franklin. A simple model for I/O transactions is introduced and compared with the 250 million observations sent to the LMT database from August 2008 to February 2009.

1-Hydroxypyrene (1-OHP) is a biomarker of recent exposure to polycyclic aromatic hydrocarbons (PAHs). We investigated whether urinary 1-OHP concentrations in Chinese coke oven workers (COWs) are modulated by job category, respirator usage, and cigarette smoking. The present cross-sectional study measured urinary 1-OHP concentrations in 197 COWs from Coking plant I and 250 COWs from Coking plant II, as well as 220 unexposed referents from Control plant I and 56 referents from Control plant II. Urinary 1-OHP concentrations (geometric mean, {mu}mol/mol creatinine) were 5.18 and 4.21 in workers from Coking plants I and II, respectively. The highest 1-OHP levels in urine were found among topside workers including lidmen, tar chasers, and whistlers. Benchmen had higher 1-OHP levels than other workers at the sideoven. Above 75% of the COWs exceeded the recommended occupational exposure limit of 2.3 {mu}mol/mol creatinine. Respirator usage and increased body mass index (BMI) slightly reduced 1-OHP levels in COWs. Cigarette smoking significantly increased urinary 1-OHP levels in unexposed referents but had no effect in COWs. Chinese COWs, especially topside workers and benchmen, are exposed to high levels of PAHs. Urinary 1-OHP concentrations appear to be modulated by respirator usage and BMI in COWs, as well as by smoking in unexposed referents.

34 Sun's Network File System (NFS) One of the first uses of distributed client/server computing of data across clients. Thus, if 1 #12;2 SUN'S NETWORK FILE SYSTEM (NFS) you access a file on one machine-DUSSEAU #12;SUN'S NETWORK FILE SYSTEM (NFS) 3 even; in the best such case, no network traffic need be gener

January 2001 Open File Report 2001-0027 O DESKTOP DATABASE COMPILATION OF SACS UNIT INFORMATION, 2001). Acknowledgements for use of this database should refer to the "SACS database compilation prepared by the Council for Geoscience, Pretoria; as documented by Eglington et al. (2001)". This open

CAP88 (Clean Air Act Assessment Package 1988) is a computer model developed for the US Environmental Protection Agency to assess the potential dose from radionuclide emissions to air and to demonstrate compliance with the Clean Air Act. It has options to calculate either individual doses, in units of mrem, or a collective dose, also called population dose, in units of person-rem. To calculate the collective dose, CAP88 uses a population file such as LANL.pop, that lists the number of people in each sector (N, NNE, NE, etc.) as a function of distance (1 to 2 km, etc.) out to a maximum radius of 80 km. Early population files are described in the Los Alamos National Laboratory (LANL) Environmental Reports for 1985 (page 14) and subsequent years. LA-13469-MS describes a population file based on the 1990 census. These files have been updated several times, most recently in 2006 for CAP88 version 3. The 2006 version used the US census for 2000. The present paper describes the 2012 updates, using the 2010 census.

Truffles -- Secure File Sharing With Minimal System Administrator Intervention Peter Reiher Thomas sharing between arbitrary users at arbitrary sites connected by a network. Truffles is an interesting the potential of greatly increasing the workload of system administrators, if the services are not designed

Availability in the Sprite Distributed File System Mary Baker John Ousterhout Computer Science faults means recovering from them quickly. Our position is that performance and availability server recovery is the most cost-effective way of providing such availability. Mechanisms used

This manual is a guide to use of the file protection mechanisms available on the Martin Marietta Energy Systems, Inc. KSV VAXes. User identification codes (UICs) and general identifiers are discussed as a basis for understanding UIC-based and access control list (ACL) protection.

The Quality Assurance procedures used for the initial phase of the Model Catalog Project were developed to attain two objectives, referred to as basic functionality and visualization. To ensure the Monte Carlo N-Particle model input files posted into the ModCat database meet those goals, all models considered as candidates for the database are tested, revised, and re-tested.

Samples of Drug Testing Language Memorandum for Human Resources File DATE: TO: Employee FROM: Supervisor RE: Drug Testing As you know, the position that you have been selected for requires that you pass a pre-employment drug and alcohol test. In addition to the pre-employment test, you will also be subject

Maximizing the usage of renewable energy will reduce our reliance on dwindling natural resources and environmental pollution. Batteries are an important enabling technology for renewable energy, portable electronics, and modern transportation systems such as hybrid electric vehicles. However, limitation

for information retrieval across data centers. We present part of the solution to this challenge with an energy of power usage. About 26% of the energy consumption [1] at data centers is attributed to disk storage number of spinning drives has created an emerging and growing energy usage concern at data centers

This analysis examines the relationship between energy demand and residential building attributes, demographic characteristics, and behavioral variables using the U.S. Department of Energys Residential Energy Consumption Survey 2005 microdata. This study investigates the applicability of the smooth backfitting estimator to statistical analysis of residential energy consumption via nonparametric regression. The methodology utilized in the study extends nonparametric additive regression via local linear smooth backfitting to categorical variables. The conventional methods used for analyzing residential energy consumption are econometric modeling and engineering simulations. This study suggests an econometric approach that can be utilized in combination with simulation results. A common weakness of previously used econometric models is a very high likelihood that any suggested parametric relationships will be misspecified. Nonparametric modeling does not have this drawback. Its flexibility allows for uncovering more complex relationships between energy use and the explanatory variables than can possibly be achieved by parametric models. Traditionally, building simulation models overestimated the effects of energy efficiency measures when compared to actual "as-built" observed savings. While focusing on technical efficiency, they do not account for behavioral or market effects. The magnitude of behavioral or market effects may have a substantial influence on the final energy savings resulting from implementation of various energy conservation measures and programs. Moreover, variability in behavioral aspects and user characteristics appears to have a significant impact on total energy consumption. Inaccurate estimates of energy consumption and potential savings also impact investment decisions. The existing modeling literature, whether it relies on parametric specifications or engineering simulation, does not accommodate inclusion of a behavioral component. This study attempts to bridge that gap by analyzing behavioral data and investigate the applicability of additive nonparametric regression to this task. This study evaluates the impact of 31 regressors on residential natural gas usage. The regressors include weather, economic variables, demographic and behavioral characteristics, and building attributes related to energy use. In general, most of the regression results were in line with previous engineering and economic studies in this area. There were, however, some counterintuitive results, particularly with regard to thermostat controls and behaviors. There are a number of possible reasons for these counterintuitive results including the inability to control for regional climate variability due to the data sanitization (to prevent identification of respondents), inaccurate data caused by to self-reporting, and the fact that not all relevant behavioral variables were included in the data set, so we were not able to control for them in the study. The results of this analysis could be used as an in-sample prediction for approximating energy demand of a residential building whose characteristics are described by the regressors in this analysis, but a certain combination of their particular values does not exist in the real world. In addition, this study has potential applications for benefit-cost analysis of residential upgrades and retrofits under a fixed budget, because the results of this study contain information on how natural gas consumption might change once a particular characteristic or attribute is altered. Finally, the results of this study can help establish a relationship between natural gas consumption and changes in behavior of occupants.

Handling of multimedia files in the Invenio Software is motivated by the need for integration of multimedia files in the open-source, large-scale digital library software Invenio, developed and used at CERN, the European Organisation for Nuclear Research. In the last years, digital assets like pictures, presentations podcasts and videos became abundant in these systems and digital libraries have grown out of their classic role of only storing bibliographical metadata. The thesis focuses on digital video as a type of multimedia and covers the complete workflow of handling video material in the Invenio software: from the ingestion of digital video material to its processing on to the storage and preservation and finally the streaming and presentation of videos to the user. The potential technologies to realise a video submission workflow are discussed in-depth and evaluated towards system integration with Invenio. The focus is set on open and free technologies, which can be redistributed with the Inve...

Uncertainty may be introduced into RADTRAN analyses by distributing input parameters. The MELCOR Uncertainty Engine (Gauntt and Erickson, 2004) has been adapted for use in RADTRAN to determine the parameter shape and minimum and maximum of the distribution, to sample on the distribution, and to create an appropriate RADTRAN batch file. Coupling input parameters is not possible in this initial application. It is recommended that the analyst be very familiar with RADTRAN and able to edit or create a RADTRAN input file using a text editor before implementing the RADTRAN Uncertainty Analysis Module. Installation of the MELCOR Uncertainty Engine is required for incorporation of uncertainty into RADTRAN. Gauntt and Erickson (2004) provides installation instructions as well as a description and user guide for the uncertainty engine.

addresses were extracted from the records and placed in key directories. These different methods of locating records by key information are today known as list structures (5, 6, 3). Although there are many variations of list structuring techniques..., it is helpful to observe two extremes of list structures with most others be1ng combinations of them. The association of key data and DASD addresses permits retrieval of only those records required for processing without requiring a file search (6). Since...

This encryption algorithm is mainly designed for having a secure file transfer in the low privilege servers and as well as in a secured environment too. This methodology will be implemented in the data center and other important data transaction sectors of the organisation where the encoding process of the software will be done by the database administrator or system administrators and his trusted clients will have decoding process of the software. This software will not be circulated to the unauthorised customers.

addresses were extracted from the records and placed in key directories. These different methods of locating records by key information are today known as list structures (5, 6, 3). Although there are many variations of list structuring techniques..., it is helpful to observe two extremes of list structures with most others be1ng combinations of them. The association of key data and DASD addresses permits retrieval of only those records required for processing without requiring a file search (6). Since...

The file management component of a database management system (DBMS) has to be tailor designed to meet the performance demands of large database applications. The operating system (OS) file systems are typically not suitable for storing...

The Evonik Degussa Corporation is the global market leader in the specialty chemicals industry. Innovative products and system solutions make an indispensable contribution to our customers' success. We refer to this as "creating essentials". In fiscal 2004, Degussa's 45,000 employees worldwide generated sales of 11.2 billion euros and operating profits (EBIT) of 965 million euros. Evonik Degussa Corporation has performed a plant wide energy usage assessment at the Mapleton, Illinois facility, which consumed 1,182,330 MMBTU in 2003. The purpose of this study was to identify opportunities for improvement regarding the plants utility requirements specific to their operation. The production is based mainly on natural gas usage for steam, process heating and hydrogen production. The current high price for natural gas in the US is not very competitive compared to other countries. Therefore, all efforts must be taken to minimize the utility consumption in order to maximize market position and minimize fixed cost increases due to the rising costs of energy. The main objective of this plant wide assessment was to use a methodology called Site Energy Modelling (SitE Modelling) to identify areas of potential improvement for energy savings, either in implementing a single process change or in changing the way different processes interact with each other. The overall goal was to achieve energy savings of more than 10% compared to the 2003 energy figures of the Mapleton site. The final savings breakdown is provided below: - 4.1% savings for steam generation and delivery These savings were accomplished through better control schemes, more constant and optimized loading of the boilers and increased boiler efficiency through an advanced control schemes. - 1.6% savings for plant chemical processing These saving were accomplished through optimized processing heating efficiency and batch recipes, as well as an optimized production schedule to help equalize the boiler load (e.g. steam consumption).