Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

During the three years of AMETIST the performance and usability of the existing tools for analysing timed automata models have improved enormously. In addition to traditional verification of timed models, emphasis has been put into retargeting the technology towards optimal scheduling and performance analysis and have been pursued in a number of new tools developed

VCAT is a knowledge modeling and analysistool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

Shot planning and analysistools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

The Solar Energy Research Institute (SERI) is developing a procedure for the validation of Building Energy Analysis Simulation Codes (BEAS). These codes are being used increasingly in the building design process, both directly and as the basis for simplified design tools and guidelines. The importance of the validity of the BEAS in predicting building energy performance is obvious when one considers the money and energy that could be wasted by energy-inefficient designs. However, to date, little or no systematic effort has been made to ensure the validity of the various BEAS. The validation work at SERI consists of three distinct parts: Comparative Study, Analytical Verification, and Empirical Validation. The procedures have been developed for the first two parts and have been implemented on a sampling of the major BEAS; results have shown major problems in one of the BEAS tested. Furthermore, when one building design was run using several of the BEAS, large differences were found in the predicted annual cooling and heating loads. The empirical validation procedure has been developed, and five two-zone test cells have been constructed for validation; a summer validation run will take place as soon as the data acquisition system is completed. Additionally, a test validation exercise is now in progress using the low-cal house to fine-tune the empirical validation procedure and better define monitoring data requirements.

A new software tool enables the easy and quick selection of applicable regulatory guidelines as a starting point for human factors engineering (HFE) analyses. Once selected, each guideline can be viewed on screen. The software tracks and reports the ...

source source History View New Pages Recent Changes All Special Pages Semantic Search/Querying Get Involved Help Apps Datasets Community Login | Sign Up Search Page Edit History Facebook icon Twitter icon Â» SWERA/AnalysisTools < SWERA Jump to: navigation, search SWERA logo.png Solar and Wind Energy Resource Assessment (SWERA) AnalysisToolsPowered by OpenEI Getting Started Data Sets AnalysisTools About SWERA OpenCarto OpenCarto houses the SWERA web based GIS application and provides the tools and data to support a variety of user communities in both small and large project planning, feasibility assessment, policy making, and decision support. The interface is designed to support collaboration across industries, geography, and research domains by providing interoperability

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

Models and Tools Archive Models and Tools Archive Through the years, NREL has developed and supported several models and tools to assess, analyze, and optimize renewable energy and energy efficiency technologies. Some of these have been transferred to the private market. This page lists tools we have supported, but that are no longer active. See current models and tools here. ADVISOR (ADvanced VehIcle SimulatOR) Simulate and analyze conventional, advanced, light, and heavy vehicles, including hybrid electric and fuel cell vehicles. In 2003, ADVISOR was commercialized by AVL Powertrain Engineering, Inc. Hybrid2 Conduct detailed long-term performance and economic analysis on a wide variety of hybrid power systems. RET Finance Calculate the cost of energy of renewable electricity generation

This presentation provides an overview of the Built Environment Energy AnalysisTool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

IGATE-E is an industrial energy analysistool. The tool is intended to be a decision support and planning tool to a wide spectrum of energy analysts, engineers, researchers, government organizations, private consultants, industry partners, and alike. The tool applies statistical modeling to multiple datasets and provides information at the geospatial resolution of zip code using bottom up approaches. Within each zip code, the current version of the tool estimates electrical energy consumption of manufacturing industries based on each type of industries using information from DOE's Industrial Assessment Center database (IAC-DB) and DOE's Energy Information Administration Manufacturing Energy Consumption Survey database (EIA-MECS DB), in addition to commercially available databases such as the Manufacturing News database (MNI, Inc.). Ongoing and future work include adding modules for the predictions of fuel energy consumption streams, manufacturing process steps energy consumption, major energy intensive processes (EIPs) within each industry type among other metrics of interest. The tool utilizes the DOE EIA-MECS energy survey data to validate bottom-up estimates and permits several statistical examinations.

The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysistool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysistool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

AN ASSESSMENT OF ENERGY INTENSITY INDICATORS AND THEIR ROLE AS POLICY - MAKING TOOLS by Mallika as a basis for policy-making has been on the rise. The idea that trends in both energy intensity policy criteria. Estimates of economic energy intensity from decomposition analyses are found to be data

The Desktop Analysis Reporting Tool (DART) is a software package that allows a user to easily view and analyze radiation portal monitor (RPM) daily files that span long periods. DART gives users the capability to determine the state of health of a monitor, troubleshoot and diagnose problems, and view data in various time frames to perform trend analysis. In short, it converts the data strings written in the daily files into meaningful tables and plots. DART is an application-based program that was designed to maximize the benefit of a centralized data repository while distributing the workload to individual desktop machines. This networked approach requires a more complex database manager (SQL Server); however, SQL Server is not currently provided with the DART Installation Disk. SQL Express is sufficient for local data analysis and requires the installation of SQL Express and DART on each machine intended for analysis.

BBAT is written to meet the need of an interactive graphical tool to explore the longitudinal phase space. It is driven for testing new ideas or new tricks quickly. It is especially suitable for machine physicists or operation staff as well both in the control room during machine studies or off-line to analyze the data. The heart of the package contains a set of c-routines to do the number crunching. The graphics part is wired with scripting language tcl/tk and BLT. The c-routines are general enough that one can write new applications such as animation of the bucket as a machine parameter varies via a sliding scale. BBAT deals with single rf system. For double rf system, one can use Dr. BBAT, which stands for Double rf Bunch and Bucket AnalysisTool. One usage of Dr. BBAT is to visualize the process of bunch coalescing and flat bunch creation.

In this paper we discuss the use of expert system components to enhance reliability analysis software tools. The discussion focuses on two expert systems built to enhance STAR and SUPER, reliability analysistools developed at AT&T Bell Laboratories. ...

For the past three years, the Office of Security Policy has been aggressively pursuing substantial improvements in the U. S. Department of Energy (DOE) regulations and directives related to safeguards and security (S&S). An initial effort focused on areas where specific improvements could be made. This revision was completed during 2009 with the publication of a number of revised manuals. Developing these revisions involved more than 100 experts in the various disciplines involved, yet the changes made were only those that could be identified and agreed upon based largely on expert opinion. The next phase of changes will be more analytically based. A thorough review of the entire (S&S) directives set will be conducted using software tools to analyze the present directives with a view toward 1) identifying areas of positive synergism among topical areas, 2) identifying areas of unnecessary duplication within and among topical areas, and 3) identifying requirements that are less than effective in achieving the intended protection goals. This paper will describe the software tools available and in development that will be used in this effort. Some examples of the output of the tools will be included, as will a short discussion of the follow-on analysis that will be performed when these outputs are available to policy analysts.

Distribution System AnalysisTools for Studying High Penetration of PV with Grid Support Features Electric Energy System #12;#12;Distribution System AnalysisTools for Studying High Penetration of PV project titled "Distribution System AnalysisTools for Studying High Penetration of PV with Grid Support

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

Commercial Buildings Â» Buildings Performance Database Â» Buildings Commercial Buildings Â» Buildings Performance Database Â» Buildings Performance Database AnalysisTools Buildings Performance Database AnalysisTools The Buildings Performance Database will offer four analysistools for exploring building data and forecasting financial and energy savings: a Peer Group Tool, a Retrofit AnalysisTool, a Data Table Tool, and a Financial Forecasting Tool. Available now: Peer Group Tool The Peer Group Tool allows users to peruse the BPD, define peer groups, and analyze their performance. Users can create Peer Groups by filtering the dataset based on parameters such as building type, location, floor area, age, occupancy, and system characteristics such as lighting and HVAC type. The graphs show the energy performance distribution of those

NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.

PyRAT is a radiography analysistool used to reconstruction images of unknown 1-0 objects. The tool is written in Python and developed for use on LINUX and Windows platforms. The tool is capable of performing nonlinear inversions of the images with minimal manual interaction in the optimization process. The tool utilizes the NOMAD mixed variable optimization tool to perform the optimization.

This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

Declining natural resources, rising oil prices, looming climate change and the introduction of nuclear energy partnerships, such as GNEP, have reinvigorated global interest in nuclear energy. The convergence of such issues has prompted countries to move ahead quickly to deal with the challenges that lie ahead. However, developing countries, in particular, often lack the domestic infrastructure and public support needed to implement a nuclear energy program in a safe, secure, and nonproliferation-conscious environment. How might countries become ready for nuclear energy? What is needed is a framework for assessing a country's readiness for nuclear energy. This paper suggests that a Nuclear Energy Readiness Indicator (NERI) Index might serve as a meaningful basis for assessing a country's status in terms of progress toward nuclear energy utilization under appropriate conditions. The NERI Index is a benchmarking tool that measures a country's level of 'readiness' for nonproliferation-conscious nuclear energy development. NERI first identifies 8 key indicators that have been recognized by the International Atomic Energy Agency as key nonproliferation and security milestones to achieve prior to establishing a nuclear energy program. It then measures a country's progress in each of these areas on a 1-5 point scale. In doing so NERI illuminates gaps or underdeveloped areas in a country's nuclear infrastructure with a view to enable stakeholders to prioritize the allocation of resources toward programs and policies supporting international nonproliferation goals through responsible nuclear energy development. On a preliminary basis, the indicators selected include: (1) demonstrated need; (2) expressed political support; (3) participation in nonproliferation and nuclear security treaties, international terrorism conventions, and export and border control arrangements; (4) national nuclear-related legal and regulatory mechanisms; (5) nuclear infrastructure; (6) the utilization of IAEA technical assistance; (7) participation in regional arrangements; and (8) public support for nuclear power. In this paper, the Index aggregates the indicators and evaluates and compares the level of readiness in seven countries that have recently expressed various degrees of interest in establishing a nuclear energy program. The NERI Index could be a valuable tool to be utilized by: (1) country officials who are considering nuclear power; (2) the international community, desiring reassurance of a country's capacity for the peaceful, safe, and secure use of nuclear energy; (3) foreign governments/NGO's, seeking to prioritize and direct resources toward developing countries; and (4) private stakeholders interested in nuclear infrastructure investment opportunities.

The 9/30/2009 ASC Level 2 Scalable AnalysisTools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

EERE AnalysisTools EERE AnalysisTools Jump to: navigation, search The U.S. Department of Energy funds energy analysis activities and produces analysis resources. On this website, energy analysts and policymakers can access energy analysis products related to renewable energy and energy efficiency, including published analyses, energy data and maps, and analysistools such as models, software, and calculators. While the website provides links to select and important energy analysis products, it is not a comprehensive source of such products. Energy analysis affords an understanding of issues and opportunities that effective energy program management requires. http://www1.eere.energy.gov/analysis/index.html Pages in category "EERE AnalysisTools" The following 43 pages are in this category, out of 43 total.

This paper describes the Alpaca runtime tools. These tools leverage the component infrastructure of the Cactus Framework in a novel way to enable runtime steering, monitoring, and interactive control of a simulation. Simulation data can be observed graphically, ... Keywords: frameworks, runtime visualization, software/program verification

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

Summary: ProMAT is a software tool for statistically analyzing data from ELISA microarray experiments. The software estimates standard curves, sample protein concentrations and their uncertainties for multiple assays. ProMAT generates a set of comprehensive figures for assessing results and diagnosing process quality. The tool is available for Windows or Mac, and is distributed as open-source Java and R code. Availability: ProMAT is available at http://www.pnl.gov/statistics/ProMAT. ProMAT requires Java version 1.5.0 and R version 1.9.1 (or more recent versions) which are distributed with the tool.

Tool Allows Quicker, More Versatile Analysis of Energy Tool Allows Quicker, More Versatile Analysis of Energy Production Technologies Novel Tool Allows Quicker, More Versatile Analysis of Energy Production Technologies June 19, 2012 - 1:00pm Addthis Washington, DC - A new energy production technology analysistool that could lead to cost-effective improvements for energy generation and lower costs for consumers is now available on the National Energy Technology Laboratory (NETL) website. Available at no cost, the Power Systems Life Cycle AnalysisTool (Power LCAT) compares seven energy-production technologies: natural gas combined cycle, integrated gasification combined cycle, existing and supercritical pulverized coal, existing and new nuclear, and onshore wind. An option for capturing and sequestering carbon dioxide emissions is also included for

Tool Yields Custom Environmental Data for Lifecycle AnalysisTool Yields Custom Environmental Data for Lifecycle Analysis New Tool Yields Custom Environmental Data for Lifecycle Analysis September 10, 2012 - 1:00pm Addthis Washington, DC - A new, free online tool developed by a Department of Energy (DOE) laboratory allows users to customize and analyze the environmental impact of various fuels before they are used to create power. Information from the ExcelÂ™-based Upstream Dashboard - developed by the Office of Fossil Energy's National Energy Technology Laboratory (NETL) - can be used with other data or models to build an emissions inventory of various feedstocks as part of a comprehensive lifecycle analysis of the fuels. Lifecycle analysis is a new and innovative way to analyze and compare different pathways for producing power and transportation fuels.

New Tool Yields Custom Environmental Data for Lifecycle Analysis New Tool Yields Custom Environmental Data for Lifecycle Analysis New Tool Yields Custom Environmental Data for Lifecycle Analysis September 10, 2012 - 1:00pm Addthis Washington, DC - A new, free online tool developed by a Department of Energy (DOE) laboratory allows users to customize and analyze the environmental impact of various fuels before they are used to create power. Information from the ExcelÂ™-based Upstream Dashboard - developed by the Office of Fossil Energy's National Energy Technology Laboratory (NETL) - can be used with other data or models to build an emissions inventory of various feedstocks as part of a comprehensive lifecycle analysis of the fuels. Lifecycle analysis is a new and innovative way to analyze and compare different pathways for producing power and transportation fuels.

Concurrency introduces a high degree of combinatory which may be the source of subtle mistakes. We present a new tool, Quasar, which is based on ASIS and which uses fully the concept of patterns. The analysis of a concurrent Ada program by our tool proceeds ...

We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities.

A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers.

Biomass AnalysisTool Is Faster, More Precise Biomass AnalysisTool Is Faster, More Precise February 26, 2013 Photo of a left hand holding a rectangular tray that has 48 cups, each holding a small amount of a biomass sample. Below the tray is a slice of a poplar tree, about an inch thick and a foot in diameter. Enlarge image This tray of 80-milliliter samples was taken from a standard poplar tree, such as the one pictured here. It is ready to be loaded into NREL's molecular beam mass spectrometer for rapid analysis of the cell wall chemistry of each sample. Credit: Dennis Schroeder A screening tool from the U.S. Department of Energy's National Renewable Energy Laboratory (NREL) eases and greatly quickens one of the thorniest tasks in the biofuels industry: determining cell wall chemistry to find plants with ideal genes.

Overhang Annual Analysis Overhang Annual Analysis Screen shot of Overhang Annual Analysis Lets users visualize the shading performance of a window overhang for an entire year. Overhang Annual Analysistool accepts a location and the dimensions of the window and overhang as inputs, and produces a chart depicting the amount that the overhang shades the window for each hour of the day, for each month of the year. Screen Shots Keywords window, overhang, shading, solar Validation/Testing Outputs verified against an overhang shadow calculator with an independent methodology, also produced by the author. Expertise Required None. Users Unknown, new tool, released August 2007. Audience Architects, builders, homeowners, passive solar designers, energy analysts Input Location, window dimensions, overhang dimensions.

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

FUTURE POWER GRID INITIATIVE Market Design AnalysisTool OBJECTIVE Power market design plays a critical role in the outcomes related to power system reliability and market efficiency. However, translation of market rules/designs into the complex mathematical market clearing mechanism is not a trivial

This paper describes a new computerized Subdivision Energy AnalysisTool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

The Adversarial Route AnalysisTool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

Foundation Analysis and Design (FAD) Tools is a software package that provides for the design of transmission line foundations for single pole structures (MFAD), H-frame structures (HFAD) and steel towers (TFAD). The FAD Tools software provides for the design of drilled shafts and direct embedded poles in soil and rock for single pole foundations (MFAD), and drilled shafts and direct embedded poles in soil and rock for H-frame foundations (HFAD) and drilled shafts for steel towers (TFAD). The software ha...

IGATE-E is an energy analysistool for industrial energy evaluation. The tool applies statistical modeling to multiple publicly available datasets and provides information at the geospatial resolution of zip code using bottom up approaches. Within each zip code, the current version of the tool estimates electrical energy consumption of manufacturing industries based on each type of industries using DOE s Industrial Assessment Center database (IAC-DB) and DOE s Energy Information Administration Manufacturing Energy Consumption Survey database (EIA-MECS DB), in addition to other commercially available databases such as the Manufacturing News database (MNI, Inc.). Ongoing and future work include adding modules for the predictions of fuel energy consumption streams, manufacturing process steps energy consumption, major energy intensive processes (EIPs) within each industry type among other metrics of interest. The tool provides validation against DOE s EIA-MECS state level energy estimations and permits several statistical examinations. IGATE-E is intended to be a decision support and planning tool to a wide spectrum of energy analysts, researchers, government organizations, private consultants, industry partners, and alike.

The Dynamic Maps, Geographic Information System (GIS) Data and Analysis The Dynamic Maps, Geographic Information System (GIS) Data and AnalysisTools website provides maps, data and tools for renewable energy resources that determine which energy technologies are viable solutions in domestic and international regions. MapSearch - While this site contains detailed information and quality data, if you want to search for the latest and most up-to-date maps created by NREL, please visit our MapSearch: http://www.nrel.gov/gis/mapsearch/ Renewable Energy Technical Potential with the image of a presentation slide with map. The National Renewable Energy Laboratory's GIS team analyzes wind, solar, biomass, geothermal, and other energy resources and inputs the data into the GIS. Read more about how NREL's GIS staff and capabilities enhance the

The sun as an oscillator produces frequencies which propagate in the heliosphere, via solar wind, to the terrestrial magnetosphere. We searched for those frequencies in the parameters of the near Earth solar plasma and the geomagnetic indices for the past four solar cycles. The solar wind parameters used in this work are the interplanetary magnetic field, plasma beta, Alfven Mach number, solar wind speed, plasma temperature, plasma pressure, plasma density and the geomagnetic indices DST, AE, Ap and Kp. We found out that each parameter of the solar wind exhibit certain periodicities which di?erentiate in each cycle. Our results indicate intermittent periodicities in our data, some of them shared between the solar wind parameters and geomagnetic indices.

We present the Tool for Astrophysical Data Analysis (TA-DA), a new software aimed to greatly simplify and improve the analysis of stellar photometric data in comparison with theoretical models, and allow the derivation of stellar parameters from multi-band photometry. Its flexibility allows one to address a number of such problems: from the interpolation of stellar models, or sets of stellar physical parameters in general, to the computation of synthetic photometry in arbitrary filters or units; from the analysis of observed color-magnitude diagrams to a Bayesian derivation of stellar parameters (and extinction) based on multi-band data. TA-DA is available as a pre-compiled Interactive Data Language widget-based application; its graphical user interface makes it considerably user-friendly. In this paper, we describe the software and its functionalities.

This paper presents a new concept called the ``Control Sensitivity Index`` of CSI, for the stability analysis of HVdc converters connected to weak ac systems. The CSI for a particular control mode can be defined as the ratio of incremental changes in the two system variables that are most relevant to that control mode. The index provides valuable information on the stability of the system and, unlike other approaches, aids in the design of the controller. It also plays an important role in defining non-linear gains for the controller. This paper offers a generalized formulation of CSI and demonstrates its application through an analysis of the CSI for three modes of HVdc control. The conclusions drawn from the analysis are confirmed by a detailed electromagnetic transients simulation of the ac/dc system. The paper concludes that the CSI can be used to improve the controller design and, for an inverter in a weak ac system, the conventional voltage control mode is more stable than the conventional {gamma} control mode.

Bookmark and Share Bookmark and Share Maps NREL's GIS team develops maps for various renewable resources and for specific projects. As a benefit to the public, a majority of static maps are offered and Google Map (KML/KMZ) files on a tool called MapSearch. Biomass Maps Maps showing the biomass resources available in the United States by county. Feedstock categories include crop residues; forest residues; primary and secondary mill residues; urban wood waste; and methane emissions from manure management, landfills, and domestic wastewater treatment. Federal Energy Management Program The Federal Energy Management Program (FEMP) teamed with Geospatial Analysis staff at NREL to update the analysis for this project and created an interactive FEMP Screening Map application. The previous maps have been

Cooling towers are used extensively for numerous industrial, residential, and commercial applications. Yet despite how common their uses, operators often do not know how to properly evaluate and optimize their performance. This is due to the complex and variable nature of all of the factors that can influence performance; fan speed, wind speed, sump temperature, heat load, ambient temperature, relative humidity, etc. This can be overwhelming for a regular operator resulting in many cooling towers being set to a default operating condition and forgotten. This paper will introduce a web-based cooling tower analysistool being developed to help users understand and optimize operational efficiency. The calculations, evaluations, and models will be discussed in detail to highlight important design considerations and issues. This will include how the Merkel Theory, psychometric properties, tower types, and historical weather data are incorporated into the analysis.

Bookmark and Share Bookmark and Share Data Resources NREL's Geographic Information System (GIS) team develops technology-specific GIS data maps for a variety of areas, as well as targeted analysistools that can help determine availability of renewable energy resources. Geographic Information System Data NREL's GIS Team develops technology-specific GIS data maps for a variety of areas, including biomass, geothermal, solar, wind, and renewable hydrogen. The team has made some of our datasets available for download through this Web site. 10km and 40km solar datasets are available for the United States and some international sites. 50m wind datasets are available for specific states, regions and some international sites. 25km wind datasets are available for the United States. 90m offshore wind datasets are available

Future space exploration missions and campaigns will require sophisticated tools to help plan and analyze logistics. To encourage their use, space logistics tools must be usable: a design concept encompassing terms such ...

We describe the integrated development of PowerDOE, a new version of the DOE-2 building energy analysis program, and the Building Design Advisor (BDA), a multimedia-based design tool that assists building designers with the concurrent consideration of multiple design solutions with respect to multiple design criteria. PowerDOE has a windows-based Graphical User Interface (GUI) that makes it easier to use than DOE-2, while retaining DOE-2`s calculation power and accuracy. BDA, with a similar GUI, is designed to link to multiple analytical models and databases. In its first release it is linked to PowerDOE and a Daylighting Analysis Module, as well as to a Case Studies Database and a Schematic Graphic Editor. These allow building designers to set performance goals and address key building envelope parameters from the initial, schematic phases of building design to the detailed specification of building components and systems required by PowerDOE. The consideration of the thermal performance of building envelopes through PowerDOE and BDA is integrated with non-thermal envelope performance aspects, such as daylighting, as well as with the performance of non-envelope building components and systems, such as electric lighting and HVAC. Future versions of BDA will support links to CAD and electronic product catalogs, as well as provide context-dependent design advice to improve performance.

In subdivisions, house orientations are largely determined by street layout. The resulting house orientations affect energy consumption (annual and on-peak) for heating and cooling, depending on window area distributions and shading from neighboring houses. House orientations also affect energy production (annual and on-peak) from solar thermal and photovoltaic systems, depending on available roof surfaces. Therefore, house orientations fundamentally influence both energy consumption and production, and an appropriate street layout is a prerequisite for taking full advantage of energy efficiency and renewable energy opportunities. The potential influence of street layout on solar performance is often acknowledged, but solar and energy issues must compete with many other criteria and constraints that influence subdivision street layout. When only general guidelines regarding energy are available, these factors may be ignored or have limited effect. Also, typical guidelines are often not site-specific and do not account for local parameters such as climate and the time value of energy. For energy to be given its due consideration in subdivision design, energy impacts need to be accurately quantified and displayed interactively to facilitate analysis of design alternatives. This paper describes a new computerized Subdivision Energy AnalysisTool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

Environmental sampling and radionuclide analysis of the resulting material can be utilized as a supplemental approach in safeguarding practices and particularly for detection of undeclared nuclear activities. The production of nuclear weapons could be pursued by uranium enrichment processes to produce highly enriched U-235 or by nuclear reactor operations followed by chemical separations to produce Pu-239. The application of either of these processes results in the production of signature materials, some of which will be released to the environs. Results from the operations of the Hanford production facilities are discussed and indicate the type of signatures that may be expected from plutonium production facilities. These include noble gas emissions from the reactors and chemical separations processes, the production of radionuclides in reactor cooling water followed by their subsequent release to the Columbia River, and the release of mildly contaminated process water from the chemical processing facilities. These signature materials are carried by both gaseous and liqid effluents and enter various compartments of the environment. The types of signature materials which are most likely to be accumulated are discussed, together with examples of the quantities which have been released during past separations. There are numerous processes by which natural uranium may be enriched to produce highly enriched U-235. The most definitive signature of such processes is always a modification in uranium isotope ratios, and materials showing either enriched or depleted uranium in gaseous and liquid effluents provide the best indication that uramium enrichment processes are taking place. Therefore, techniques for sampling and analysis of airborne, waterborne, or deposited uranium in environmental matrices provide a means of detecting uranium enrichment which may lead to proliferation products.

Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysistool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.

The objective of this project was to provide a robust fallout analysis and planning tool for the National Technical Nuclear Forensics interagency ground sample collection team. Their application called for a fast-running, portable mission-planning tool for use in response to emerging improvised nuclear device (IND) post-detonation situations. The project met those goals by research and development of models to predict the physical, chemical, and radiological properties of fallout debris. ORNL has developed new graphical user interfaces for two existing codes, the Oak Ridge Isotope Generation (ORIGEN) code and the Defense Land Fallout Interpretive Code (DELFIC). ORIGEN is a validated, radionuclide production and decay code that has been implemented into the Fallout AnalysisTool to predict the fallout source term nuclide inventory after the detonation of an IND. DELFIC is a validated, physics-based, research reference fallout prediction software package. It has been implemented into the Fallout Planning Tool and is used to predict the fractionated isotope concentrations in fallout, particle sizes, fractionation ratios, dose rate, and integrated dose over the planned collection routes - information vital to ensure quality samples for nuclear forensic analysis while predicting dose to the sample collectors. DELFIC contains a particle activity module, which models the radiochemical fractionation of the elements in a cooling fireball as they condense into and onto particles to predict the fractionated activity size distribution for a given scenario. This provides the most detailed physics-based characterization of the fallout source term phenomenology available in an operational fallout model.

To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the tool has been developed and implemented in software.

Our understanding of glycosaminoglycan (GAG) biology has been limited by a lack of sensitive and efficient analytical tools designed to deal with these complex molecules. GAGs are heterogeneous and often sulfated linear ...

Future planetary explorations will require surface traverses of unprecedented frequency, length, and duration. As a result, there is need for exploration support tools to maximize productivity, scientific return, and safety. ...

During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employed to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.

On-Line Tools On-Line Tools Fuel Economy Website - This interactive website allows consumers to factor fuel efficiency into their car buying decisions by allowing side by side fuel economy comparisons of new model year vehicles. This site also contains information on advanced techonolgy vehicles, the environment, and a well organized links section, making it a gateway for those interested in researching vehicles on the web. Contact: Bo Saulsbury. Transportation Energy Data Book: Edition 32 - Designed as a desk-top reference, the data book includes statistics and other information that characterize transportation activity and/or influence transportation energy use. The Transportation Energy Data Book: Edition 32 is available in pdf format by clicking on the above link and is also available in hard copy. To

Designers, implementers, and marketers of data analysistools typically have different perspectives than users. Consequently, data analysis often find themselves using tools focused on graphics and programming concepts rather than concepts which reflect their own domain and the context of their work. Some user studies focus on usability tests late in development; others observe work activity, but fail to show how to apply that knowledge in design. This paper describes a methodology for applying observations of data analysis work activity in prototype tool design. The approach can be used both in designing improved data analysistools, and customizing visualization environments to specific applications. We present an example of user-centered design for a prototype tool to cull large data sets. We revisit the typical graphical approach of animating a large data set from the point of view of an analysis who is culling data. Field evaluations using the prototype tool not only revealed valuable usability information, but initiated in-depth discussions about user`s work, tools, technology, and requirements.

Benchmarks are heavily used in different areas of computer science to evaluate algorithms and tools. In program analysis and testing, open-source and commercial programs are routinely used as bench- marks to evaluate different aspects of algorithms ...

This paper presents a knowledge-centric and language independent framework and its application to develop safety analysistools for avionics systems. A knowledge-centric approach is important to address domain-specific needs, with respect to the types ...

This thesis discusses the Surface Exploration Traverse Analysis and Navigation Tool (SEXTANT), a system designed to help maximize productivity, scientific return, and safety on future lunar and planetary explorations,. The ...

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

Analysis and Selection of Analytical Tools to Assess Analysis and Selection of Analytical Tools to Assess National-Interest Transmission Bottlenecks Final Report Analysis and Selection of Analytical Tools to Assess National-Interest Transmission Bottlenecks Final Report This work described in this report was coordinated by the Consortium for Electric Reliability Technology Solutions and funded by the U.S. Department of Energy, Office of Electric Transmission and Distribution under Contract No. DE AC03-76SF00098. Analysis and Selection of Analytical Tools to Assess National-Interest Transmission Bottlenecks Final Report More Documents & Publications THE VALUE OF ECONOMIC DISPATCH A REPORT TO CONGRESS PURSUANT TO SECTION 1234 OF THE ENERGY POLICY ACT OF 2005 2006 National Electric Transmission Congestion Study and Related Materials

Complex Adaptive Systems (CAS) can be applied to investigate complex infrastructure interdependencies such as those between the electric power and natural gas markets. These markets are undergoing fundamental transformations including major changes in electric generator fuel sources. Electric generators that use natural gas as a fuel source are rapidly gaining market share. These generators introduce direct interdependency between the electric power and natural gas markets. These interdependencies have been investigated using the emergent behavior of CAS model agents within the Spot Market Agent Research Tool Version 2.0 Plus Natural Gas (SMART II+).

analysis, GPA first generates an abstract representation of the system of ODEs and then dynamically. Hayden, and J. T. Bradley, "Fluid Analysis of Energy Consumption using Rewards in Massively ParallelGPA Â­ a tool for fluid scalability analysis of massively parallel systems Anton Stefanek Richard A

First Annual Conference on Intelligence Analysis Methods and Tools, May 2005 PNNL-SA-44274 Top Ten. The Pacific Northwest National Laboratory (PNNL) has for some time been involved in both the development development and/or enhancement. 2. Approach We conducted a one-day workshop with analysts em- ployed at PNNL

This document provides a brief overview of HVAC analysistools, focusing on those for chiller plants. A review of 14 tools, ranging from simplified to complex, finds a wide variation in calculation methods, HVAC system capabilities, available energy rate structures, and purchase cost. EPRI's ChillerCalc, which provides first-order analysis for chiller plants, compares favorably with other programs, offering critical features not included by some, such as time-of-use (TOU) energy rates, demand ratchets, l...

Indices are an important tool used to increase the accuracy and efficiency of the energy
audit process. This thesis describes methods for using annual, monthly, daily, and hourly
indices to improve current energy auditing processes.
Eleven schools in different regions in Texas were identified for the case studies. The
results show that certain indices match what is recommended by on-site visits and actually
provide additional information that is sometimes not identified by a site visit. The indices
developed provide a useful means by which energy audit firms and building
owners/administrators can identify those areas of a building that have the most potential for
energy cost reduction measures and operation and maintenance measures prior to a site
visit. These indices assist the energy auditor in performing more efficient energy analyses
on buildings. Each school in this thesis was audited prior to this study as part of the Texas
LoanSTAR program. The indices were then developed using data from the period between
September 1991 and December 1993. Retrofits to the case study buildings were completed
during this period also. The sites were then reaudited to confirm the results from the
previous audits, the usefulness of the indices, and/or discover new areas for energy savings.
Two important new findings from this thesis are: 1) that schools are better modeled by
grouping data into separate occupancy profiles consisting of school year months and
summer months; and 2) the school year base-level electricity consumption can be
calculated by taking the 25th percentiles of all twelve months of data reported. This
approximately matches the base-level determined when running a 3-parameter cooling models on monthly energy consumption data and has the advantage that it does not require
coincident weather data.

Indices are an important tool used to increase the accuracy and efficiency of the energy audit process. This thesis describes methods for using annual, monthly, daily, and hourly indices to improve current energy auditing processes. Eleven schools in different regions in Texas were identified for the case studies. The results show that certain indices match what is recommended by on-site visits and actually provide additional information that is sometimes not identified by a site visit. The indices developed provide a useful means by which energy audit firms and building owners/administrators can identify those areas of a building that have the most potential for energy cost reduction measures and operation and maintenance measures prior to a site visit. These indices assist the energy auditor in performing more efficient energy analyses on buildings. Each school in this thesis was audited prior to this study as part of the Texas LoanSTAR program. The indices were then developed using data from the period between September 1991 and December 1993. Retrofits to the case study buildings were completed during this period also. The sites were then reaudited to confirm the results from the previous audits, the usefulness of the indices, and/or discover new areas for energy savings. Two important new findings from this thesis are: 1) that schools are better modeled by grouping data into separate occupancy profiles consisting of school year months and summer months; and 2) the school year base-level electricity consumption can be calculated by taking the 25th percentiles of all twelve months of data reported. This approximately matches the base-level determined when running a 3-parameter cooling models on monthly energy consumption data and has the advantage that it does not require coincident weather data.

This paper outlines the application of a special Environmental Management Information System (EMIS) as combination of discrete event simulation with ecological material flow analysis for a selected production process. The software tool serves as decision ...

Plus Plus BTU Analysis Plus logo. Heat load calculation program that performs comprehensive heat load studies with hardcopy printouts of the results. The BTU Analysi Plus program is designed for general heating, air-conditioning, and commerical studies. Since 1987, the BTU Analysis family of programs have been commercially distributed and are marketed through professional organizations, trade advertisements, and word of mouth. They are currently used in six (6) foriegn countries and the U.S. Used in temperate, tropic, artic, and arid climates. They have proved themselves easy to use, accurate and productive again and again. A version of BTU Analysis Plus was adopted for use in the revised HEATING VENTILATING AND AIR CONDITIONING FUNDAMENTALS by Raymond A. Havrella.

Report provides tables of present-value factors for use in the life-cycle cost analysis of capital investment projects for federal facilities. It also provides energy price indices based on the U.S. Department of Energy (DOE) forecasts from 2012 to 2042.

The last (5th) wave of EU enlargement ended on 1st January 2007 with the accession of Romania and Bulgaria. Many countries of the South-Eastern Europe aspire to join the EU. Croatia appears to be the next prospective member, so the aim of this paper ... Keywords: Ward's method, classification, cluster analysis, k-means method, multivariate method, structural economic indicators

Webmaster Webmaster To contact the Webmaster, please provide your name, e-mail address, and message below. When you are finished, click "Send Message." NOTE: If you enter your e-mail address incorrectly, we will be unable to reply. Your name: Your email address: Your message: Send Message Printable Version NREL GIS Home About NREL GIS Renewable Energy Technical Potential Maps Data Resources Data Visualization & Geospatial Tools GIS Staff Publications Mailing List Contact Us Did you find what you needed? Yes 1 No 0 Thank you for your feedback. Would you like to take a moment to tell us how we can improve this page? Submit We value your feedback. Thanks! We've received your feedback. Something went wrong. Please try again later. NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, operated by the Alliance for Sustainable Energy, LLC.

REG REG BTU Analysis REG logo. Heat load calculation program that performs comprehensive heat load studies with hardcopy printouts of the results. The REG program is designed for general heating, air-conditioning, and light commercial studies. Since 1987, the BTU Analysis family of programs have been commercially distributed and are marketed through professional organizations, trade advertisements, and word of mouth. They are currently used in six (6) foriegn countries and the U.S. Used in temperate, tropic, artic, and arid climates. They have proved themselves easy to use, accurate and productive again and again. A version of BTU Analysis, was adopted for use in the revised HEATING VENTILATING AND AIR CONDITIONING FUNDAMENTALS by Raymond A. Havrella. Keywords

In this paper we describe a software package for developing heart rate variability analysis. This package, called RHRV, is a third party extension for the open source statistical environment R, and can be freely downloaded from the R-CRAN repository. ... Keywords: Apnea, Heart rate variability, Open source, Signal processing

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

Federal Energy Management Program Federal Energy Management Program The Federal Energy Management Program (FEMP) teamed with Geospatial Analysis staff at NREL to update the analysis for this project and created an interactive FEMP Screening Map application. The previous maps have been archived. If you have a need for one of the archived maps, please contact the Webmaster. This new application examines the viability of three solar technologies in the United States with a high-level annualized economic calculation, with and without potential savings from available renewable energy incentives at the state and federal level. This map allows users to access the results of those calculations, look in more detail at their areas of interest, and overlay other layers, such as renewable energy resource data for solar,

With the goals of reducing greenhouse gas emissions, oil imports, and energy costs, a wide variety of automotive technologies are proposed to replace the traditional gasoline-powered internal combustion engine (g-ICE). A prototype model, Analytica Transportation Energy Analysis Model (ATEAM), has been developed using the Analytica decision modeling environment, visualizing the structure as a hierarchy of influence diagrams. The report summarized the FY2010 ATEAM accomplishments.

The MetaBrowser design is based on the premise that scientists should not be forced to learn new languages or commands for finding the data they are interested in and for selecting subsets of the data for further analysis. Furthermore, there should be a single system that permits browsing, query, and analysis of the data, so that the scientist does not have to switch between systems. The current version for the MetaBrowser was designed for the DOE CEDR (Comprehensive Epidemiological Data Resource) project, but the same principles can apply to other scientific disciplines. Browsing and query should be combined. It is quite natural for a user to explore the information in the database before deciding what subset of the data to select for further analysis. In general, if there is a large number of datasets (i.e. databases) in the system, then the user would want to find out information about the various datasets (called metadata), before choosing one or more datasets for further exploration. Thus, a metadatabase that holds information about datasets in the systems must exist.

There is a need in water sciences for computational tools to integrate large spatially distributed datasets to provide insight into the spatial and temporal domains of the data while allowing visualization, analysis in the spatial and temporal dimensions, ... Keywords: Data visualization, GIS, Geospatial software, Hydrological modeling, Integrated environmental modeling, Spatio-temporal analysis

A wide range of ARM developers from architects, to compiler writers, to software developers, need tools to understand, analyze, and simulate program behavior. For developers to achieve high levels of system and program correctness, performance, reliability, and power efficiency these tools must be fast and customizable to the problems at hand. BitRaker Anvil is a tool building framework allowing developers to rapidly build tools to achieve these goals. BitRaker Anvil uses binary instrumentation to modify ARM binaries for the purpose of analyzing program behavior. BitRaker Anvil equips the developer with an easy to use API that allows the user to specify the particular program characteristics to analyze. Using this API, the developer can create custom tools to perform simulation or workload analysis several orders of magnitude faster than using a cycle level simulator. Prior binary instrumentation technology requires that analysis code be merged into the same binary as the code to be analyzed. A key new feature of our binary instrumentation framework is ReHost analysis, which allows an instrumented ARM binary to make calls to analysis code that is written in the native format of the desktop machine. Using this for cross-platform ARM development results in analysis that runs orders of magnitude faster while simultaneously reducing the size of the ARM binary images. 1

This paper explores three significant software development requirements for making the transition from stand-alone lighting simulation/analysistools to simulation-based design aid tools. These requirements include specialized lighting simulation engines, facilitated methods for creating detailed simulatable building descriptions, an automated techniques for providing lighting design guidance. Initial computer implementations meant to address each of these requirements are discussed to further elaborate these requirements and to illustrate work-in-progress.

PIXE analysis software has been for long mainly tuned to the needs of Si(Li) detector based spectra analysis and quantification methods based on K{sub {alpha}} or L{sub {alpha}} X-ray lines. Still, recent evidences related to the study of relative line intensities and new developments in detection equipment, namely the emergence of commercial microcalorimeter based X-ray detectors, have brought up the possibility that in the near future PIXE will become more than just major lines quantification. A main issue that became evident as a consequence of this was the need to be able to fit PIXE spectra without prior knowledge of relative line intensities. Considering new developments it may be necessary to generalize PIXE to a wider notion of ion beam induced X-ray (IBIX) emission, to include the quantification of processes such as Radiative Auger Emission. In order to answer to this need, the IBIXFIT code was created based much on the Bayesian Inference and Simulated Annealing routines implemented in the Datafurnace code [1]. In this presentation, the IBIXFIT is used to fit a microcalorimeter spectrum of a Ba{sub x}Sr{sub (1-x)}TiO{sub 3} thin film sample and the specific possibility of selecting between fixed and free line ratios combined with other specificities of the IBIXFIT algorithm are shown to be essential to overcome the problems faced.

This slide presentation focuses on the growing role and importance of imagery analysis for IAEA safeguards applications and how commercial satellite imagery, together with the newly available geospatial tools, can be used to promote 'all-source synergy.' As additional sources of openly available information, satellite imagery in conjunction with the geospatial tools can be used to significantly augment and enhance existing information gathering techniques, procedures, and analyses in the remote detection and assessment of nonproliferation relevant activities, facilities, and programs. Foremost of the geospatial tools are the 'Digital Virtual Globes' (i.e., GoogleEarth, Virtual Earth, etc.) that are far better than previously used simple 2-D plan-view line drawings for visualization of known and suspected facilities of interest which can be critical to: (1) Site familiarization and true geospatial context awareness; (2) Pre-inspection planning; (3) Onsite orientation and navigation; (4) Post-inspection reporting; (5) Site monitoring over time for changes; (6) Verification of states site declarations and for input to State Evaluation reports; and (7) A common basis for discussions among all interested parties (Member States). Additionally, as an 'open-source', such virtual globes can also provide a new, essentially free, means to conduct broad area search for undeclared nuclear sites and activities - either alleged through open source leads; identified on internet BLOGS and WIKI Layers, with input from a 'free' cadre of global browsers and/or by knowledgeable local citizens (a.k.a.: 'crowdsourcing'), that can include ground photos and maps; or by other initiatives based on existing information and in-house country knowledge. They also provide a means to acquire ground photography taken by locals, hobbyists, and tourists of the surrounding locales that can be useful in identifying and discriminating between relevant and non-relevant facilities and their associated infrastructure. The digital globes also provide highly accurate terrain mapping for better geospatial context and allow detailed 3-D perspectives of all sites or areas of interest. 3-D modeling software (i.e., Google's SketchUp6 newly available in 2007) when used in conjunction with these digital globes can significantly enhance individual building characterization and visualization (including interiors), allowing for better assessments including walk-arounds or fly-arounds and perhaps better decision making on multiple levels (e.g., the best placement for International Atomic Energy Agency (IAEA) video monitoring cameras).

>This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

IndicesIndices Climate Indices Climate indices are diagnostic tools used to describe the state of the climate system and monitor climate. They are most often represented with a time series, where each point in time corresponds to one index value. An index can be constructed to describe almost any atmospheric event; as such, they are myriad. Therefore, CDIAC provides these links to other web sites to help guide users to the most widely used climate indices, which in many cases are updated monthly. Data Set Website/Name NOAA's Climate Prediction Center, Monitoring and Data Index Page NOAA's Earth Systems Research Laboratory, Monthly Atmospheric and Ocean Time Series Page (plot, analyze, and compare time series) The Monthly Teleconnection Indices Page from NOAA's National

Test data will be used to validate advanced turbine design and analysistools. NREL signed a Cooperative Research and Development Agreement with Alstom in 2010 to conduct certification testing certification testing in 2011. Tests to be conducted by NREL include a power quality test to finalize

This fact sheet from the National Renewable Energy Laboratory describes the Drive-Cycle Rapid Investigation, Visualization, and Evaluation (DRIVE) analysistool, which uses GPS and controller area network data to characterize vehicle operation and produce custom vehicle drive cycles, analyzing thousands of hours of data in a matter of minutes.

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

Highlights: Black-Right-Pointing-Pointer Sustainability and proximity principles have a key role in waste management. Black-Right-Pointing-Pointer Core indicators are needed in order to quantify and evaluate them. Black-Right-Pointing-Pointer A systematic, step-by-step approach is developed in this study for their development. Black-Right-Pointing-Pointer Transport may play a significant role in terms of environmental and economic costs. Black-Right-Pointing-Pointer Policy action is required in order to advance in the consecution of these principles. - Abstract: In this paper, the material and spatial characterization of the flows within a municipal solid waste (MSW) management system are combined through a Network-Based Spatial Material Flow Analysis. Using this information, two core indicators are developed for the bio-waste fraction, the Net Recovery Index (NRI) and the Transport Intensity Index (TII), which are aimed at assessing progress towards policy-related sustainable MSW management strategies and objectives. The NRI approaches the capacity of a MSW management system for converting waste into resources through a systematic metabolic approach, whereas the TII addresses efficiency in terms of the transport requirements to manage a specific waste flow throughout the entire MSW management life cycle. Therefore, both indicators could be useful in assessing key MSW management policy strategies, such as the consecution of higher recycling levels (sustainability principle) or the minimization of transport by locating treatment facilities closer to generation sources (proximity principle). To apply this methodological approach, the bio-waste management system of the region of Catalonia (Spain) has been chosen as a case study. Results show the adequacy of both indicators for identifying those points within the system with higher capacity to compromise its environmental, economic and social performance and therefore establishing clear targets for policy prioritization. Moreover, this methodological approach permits scenario building, which could be useful in assessing the outcomes of hypothetical scenarios, thus proving its adequacy for strategic planning.

TR-X is a multi-platform program that provides a graphical interface to Monte Carlo N-Particle transport (MCNP) and Monte Carlo N-Particle transport eXtended (MCNPX) codes. Included in this interface are tools to reduce the tedium of input file creation, provide standardization of model creation and analysis, and expedite the execution of the created models. TR-X provides tools to make the rapid testing of multiple permutations of these models easier, while also building in standardization that allows multiple solutions to be compared.

The analysis was performed on a system comprising a counterflow, concentric-pipe economizer, heat exchanger, flowmeter, plug, and connecting pipe. The system was assumed to be at some initial temperature equal to the inlet sodium temperature and suddenly loses heat to a medium in the heat exchanger. Design and operating data are presented. A cooling rate curve is given where the nitrogen flow rate is decreased when the plug temperature reaches 400 deg F. The time variation of minimum temperatures is given for various values of thermal capacitance with constant equilibrium temperature, and the economizer parameter with constant equilibrium temperatures and thermal capacitance. The variation in heat exchanger parameter with economizer parameter for a constant equilibrium minimum temperature of 250 deg F, and a constant inlet temperature of 750 deg F is indicated. (B.O.G.)

In 2009, a National Academy of Sciences report called for investigation into the scienti#12;c basis behind tool mark comparisons (National Academy of Sciences, 2009). Answering this call, Chumbley et al. (2010) attempted to prove or disprove the hypothesis that tool marks are unique to a single tool. They developed a statistical algorithm that could, in most cases, discern matching and non-matching tool marks made at di#11;erent angles by sequentially numbered screwdriver tips. Moreover, in the cases where the algorithm misinterpreted a pair of marks, an experienced forensics examiner could discern the correct outcome. While this research served to con#12;rm the basic assumptions behind tool mark analysis, it also suggested that statistical analysis software could help to reduce the examiner's workload. This led to a new tool mark analysis approach, introduced in this thesis, that relies on 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. These scans are carefully cleaned to remove noise from the data acquisition process and assigned a coordinate system that mathematically de#12;nes angles and twists in a natural way. The marking process is then simulated by using a 3D graphics software package to impart rotations to the tip and take the projection of the tip's geometry in the direction of tool travel. The edge of this projection, retrieved from the 3D graphics software, becomes a virtual tool mark. Using this method, virtual marks are made at increments of 5#14; and compared to a scan of the evidence mark. The previously developed statistical package from Chumbley et al. (2010) performs the comparison, comparing the similarity of the geometry of both marks to the similarity that would occur due to random chance. The resulting statistical measure of the likelihood of the match informs the examiner of the angle of the best matching virtual mark, allowing the examiner to focus his/her mark analysis on a smaller range of angles. Preliminary results are quite promising. In a study with both sides of 6 screwdriver tips and 34 corresponding marks, the method distinguished known matches from known non-matches with zero false positive matches and only two matches mistaken for non-matches. For matches, it could predict the correct marking angle within #6;5-10#14;. Moreover, on a standard desktop computer, the virtual marking software is capable of cleaning 3D tip and plate scans in minutes and producing a virtual mark and comparing it to a real mark in seconds. These results support several of the professional conclusions of the tool mark analysis com- munity, including the idea that marks produced by the same tool only match if they are made at similar angles. The method also displays the potential to automate part of the comparison process, freeing the examiner to focus on other tasks, which is important in busy, backlogged crime labs. Finally, the method o#11;ers the unique chance to directly link an evidence mark to the tool that produced it while reducing potential damage to the evidence.

In this paper, the national 54 high-tech industrial parks' main economic indicators and their cities' GDP are analyzed respectively with clustering analysis. It is found that there is a certain degree of correlation between them. To reveal the inner ... Keywords: Cities' GDP, High-tech industrial parks' economic, Correlation Analysis, Clustering Analysis

Built Environment Built Environment Energy AnalysisTool Overview Prepared by: Chris Porter Cambridge Systematics, Inc. Cambridge, Massachusetts NREL Technical Monitor: Laura Vimmerstedt March 2013 NREL/PR-6A20-58101 2 Built Environment Energy AnalysisTool Overview Subcontractor: Chris Porter Cambridge Systematics, Inc. 100 Cambridge Park Drive, Suite 400 Cambridge, MA 02140 Period of Performance: June 2011-February 2013 NREL Technical Monitor: Laura Vimmerstedt Prepared under Subcontract No. DGJ-1-11857-01 This publication was reproduced from the best available copy submitted by the subcontractor and received no editorial review at NREL. NOTICE This report was prepared as an account of work sponsored by an agency of the United States government. Neither the

We introduce NeedATool (Needlet AnalysisTool), a software for data analysis based on needlets, a wavelet rendition which is powerful for the analysis of fields defined on a sphere. Needlets have been applied successfully to the treatment of astrophysical and cosmological observations, and in particular to the analysis of cosmic microwave background (CMB) data. Usually, such analyses are performed in real space as well as in its dual domain, the harmonic one. Both spaces have advantages and disadvantages: for example, in pixel space it is easier to deal with partial sky coverage and experimental noise; in the harmonic domain, beam treatment and comparison with theoretical predictions are more effective. During the last decade, however, wavelets have emerged as a useful tool for CMB data analysis, since they allow us to combine most of the advantages of the two spaces, one of the main reasons being their sharp localization. In this paper, we outline the analytical properties of needlets and discuss the main features of the numerical code, which should be a valuable addition to the CMB analyst's toolbox.

The Energy Policy Act of 1992 establishes a program to support development of renewable energy technologies including a production incentive to public power utilities. Because there is a wide range of possible policy actions that could be taken to increase electric market share for renewables, modeling tools are needed to help make informed decisions regarding future policy. Previous energy modeling tools did not contain the region or infrastructure focus necessary to examine renewable technologies. As a result, the Department of Energy Office of Utility Technologies (OUT) supported the development of tools for renewable energy policy analysis. Three models were developed: The Renewable Energy Penetration (REP) model, which is a spreadsheet model for determining first-order estimates of policy effects for each of the ten federal regions; the Ten Federal Region Model (TFRM), which employs utility capacity expansion and dispatching decision; and the Region Electric Policy Analysis Model (REPAM), which was constructed to allow detailed insight into interactions between policy and technology within an individual region. These Models were developed to provide a suite of fast, personal-computer based policy analysistools; as one moves from the REP model to the TFRM to the REPAM the level of detail (and complexity) increases. In 1993 a panel was formed to identify model strengths, weaknesses (including any potential biases) and to suggest potential improvements. The panel met in January 1994 to discuss model simulations and to deliberate regarding evaluation outcomes. This report is largely a result of this meeting. This report is organized as follows. It provides a description of the TFRM and summarizes the panel`s findings. Individual chapters examine various aspects of the model: demand and load, capacity expansion, dispatching and production costing, reliability, renewables, storage, financial and regulatory concerns, and environmental effects.

Abstract. Using intermediate degree p mode frequency data sets for solar cycle 22, we find that the frequency shifts and magnetic indices show a “hysteresis” phenomenon. It is observed that the magnetic indices follow different paths for the ascending and descending phases of the solar cycle, the descending path always seems to follow a higher track than the ascending one. However, for the radiative indices, the paths cross each other indicating phase reversal. 1.

Traditionally building simulation models are used at the design phase of a building project. These models are used to optimize various design alternatives, reduce energy consumption and cost. Building performance assessment for the operational phase of a buildings life cycle is sporadic, typically working from historical metered data and focusing on bulk energy assessment. Building Management Systems (BMS) do not explicitly incorporate feedback to the design phase or account for any changes, which have been made to building layout or fabric during construction. This paper discusses a proposal to develop an Industry Foundation Classes (IFC) compliant data visualization tool Building Performance Indicator (BuildingPI) for performance metric and performance effectiveness ratio evaluation.

Data analysis at the DIII-D National Fusion Facility is simplified by the use of two software packages in analysis codes. The first is GAP1otObj, an IDL-based object-oriented library used in visualization tools for dynamic plotting. GAPlotObj gives users the ability to manipulate graphs directly through mouse and keyboard-driven commands. The second software package is MDSplus, which is used at DIED as a central repository for analyzed data. GAPlotObj and MDSplus reduce the effort required for a collaborator to become familiar with the DIII-D analysis environment by providing uniform interfaces for data display and retrieval. Two visualization tools at DIII-D that benefit from them are ReviewPlus and EFITviewer. ReviewPlus is capable of displaying interactive 2D and 3D graphs of raw, analyzed, and simulation code data. EFITviewer is used to display results from the EFIT analysis code together with kinetic profiles and machine geometry. Both bring new possibilities for data exploration to the user, and are able to plot data from any fusion research site with an MDSplus data server.

PV Installation Labor Market Analysis PV Installation Labor Market Analysis and PV JEDI Tool Developments Barry Friedman NREL Strategic Energy Analysis Center May 16, 2012 World Renewable Energy Forum Denver, Colorado NREL/PR-6A20-55130 NATIONAL RENEWABLE ENERGY LABORATORY Disclaimer 2 DISCLAIMER AGREEMENT These information ("Data") are provided by the National Renewable Energy Laboratory ("NREL"), which is operated by the Alliance for Sustainable Energy LLC ("Alliance") for the U.S. Department of Energy (the "DOE"). It is recognized that disclosure of these Data is provided under the following conditions and warnings: (1) these Data have been prepared for reference purposes only; (2) these Data consist of forecasts, estimates or assumptions made on a best-

DOE JGI's Kostas Mavrommatis, chair of the Scalability of Comparative Analysis, Novel Algorithms and Tools panel, at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

Agricultural energy consumption is an important environmental and social issue. Several diagnoses have been proposed to define indicators for analyzing energy consumption at large scale of agricultural farm activities (year, farm, family of production, ... Keywords: energetic indicators, spatial OLAP, spatial data warehouses

A 3D simulation tool for modeling solid oxide fuel cells is described. The tool combines the versatility and efficiency of a commercial finite element analysis code, MARC{reg_sign}, with an in-house developed robust and flexible electrochemical (EC) module. Based upon characteristic parameters obtained experimentally and assigned by the user, the EC module calculates the current density distribution, heat generation, and fuel and oxidant species concentration, taking the temperature profile provided by MARC{reg_sign} and operating conditions such as the fuel and oxidant flow rate and the total stack output voltage or current as the input. MARC{reg_sign} performs flow and thermal analyses based on the initial and boundary thermal and flow conditions and the heat generation calculated by the EC module. The main coupling between MARC{reg_sign} and EC is for MARC{reg_sign} to supply the temperature field to EC and for EC to give the heat generation profile to MARC{reg_sign}. The loosely coupled, iterative scheme is advantageous in terms of memory requirement, numerical stability and computational efficiency. The coupling is iterated to self-consistency for a steady-state solution. Sample results for steady states as well as the startup process for stacks with different flow designs are presented to illustrate the modeling capability and numerical performance characteristic of the simulation tool.

Design improvements made for downhole thermal protection of systems based on results obtained from the analysis of the electronics, heat sink, and dewar packaged in a steel tubular body are described. Results include heat flux at the tool surface, temperature-time histories of each subsystem and isotherm contour plots during the simulation. The analysis showed that the thermal potential between the electronics and the heat sink was in the wrong direction and also was too small to remove heat entering the electronics section. Also, the conductance of the available heat transfer paths from electronics to heat sink was too small to remove that heat efficiently. Significant improvements in survival at high temperatures were achieved by increasing the available thermal capacity of the heat sink, increasing the thermal potential between the heat sink and electronics, and vastly increasing the conductance of the heat transfer paths.

Tools to Spur Increased Deployment of " Waste Heat" Tools to Spur Increased Deployment of " Waste Heat" Rejection/Recycling Hybrid GHP Systems in Hot, Arid or Semiarid Climates Like Texas Geothermal Project Jump to: navigation, search Last modified on July 22, 2011. Project Title Analysis & Tools to Spur Increased Deployment of " Waste Heat" Rejection/Recycling Hybrid GHP Systems in Hot, Arid or Semiarid Climates Like Texas Project Type / Topic 1 Recovery Act - Geothermal Technologies Program: Ground Source Heat Pumps Project Type / Topic 2 Topic Area 2: Data Gathering and Analysis Project Description As GHP systems offer substantial energy efficiency by leveraging earth's intrinsic thermal capacitance, they could play a pivotal role in achieving the DoE's Building Technologies Pro-gram's "zero energy" goal in heavily cooling-dominated climates. Moreover, SHR-augmented GHP systems, in particular, could play a vital role in reducing building energy consumption and limiting greenhouse gas (GHG) emissions in heavily cooling dominated states, like Texas, which are experiencing large increases in population and correspondingly, peak electricity demand. If only 0.1% of Texas,' Arizona's, New Mexico's and Nevada's nearly 15 million-or 15,000-homes were to install new (or convert their existing HVAC or heat pump system to) a full or hybrid GHP system, it would result in between $400 and $800 million USD of new economic activity, most of which would be domestic. Moreover, these 15,000 homes would cut their annual energy consumption-and concomitant GHG emissions-by roughly 40-70%; on average they would save about $1,000 USD in annual operating costs, collectively saving about $15 million USD annually. A conservative GHP industry estimate is that at least 900 people would be directly employed for every 10,000 GHP units installed.

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

The Energy Policy Act of 1992 establishes a program to support development of renewable energy technologies including a production incentive to public power utilities. Because there is a wide range of possible policy actions that could be taken to increase electric market share for renewables, modeling tools are needed to help make informed decisions regarding future policy. Previous energy modeling tools did not contain the regional or infrastructure focus necessary to examine renewable technologies. As a result, the Department of Energy Office of Utility Technologies (OUT) supported the development of tools for renewable energy policy analysis. Three models were developed: The Renewable Energy Penetration (REP) model, which is a spreadsheet model for determining first-order estimates of policy effects for each of the ten federal regions; the Ten Federal Region Model (TFRM), which employs utility capacity expansion and dispatching decisions; and the Regional Electric Policy Analysis Model (REPAM) which was constructed to allow detailed insight into interactions between policy and technology within an individual region. In 1993, the OUT supported the Oak Ridge Institute of Science and Education (ORISE) to form an expert panel to provide an independent review of the REP model and TFRM. This report contains the panel`s evaluation of the REP model; the TFRM is evaluated in a companion report. The panel did not review the REPAM. The panel met for a second time in January 1994 to discuss model simulations and deliberate regarding evaluation outcomes. This report is largely a result of this second meeting. The remainder of this chapter provides a description of the REP model and summarizes the panel`s findings. Individual chapters examine various aspects of the model: demand and load, capacity expansion, dispatching and production costing, reliability, renewables, storage, transmission, financial and regulatory concerns, and environmental effects.

The use of lightweight and highly formable advanced materials in automobile and truck manufacturing has the potential to save fuel. Advances in tooling technology would promote the use of these materials. This report describes an energy savings analysis performed to approximate the potential fuel savings and consequential carbon-emission reductions that would be possible because of advances in tooling in the manufacturing of, in particular, non-powertrain components of passenger cars and heavy trucks. Separate energy analyses are performed for cars and heavy trucks. Heavy trucks are considered to be Class 7 and 8 trucks (trucks rated over 26,000 lbs gross vehicle weight). A critical input to the analysis is a set of estimates of the percentage reductions in weight and drag that could be achieved by the implementation of advanced materials, as a consequence of improved tooling technology, which were obtained by surveying tooling industry experts who attended a DOE Workshop, Tooling Technology for Low-Volume Vehicle Production, held in Seattle and Detroit in October and November 2003. The analysis is also based on 2001 fuel consumption totals and on energy-audit component proportions of fuel use due to drag, rolling resistance, and braking. The consumption proportions are assumed constant over time, but an allowance is made for fleet growth. The savings for a particular component is then the product of total fuel consumption, the percentage reduction of the component, and the energy audit component proportion. Fuel savings estimates for trucks also account for weight-limited versus volume-limited operations. Energy savings are assumed to be of two types: (1) direct energy savings incurred through reduced forces that must be overcome to move the vehicle or to slow it down in braking. and (2) indirect energy savings through reductions in the required engine power, the production and transmission of which incur thermodynamic losses, internal friction, and other inefficiencies. Total savings for an energy use component are estimated by scaling up the direct savings with an approximate total-to-direct savings ratio. Market penetration for new technology vehicles is estimated from projections about scrappage. Retrofit savings are assumed negligible, but savings are also assumed to accrue with increases in the fleet size, based on economic growth forecasts. It is assumed that as vehicles in the current fleet are scrapped, they are replaced with advanced-technology vehicles. Saving estimates are based on proportions of new vehicles, rather than new-vehicle mileages. In practice, of course, scrapped vehicles are often replaced with used vehicles, and used vehicles are replaced with new vehicles. Because new vehicles are typically driven more than old, savings estimates based on count rather than mileage proportions tend to be biased down (i.e., conservative). Savings are expressed in terms of gallons of fuel saved, metric tons of CO2 emissions reductions, and percentages relative to 2001 levels of fuel and CO2. The sensitivity of the savings projections to inputs such as energy-audit proportions of fuel consumed for rolling resistance, drag, braking, etc. is assessed by considering different scenarios. Though based on many approximations, the estimates approximate the potential energy savings possible because of improvements in tooling. For heavy trucks, annual diesel savings of 2.4-6.8 percent, and cumulative savings on the order of 54-154 percent, of 2001 consumption could accrue by 2050. By 2050, annual gasoline savings of 2.8-12 percent, and cumulative savings on the order of 83-350 percent of 2001 consumption could accrue for cars.

The purpose of this report was to review pertinent literature and studies that might reveal models capable of optimizing the siting, sizing and economic value of energy storage in the future smart grid infrastructure. Energy storage technology and utility system deployment have been subjects of intense research and development for over three decades. During this time, many models have been developed that consider energy storage implementation in the electric power industry and other applications. Nevertheless, this review of literature discovered no actual models and only a few software tools that relate specifically to the application environment and expected requirements of the evolving smart grid infrastructure. This report indicates the existing need for such a model and describes a pathway for developing it.

The major challenges in sustainable implementation are the financial issue and uncertainties. The traditional financial budgeting approach that is commonly used to evaluate sustainable projects normally neglects future decisions that might need to be made over the course of a project. The real options approach has been suggested as a tool for strategic decision making because it can provide flexibility which can increase the project value. Researchers have been trying to identify the potential of the real options approach, and provide the frameworks for a real options evaluation and flexible strategy in sustainability improvement. However, some important variables and financial impacts explanation of real options are missing. Models can be improved to show the variation of possible project values along with its behavior. This work aims to improve the real options model in sustainable projects to provide understanding about the financial impacts of flexible strategy to sustainable improvement projects and to be used as a tool to assist decision making. The results showed that real options can have a positive financial impact to the project. The extension of this model can assist the analysis and development of decision policies.

This report summarizes work carried out by the Ultra-scale Visualization Climate Data AnalysisTools (UV-CDAT) Team for the period of January 1, 2011 through June 30, 2011. It discusses highlights, overall progress, period goals, and collaborations and lists papers and presentations. To learn more about our project, please visit our UV-CDAT website (URL: http://uv-cdat.org). This report will be forwarded to the program manager for the Department of Energy (DOE) Office of Biological and Environmental Research (BER), national and international collaborators and stakeholders, and to researchers working on a wide range of other climate model, reanalysis, and observation evaluation activities. The UV-CDAT executive committee consists of Dean N. Williams of Lawrence Livermore National Laboratory (LLNL); Dave Bader and Galen Shipman of Oak Ridge National Laboratory (ORNL); Phil Jones and James Ahrens of Los Alamos National Laboratory (LANL), Claudio Silva of Polytechnic Institute of New York University (NYU-Poly); and Berk Geveci of Kitware, Inc. The UV-CDAT team consists of researchers and scientists with diverse domain knowledge whose home institutions also include the National Aeronautics and Space Administration (NASA) and the University of Utah. All work is accomplished under DOE open-source guidelines and in close collaboration with the project's stakeholders, domain researchers, and scientists. Working directly with BER climate science analysis projects, this consortium will develop and deploy data and computational resources useful to a wide variety of stakeholders, including scientists, policymakers, and the general public. Members of this consortium already collaborate with other institutions and universities in researching data discovery, management, visualization, workflow analysis, and provenance. The UV-CDAT team will address the following high-level visualization requirements: (1) Alternative parallel streaming statistics and analysis pipelines - Data parallelism, Task parallelism, Visualization parallelism; (2) Optimized parallel input/output (I/O); (3) Remote interactive execution; (4) Advanced intercomparison visualization; (5) Data provenance processing and capture; and (6) Interfaces for scientists - Workflow data analysis and visualization construction tools, and Visualization interfaces.

A system to capture US Air Force financial expertise is currently under development. To accomplish its mission of costing Air Force programs, the Director of the Budget utilizes the services of budget analysts. Analysts are required to prepare three cost exercises during the year: the Program Objective Memorandum (POM), the Budget Estimation Submission (BES), and the President's Budget (PB). Additionally, during budget calls and budget execution, analysts must make decisions regarding funding obligations and outlays. Analysis and modification of outlay projection plans require the use of operational and procedural heuristics. These problem-solving steps followed by analysts are being used to develop a prototype. Information regarding the prototype, the tool, and its capabilities are presented. Additional features of the program are described including integration with existing information management systems.

In development projects, designers should take into consideration the possibility of a vapor cloud explosion in the siting and design of a process plant from day one. The most important decisions pertinent to the location of different process areas, separation between different areas, location of occupied buildings and overall layout may be made at the conceptual stage of the project. During the detailed design engineering stage the final calculation of gas explosion loads is an important activity. However, decisions related to the layout and location of occupied buildings at this stage could be very costly. Therefore, at the conceptual phase of the development project for a hydrocarbon facility, it would be helpful to get a picture of possible vapor cloud explosion loads to be used in studying various options. This thesis presents the analytical parameters that are used in vapor cloud explosion risk analysis. It proposes a model structure for the analysis of vapor cloud explosion risks to buildings based on exceedance methodology. This methodology was developed in a computer program which is used to support this thesis. The proposed model considers all possible gas release scenarios through the use of the Monte Carlo simulation. The risk of vapor cloud explosions can be displayed using exceedance curves. The resulting model provides a predictive tool for vapor cloud explosion problems at the early stages of development projects, particularly in siting occupied buildings in onshore hydrocarbon facilities. It can also be used as a quick analytical tool for investigating various aspects of vapor cloud explosions. This model has been applied to a case study, a debutanizer process unit. The model was used to explore the different alternatives of locating a building near the facility. The results from the model were compared to the results of other existing software to determine the model validity. The results show that the model can effectively examine the risk of vapor cloud explosions.

This article presents an analysis of about 29,000 measurements of gamma radiation associated with the decay of radon in a sealed container at the Geological Survey of Israel (GSI) Laboratory in Jerusalem between 28 January 2007 and 10 May 2010. These measurements exhibit strong variations in time of year and time of day, which may be due in part to environmental influences. However, time-series analysis reveals a number of periodicities, including two at approximately 11.2 year$^{-1}$ and 12.5 year$^{-1}$. We have previously found these oscillations in nuclear-decay data acquired at the Brookhaven National Laboratory (BNL) and at the Physikalisch-Technische Bundesanstalt (PTB), and we have suggested that these oscillations are attributable to some form of solar radiation that has its origin in the deep solar interior. A curious property of the GSI data is that the annual oscillation is much stronger in daytime data than in nighttime data, but the opposite is true for all other oscillations. This may be a systematic effect but, if it is not, this property should help narrow the theoretical options for the mechanism responsible for decay-rate variability.

Surface (0--40 cm) soil organic carbon (SOC) dynamics were studied beneath four switchgrass (Panicum virgatum L.) field trails in the southeastern US. Soil organic carbon was partitioned into particulate organic matter (POM) and mineral-associated organic matter (MOM). Most (75--90%) of the SOC at each study site was affiliated with MOM (<0.053 mm). Changes in stable carbon isotope ratios were used to derive carbon inputs to and losses from POM and MOM at each site. Inventories of existing SOC and new C{sub 4}-derived SOC beneath switchgrass decreased with increasing soil depth. Approximately 5 yr after establishment, 19 to 31% of the existing SOC inventories beneath switchgrass had been derived from new C{sub 4}-carbon inputs. Calculated turnover times of POM and MOM ranged from 2.4 to 4.3 yr and 26 to 40 yr, respectively. The turnover time of SOC in the POM fraction increased with decreasing mean annual temperature. A simple, two-compartment model was parameterized to predict the potential for soil carbon sequestration under switchgrass. An example calculation with the model indicated a measurable and verifiable recovery of soil carbon (=12% increase) on degraded lands through one decade of switchgrass production. The potential to sequester carbon through switchgrass cultivation will depend on initial soil carbon inventories, prevailing climate, soil types and site management.

Surface (0-40 cm) soil organic carbon (SOC) dynamics were studied beneath four switchgrass (Panicum virgatum L.) field trials in the southeastern United States. Soil organic carbon was partitioned into particulate organic matter (POM) and mineral-associated organic matter (MOM). Most (75-90%) of the SOC at each study site was affiliated with MOM (<0.053 mm). Changes in stable carbon isotope ratios were used to derive carbon inputs to and losses from POM and MOM at each site. Inventories of existing SOC and new C4-derived SOC beneath switchgrass decreased with increasing soil depth. Approximately 5 yr after establishment, 19 to 31% of the existing SOC inventories beneath switchgrass had been derived from new C{sub 4}-carbon inputs. Calculated turnover times of POM and MOM ranged from 2.4 to 4.3 yr and 26 to 40 yr, respectively. The turnover time of SOC in the POM fraction increased with decreasing mean annual temperature. A simple, two-compartment model was parameterized to predict the potential for soil carbon sequestration under switchgrass. An example calculation with the model indicated a measurable and verifiable recovery of soil carbon ({approx}12% increase) on degraded lands through one decade of switchgrass production. The potential to sequester carbon through switchgrass cultivation will depend on initial soil carbon inventories, prevailing climate, soil type, and site management.

MapSearch Searching for maps has never been easier. A screen capture of the MapSearch Map view option Bookmark and Share Data Visualization & Geospatial Tools NREL's Geographic Information System (GIS) Team has developed tools that allow users to apply these data. These tools help determine things such as how much electricity can be produced from solar systems on a house or what renewable resources are available in a specific areas. Please visit http://maps.nrel.gov/ for the most current list of available NREL's GIS tools. If you have difficulty using these tools because of a disability, please contact the Webmaster. General Interactive Mapping Tools Access RE Atlas, Solar Power Prospector, PVWatts, and other popular tools that dynamically generate maps of renewable energy resources.

Recurrence Plot (RP) and Recurrence Quantification Analysis (RQA) are signal numerical analysis methodologies able to work with non linear dynamical systems and non stationarity. Moreover they well evidence changes in the states of a dynamical system. We recall their features and give practical recipes. It is shown that RP and RQA detect the critical regime in financial indices (in analogy with phase transition) before a bubble bursts, whence allowing to estimate the bubble initial time. The analysis is made on DAX and NASDAQ daily closing price between Jan. 1998 and Nov. 2003. DAX is studied in order to set-up overall considerations, and as a support for deducing technical rules. The NASDAQ bubble initial time has been estimated to be on Oct. 19, 1999.

Purpose:Dose?volume response data for breast cancerradiotherapy (RT) are generally lacking. The purpose of this work is to develop a database and software tools to facilitate the analyses of short? and long?term radiationdose?volume responses of breast cancer RT. Method and Materials: As a part of the project aiming to develop the Research Analysis Platform and IGRTDatabases (RAPID)

The U.S. Department of Energy (DOE) has an interest in large scale hydrogen geostorage, which could offer substantial buffer capacity to meet possible disruptions in supply or changing seasonal demands. The geostorage site options being considered are salt caverns, depleted oil/gas reservoirs, aquifers and hard rock caverns. The DOE has an interest in assessing the geological, geomechanical and economic viability for these types of geologic hydrogen storage options. This study has developed an economic analysis methodology and subsequent spreadsheet analysis to address costs entailed in developing and operating an underground geologic storage facility. This year the tool was updated specifically to (1) incorporate more site-specific model input assumptions for the wells and storage site modules, (2) develop a version that matches the general format of the HDSAM model developed and maintained by Argonne National Laboratory, and (3) incorporate specific demand scenarios illustrating the model's capability. Four general types of underground storage were analyzed: salt caverns, depleted oil/gas reservoirs, aquifers, and hard rock caverns/other custom sites. Due to the substantial lessons learned from the geological storage of natural gas already employed, these options present a potentially sizable storage option. Understanding and including these various geologic storage types in the analysis physical and economic framework will help identify what geologic option would be best suited for the storage of hydrogen. It is important to note, however, that existing natural gas options may not translate to a hydrogen system where substantial engineering obstacles may be encountered. There are only three locations worldwide that currently store hydrogen underground and they are all in salt caverns. Two locations are in the U.S. (Texas), and are managed by ConocoPhillips and Praxair (Leighty, 2007). The third is in Teeside, U.K., managed by Sabic Petrochemicals (Crotogino et al., 2008; Panfilov et al., 2006). These existing H{sub 2} facilities are quite small by natural gas storage standards. The second stage of the analysis involved providing ANL with estimated geostorage costs of hydrogen within salt caverns for various market penetrations for four representative cities (Houston, Detroit, Pittsburgh and Los Angeles). Using these demand levels, the scale and cost of hydrogen storage necessary to meet 10%, 25% and 100% of vehicle summer demands was calculated.

MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL). Brookhaven National Laboratory (BNL) has a program with the NRC called ``MELCOR Verification, Benchmarking, and Applications,`` whose aim is to provide independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysistool. The scope of this program is to perform quality control verification on all released versions of MELCOR, to benchmark MELCOR against more mechanistic codes and experimental data from severe fuel damage tests, and to evaluate the ability of MELCOR to simulate long-term severe accident transients in commercial LWRs, by applying the code to model both BWRs and PWRs. Under this program, BNL provided input to the NRC-sponsored MELCOR Peer Review, and is currently contributing to the MELCOR Cooperative Assessment Program (MCAP). This paper presents a summary of MELCOR assessment efforts at BNL and their contribution to NRC goals with respect to MELCOR.

There exist hundreds of building energy software tools, both web- and disk-based. These tools exhibit considerable range in approach and creativity, with some being highly specialized and others able to consider the building as a whole. However, users are faced with a dizzying array of choices and, often, conflicting results. The fragmentation of development and deployment efforts has hampered tool quality and market penetration. The purpose of this review is to provide information for defining the desired characteristics of residential energy tools, and to encourage future tool development that improves on current practice. This project entails (1) creating a framework for describing possible technical and functional characteristics of such tools, (2) mapping existing tools onto this framework, (3) exploring issues of tool accuracy, and (4) identifying ''best practice'' and strategic opportunities for tool design. evaluated 50 web-based residential calculators, 21 of which we regard as ''whole-house'' tools(i.e., covering a range of end uses). Of the whole-house tools, 13 provide open-ended energy calculations, 5 normalize the results to actual costs (a.k.a ''bill-disaggregation tools''), and 3 provide both options. Across the whole-house tools, we found a range of 5 to 58 house-descriptive features (out of 68 identified in our framework) and 2 to 41 analytical and decision-support features (55 possible). We also evaluated 15 disk-based residential calculators, six of which are whole-house tools. Of these tools, 11 provide open-ended calculations, 1 normalizes the results to actual costs, and 3 provide both options. These tools offered ranges of 18 to 58 technical features (70 possible) and 10 to 40 user- and decision-support features (56 possible). The comparison shows that such tools can employ many approaches and levels of detail. Some tools require a relatively small number of well-considered inputs while others ask a myriad of questions and still miss key issues. The value of detail has a lot to do with the type of question(s) being asked by the user (e.g., the availability of dozens of miscellaneous appliances is immaterial for a user attempting to evaluate the potential for space-heating savings by installing a new furnace). More detail does not, according to our evaluation, automatically translate into a ''better'' or ''more accurate'' tool. Efforts to quantify and compare the ''accuracy'' of these tools are difficult at best, and prior tool-comparison studies have not undertaken this in a meaningful way. The ability to evaluate accuracy is inherently limited by the availability of measured data. Furthermore, certain tool outputs can only be measured against ''actual'' values that are themselves calculated (e.g., HVAC sizing), while others are rarely if ever available (e.g., measured energy use or savings for specific measures). Similarly challenging is to understand the sources of inaccuracies. There are many ways in which quantitative errors can occur in tools, ranging from programming errors to problems inherent in a tool's design. Due to hidden assumptions and non-variable ''defaults'', most tools cannot be fully tested across the desirable range of building configurations, operating conditions, weather locations, etc. Many factors conspire to confound performance comparisons among tools. Differences in inputs can range from weather city, to types of HVAC systems, to appliance characteristics, to occupant-driven effects such as thermostat management. Differences in results would thus no doubt emerge from an extensive comparative exercise, but the sources or implications of these differences for the purposes of accuracy evaluation or tool development would remain largely unidentifiable (especially given the paucity of technical documentation available for most tools). For the tools that we tested, the predicted energy bills for a single test building ranged widely (by nearly a factor of three), and far more so at the end-use level. Most tools over-predicted energy bills and all over-predicted consumption. Var

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

The purpose of this report was to review pertinent literature and studies to identify current state-of-the-art models and analytical tools that optimize the siting, sizing, and economic value of energy storage in a smart grid infrastructure. In recent decades, research and development has significantly improved the cost and reliability of energy storage systems. However, a relatively small percentage of that work has focused on engineering tools for integrating energy storage into existing or future electric grids. This literature review revealed that only a small number of software tools exist, and that those tools only partially address the needs for placement, sizing, and overall control strategies of stationary energy storage within a smart grid infrastructure. None of the tools comprehensively captures the benefits of energy storage, which would reveal all of the potential values. None of the tools or models provides optimization features that identify optimal placement and sizing options within a transmission or distribution system context. This review identifies a need for tool development to fill the gap in the grid analytics and provides some recommendations of guiding principles for advancing the analytical capabilities needed for the engineering and grid planning communities.

Bookmark and Share Bookmark and Share MapSearch MapSearch Logo is a computer monitor with a magnifying glass suspended in the air before it. Use our MapSearch to easily search our collection of maps created by the Geographic Information System (GIS) team. Please use the search box and the filters on the left and right of the screen to limit results. Notice: The current tool works best in Firefox and may result in errors if opened using Microsoft Internet Explorer. June 2013 - The NREL GIS team has released a new beta version of the Mapsearch tool. This new beta version should eliminate some of the browser issues experienced with the current tool. The beta version is designed to work with NREL's OpenEI so users will have one site to search and view NREL created maps. If you have any feeback or comments on this new beta site, contact the Webmaster. While testing is done on this new beta version, the current MapSearch tool is still available. The following instructions apply to the current tool. launch

The market for small wind systems in the United States, often defined as systems less than or equal to 100 kW that produce power on the customer side of the meter, is small but growing steadily. The installed capacity of domestic small wind systems in 2002 was reportedly 15-18 MW, though the market is estimated to be growing by as much as 40 percent annually (AWEA, 2002). This growth is driven in part by recent technology advancements and cost improvements and, perhaps more importantly, by favorable policy incentives targeted at small wind systems that are offered in several states. Currently, over half of all states have incentive policies for which residential small wind installations are eligible. These incentives range from low-interest loan programs and various forms of tax advantages to cash rebates that cover as much as 60 percent of the total system cost for turbines 10 kW or smaller installed in residential applications. Most of these incentives were developed to support a ran ge of emerging renewable technologies (most notably photovoltaic systems), and were therefore not specifically designed with small wind systems in mind. As such, the question remains as to which incentive types provide the greatest benefit to small wind systems, and how states might appropriately set the level and type of incentives in the future. Furthermore, given differences in incentive types and levels across states, as well as variations in retail electricity rates and other relevant factors, it is not immediately obvious which states offer the most promising markets for small wind turbine manufacturers and installers, as well as potential residential system owners. This paper presents results from a Berkeley Lab analysis of the impact of existing and proposed state and federal incentives on the economics of grid-connected, residential small wind systems. Berkeley Lab has designed the Small Wind AnalysisTool (SWAT) to compare system economics under current incentive structures a cross all 50 states. SWAT reports three metrics to characterize residential wind economics in each state and wind resource class: (1) Break-Even Turnkey Cost (BTC): The BTC is defined as the aggregate installed system cost that would balance total customer payments and revenue over the life of the system, allowing the customer to ''break-even'' while earning a specified rate of return on the small wind ''investment.'' (2) Simple Payback (SP): The SP is the number of years it takes a customer to recoup a cash payment for a wind system and all associated costs, assuming zero discount on future revenue and payments (i.e., ignoring the time value of money). (3) Levelized Cost of Energy (LCOE): The LCOE is the levelized cost of generating a kWh of electricity over the lifetime of the system, and is calculated assuming a cash purchase for the small wind system and a 5.5 percent real discount rate. This paper presents SWAT results for a 10 kW wind turbine and turbine power production is based on a Bergey Excel system. These results are not directly applicable to turbines with different power curves and rated outputs, especially given the fact that many state incentives are set as a fixed dollar amount, and the dollar per Watt amount will vary based on the total rated turbine capacity.

This report documents efforts to develop a computer tool for modeling the economic payback for comparative airport ground support equipment (GSE) that are propelled by either electric motors or gasoline and diesel engines. The types of GSE modeled are pushback tractors, baggage tractors, and belt loaders. The GSE modeling tool includes an emissions module that estimates the amount of tailpipe emissions saved by replacing internal combustion engine GSE with electric GSE. This report contains modeling assumptions, methodology, a user’s manual, and modeling results. The model was developed based on the operations of two airlines at four United States airports.

Nowadays, most of the main companies in the vertical transport industry are researching tools capable of providing support for the design process of elevator systems. Numerous decisions have to be taken to obtain an accurate, comfortable, and high-quality ... Keywords: Vertical transport, elevator, lift, simulation

This paper describes the implementation and evaluation of a mobile knowledge management and decision support system to assist archaeologists in dealing with soils. Our view is that provision of a mobile tool which provides access to expert knowledge ... Keywords: Geoarchaeology, Knowledge management, Mobile decision support

This paper presents the development of a new software tool IRA-WDS. This GIS-based software predicts the risks associated with contaminated water entering water distribution systems from surrounding foul water bodies such as sewers, drains and ditches. ... Keywords: Contaminant intrusion, Developing countries, GIS, Intermittent water supply, Risk assessment, Tight coupling, Water supply

Martin Schindewolf worked during his internship at the Lawrence Livermore National Laboratory (LLNL) under the guidance of Martin Schulz at the Computer Science Group of the Center for Applied Scientific Computing. We studied the performance of the TM subsystem of BG/Q as well as researched the possibilities for tool support for TM. To study the performance, we run CLOMP-TM. CLOMP-TM is a benchmark designed for the purpose to quantify the overhead of OpenMP and compare different synchronization primitives. To advance CLOMP-TM, we added Message Passing Interface (MPI) routines for a hybrid parallelization. This enables to run multiple MPI tasks, each running OpenMP, on one node. With these enhancements, a beneficial MPI task to OpenMP thread ratio is determined. Further, the synchronization primitives are ranked as a function of the application characteristics. To demonstrate the usefulness of these results, we investigate a real Monte Carlo simulation called Monte Carlo Benchmark (MCB). Applying the lessons learned yields the best task to thread ratio. Further, we were able to tune the synchronization by transactifying the MCB. Further, we develop tools that capture the performance of the TM run time system and present it to the application's developer. The performance of the TM run time system relies on the built-in statistics. These tools use the Blue Gene Performance Monitoring (BGPM) interface to correlate the statistics from the TM run time system with performance counter values. This combination provides detailed insights in the run time behavior of the application and enables to track down the cause of degraded performance. Further, one tool has been implemented that separates the performance counters in three categories: Successful Speculation, Unsuccessful Speculation and No Speculation. All of the tools are crafted around IBM's xlc compiler for C and C++ and have been run and tested on a Q32 early access system.

This thesis shows the applicability and value of real options analysis in developing an oil field, and how its use along with decision analysis can maximize the returns on a given project and minimize the losses. It focuses ...

Subsidence analysis based on decompaction of the sedimentary record is a standard method for reconstructing the evolution of sedimentary basins. For such an analysis, data on strata thickness, lithology, age constraints, lithological properties, porosity, ... Keywords: Basin analysis, Error quantification, Monte Carlo simulation, Subsidence, Vienna Basin

This document reports the results of a study of cost tools to support the analysis of Operations Other Than War (OOTW). It recommends the continued development of the Department of Defense (DoD) Contingency Operational Support Tool (COST) as the basic cost analysistool for 00TWS. It also recommends modifications to be included in future versions of COST and the development of an 00TW mission planning tool to supply valid input for costing.

In this contribution, the multi actor multi criteria analysis (MAMCA) to evaluate transport projects is presented. This evaluation methodology specifically focuses on the inclusion of the different actors that are involved in a project, the so-called ... Keywords: Multi actor multi criteria analysis, Stakeholders, Transport project appraisal

Environmental (Green) policymaking is the process of designing and applying environmental policies that is interdisciplinary in nature. Therefore the environmental policy analysis and making may include wide range of information and data; scientific, ... Keywords: activity models, green policy design, multi-dimensional formalism, ontologies, policy analysis, policy support system

Over roughly the past two decades, an architecture has evolved for using Wide Area Measurement Systems to monitor power-system oscillatory dynamics. This includes, but is not limited to, the use of Phasor Measurement Units and advanced signal processing algorithms. Today, this architecture is further evolving toward real-time operation applications. This paper provides a perspective upon these analysis approaches, and defines and clarifies analysis operations, interactions, and application conditions. An overview of WAMS analysis approaches is given along with several examples from the western North American Power System.

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

The NUclear EVacuation Analysis Code (NUEVAC) has been developed by Sandia National Laboratories to support the analysis of shelter-evacuate (S-E) strategies following an urban nuclear detonation. This tool can model a range of behaviors, including complex evacuation timing and path selection, as well as various sheltering or mixed evacuation and sheltering strategies. The calculations are based on externally generated, high resolution fallout deposition and plume data. Scenario setup and calculation outputs make extensive use of graphics and interactive features. This software is designed primarily to produce quantitative evaluations of nuclear detonation response options. However, the outputs have also proven useful in the communication of technical insights concerning shelter-evacuate tradeoffs to urban planning or response personnel.

The cost of design, construction and maintenance of facilities is on continual rise. The demand is to construct facilities which have been designed by apply life cycle costing principles. These principles have already given strong decision making power to the manufacturing industry. The need to satisfy the environmental sustainability requirements, improve operational effectiveness of buildings and apply value engineering principles has increased the dependency on life cycle costing analysis. The objective is to obtain economically viable solutions by analyzing the alternatives during the design of a building. Though the LCCA process is able to give the desired results, it does have some problems which have stood as hindrances to the more widespread use of the LCCA concept and method. The literature study has highlighted that the problem areas are the lack of frameworks or mechanisms for collecting and storing data and the complexity of LCCA exercise, which involves the analysis of a thousand of building elements and a number of construction-type options and maintenance activities for each building element at detailed design stages. Building Information Modeling has been able to repeatedly answer the questions raised by the AEC industry. The aim of this study is to identify the areas where BIM can be effectively applied to the LCCA process and become a part of the workflow. In this study, initially four LCCA case studies are read and evaluated from the point of view of understanding the method in which the life cycle costing principles have been applied. The purpose, the type alternatives examined, the process of analysis, the type of software used and the results are understood. An attempt has been carried out to understand the workflow of the LCCA process.
There is a confidence that Building Information Modeling is capable of handling changes during the design, construction and maintenance phases of the project. Since applying changes to any kind of information of the building during LCC analysis forms the core, it has become necessary to use computer building models for examining these changes. The building modeling softwares are enumerated. The case studies have highlighted that the evaluation of the alternatives are primarily to achieve energy efficient solutions for the buildings. Applying these solutions involves high initial costs. The return on investment is the means by which these solutions become viable to the owners of the facilities. This is where the LCCA has been applied. Two of the important cost elements of the LCC analysis are initial costs and the operating costs of the building. The collaboration of these modeling tools with other estimating software where the initial costs of the building can be generated is studied. The functions of the quantity take-off tools and estimating tools along with the interoperability between these tools are analyzed. The operating costs are generated from the software that focuses on sustainability. And the currently used tools for performing the calculations of the life cycle costing analysis are also observed. The objective is to identify if the currently available BIM tools and software can help in obtaining LCCA results and are able to offset the hindrances of the process. Therefore, the software are studied from the point of view of ease of handling data and the type of data that can be generated. Possible BIM workflows are suggested depending on the functions of the software and the relationship between them. The study has aimed at taking a snapshot the current tools available which can aid the LCCA process. The research is of significance to the construction industry as it forms a precursor to the application of Building Information Modeling to the LCCA process as it shows that it has the capacity of overcoming the obstacles for life cycle costing. This opens a window to the possibility of applying BIM to LCCA and furthering this study.

The aim of this analysis is to explore the fundamental stability issues of a robotic vehicle carrying out localization, mapping, and feedback control in a perturbation-filled environment. Motivated by the application of ...

Malware -- a generic term that encompasses viruses, trojans, spywares and other intrusive code -- is widespread today. Malware analysis is a multi-step process providing insight into malware structure and functionality, facilitating the development of ... Keywords: instrumentation, malware, security

In this thesis, we present two computational platforms for future biological research. The first, FNAC, is a flexible programmatic Framework for Network Analysis and Comparison that simplifies many common operations on ...

Summary: The serial analysis of chromatin occupancy technique (SACO) promises to become a widely used method for the unbiased genome-wide experimental identification of loci bound by a transcription factor of interest. We describe the first web-based ...

This paper presents a technologically advanced analysis method to study the urban traffic problem. The integrated approach described here may also provide interesting results for studying particularly extensive and complex situations, while guaranteeing ...

Analyzing whole building interval data is an inexpensive but effective way to identify and improve building operations, and ultimately save money. Utilizing the Energy Charting and Metrics Tool (ECAM) add-in for Microsoft Excel, building operators and managers can begin implementing changes to their Building Automation System (BAS) after trending the interval data. The two data components needed for full analyses are whole building electricity consumption (kW or kWh) and outdoor air temperature (OAT). Using these two pieces of information, a series of plots and charts and be created in ECAM to monitor the buildings performance over time, gain knowledge of how the building is operating, and make adjustments to the BAS to improve efficiency and start saving money.

As Hanford Site contractors address future structural demands on nuclear waste tanks, built as early as 1943, it is necessary to address their current safety margins and ensure safe margins are maintained. Although the current civil engineering practice guidelines for soil modeling are suitable as preliminary design tools, future demands potentially result in loads and modifications to the tanks that are outside the original design basis and current code based structural capabilities. For example, waste removal may include cutting a large hole in a tank. This report addresses both spring modeling of site soils and finite-element modeling of soils. Additionally seismic dynamic modeling of Hanford Site soils is also included. Of new and special interest is Section 2.2 that Professor Robert D. Holtz of the University of Washington wrote on plane strain soil testing versus triaxial testing with Hanford Site application to large buried waste tanks.

In 2008 the German National Analysis Facility (NAF) at DESY was established. It is attached to and builds on top of the DESY Grid infrastructure. The facility was designed to provide the best possible analysis infrastructure for high energy particle physics of the ATLAS, CMS, LHCb and ILC experiments. The Grid and local infrastructure of the German NAF will be reviewed with a focus on the ATLAS part. Both parts include large scale storage and a batch system. The main emphasis of this presentation is the ATLAS specific customisation and utilisation of the NAF. This refers not only to the NAF components but also to the different components of the ATLAS analysis framework. Experience from operating and supporting ATLAS users on the German NAF will be presented. The ATLAS usage of the different components will be shown including some typical use cases of user analysis. Finally, the question will be addressed if the design of the NAF meets the ATLAS expectations for efficient data analysis in the era of LHC data t...

In 2008 the German National Analysis Facility (NAF) at DESY was established. It is attached to and builds on top of DESY Grid infrastructure. The facility is designed to provide the best possible analysis infrastructure for high energy particle physics of the ATLAS, CMS, LHCb and ILC experiments. The Grid and local infrastructure of the NAF is reviewed with a focus on the ATLAS part. Both parts include large scale storage and a batch system. Emphasis is put on ATLAS specific customisation and utilisation of the NAF. This refers not only to the NAF components but also to the di erent components of the ATLAS analysis framework. Experience from operating and supporting ATLAS users on the NAF is presented in this paper. The ATLAS usage of the di erent components are shown including some typical use cases of user analysis. Finally, the question is addressed, if the design of the NAF meets the ATLAS expectations for effcient data analysis in the era of LHC data taking.

To support interactive visualization and analysis of complex, large-scale climate data sets, UV-CDAT integrates a powerful set of scientific computing libraries and applications to foster more efficient knowledge discovery. Connected through a provenance framework, the UV-CDAT components can be loosely coupled for fast integration or tightly coupled for greater functionality and communication with other components. This framework addresses many challenges in the interactive visual analysis of distributed large-scale data for the climate community.

To support interactive visualization and analysis of complex, large-scale climate data sets, UV-CDAT integrates a powerful set of scientific computing libraries and applications to foster more efficient knowledge discovery. Connected through a provenance framework, the UV-CDAT components can be loosely coupled for fast integration or tightly coupled for greater functionality and communication with other components. This framework addresses many challenges in interactive visual analysis of distributed large-scale data for the climate community.

in most large electrical over-ground substations with a grounding grid embedded in a multiple present TOTBEM: a freeware application for the in-house computer aided design and analysis of grounding potentials and compact underground substations. 1 Introduction The main objective of an earthing system

Anti-virus vendors are confronted with a multitude of potentially malicious samples today. Receiving thousands of new samples every day is not uncommon. The signatures that detect confirmed malicious threats are mainly still created manually, so it is ... Keywords: Dynamic analysis, malware

Motivation: The size and complex nature of LC-MS proteomics data sets motivates development of specialized software for statistical data analysis and exploration. We present DanteR, a graphical R package that features extensive statistical and diagnostic functions for quantitative proteomics data analysis, including normalization, imputation, hypothesis testing, interactive visualization and peptide-to-protein rollup. More importantly, users can easily extend the existing functionality by including their own algorithms under the Add-On tab. Availability: DanteR and its associated user guide are available for download at http://omics.pnl.gov/software/. For Windows, a single click automatically installs DanteR along with the R programming environment. For Linux and Mac OS X, users must first install R and then follow instructions on the DanteR web site for package installation.

In this paper we present a method for the spatial analysis of complex cellular systems based on a multiscale study of neighborhood relationships. A function to measure those relationships, M, is introduced. The refined Relative Neighborhood Graph is then presented as a method to establish vicinity relationships within layered cellular structures, and particularized to epithelial cell nuclei in the mammary gland. Finally, the method is illustrated with two examples that show interactions within one population of epithelial cells and between two different populations.

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

This study presents a simplified and accurate procedure for selecting the rheological model which best fits the rheological properties of a given non- Newtonian fluid and introduces five new approaches to correct for tool joint losses from expansion and contraction when hydraulics is calculated. The new approaches are enlargement and contraction (E&C), equivalent diameter (ED), two different (2IDs), enlargement and contraction plus equivalent diameter (E&C+ED), and enlargement and contraction plus two different IDs (E&C+2IDs). In addition to the Newtonian model, seven major non-Newtonian rheological models (Bingham plastic, Power law, API, Herschel-Bulkley, Unified, Robertson and Stiff, and Casson) provide alternatives for selecting the model that most accurately represents the shear-stress/shear-rate relationship for a given non- Newtonian fluid. The project assumes that the model which gives the lowest absolute average percent error (EAAP) between the measured and calculated shear stresses is the best one for a given non-Newtonian fluid. The results are of great importance in achieving correct results for pressure drop and hydraulics calculations and the results are that the API rheological model (RP 13D) provides, in general, the best prediction of rheological behavior for the mud samples considered (EAAP=1.51), followed by the Herschel-Bulkley, Robertson and Stiff, and Unified models. Results also show that corrections with E&C+2IDs and API hydraulics calculation give a good approximation to measured pump pressure with 9% of difference between measured and calculated data.

Beginning January 1, 1987, the state of Kansas began collecting and recording data from residential radon tests. This data was collected based entirely upon voluntary home testing, performed by 1) the home owner (using a store-purchased radon test kit), 2) a professional radon testing laboratory or 3) by technicians from the Kansas Department of Health and Environment (KDHE) state laboratory. The majority of test results arc from tests conducted by homeowners. The radon database was analyzed using Arc Info 8.2. Three primary graphical information system (GIs) analyses were performed: 1) a comparison of the Kansas database to the Environmental Protection Agency (EPA)l Unites States Geographical Service (USGS) radon threat map for Kansas, 2) a data density analysis of statewide testing patterns and 3) an analysis of average radon values across clustered zip code districts in Sedgwick County, Shawnee County and the Kansas City metropolitan area (including Johnson, Wyandottc, Leavenworth and Douglas Counties). Comparison of the Kansas radon database to the EPAIUSGS threat asscssmcnt map showed similar but not identical trends. The data density analysis identified the zip code districts for which no test results had been collected and identified the areas of

The array of alternative energy sources which are vying for the federal government's R and D dollar is formidable when compared to the politically acceptable amount which can be used to fund the research. To guide how these funds should be dispersed, a rational, defensible procedure is needed which can repeatedly be applied as new technologies and new information become available. The procedure advanced in this paper is a decision analysis technique known as multi attribute decision analysis (MADA) and its use is illustrated in an evaluation and ranking of solar thermal electric power generating systems. Since the ultimate purchase decision is made in the market place, the preferences of potential users have been sampled and brought to bear on the ranking. The focus of this description is on the formulation of the problem structure and the decision model, the treatment of uncertainty, and how the results relate to the questions asked by and of the Department of Energy, which funded the study. A final note proposes how decision analysis can be used to address the broader questions of choice among competing technologies with cautions concerning misuse of the procedure.

data collection efforts are used to gauge the problem of crime in America. These two data series are intended to measure different aspects of crime, but this point is often lost in the front-page headline summaries of whether crime is “up,” “down, ” or about the same. Confusion regarding the two crime indicators is exacerbated when the two data series show substantially different trends, as was the case with the most recent release of data in 2001. For this column, we asked Michael Rand and Callie Rennison of the U.S. Bureau of Justice Statistics to describe the two national crime indicators and their differences. Their article clearly describes the distinct approaches to measuring crime used by these two data series, and provides a strong justification for maintaining these two approaches to address very distinct policy needs. Just a thought... A previous Window on Washington column from summer 1998 discussed the U.S. Census Bureau’s Small-Area Income and Poverty Estimates Program, which used administrative records and census data in conjunction with Current Population Survey estimates to produce modelbased small-area estimates that incorporated information from all these sources. It would be interesting to see if a similar approach might be used to provide small-area estimates of violent crime, possibly by fitting a regression model to the National Crime Victimization Survey data, using the Uniform Crime Reports and other data as explanatory variables. Such approaches to combine information from the two national crime indicators—continuing efforts to refine inferences from two frequently confused data series—are interesting possibilities for future research.

About NREL GIS About NREL GIS NREL's Geographic Information System (GIS) team analyzes renewable energy resources and many other data sources to determine which energy technologies are viable solutions across the globe and inputs the data into a geographic information system. GIS is a computer-based system used to manipulate, manage, and analyze multidisciplinary geographic and related attribute data. The GIS system is composed of hardware, software, data, and expertise. Using a GIS system allows the user to perform several tasks, including data capture, data management, data manipulation, data analysis, and presentation of results in graphic or report forms. All information in GIS is linked to a spatial reference used to store and access data. GIS data layers can be recombined or manipulated and analyzed

A virtual reality system was developed for computational and graphical modeling and simulation of radiation environments. This system, called Virtual Radiation Fields (VRF), demonstrates the usefulness of radiological analysis in simulation-based design for predicting radiation doses for robotic equipment and personnel working in a radiation environment. The system was developed for use in determining the radiation doses for robotic equipment to be used in tank-waste retrieval operations at the Hanford National Laboratory. As a reference case, specific application is made to simulate cleanup operations for Hanford tank C-106. A three-dimensional model representation of the tank and its predicted radiation levels are presented and analyzed. Tank cleanup operations were simulated to understand how radiation levels change during the cleanup phase and to predict cumulative radiation doses to robotic equipment to aid in the development of maintenance and replacement schedules.

Historical evidence has shown that incidents due to hazardous materials (HazMat) releases during transportation can lead to severe consequences. The public and some agencies such as the Department of Transportation (DOT) show an increasing concern with the hazard associated with HazMat transportation. Many hazards may be identified and controlled or eliminated through use of risk analysis. Transportation Risk Analysis (TRA) is a powerful tool in HazMat transportation decision support system. It is helpful in choosing among alternate routes by providing information on risks associated with each route, and in selecting appropriate risk reduction alternatives by demonstrating the effectiveness of various alternatives. Some methodologies have been developed to assess the transportation risk; however, most of those proposed methodologies are hard to employ directly by decision or policy makers. One major barrier is the lack of the match between available data/database analysis and the numerical methodologies for TRA. In this work methodologies to assess the transportation risk are developed based on the availability of data or databases. The match between the availability of data/databases and numerical TRA methodologies is pursued. Each risk component, including frequency, release scenario, and consequence, is assessed based on the available data/databases. The risk is measured by numerical algorithms step by step in the transportation network. Based on the TRA results, decisions on HazMat transportation could be made appropriately and reasonably. The combination of recent interest in expanding or building new facilities to receive liquefied natural gas (LNG) carriers, along with increased awareness and concern about potential terrorist action, has raised questions about the potential consequences of incidents involving LNG transportation. One of those consequences, rapid phase transition (RPT), is studied in this dissertation. The incidents and experiments of LNG-water RPT and theoretical analysis about RPT mechanism are reviewed. Some other consequences, like pool spread and vapor cloud dispersion, are analyzed by Federal Energy Regulatory Commission (FERC) model.

A major challenge that has faced policy makers concerned with acid deposition is obtaining an integrated view of the underlying science related to acid deposition. In response to this challenge, the US Department of Energy is sponsoring the development of an integrated Tracking and Analysis Framework (TAF) which links together the key acid deposition components of emissions, air transport, atmospheric deposition, and aquatic effects in a single modeling structure. The goal of TAF is to integrate credible models of the scientific and technical issues into an assessment framework that can directly address key policy issues, and in doing so act as a bridge between science and policy. Key objectives of TAF are to support coordination and communication among scientific researchers; to support communications with policy makers, and to provide rapid response for analyzing newly emerging policy issues; and to provide guidance for prioritizing research programs. This paper briefly describes how TAF was formulated to meet those objectives and the underlying principals which form the basis for its development.

Nuclear forensic teams will be deployed to collect and evaluate fallout samples on the ground in the scenario of a low-yield nuclear detonation in a heavily populated area. Quick non-destructive methods of predicting the quality of the sample before it is analyzed in detail are essential for efficient post-event collections. In this work, the process of exporting Defense Land Fallout Interpretive Code (DELFIC) results into Gamma Detector Response and Analysis Software (GADRAS) has been automated within the Fallout AnalysisTool. This coupling allows for the simulation of detector responses to fallout samples with varying degrees of fractionation. The degree to which the samples are fractionated depends on the location of the samples in the fallout field. In the following study, this phenomenon is examined, as its understanding is important to the investigation of debris distribution. The simulated detector spectra from GADRAS can be used to compare peak ratios of volatile-refractory isotope pairs in order to determine the degree of fractionation. Simulated fractionated fallout samples from DELFIC for a 10 kt, pure 235U fission surface burst were modeled for distances ranging to 256 km out from ground zero, and for times up to 1 week from detonation. The fractionation ratios, also known as r values, from isotope concentrations, photon lines and peak areas of four volatile-refractory pairs were calculated and compared. Fractionation prediction via the peak areas method was evaluated for each pair by comparing the results with the simulated radionuclide inventory.

The present invention relates to a tool setting device for use with numerically controlled machine tools, such as lathes and milling machines. A reference position of the machine tool relative to the workpiece along both the X and Y axes is utilized by the control circuit for driving the tool through its program. This reference position is determined for both axes by displacing a single linear variable displacement transducer (LVDT) with the machine tool through a T-shaped pivotal bar. The use of the T-shaped bar allows the cutting tool to be moved sequentially in the X or Y direction for indicating the actual position of the machine tool relative to the predetermined desired position in the numerical control circuit by using a single LVDT.

This study describes the development of a computer program for analyzing the off-design performance of axial flow helium compressors, which is one of the major concerns for the power conversion system of a high temperature gas-cooled reactor (HTGR). The compressor performance has been predicted by the aerodynamic analysis of meridional flow with allowances for losses. The governing equations have been derived from Euler turbomachine equation and the streamline curvature method, and then they have been merged into linearized equations based on the Newton-Raphson numerical method. The effect of viscosity is considered by empirical correlations to introduce entropy rises caused by primary loss sources. Use of the method has been illustrated by applying it to a 20-stage helium compressor of the GTHTR300 plant. As a result, the flow throughout the stages of the compressor has been predicted and the compressor characteristics have been also investigated according to the design specification. The program results show much better stability and good convergence with respect to other through-flow methods, and good agreement with the compressor performance map provided by JAEA. (authors)

The Defense Threat Reduction Agency (DTRA) commissioned an assessment of the Consequence Management (CM) plans in place on military bases for response to a chemical attack. The effectiveness of the CM plans for recovering from chemical incidents was modeled using a multiple Decision Support Tools (DSTs). First, a scenario was developed based on an aerial dispersion of a chemical agent over a wide-area of land. The extent of contamination was modeled with the Hazard Prediction and Assessment Capability (HPAC) tool. Subsequently, the Analyzer for Wide Area Restoration Effectiveness (AWARE) tool was used to estimate the cost and time demands for remediation based on input of contamination maps, sampling and decontamination resources, strategies, rates and costs. The sampling strategies incorporated in the calculation were designed using the Visual Sample Plan (VSP) tool. Based on a gaps assessment and the DST remediation analysis, an Enhanced Chemical Incident Response Plan (ECIRP) was developed.

Analysis and Reverse Engineering of Code Using Hierarchy and Yourdon (ARCHY) diagrams is a tool for development and maintenance of FORTRAN programs. When FORTRAN source code is read by ARCHY, it automatically creates a database that includes a data dictionary, which lists each variable, its dimensions, type, category (set, referenced, passed), module calling structure, and common block information. The database exists in an ASCII file that can be directly edited or maintained with the ARCHY database editor. The database is used by ARCHY to product structure charts and Yourdon data flow diagrams in PostScript format. ARCHY also transfers database information such as a variable definitions, module descriptions, and technical references to and from module headers. ARCHY contains several utilities for making programs more readable. It can automatically indent the body of loops and conditionals and resequence statement labels. Various language extensions are translated into FORTRAN-77 to increase code portability. ARCHY frames comment statements and groups FORMAT statements at the end of modules. It can alphabetize modules within a program, end-of-line labels can be added, and it can also change executable statements to upper or lower case. ARCHY runs under the VAX-VMS operating system and inputs from VAX-FORTRAN, IBM-FORTRAN, and CRAY FORTRAN sources files.

This paper documents Fluor Hanford's use of Leading Indicators, management leadership, and statistical methodology in order to improve safe performance of work. By applying these methods, Fluor Hanford achieved a significant reduction in injury rates in 2003 and 2004, and the improvement continues today. The integration of data, leadership, and teamwork pays off with improved safety performance and credibility with the customer. The use of Statistical Process Control, Pareto Charts, and Systems Thinking and their effect on management decisions and employee involvement are discussed. Included are practical examples of choosing leading indicators. A statistically based color coded dashboard presentation system methodology is provided. These tools, management theories and methods, coupled with involved leadership and employee efforts, directly led to significant improvements in worker safety and health, and environmental protection and restoration at one of the nation's largest nuclear cleanup sites.

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

ToolsTools Below are links to tools that assist users with project planning and analysis. Residential Lighting Usage Estimate Tool The Residential Lighting Usage Estimate Tool is a companion to the report, "Residential Lighting End-Use Consumption Study," which developed a regional estimation framework within a national sample design that allows for the estimation of lamp usage and energy consumption. The tool contains the full set of estimates produced by methodology described in the study. Simple Modular LED Cost Model (LEDCOM) The LED Cost Model provides a simplified method for analyzing the manufacturing costs of an LED package, and enables those involved in the manufacture of LED packages to evaluate the relative impact of changes made at different points in the manufacturing process on the final LED package cost.

In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

This report documents refined requirements for tools to aid the process of force design in Operations Other Than War (OOTWs). It recommends actions for the creation of one tool and work on other tools relating to mission planning. It also identifies the governmental agencies and commands with interests in each tool, from whom should come the user advisory groups overseeing the respective tool development activities. The understanding of OOTWs and their analytical support requirements has matured to the point where action can be taken in three areas: force design, collaborative analysis, and impact analysis. While the nature of the action and the length of time before complete results can be expected depends on the area, in each case the action should begin immediately. Force design for OOTWs is not a technically difficult process. Like force design for combat operations, it is a process of matching the capabilities of forces against the specified and implied tasks of the operation, considering the constraints of logistics, transport and force availabilities. However, there is a critical difference that restricts the usefulness of combat force design tools for OOTWs: the combat tools are built to infer non-combat capability requirements from combat capability requirements and cannot reverse the direction of the inference, as is required for OOTWs. Recently, OOTWs have played a larger role in force assessment, system effectiveness and tradeoff analysis, and concept and doctrine development and analysis. In the first Quadrennial Defense Review (QDR), each of the Services created its own OOTW force design tool. Unfortunately, the tools address different parts of the problem and do not coordinate the use of competing capabilities. These tools satisfied the immediate requirements of the QDR, but do not provide a long-term cost-effective solution.

Initial interest in Dechloromonas aromatica strain RCB arose from its ability to anaerobically degrade benzene. It is also able to reduce perchlorate and oxidize chlorobenzoate, toluene, and xylene, creating interest in using this organism for bioremediation. Little physiological data has been published for this microbe. It is considered to be a free-living organism. The a priori prediction that the D. aromatica genome would contain previously characterized 'central' enzymes involved in anaerobic aromatic degradation proved to be false, suggesting the presence of novel anaerobic aromatic degradation pathways in this species. These missing pathways include the benzyl succinyl synthase (bssABC) genes (responsible for formate addition to toluene) and the central benzoylCoA pathway for monoaromatics. In depth analyses using existing TIGRfam, COG, and InterPro models, and the creation of de novo HMM models, indicate a highly complex lifestyle with a large number of environmental sensors and signaling pathways, including a relatively large number of GGDEF domain signal receptors and multiple quorum sensors. A number of proteins indicate interactions with an as yet unknown host, as indicated by the presence of predicted cell host remodeling enzymes, effector enzymes, hemolysin-like proteins, adhesins, NO reductase, and both type III and type VI secretory complexes. Evidence of biofilm formation including a proposed exopolysaccharide complex with the somewhat rare exosortase (epsH), is also present. Annotation described in this paper also reveals evidence for several metabolic pathways that have yet to be observed experimentally, including a sulphur oxidation (soxFCDYZAXB) gene cluster, Calvin cycle enzymes, and nitrogen fixation (including RubisCo, ribulose-phosphate 3-epimerase, and nif gene families, respectively). Analysis of the D. aromatica genome indicates there is much to be learned regarding the metabolic capabilities, and life-style, for this microbial species. Examples of recent gene duplication events in signaling as well as dioxygenase clusters are present, indicating selective gene family expansion as a relatively recent event in D. aromatica's evolutionary history. Gene families that constitute metabolic cycles presumed to create D. aromatica's environmental 'foot-print' indicate a high level of diversification between its predicted capabilities and those of its close relatives, A. aromaticum str EbN1 and Azoarcus BH72.

govDataData Tools govDataData Tools Data Tools Most ARM data is archived and made available as time-series data in netCDF format. NetCDF (designed by Unidata) is relatively compact, is appendable, is capable of storing descriptive "meta-data" along with measurement data, and is platform-independent. However, the organization of data within netCDF files tends to follow conventions reflecting either the source or the end use of the data. Consequently, tools developed for use with netCDF files following one convention may not be optimal for use with those following another convention. For example, tools designed for examination of geographically-gridded data may not be well-suited for analysis of time-series data such as that generated by ARM. In addition, some of the

Water Assessment for Transportation Energy Resources (WATER) Tool Released Water Assessment for Transportation Energy Resources (WATER) Tool Released Argonne National Laboratory recently released an open access online tool called WATER (Water Assessment for Transportation Energy Resources), which quantifies water footprint of fuel production stages from feedstock production to conversion process for biofuel with county, state, and regional level spatial resolution. WATER provides analysis on water consumption and its impact on water quality. It contains biofuel pathways for corn grain ethanol, soybean biodiesel, and cellulosic ethanol produced from corn stover and wheat straw. Perennial grass (Switchgrass and Miscanthus) and forest wood residue-based biofuel pathways are currently under development. The WATER tool enables users to conduct pathway comparison, scenario development, and regional specific feedstock analysis in supporting of biofuel industry development and planning. It is available at http://water.es.anl.gov/.

Automated text processing systems represent a potentially powerful technology for deriving valuable information from text-based documents. This report describes a preliminary evaluation of advanced tools for extracting useful information from text-based incident and event reports typical of those generated in the energy industry. It also presents a review of background literature and current research related to automated text processing.

Motivation -- Distributed Cognition (DCog) and Actor-Network Theory (ANT) are two related perspectives which can be adopted when studying the relationship between humans and artefacts in collaborative environments. Although these perspectives ... Keywords: TITAN, actor-network theory, distributed cognition, information trajectories, research tool

building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

This report is a user`s manual for GRAFLAB, which is a new database, analysis, and plotting package that has been written entirely in the MATLAB programming language. GRAFLAB is currently used for data reduction, analysis, and archival. GRAFLAB was written to replace GRAFAID, which is a FORTRAN database, analysis, and plotting package that runs on VAX/VMS.

Energy companies can use better knowledge about existing power quality to target maintenance efforts, to establish a baseline for offering premium power services, and to use as a selling tool to sell sites with high power quality. Unfortunately, most energy companies do not have widespread, long-term records of power quality needed to provide this information. But energy companies often have good historical records of reliability indices for all circuits on their system (System Average Interruption Frequ...

Illustrative Scenarios Tool Illustrative Scenarios Tool Jump to: navigation, search Tool Summary LAUNCH TOOL Name: Illustrative Scenarios Tool Agency/Company /Organization: European Commission Focus Area: GHG Inventory Development Topics: AnalysisTools Website: www.eutransportghg2050.eu/cms/illustrative-scenarios-tool/ The SULTAN (SUstainabLe TrANsport) Illustrative Scenarios Tool is a high-level calculator to help provide estimates of the possible impacts of policy on transport in the European Union. The tool allows quick scoping of a range of transport policy options to help understand what scale of action might be required and may also be used as part of the analysis for final technical outputs of a project. How to Use This Tool This tool is most helpful when using these strategies:

Researchers at Oak Ridge National Laboratory (ORNL) developed the Adaptable, Multiplatform, Real-Time Analysis Package (AMRAP) for the continuous measurement of environmental radionuclide decay. AMRAP is a completely open source visualization and analysis package capable of combining a variety of data streams into an array of real-time plots. Once acquired, data streams are analyzed to store static images and extract data based on previously defined thresholds. AMRAP is currently used at ORNL to combine data streams from an Ortec Detective high-purity germanium (HPGe) detector, a TSA Systems radiation portal monitor (RPM), and an Orion weather station. The combined data are used to study the rain-induced increase in RPM background radiation levels. RPMs experience an increase in background radiation during precipitation due to the deposition of atmospheric radionuclides on the ground. Using AMRAP results in a real-time analysis workstation specifically dedicated to the study of RPM background radiation levels. By means of an editable library of common inputs, AMRAP is adaptable to remote monitoring applications that would benefit from the real-time visualization and analysis of radiation measurements. To study rain-induced increases in background radiation levels observed in radiation portal monitors (RPMs), researchers at Oak Ridge National Laboratory (ORNL) developed a software package that allows data with different formats to be analyzed and plotted in near real time. The Adaptable, Multiplatform, Real-Time Analysis Package (AMRAP) was developed to operate in the background and capture plots of important data based on previously defined thresholds. After executing AMRAP, segments of a data stream can be captured without additional post-processing. AMRAP can also display previously recorded data to facilitate a detailed offline analysis. Without access to these capabilities in a single software package, analyzing multiple continuously recorded data streams with different formats is impractical. Commercially available acquisition software packages record and analyze radiation measurements but are not designed to perform real-time analysis in conjunction with data from other vendors. The lack of collaboration between vendors is problematic when research requires different data streams to be correlated in time and immediately analyzed. AMRAP was specifically developed to provide a solution to this problem. AMRAP is a completely open source visualization and analysis package capable of plotting and analyzing data from different vendors in near real time.

This project will develop a 3D, advanced coarse mesh transport method (COMET-Hex) for steady- state and transient analyses in advanced very high-temperature reactors (VHTRs). The project will lead to a coupled neutronics and thermal hydraulic (T/H) core simulation tool with fuel depletion capability. The computational tool will be developed in hexagonal geometry, based solely on transport theory without (spatial) homogenization in complicated 3D geometries. In addition to the hexagonal geometry extension, collaborators will concurrently develop three additional capabilities to increase the code’s versatility as an advanced and robust core simulator for VHTRs. First, the project team will develop and implement a depletion method within the core simulator. Second, the team will develop an elementary (proof-of-concept) 1D time-dependent transport method for efficient transient analyses. The third capability will be a thermal hydraulic method coupled to the neutronics transport module for VHTRs. Current advancements in reactor core design are pushing VHTRs toward greater core and fuel heterogeneity to pursue higher burn-ups, efficiently transmute used fuel, maximize energy production, and improve plant economics and safety. As a result, an accurate and efficient neutron transport, with capabilities to treat heterogeneous burnable poison effects, is highly desirable for predicting VHTR neutronics performance. This research project’s primary objective is to advance the state of the art for reactor analysis.

The Scalasca toolset has successfully demonstrated measurement and analysis scalability on the largest computer systems, however, applications have growing complexity and increasing demands on performance tools. One such application is the PFLOTRAN code ... Keywords: MPI communicators, performance measurement tools, scalability

The proliferation of dynamic program analysistools has done much to ease the burden of developing complex software. However, creating such tools remains a challenge. Dynamic binary instrumentation frameworks such as ...

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

This Methodology Booklet provides a comprehensive review and methodology guiding principles for constructing energy efficiency indicators, with illustrative examples of application to individual countries. It reviews work done by international agencies and national government in constructing meaningful energy efficiency indicators that help policy makers to assess changes in energy efficiency over time. Building on past OECD experience and best practices, and the knowledge of these countries' institutions, relevant sources of information to construct an energy indicator database are identified. A framework based on levels of hierarchy of indicators -- spanning from aggregate, macro level to disaggregated end-use level metrics -- is presented to help shape the understanding of assessing energy efficiency. In each sector of activity: industry, commercial, residential, agriculture and transport, indicators are presented and recommendations to distinguish the different factors affecting energy use are highlighted. The methodology booklet addresses specifically issues that are relevant to developing indicators where activity is a major factor driving energy demand. A companion spreadsheet tool is available upon request.

Synchrotron radiation has become an indispensable research tool for a growing number of scientists in a seemingly ever expanding number of disciplines. We can thank the European Synchrotron Research Facility (ESRF) in Grenoble for taking an innovative step toward achieving the educational goal of explaining the nature and benefits of synchrotron radiation to audiences ranging from the general public (including students) to government officials to scientists who may be unfamiliar with x-ray techniques and synchrotron radiation. ESRF is the driving force behind a new CD-ROM playable on both PCs and Macs titled Synchrotron light to explore matter. Published by Springer-Verlag, the CD contains both English and French versions of a comprehensive overview of the subject.

Execution) is a tracing system developed by Larus and Ball which is incorporated as part of the Gnu C compiler [3]. Its goal is to generate very small traces which can be saved and then reused for multiple simulation runs. The modified compiler actually produces two executable programs. The first is the modified application. In addition to normal compilation, the compiler uses the notion of abstract execution to insert tracing code in the application code. Abstract execution is based upon control-flow tracing to reduce the amount of trace code necessary. The resulting trace produced by the modified application is only a tiny part of the full trace. This allows traces representing long execution runs to be saved on disk. The compiler also produces an application specific trace regeneration program. The regeneration program is a post-processing tool which accepts the compacted trace and outputs the full execution trace. The tracing overhead, including the cost of saving the compacted tra...

Software Tools WINDOW for analyzing window thermal and optical performance THERM for analyzing two-dimensional heat transfer through building products Optics for analyzing optical properties of glazing systems International Glazing Database Optical data for glazing products used by WINDOW 5.2 and Optics5.1 including NFRC approved products Complex Glazing Database A database of shading materials and systems, such as roller shades and venetian blinds, that can be used by WINDOW 6 to calculate thermal and optical characteristics of window products with these shading systems. COMFEN A PC Program for calculating the heating and cooling energy use, and visual and thermal comfort, of commercial building facades. RESFEN A PC program for calculating the heating and cooling energy use of windows in residential buildings

This document describes the implementation of the topological vertex finding algorithm ZVTOP within the org.lcsim reconstruction and analysis framework. At the present date, Java vertexing tools allow users to perform topological vertexing on tracks that have been obtained from a Fast MC simulation. An implementation that will be able to handle fully reconstructed events is being designed from the ground up for longevity and maintainability.

Sheet-metal forming involves a complex strain distribution over the part. The strains consist of tension, compression, and a mix of both. A geometry has been developed, the X-Die, in order to gain insight into the strain behavior of different materials. The X-Die enables strain paths far into the tension/compression region, thus creating the possibility to extend the experimental base both for definition and for further extrapolation of the Forming Limit Curve (FLC) in the tension/compression region, as well as to evaluate FE-simulation results for the same region.Today, evaluation of cracks is made by using FLC. In the conventional test methods, the strains only reach 40% compression (true strain) and often much lower percentages. In conventional test methods, the FLC for any region beyond these levels is extrapolated from existing data.The experimental test proposed in this work consists of a geometry, the X-die, which has shown that rates of 70% tension/compression can be reached (point 0.7/-0.7 in the FLC). Thereby, the region for prediction of cracks on the compression side can be extended in the Forming Limit Diagram (FLD). Furthermore, the strain paths are easy to follow and the limits when cracks appear can be evaluated. Furthermore, the experimental results show that the behavior depends on the material quality. Qualities such as Extreme High Strength Steel (EHSS) and Aluminum have a limited tension/compression rate due to failure in plane strain tension. Material qualities with high r-values, e.g. Mild steel and High Strength Steel (HSS), reach high tension/compression rates before failure and have regions with clearly defined strain signatures. This will be favorable for comparison with numerical simulations, especially for strain signatures in the tension/compression region. Furthermore, the experiments did not indicate any limitation in the compression region besides the one defined in the normal procedure in creation of an FLC.This geometry is favorable to calibrate simulation results, in order to analyze prediction of strains located on the left side in an FLD.

Minute durable plate-like thermal indicators are employed for precision measuring static and dynamic temperatures of well drilling fluids. The indicators are small enough and sufficiently durable to be circulated in the well with drilling fluids during the drilling operation. The indicators include a heat resistant indicating layer, a coacting meltable solid component and a retainer body which serves to unitize each indicator and which may carry permanent indicator identifying indicia. The indicators are recovered from the drilling fluid at ground level by known techniques.

The worst scenario of drilling operation is blowout which is uncontrolled flow of formation fluid into the wellbore. Blowouts result in environmental damage with potential risk of injuries and fatalities. Although not all blowouts result in disaster, outcomes of blowouts are unknown and should be studied before starting an operation. Plans should be available to prevent blowouts or provide safe and secure ways of controlling the well before the drilling operation starts. The plan should include procedures in case of any blowout incident as a proactive measure. A few commercial softwares are available in the industry for dynamic kill and transient modeling. All models are proprietary and very complex which reduces the flexibility of the program for specific cases. The purpose of this study is to develop a pseudo transient hydraulic simulator for dynamic kill operations. The idea and concept is to consider the flow of each phase as a single phase flow. The summation of hydrostatic and frictional pressure of each phase determines the bottomhole pressure during the dynamic kill operation. The simulator should be versatile and capable of handling special cases that may encounter during blowouts. Some of the main features of the proposed dynamic kill simulator include; quick and robust simulation, fluid properties are corrected for pressure and temperature, sensitivity analysis can be performed through slide bars, and capable of handling variety of wellbore trajectories. The results from the proposed simulator were compared to the result of commercial software, OLGA ABC. The results were in agreement with each other. It is recommended to apply the simulator for operations with required kill fluid volumes of one to two wellbore volumes.

This indicator describes emissions of greenhouse gases in the United States and its territories between 1990 and 2010. This indicator reports emissions of greenhouse gases (GHGs) according to their global warming potential, a measure of how much a given amount of the GHG is estimated to contribute to global warming over a selected period of time. For the purposes of comparison, global warming potential values are given in relation to carbon dioxide (CO2) and are expressed in terms of CO2 equivalents. Components of this indicator include: • U.S. GHG emissions by gas (Figure 1) • U.S. GHG emissions and sinks by economic sector (Figure 2) • U.S. GHG emissions per capita and per dollar of GDP (Figure 3)

This indicator describes emissions of greenhouse gases (GHGs) in the United States and its territories between 1990 and 2011. This indicator reports emissions of GHGs according to their global warming potential, a measure of how much a given amount of the GHG is estimated to contribute to global warming over a selected period of time. For the purposes of comparison, global warming potential values are given in relation to carbon dioxide (CO2) and are expressed in terms of CO2 equivalents. Components of this indicator include: U.S. GHG emissions by gas (Figure 1) U.S. GHG emissions and sinks by economic sector (Figure 2) U.S. GHG emissions per capita and per dollar of GDP (Figure 3)

This article reports that a new tool has been designed and successfully tested that can designate which direction from a borehole a particular fracture is located. Albuquerque-based Sandia National Laboratories tested the new tool. The prototype was built by Southwest Research Institute of San Antonio. During field tests, the tool detected simulated fractures more than 30 ft away from a test borehole. It determines fracture direction by transmitting highly directional and powerful radar pulses in a known direction. The pulses last eight billionths of a second and their frequency spectrum range up to the VHF (very high frequency) band. Discontinuities in the rock interrupt and reflect radar signals so that a signal's return to the toolindicates the presence of fractures. The return signal's time delay translates into distance from the borehole. The transmitter and receiver rotate in place, permitting the tool to scan for fractures in all directions.

Energy use indices and associated coefficients of variation are computed for major industry categories for electricity and natural gas use in small and medium-sized plants in the U.S. Standard deviations often exceed the average EUI for an energy type, with coefficients of variation averaging 290% for 8,200 plants from all areas of the continental U.S. Data from milder climates appears more scattered than that from colder climates. For example, the ratio of the average of coefficient of variations for all industry types in warm versus cold regions of the U.S. generally is greater than unity. Data scatter may have several explanations, including climate, plant area accounting, the influence of low cost energy and low cost buildings used in the south of the U.S. This analysis uses electricity and natural gas energy consumption and area data of manufacturing plants available in the U.S. Department of Energy’s national Industrial Assessment Center database.

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

OnGrid Tool OnGrid Tool The OnGrid Tool is an Excel-based application assisting in PV system sizing, pricing and quoting. It enables solar installers to gather a customerÂ’s utility and site information quickly and efficiently. It calculates optimum system size, electricity bill savings, system incentives, and system net costs. It then calculates various economic benefit factors and produces a variety of proposals, quotes, and. The OnGrid Tool includes rates, incentives, and tax calculations, and estimates system performance, including all performance factors and shading data. Screen Shots Keywords solar, financial, payback, analysis, sales, tool, software, economics, proposal Validation/Testing The software is run through Microsoft Excel and results are available immediately. OnGrid uses the most credible sources for industry updates.

Possible alternative Solar Indices which could either be a perturbation from the currently defined Solar Index or possible indices based on current technologies for other media markets are discussed. An overview is given of the current project, including the logic that was utilized in defining its current structure and then alternative indices and definitions are presented and finally, recommendations are made for adopting alternative indices.

Abstract. Recently, progress indicators have been proposed for SQL queries in RDBMSs. All previously proposed progress indicators consider each query in isolation, ignoring the impact simultaneously running queries have on each other’s performance. In this paper, we explore a multi-query progress indicator, which explicitly considers concurrently running queries and even queries predicted to arrive in the future when producing its estimates. We demonstrate that multi-query progress indicators can provide more accurate estimates than single-query progress indicators. Moreover, we extend the use of progress indicators beyond being a GUI tool and show how to apply multi-query progress indicators to workload management. We report on an initial implementation of a multi-query progress indicator in PostgreSQL and experiments with its use both for estimating remaining query execution time and for workload management. 1

Abstract ? Experimental measurements show that Internet traffic exhibits self-similarity and long-range dependence (LRD). A delicate issue is the estimation of traffic statistical quantities that characterize self-similarity and LRD, such as the Hurst parameter H. In this paper, we propose to use the Modified Allan Variance (MAVAR), a well-known time-domain tool originally studied for frequency stability characterization, for estimating the power-law spectrum and thus the H parameter of LRD traffic time series. This novel method is validated by comparison to one of the most widely adopted algorithms for analyzing LRD traffic: the log-scale diagram technique based on wavelet analysis. Both methods are applied to pseudo-random data series, generated with known values of H. MAVAR exhibits outstanding accuracy in estimating H, better than the classical log-scale method. Finally, both techniques are applied to a real IP traffic trace, providing a further example of the capabilities of MAVAR. Index Terms ? Fractals, fractional noise, Internet, long-range dependence, self-similarity, traffic control (communication). Work partialy supported by Ministero dell’Istruzione, dell’Università e della Ricerca (MIUR), Italy, under FIRB project TANGO. I.

Moisture-sensitive systems to measure and indicate the maximum level of humidity exposure are discussed. A chemical indicator utilizing deliquescent salts and water-soluble dyes provides an irreversible color change at discrete levels of relative humidity. To provide indication of the time at which the exposure occurs, a circuit employing a resistive-type sensor was developed. A small, commercially available sensor is used in a portable probe to detect humidity leaks into controlled areas.

The Federal Railroad Association maintains a file of carrier-reported accidents and incidents that meet threshold criteria for damage cost and/or casualties. Using a five year period from this data base, an investigation was conducted into the relationship between quantifiable risk factors and accident frequency and severity. Specific objectives were to identify key variables in accidents, formulate a model to predict future accidents, and assess the relative importance of these variables from the perspective of routing and shipping decision making. The temporal factors YEAR and MONTH were found to be significant predictors of risk; accident severity was greatest for accidents caused by track and roadbed defects. Train speed was an indicator of accident severity; track class and training tonnage were inversely proportional to accident severity. Investigation of the data base is continuing, with a final report expected by late summer. 15 refs., 1 fig., 10 tabs.

Both environmental indicators and multi-metric indices are useful for describing baseline conditions and qualitatively predicting the cumulative consequences of multiple actions. Several examples and case studies associated with indicators and/or indices are presented herein. They can be easily modified for usage in CEAM. Habitat suitability models reflect special indices related to habitat needs and quality for specific species or broad habitat types. Such models have been used to address direct and indirect effects, and with some modification, they can be also used to address cumulative effects of multiple actions. This review has indicated that there are numerous examples of such tools which have been or could be used in both EIA and CEAM. Some key lessons are: (1) in conducting CEAM studies, it is useful to think from the mindset that 'I am the VEC or indicator, and what is my historical and current condition and how have I, or will I, be affected by multiple past, present, and future actions?'; (2) due to the likely absence of detailed information on future actions, the described tools can still be used to 'predict' future conditions by focusing on qualitative up-or-down changes in individual indicators or indices with their aggregated displays; and (3) numerous regional and site-specific tools are currently available, with one example being indices of biological integrity for specific watersheds and water bodies. Such tools, even though they may not have been developed for CEAM usage, can certainly benefit CEAM studies and practice. Finally, usage of selected and appropriate tools as described herein can aid in conducting science-based, systematic, and documentable CEAM studies.

To summarize, Consortium for Electric Reliability Technology Solutions (CERTS) is engaged in a multi-year program of public interest R&D to develop and prototype software tools that will enhance system reliability during the transition to competitive markets. The core philosophy embedded in the design of these tools is the recognition that in the future reliability will be provided through market operations, not the decisions of central planners. Embracing this philosophy calls for tools that: (1) Recognize that the game has moved from modeling machine and engineering analysis to simulating markets to understand the impacts on reliability (and vice versa); (2) Provide real-time data and support information transparency toward enhancing the ability of operators and market participants to quickly grasp, analyze, and act effectively on information; (3) Allow operators, in particular, to measure, monitor, assess, and predict both system performance as well as the performance of market participants; and (4) Allow rapid incorporation of the latest sensing, data communication, computing, visualization, and algorithmic techniques and technologies.

With the extensive inclusion of document, especially text, in the business systems, data mining does not cover the full scope of Business Intelligence. Data mining cannot deliver its impact on extracting useful details from the large collection of unstructured and semi-structured written materials based on natural languages. The most pressing issue is to draw the potential business intelligence from text. In order to gain competitive advantages for the business, it is necessary to develop the new powerful tool, text mining, to expand the scope of business intelligence. In this paper, we will work out the strong points of text mining in extracting business intelligence from huge amount of textual information sources within business systems. We will apply text mining to each stage of Business Intelligence systems to prove that text mining is the powerful tool to expand the scope of BI. After reviewing basic definitions and some related technologies, we will discuss the relationship and the benefits of these to text mining. Some examples and applications of text mining will also be given. The motivation behind is to develop new approach to effective and efficient textual information analysis. Thus we can expand the scope of Business Intelligence using the powerful tool, text mining.

We describe the tools and interfaces created by the AGEDIS project, a European Commission sponsored project for the creation of a methodology and tools for automated model driven test generation and execution for distributed systems. The project includes ... Keywords: UML modeling, automated test generation, coverage analysis, defect analysis, test execution framework, validation

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

DOE Sponsored Tools DOE Sponsored Tools The Department of Energy sponsors continued development of a variety of building energy software tools. See the following for more information about software tools now under development: Whole-Building Energy Performance Simulation EnergyPlus A new-generation building energy simulation program from the creators of BLAST and DOE-2. DOE-2 An hourly, whole-building energy analysis program which calculates energy performance and life-cycle cost of operation. The current version is DOE-2.1E. Building Design Advisor Provides building decision-makers with the energy-related information they need beginning in the initial, schematic phases of building design through the detailed specification of building components and systems. SPARK Models complex building envelopes and mechanical systems that are beyond

A photomultiplier tube saturation indicator is formed by supplying a supplemental light source, typically a light emitting diode (LED), adjacent to the photomultiplier tube. A switch allows the light source to be activated. The light is forwarded to the photomultiplier tube by an optical fiber. If the probe is properly light tight, then a meter attached to the indicator will register the light from the LED. If the probe is no longer light tight, and the saturation indicator is saturated, no signal will be registered when the LED is activated. This photomultiplier tube is used with alpha contamination probes.

This article presents a power-spectrum analysis of 2,350 measurements of the $^{90}$Sr/$^{90}$Y decay process acquired over the interval 4 August 2002 to 6 February 2009 at the Lomonosov Moscow State University (LMSU). As we have found for other long sequences of decay measurements, the power spectrum is dominated by a very strong annual oscillation. However, we also find a set of low-frequency peaks, ranging from 0.26 year$^{-1}$ to 3.98 year$^{-1}$, which are very similar to an array of peaks in a power spectrum formed from Mt Wilson solar diameter measurements. The Mt Wilson measurements have been interpreted in terms of r-mode oscillations in a region where the sidereal rotation frequency is 12.08 year$^{-1}$. We find that the LMSU measurements may also be attributed to the same type of r-mode oscillations in a solar region with the same sidereal rotation frequency. We propose that these oscillations occur in an inner tachocline that separates the radiative zone from a more slowly rotating solar core.

POWER Tool--Planning, Optimization, Waste Estimating and Resourcing tool, a hand-held field estimating unit and relational database software tool for optimizing disassembly and final waste form of contaminated systems and equipment.

A battery condition indicator is described for indicating both the charge used and the life remaining in a rechargeable battery comprising: rate multiplying and counting means for indirectly measuring the charge useed by the battery between charges; means for supplying variable rate clock pulse to the rate multiplying and counting means, the rate of the clock pulses being a function of whether a high current consumption load is connected to the battery or not; timing means for measuring the total time in service of the battery; charge used display means responsive to the rate multiplying and counting means for providing an indication of the charge remaining in the battery; and age display means responsive to the timing means for providing an indication of the life or age of the battery.

Battery jumper cables provide an effective means to connect a charged battery to a discharged battery. However, the electrodes of the batteries must be properly connected for charging to occur and to avoid damage to the batteries. A battery polarity indicator is interposed between a set of battery jumper cables to provide a visual/aural indication of relative battery polarity as well as a safety circuit to prevent electrical connection where polarities are reversed.

This patent relates to a humidity indicator having particles of colored dye distributed over the surface of a dry, deliquescent salt of a neutral color. When exposed to a humidity level above that which causes deliquescence of the salt, the dye bleeds through and imparts its developed tincture to the resulting saturated salt solution. On dehydration, the dye remains infused throughout the dried salt to present an irreversible indication of the humidity exposure. (auth)

The Southern Energy Efficiency Center (SEEC) was established to substantially increase the deployment of high-performance “beyond-code” buildings across the southern region of the U.S. It is funded by the U.S. Department of Energy (DOE) Building Technologies Program and administered by the National Energy Technology Laboratory. During its first 18-month phase, to expand the use of existing methods, procedures and tools for building energy efficiency in the marketplace; project efforts include identifying the existing tools from very simple calculators for estimating energy savings to detailed methods for measurement and verification of commercial building energy savings and defining their technical and practical characteristics. This work is defined under the SEEC Subtask 2.4 Expand the Use of Existing Methods and Tools. This report presents preliminary deliverables of this subtask developed and documented by the Energy Systems Laboratory (ESL) for use by the SEEC member state region.
The primary goal of this subtask is to provide the state energy offices with the list of available tools and recommendations for use. By scrutinizing the information gathered, these recommendations have been developed to encourage the use of a number of existing tools that are not widely used, but provide valuable information and insight on the benefits of building energy efficiency in the SEEC member states. The resultant summary spreadsheet will also allow them to choose the appropriate tool, either simple calculators or detailed methods, according to the inquiry.

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

A temperature change indicator is described which is composed of an enzyme and a substrate for that enzyme suspended in a solid organic solvent or mixture of solvents as a support medium. The organic solvent or solvents are chosen so as to melt at a specific temperature or in a specific temperature range. When the temperature of the indicator is elevated above the chosen, or critical temperature, the solid organic solvent support will melt, and the enzymatic reaction will occur, producing a visually detectable product which is stable to further temperature variation.

Climate change indicators are developed for Vermont in recent decades based on the trends in freeze dates, the length of the growing season, the frozen period of small lakes, and the onset of spring. These trends, which show a consistent pattern ...

This patent describes a battery capacity indicator for providing a continuous indication of battery capacity for a battery powered device. It comprises means for periodically effecting a first and a second positive discharge rate of the battery; voltage measurement means, for measuring the battery terminal voltage at the first and second positive discharge rates during the operation of the device, and for generating a differential battery voltage value in response thereto; memory means for storing a set of predetermined differential battery voltage values and a set of predetermined battery capacity values, each of the set of predetermined differential battery voltage values defining one of the set of predetermined battery capacity values; comparison means, coupled to the memory means and to the voltage measurement means, for comparing the measured differential battery voltage values with the set of predetermined differential battery voltage values, and for selecting the predetermined battery capacity value corresponding thereto.

Jumpshot: Performance Visualization Tool Jumpshot: Performance Visualization Tool Jumpshot: Performance Visualization Tool Jumpshot is a Java-based visualization tool for performing postmortem performance analysis. Using Java instead of Tcl/Tk (used in some older visualization tools) improves the portability, maintainability and functionalities of the tools. Jumpshot-4 is the latest viewer and is a total redesign of the graphical tool for SLOG-2. The new scalable logfile format allows the viewer to provide functionalities never made possible before. Level-of-detail support through preview drawables provides high-level abstraction of the details without reading in huge amount of data into the graphical display engine. The new Jumpshot allows seamless scrolling from the begining to the end of the logfile at any zoom-level. In addition, new functionlities like

Landscape indicators, when combined with information about environmental conditions (such as habitat potential, biodiversity, carbon and nutrient cycling, and erosion) and socioeconomic forces, can provide insights about changing ecosystem services. They also provide information about opportunities for improving natural resources management. Landscape indicators rely on data regarding land cover, land management and land functionality. Challenges in using landscape indicators to assess change and effects include (1) measures of land management and attributes that are reliable, robust and consistent for all areas on the Earth do not exist, and thus land cover is more frequently utilized; (2) multiple types of land cover and management are often found within a single landscape and are constantly changing, which complicates measurement and interpretation; and (3) while causal analysis is essential for understanding and interpreting changes in indicator values, the interactions among multiple causes and effects over time make accurate attribution among many drivers of change particularly difficult. Because of the complexity, sheer number of variables, and limitations of empirical data on land changes, models are often used to illustrate and estimate values for landscape indicators, and those models have several problems. Recommendations to improve our ability to assess the effects of changes in land management include refinement of questions to be more consistent with available information and the development of data sets based on systematic measurement over time of spatially explicit land qualities such as carbon and nutrient stocks, water and soil quality, net primary productivity, habitat and biodiversity. Well-defined and consistent land-classification systems that are capable of tracking changes in these and other qualities that matter to society need to be developed and deployed. Because landscapes are so dynamic, it is crucial to develop ways for the scientific community to work together to collect data and develop tools that will enable better analysis of causes and effects and to develop robust management recommendations that will increases land s capacity to meet societal needs in a changing world.

There is disclosed a tamper-indicating seal that permits in the field inspection and detection of tampering. Said seal comprises a shrinkable tube having a visible pattern of markings which is shrunk over th item to be sealed, and a second transparent tube, having a second visible marking pattern, which is shrunk over the item and the first tube. The relationship between the first and second set of markings produces a pattern so that the seal may not be removed without detection. The seal is particularly applicable to UF/sub 6/ cylinder valves.

An automated system that uses an analytical approach to evaluate Computer-Aided Software Engineering (CASE) tools is currently being developed. this system is referred to as the CASE Tool Evaluation System. The following general criteria will be used: overall tool functionality; tool stability; cost; interfaces with other software; customization; ease of use; output produced; hardware and operating system needs; documentation; training and vendor support; repository interface; methodologies; and vendor stability. in Phase 1 CASE tools will be eliminated that do not meet certain must-have'' characteristics specified by the user. Phase 2 will further reduce the size of the tool list by retaining those tools that possess desirable, but not absolutely necessary'' characteristics, also specified by the user. Phase 3 will employ the Analytic Hierarchy Process, developed by Dr. Thomas L. Saaty, to rank the tools. Users will be able to supply tools of their own choosing, in addition to tools that are generated via normal use of the system. All three phases will interact with a database that stores objective information about CASE tools. The use of the Analytic Hierarchy process (AHP) distinguishes this method of CASE tool evaluation from others. As used in this system, the AHP is a method of breaking down the complex, unstructured problem of selecting a CASE tool into its component evaluation criteria and candidate tools. These criteria and tools are arranged into a hierarchical order. Each criterion and tool is assigned a subjective numerical value (by experts or users or both). These values are then synthesized to determine which have the highest priority. The result will be a ranked list of CASE tools tailored to the needs and desires of the user.

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

This document reports the results of the CMke of the Secretary of Defense/Program Analysis & Evaluation (OSD/PA&E) sponsored project to identify how Operations Other Than War (OOTW) tool requirements relate to the Joint Warfare Simulation (JWARS) and, more generally, to joint analytical modeling and simulation (M&S) requirements. It includes recommendations about which OOTW tools (and functionality within tools) should be included in JWARS, which should be managed as joint analytical modeling and simulation (M&S) tools, and which should be left for independent development.

Monte Carlo simulations of simple neutron oil well logging tools into typical geological formations are presented.The simulated tools consist of both 14 MeV pulsed and continuous Am-Be neutron sources with time gated and continuous gamma ray detectors respectively.The geological formation consists of pure limestone with 15% absolute porosity in a wide range of oil saturation.The particle transport was performed with the Monte Carlo N-Particle Transport Code System, MCNP-4B.Several gamma ray spectra were obtained at the detector position that allow to perform composition analysis of the formation.In particular, the ratio C/O was analyzed as an indicator of oil saturation.Further calculations are proposed to simulate actual detector responses in order to contribute to understand the relation between the detector response with the formation composition

Software demonstration: Demand Response Quick Assessment Tool Software demonstration: Demand Response Quick Assessment Tool Speaker(s): Peng Xu Date: February 4, 2008 - 12:00pm Location: 90-3122 The potential for utilizing building thermal mass for load shifting and peak demand reduction has been demonstrated in a number of simulation, laboratory, and field studies. The Demand Response Quick Assessment Tools developed at LBNL will be demonstrated. The tool is built on EnergyPlus simulation and is able to evaluate and compare different DR strategies, such as global temperature reset, chiller cycling, supply air temperature reset, etc. A separate EnergyPlus plotting tool will also be demonstrated during this seminar. Users can use the tool to test EnergyPlus models, conduct parametric analysis, or compare multiple EnergyPlus simulation

Software Tools Software Tools Several software programs are available, either for free or for a nominal charge, that can assist fleet managers and technology developers in assessing the potential impacts of implementing new technologies. Autonomie Autonomie is a Plug-and-Play Powertrain and Vehicle Model Architecture and Development Environment to support the rapid evaluation of new powertrain/propulsion technologies for improving fuel economy through virtual design and analysis in a math-based simulation environment. Developed in partnership with General Motors, Autonomie is an open architecture to support the rapid integration and analysis of powertrain/propulsion systems and technologies for rapid technology sorting and evaluation of fuel economy improvement under dynamic/transient testing conditions. The capability to sort technologies rapidly in a virtual design environment results in faster improvements in real-world fuel consumption by reducing the time necessary to develop and bring new technologies onto our roads.

Solar Tool Solar Tool Solar Tool logo. Makes the process of accurately sizing and positioning overhangs, shading devices and louvers easy. This software is a must for architects, planners and building services engineers, anyone who needs to quickly determine the extent of solar penetration into buildings, overshadowing or the most appropriate means of shading a window. The program uses a flexible, parametric model on which can be placed any number of horizontal, vertical and detached shades. You can select any date, time or location, seeing immediately the resulting shadows whilst interactively manipulating the geometry to show immediately the effects. You can also choose to automatically optimise the size and shape of any shading device over any range of dates and times you require. Screen Shots

This is a progress report to describe the needs of EPRI members, based on survey, and the availability of Risk Analysistools. Included in this report is the superposition of the tools available on the needs of the users using a resolution category scale of Qualitative, Semi-Quantitative and Fully-Quantitative Risk Analysis. In addition a brief description is given of the Risk Analysis and supporting tools available.

The MIT Research Reactor (MITR-II) is currently undergoing analysis for the planned conversion from high enriched uranium (HEU) to low enriched uranium (LEU), as part of a global effort to minimize the availability of ...

A tool and a method for attaching a strain gauge to a test specimen by maaining alignment of, and applying pressure to, the strain gauge during the bonding of the gauge to the specimen. The tool comprises rigid and compliant pads attached to a spring-loaded clamp. The pads are shaped to conform to the specimen surface to which the gauge is to be bonded. The shape of the pads permits the tool to align itself to the specimen and to maintain alignment of the gauge to the specimen during the bond curing process. A simplified method of attaching a strain gauge is provided by use of the tool.

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

A tool and a method for attaching a strain gauge to a test specimen by maintaining alignment of, and applying pressure to, the strain gauge during the bonding of the gauge to the specimen. The tool comprises rigid and compliant pads attached to a spring-loaded clamp. The pads are shaped to conform to the specimen surface to which the gauge is to be bonded. The shape of the pads permits the tool to align itself to the specimen and to maintain alignment of the gauge to the specimen during the bond curing process. A simplified method of attaching a strain gauge is provided by use of the tool.

A boring tool and a method of operation are provided for boring two concentric holes of precision diameters and depths in a single operation. The boring tool includes an elongated tool body, a shank for attachment to a standard adjustable boring head which is used on a manual or numerical control milling machine and first and second diametrically opposed cutting edges formed for cutting in opposite directions. The diameter of the elongated tool body is substantially equal to the distance from the first cutting edge tip to the axis of rotation plus the distance from the second cutting edge tip to the axis of rotation. The axis of rotation of the tool is spaced from the tool centerline a distance substantially equal to one-half the distance from the second cutting edge tip to the axis of rotation minus one-half the distance from the first cutting edge tip to the axis of rotation. The method includes the step of inserting the boring tool into the boring head, adjusting the distance between the tool centerline and the tool axis of rotation as described above and boring the two concentric holes.

A boring tool and a method of operation are provided for boring two concentric holes of precision diameters and depths in a single operation. The boring tool includes an elongated tool body, a shank for attachment to a standard adjustable boring head which is used on a manual or numerical control milling machine and first and second diametrically opposed cutting flutes formed for cutting in opposite directions. The diameter of the elongated tool body is substantially equal to the distance from the first flute tip to the axis of rotation plus the distance from the second flute tip to the axis of rotation. The axis of rotation of the tool is spaced from the tool centerline a distance substantially equal to one-half the distance from the second flute tip to the axis of rotation minus one-half the distance from the first flute tip to the axis of rotation. The method includes the step of inserting the boring tool into the boring head, adjusting the distance between the tool centerline and the tool axis of rotation as described above and boring the two concentric holes.

Netcat in one of the most commonly used anti-hacking tools in the world. It reads and writes data across network connections, using the TCP/IP protocol. It is designed to be a reliable "back-end" tool that can be used directly or easily driven by other ... Keywords: Internet, Security

Improves reliability and availability via reduced reliance on time-based maintenance by using analytics based on asset health and condition analysis to determine maintenance actions.Enables more effective use of existing infrastructure and data as well as efficient use of maintenance personnel to manage operational ...

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

The authors describe here the different tools used for top physics analysis in the CDF Collaboration. In particular, they discuss how the jet energy scale, lepton identification, b tagging algorithms and the neural networks help to improve the signal to background ratio of the top sample in some cases and to reduce the dominant uncertainties in other. Results using each one of these tools are also presented.

Graphitic packing removal tools are described for removal of the seal rings in one piece from valves and pumps. The packing removal tool has a cylindrical base ring the same size as the packing ring with a surface finish, perforations, knurling or threads for adhesion to the seal ring. Elongated leg shanks are mounted axially along the circumferential center. A slit or slits permit insertion around shafts. A removal tool follower stabilizes the upper portion of the legs to allow a spanner wrench to be used for insertion and removal.

Weather significantly increases variability of reliability indices. This project focuses on exploring statistical correlations between weather parameters and system performance indices using historical utility reliability data and weather data. Using this information, various approaches for normalizing utility performance indices for variability in weather can be developed.

A tool and a method are disclosed for attaching a strain gauge to a test specimen by maintaining alignment of, and applying pressure to, the strain gauge during the bonding of the gauge to the specimen. The tool comprises rigid and compliant pads attached to a spring-loaded clamp. The pads are shaped to conform to the specimen surface to which the gauge is to be bonded. The shape of the pads permits the tool to align itself to the specimen and to maintain alignment of the gauge to the specimen during the bond curing process. A simplified method of attaching a strain gauge is provided by use of the tool. 6 figs.

There is provided an apparatus for machining surfaces to accuracies within the nanometer range by use of electrical current flow through the contact of the cutting tool with the workpiece as a feedback signal to control depth of cut.

A friction stir welding tool is described and which includes a shank portion; a shoulder portion which is releasably engageable with the shank portion; and a pin which is releasably engageable with the shoulder portion.

Python Tools Python Tools Python Tools Description and Overview Python is an interpreted, general-purpose high-level programming language. Various of versions of Python are installed on most of the NERSC systems, usually accompanied with computational tool such as numpy and scipy. Using Python on NERSC Systems To use the python installed as module on NERSC systems, you need do module load python To run a python script on the Hopper and Edison compute nodes, you must set an environment variable: setenv CRAY_ROOTFS DSL which is set automatically when the python module is loaded. To execute a script on the Hopper or Edison compute nodes dedicated to your job, you need to use aprun: aprun -n 1 python ./hello_world.py or, if the script is executable and contains '#!/usr/bin/env python' as the

Why packages? The Windows tools A sample package Going further Package Development in Windows from August 13, 2008; updated November 23, 2012 1 of 45 #12;Why packages? The Windows tools A sample of packages 2 The Windows tools The main tools Missing pieces Installing the tools 3 A sample package Getting

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

Two different experiments performed in the 8 MWth MELUSINE experimental power pool reactor aimed at analyzing 1 GWd/t spent fuel pellets doped with several actinides. The goal was to measure the averaged neutron induced capture cross section in two very different neutron spectra (a PWR-like and an under-moderated one). This paper summarizes the combined deterministic APOLLO2-stochastic TRIPOLI4 analysis using the JEFF-3.1.1 European nuclear data library. A very good agreement is observed for most of neutron induced capture cross section of actinides and a clear underestimation for the {sup 241}Am(n,{gamma}) as an accurate validation of its associated isomeric ratio are emphasized. Finally, a possible huge resonant fluctuation (factor of 2.7 regarding to the 1=0 resonance total orbital momenta) is suggested for isomeric ratio. (authors)

Exascale class machines will exhibit a new level of complexity: they will feature an unprecedented number of cores and threads, will most likely be heterogeneous and deeply hierarchical, and offer a range of new hardware techniques (such as speculative threading, transactional memory, programmable prefetching, and programmable accelerators), which all have to be utilized for an application to realize the full potential of the machine. Additionally, users will be faced with less memory per core, fixed total power budgets, and sharply reduced MTBFs. At the same time, it is expected that the complexity of applications will rise sharply for exascale systems, both to implement new science possible at exascale and to exploit the new hardware features necessary to achieve exascale performance. This is particularly true for many of the NNSA codes, which are large and often highly complex integrated simulation codes that push the limits of everything in the system including language features. To overcome these limitations and to enable users to reach exascale performance, users will expect a new generation of tools that address the bottlenecks of exascale machines, that work seamlessly with the (set of) programming models on the target machines, that scale with the machine, that provide automatic analysis capabilities, and that are flexible and modular enough to overcome the complexities and changing demands of the exascale architectures. Further, any tool must be robust enough to handle the complexity of large integrated codes while keeping the user's learning curve low. With the ASC program, in particular the CSSE (Computational Systems and Software Engineering) and CCE (Common Compute Environment) projects, we are working towards a new generation of tools that fulfill these requirements and that provide our users as well as the larger HPC community with the necessary tools, techniques, and methodologies required to make exascale performance a reality.

The chemical indicators evaluated in this study were based on chemical-specific sediment quality guidelines (SQGs) obtained from several sources. SQGs are numeric values intended to help in the interpretation of sediment chemistry data. SQGs are not intended to be a final assessment of environmental condition at a site, but rather to assist in the determination of the potential for biological effects. Numerical SQGs have been developed using both mechanistic and empirical relationships between chemistry and biological effect. Both types of approaches were evaluated in the early phases of the SQO project, but the mechanistic approaches (i.e., equilibrium partitioning) were not included in the final statistical evaluations based on the results of preliminary analyses and the recommendation of the SSC. Three types of empirical chemical indicators were compared and evaluated: established indicators that were based on existing published SQGs that were developed for application on a national level, regional indicators that represent established indicator approaches calibrated to California data, and new indicators developed specifically for this project. All of the chemical indicators were based on chemical mixtures in order to represent the joint effects of multiple chemicals present in a sample. The individual chemical SQGs were integrated using a method specific to each approach to describe mixture effects. The chemicals included in each candidate indicator are shown in Table

Indicators System Print E-mail Indicators System Print E-mail What are the goals for the NCA indicators? The vision for the National Climate Assessment (NCA) is to create a system of indicators that will help inform policy-makers and citizens understand key aspects of our changing climate. Scientific information about physical climate conditions, climate impacts, vulnerabilities, and preparedness will be tracked and compiled. These measures are called indicators. The goals of the Indicators System are to: Provide meaningful, authoritative climate-relevant measures about the status, rates, and trends of key physical, ecological, and societal variables and values Inform decisions on management, research, and education at regional to national scales Identify climate-related conditions and impacts to help develop effective mitigation and adaptation measures

Criteria and Indicators for Sustainable Woodfuels Criteria and Indicators for Sustainable Woodfuels Jump to: navigation, search Tool Summary LAUNCH TOOL Name: Criteria and Indicators for Sustainable Woodfuels Agency/Company /Organization: Food and Agriculture Organization of the United Nations Sector: Land Focus Area: Biomass Resource Type: Guide/manual Website: www.fao.org/docrep/012/i1673e/i1673e00.pdf Criteria and Indicators for Sustainable Woodfuels Screenshot References: Sustainable Woodfuels[1] Overview "This publication assesses the environmental, social and economic issues as well as the legal and institutional frameworks that can ensure the sustainable production of woodfuels from forests, trees outside forests and other sources. The study continues FAO's long interest in wood energy issues and complements the many other FAO reports on wood energy and

This document presents the work conducted as part of Subtask A.1, High-Performance Glazing, of Task 12 of the IEA Solar Heating and Cooling Program. At the start of the task, the participants agreed that chromogenic technology (switchable glazing) held considerable promise, and that algorithms to accurately model their dynamic behavior were needed. The purpose of this subtask was to develop algorithms that could be incorporated into building energy analysis programs for predicting the thermal and optical performance of switchable windows. The work entailed a review of current techniques for modelling switchable glazing in windows and switchable windows in buildings and methods for improving upon existing modeling approaches. The proposed approaches correct some of the shortcomings in the existing techniques, and could be adapted for use in other similar programs. The proposed approaches generally provide more detailed calculations needed for evaluating the short-term (hourly and daily) impact of switchable windows on the energy and daylighting performance of a building. Examples of the proposed algorithms are included.

The goal of this study was to examine two different software tools designed to account for the environmental impacts of remediation projects. Three case studies from the Savannah River Site (SRS) near Aiken, SC were used to exercise SiteWise (SW) and Sustainable Remediation Tool (SRT) by including both traditional and novel remediation techniques, contaminants, and contaminated media. This study combined retrospective analysis of implemented projects with prospective analysis of options that were not implemented. Input data were derived from engineering plans, project reports, and planning documents with a few factors supplied from calculations based on Life Cycle Assessment (LCA). Conclusions drawn from software output were generally consistent within a tool; both tools identified the same remediation options as the 'best' for a given site. Magnitudes of impacts varied between the two tools, and it was not always possible to identify the source of the disagreement. The tools differed in their quantitative approaches: SRT based impacts on specific contaminants, media, and site geometry and modeled contaminant removal. SW based impacts on processes and equipment instead of chemical modeling. While SW was able to handle greater variety in remediation scenarios, it did not include a measure of the effectiveness of the scenario.

Hand Tool and Portable Power Tool Usage Hand Tool and Portable Power Tool Usage Introduction CAT/XSD recognizes that the misuse and improper maintenance of hand tools and portable power tools cause a significant number of injuries to even "experienced" workers. Consequently, CAT/XSD has adopted the following policies and procedures to minimize the hazards associated with the use of such equipment at the APS. These guidelines apply to all use of hand tools and portable power tools by CAT/XSD personnel while performing maintenance or installation activities at the APS. Although CAT/XSD feels that most of the guidelines also apply to tool usage during experimental activities, CAT/XSD will not require that short-term users complete the training described below. Using Tools Safely If you have not had formal training in the use of common tools, either view

A jar tool is described comprising: a mandrel adapted for connection to a piece of drill string at one end of the tool; a housing axially movable relative to the mandrel and adapted for connection to the drill string at the opposite end of the tool; first and second pairs of abutment faces between the mandrel and the housing defining jar and bump positions of the tool, respectively, the first and second pairs of abutment faces located in an abutment chamber; a piston slidably located between walls of the mandrel and the housing and having valve means operable to regulate the passage of hydraulic fluid from a first chamber on one side of the piston between the mandrel and the housing to a second chamber on the other side of the piston between the mandrel and the housing, the valve means including passages for permitting the hydraulic fluid to flow through the piston and between the piston and the walls of the mandrel and housing; and sealing means between the housing and mandrel operable to seal the first pair of abutment faces from outside contamination in the abutment chamber and seal the abutment chamber from the first and second chambers, the first and second pairs of abutment faces being enclosed within the jar tool.

Sandia National Laboratories has developed several models to analyze potential consequences of homeland security incidents. Two of these models (the National Infrastructure Simulation and Analysis Center Agent-Based Laboratory for Economics, N-ABLE{trademark}, and Loki) simulate detailed facility- and product-level consequences of simulated disruptions to supply chains. Disruptions in supply chains are likely to reduce production of some commodities, which may reduce economic activity across many other types of supply chains throughout the national economy. The detailed nature of Sandia's models means that simulations are limited to specific supply chains in which detailed facility-level data has been collected, but policymakers are often concerned with the national-level economic impacts of supply-chain disruptions. A preliminary input-output methodology has been developed to estimate national-level economic impacts based upon the results of supply-chain-level simulations. This methodology overcomes two primary challenges. First, the methodology must be relatively simple to integrate successfully with existing models; it must be easily understood, easily applied to the supply-chain models without user intervention, and run quickly. The second challenge is more fundamental: the methodology must account for both upstream and downstream impacts that result from supply-chain disruptions. Input-output modeling typically estimates only upstream impacts, but shortages resulting from disruptions in many supply chains (for example, energy, communications, and chemicals) are likely to have large downstream impacts. In overcoming these challenges, the input-output methodology makes strong assumptions about technology and substitution. This paper concludes by applying the methodology to chemical supply chains.

Gas condensate reservoirs often exhibit a rapid decline in production with depletion. During early production, liquid dropout accumulates in the near wellbore area and this liquid dropout reduces the effective permeability to gas and thereby the well and field productivity. Our primary goal in this research is to understand the dynamics of condensate banking in the near well region of retrograde gases. We propose a relationship that can be used in determining gas oil ratios and near the wellbore saturation. The tasks accomplished in this study of gas condensate reservoir behavior include: Development of a generalized relationship, that allows us to estimate the gas-oil- ratio (GOR) and the effect condensate banking close to production wells. This simple relationship allows us to estimate GOR and condensate banking at any time by using basic data such as saturation pressure, field pressure, gas injection rates, and gas production rates. We recognize and acknowledge that further work is required in testing and improving this relation. We suggest the addition of molecular weights (or specific gravity) of the reservoir fluid to improve the correlative relationship. Comparison of field performance under a variety of production scenarios including natural depletion, gas cycling, water injection, and, the injection of different gases (methane, nitrogen and carbon dioxide). We provide a discussion of the effects of different production schemes upon saturation profiles and saturation histories, as well as the influence of various production-injection schemes on well and field productivity. We also include an analysis of the compositional changes driven by injection and the influence of these changes on reservoir performance.

We develop and calibrate a characteristic waveform extraction tool whose major improvements and corrections of prior versions allow satisfaction of the accuracy standards required for advanced LIGO data analysis. The extraction tool uses a characteristic evolution code to propagate numerical data on an inner worldtube supplied by a 3+1 Cauchy evolution to obtain the gravitational waveform at null infinity. With the new extraction tool, high accuracy and convergence of the numerical error can be demonstrated for an inspiral and merger of mass M binary black holes even for an extraction worldtube radius as small as R = 20M. The tool provides a means for unambiguous comparison between waveforms generated by evolution codes based upon different formulations of the Einstein equations and based upon different numerical approximations.

ZEBO ZEBO ZEBO is a decision support tool for the study and design of net zero energy buildings (NZEBs) in hot climates during early design phases. The aim of this tool is to facilitate and integrate the use of energy building performance simulation for architects. The tool embraces a graphical user interface for EnergyPlus. It allows for sensitivity analysis of possible variations of NZEB design parameters and elements during the early design phases in hot climates. Its added values reside in its ability to inform the decision before decisions are made about NZEB design. The tool is contextual and is based on an embedded benchmark model and database for Egyptian residential buildings, which includes local materials and construction and allows the generation of code-compliant design

A bumping and jarring tool for a well drill string is provided having a telescoping housing and mandrel with an annular hydraulic chamber therebetween divided into two portions. The lower portion is exposed to the ambient hydrostatic well pressure and the upper portion varies in volume as the tool opens and closes. When the tool is opened a jar is produced by a valve between the chamber portions which restricts hydraulic fluid flow between the chamber portions resulting in a pressure drop in the upper portion and by a pressure increasing element in the lower portion which increases the pressure therein while the pressure drops in the upper portion. When the valve permits unrestricted flow again, the sudden return to pressure equilibrium produces the jar stroke.

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

Provide Data and Tools Print E-mail Provide Data and Tools Print E-mail With the advance of observational capabilities, computational power, and scientific research, there is both an opportunity for scientific progress in the study of the Earth system and a need to manage the data and information generated about it. As computational capabilities improve, data volume will continue to grow at an accelerating rate. It is crucial to continue to collect and store these records, but this increased output will also present data management challenges. Addressing this data-volume challenge will require advanced technology to link users to the various data providers and cloud-based tools to facilitate collaboration. In the coming decade, USGCRP will take a leadership role in coordinating these networks, by providing shared data access, analytic capabilities, and modeling frameworks to support integrated research and decision support.

Issues surrounding the development, application and interpretation of energy intensity indicators are a continuing source of debate in the field of energy policy analysis. Although economic energy intensity indicators still dominate intensity/efficiency studies, the use of physical energy intensity indicators is on the rise. In the past, physical energy intensity indicators were not employed since it was often impossible to develop aggregate (sector-level or nation-wide) measures of physical energy intensity due to the difficulties associated with adding diverse physical products. This paper presents the results of research conducted specifically to address this ‘‘aggregation’ ’ problem. The research focused on the development of the Composite Indicator Approach, a simple, practical, alternative method for calculating aggregate physical energy intensity indicators. In this paper, the Composite Indicator Approach is used to develop physical energy intensity indicators for the Canadian industrial and manufacturing sectors, and is then compared to other existing methods of aggregation. The physical composite indicators developed using this approach are also evaluated in terms of their reliability and overall usefulness. Both comparisons suggest that the Composite Indicator Approach can be a useful, and ultimately suitable, way of addressing the aggregation problem typically associated with heterogeneous sectors of the economy. r

In large companies, Online Analytical Processing (OLAP) technologies are widely used by business analysts as a decision-support tool. The exploration of the data is performed using operators such as drill-down, roll-up or slice. While ... Keywords: OLAP mining, Oracle 10g, data cubes, data mining, discovery-driven exploration, embedded indicators, online analytical processing, statistical associations

Sensitivity analysis is a powerful technique used to determine robustness, reliability and efficiency of a model. The main problem in this procedure is the evaluating total sensitivity indices that measure a parameter's main effect and all the interactions ... Keywords: Adaptive Monte Carlo algorithm, Global sensitivity indices, Multidimensional numerical integration, Sensitivity analysis

The United States Council for Automotive Research (USCAR) has formed a partnership with the Idaho National Engineering Laboratory (INEL) to develop a process for the rapid production of low-cost tooling based on spray forming technology developed at the INEL. Phase 1 of the program will involve bench-scale system development, materials characterization, and process optimization. In Phase 2, prototype systems will be de signed, constructed, evaluated, and optimized. Process control and other issues that influence commercialization will be addressed during this phase of the project. Technology transfer to USCAR, or a tooling vendor selected by USCAR, will be accomplished during Phase 3. The approach INEL is using to produce tooling, such as plastic injection molds and stamping dies, combines rapid solidification processing and net-shape materials processing into a single step. A bulk liquid metal is pressure-fed into a de Laval spray nozzle transporting a high velocity, high temperature inert gas. The gas jet disintegrates the metal into fine droplets and deposits them onto a tool pattern made from materials such as plastic, wax, clay, ceramics, and metals. The approach is compatible with solid freeform fabrication techniques such as stereolithography, selective laser sintering, and laminated object manufacturing. Heat is extracted rapidly, in-flight, by convection as the spray jet entrains cool inert gas to produce undercooled and semi-solid droplets. At the pattern, the droplets weld together while replicating the shape and surface features of the pattern. Tool formation is rapid; deposition rates in excess of 1 ton/h have been demonstrated for bench-scale nozzles.

Policy Applications of Energy Indicators Policy Applications of Energy Indicators Jump to: navigation, search Tool Summary Name: Policy Applications of Energy Indicators Agency/Company /Organization: International Atomic Energy Agency Sector: Energy Topics: Co-benefits assessment, Policies/deployment programs Resource Type: Publications Website: www.blackwell-synergy.com/toc/narf/29/4 References: Policy Applications of Energy Indicators[1] "In November 2005, a special issue of the Natural Resources Forum was published on 'Policy Applications of Energy Indicators' based on the ISED programme. To read the full issue, click here. The special issue features articles highlighting how energy indicators were developed, refined and tested by international and regional agencies and national experts to

A soil removal tool is provided for removing radioactive soil, rock and other debris from the bottom of an excavation, while permitting the operator to be located outside of a containment for that excavation. The tool includes a fixed jaw, secured to one end of an elongate pipe, which cooperates with a movable jaw pivotably mounted on the pipe. Movement of the movable jaw is controlled by a pneumatic cylinder mounted on the pipe. The actuator rod of the pneumatic cylinder is connected to a collar which is slidably mounted on the pipe and forms part of the pivotable mounting assembly for the movable jaw. Air is supplied to the pneumatic cylinder through a handle connected to the pipe, under the control of an actuator valve mounted on the handle, to provide movement of the movable jaw.

A soil tool is provided for removing radioactive soil, rock and debris from the bottom of an excavation, while permitting the operator to be located outside of a containment for that excavation. The tool includes a fixed jaw, secured to one end of an elongate pipe, which cooperates with a movable jaw pivotably mounted on the pipe. Movement of the movable jaw is controlled by a pneumatic cylinder on the pipe. The actuator rod of the pneumatic cylinder is connected to a collar which is slidably on the pipe and forms part of the pivotable mounting assembly for the movable jaw. Air is supplied to the pneumatic cylinder through a handle connected to the pipe, under the control of an actuator value mounted on the handle, to provide movement of the movable jaw.

A soil removal tool is provided for removing radioactive soil, rock and other debris from the bottom of an excavation, while permitting the operator to be located outside of a containment for that excavation. The tool includes a fixed jaw, secured to one end of an elongate pipe, which cooperates with a movable jaw pivotably mounted on the pipe. Movement of the movable jaw is controlled by a pneumatic cylinder mounted on the pipe. The actuator rod of the pneumatic cylinder is connected to a collar which is slidably mounted on the pipe and forms part of the pivotable mounting assembly for the movable jaw. Air is supplied to the pneumatic cylinder through a handle connected to the pipe, under the control of an actuator valve mounted on the handle, to provide movement of the movable jaw. 3 figs.

Sample records for analysis indicators tool from the National Library of Energy Beta (NLEBeta)

Note: This page contains sample records for the topic "analysis indicators tool" from the National Library of EnergyBeta (NLEBeta).
While these samples are representative of the content of NLEBeta,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of NLEBeta
to obtain the most current and comprehensive results.

Are you a Are you a passionate learner? Continual Learning Program: Employee Development Provides tools and resources for promoting continual learning, individual development, and strategic workforce development at the Department of Energy. It includes three learning sessions described below. Do you work in a learning organization? Will you invest in your own development and are looking for a way to get started? Let us help Keep Learning Even with Tight Budgets When you attend the session, you will be able to: ïƒ¼ Discuss the Importance of continual learning ïƒ¼ Explain an easy way to approach self-development ïƒ¼ Create blended learning strategies for development ïƒ¼ Use a tool for planning your IDP Let's Work on Your IDP When you attend the session, you will be able to:

A retrieving tool is described to securely grasp an object for emplacement in, or withdrawal from, an elongated tube. The object is grasped by hooks actuated by a wedge and cann mechanism. The mechanism on the end of a long rodlike structure is controlled by levers or bars at the access end of the tube. This device is particularly useful for positioning fuel elements within a reactor core.

Analysts seeking evidence of rising inflation often focus on the movements of a single indicator—an increase in the price of gold, for example, or a decline in the unemployment rate. But simple statistical tests reveal that such indicators, used in isolation, have very limited predictive power. Controlling inflation—perhaps the most vital responsibility of the Federal Reserve—requires a high degree of foresight. Because policy actions to curb inflation typically take effect only after a long lag, the Federal Reserve needs to know in advance when inflation is likely to rise. Consequently, to understand where prices are headed and what policy steps are appropriate, policymakers turn to forecasts of inflation. In this edition of Current Issues, we consider the usefulness of certain “indicator variables ” in forecasting inflation. These variables—which include commodity

An indicator is presented for measuring the lowspeed velocity of an object in one direction where the object returns in the opposite direction at a high speed. The indicator comprises a drum having its axis of rotation transverse to the linear movement of the object and a tape wound upon the drum with its free end extending therefrom and adapted to be connected to the object. A constant torque is applied to the drum in a direction to wind the tape on the drum. The speed of the tape in the unwinding direction is indicated on a tachometer which is coupled through a shaft and clutch means to the drum only when the tape is unwinding.