Sample records for integrated modeling program

NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

An existing computer model for dynamic hygrothermal analysis of buildings has been extended with a multizone airflow model based on loop equations to account for the coupled thermal and airflow in natural and hybrid ventilated buildings. In water distribution network and related fields loop...... a methodology adopted from water distribution network that automatically sets up the independent loops and is easy to implement into a computer program. Finally an example of verification of the model is given which demonstrates the ability of the models to accurately predict the airflow of a simple multizone...

Sensitivity analysis estimates the relative contribution of the uncertainty in input values to the uncertainty of model outputs. Partial Rank Correlation Coefficient (PRCC) and Standardized Rank Regression Coefficient (SRRC) are methods of conducting sensitivity analysis on nonlinear simulation models like the Integrated Medical Model (IMM). The PRCC method estimates the sensitivity using partial correlation of the ranks of the generated input values to each generated output value. The partial part is so named because adjustments are made for the linear effects of all the other input values in the calculation of correlation between a particular input and each output. In SRRC, standardized regression-based coefficients measure the sensitivity of each input, adjusted for all the other inputs, on each output. Because the relative ranking of each of the inputs and outputs is used, as opposed to the values themselves, both methods accommodate the nonlinear relationship of the underlying model. As part of the IMM v4.0 validation study, simulations are available that predict 33 person-missions on ISS and 111 person-missions on STS. These simulated data predictions feed the sensitivity analysis procedures. The inputs to the sensitivity procedures include the number occurrences of each of the one hundred IMM medical conditions generated over the simulations and the associated IMM outputs: total quality time lost (QTL), number of evacuations (EVAC), and number of loss of crew lives (LOCL). The IMM team will report the results of using PRCC and SRRC on IMM v4.0 predictions of the ISS and STS missions created as part of the external validation study. Tornado plots will assist in the visualization of the condition-related input sensitivities to each of the main outcomes. The outcomes of this sensitivity analysis will drive review focus by identifying conditions where changes in uncertainty could drive changes in overall model output uncertainty. These efforts are an integral

The US Postal Service maintains the largest civilian fleet in the United States totaling approximately 180,000 vehicles. To support the fleets daily energy requirements, the Postal Service also operates one of the largest networks of underground storage tanks nearly 7,500 nationwide. A program to apply risk assessment to planning, budget development and other management actions was implemented during September, 1989. Working closely with a consultant, the postal service developed regulatory and environmental risk criteria and weighting factors for a ranking model. The primary objective was to identify relative risks for each underground tank at individual facilities. Relative risks at each facility were determined central to prioritizing scheduled improvements to the tank network. The survey was conducted on 302 underground tanks in the Northeast Region of the US. An environmental and regulatory risk score was computed for each UST. By ranking the tanks according to their risk score, tanks were classified into management action categories including, but the limited to, underground tank testing, retrofit, repair, replacement and closure

This study considers a distinct case of a college outreach program that integrates student affairs staff, academic administrators, and faculty across campus. The authors find that social networks and critical agency help to understand the integration of these various professionals and offer a critical agency network model of enacting change.…

for the required simulation allowed the MK6LE project to avoid the risk of having lower level model components not integrating together. The initial...that programs that applied MBSE at the lower levels, in particular the MK54 Torpedo program, expressed regrets of limiting the re-architecture to the

This article describes a strategy to integrate information literacy into the curriculum of a nursing program in a community college. The model is articulated in four explained phases: preparatory, planning, implementation, and evaluation. It describes a collaborative process encouraging librarians to work with nursing faculty, driving students to…

This report describes the theory and capabilities of RIP (Repository IntegrationProgram). RIP is a powerful and flexible computational tool for carrying out probabilistic integrated total system performance assessments for geologic repositories. The primary purpose of RIP is to provide a management tool for guiding system design and site characterization. In addition, the performance assessment model (and the process of eliciting model input) can act as a mechanism for integrating the large amount of available information into a meaningful whole (in a sense, allowing one to keep the ''big picture'' and the ultimate aims of the project clearly in focus). Such an integration is useful both for project managers and project scientists. RIP is based on a '' top down'' approach to performance assessment that concentrates on the integration of the entire system, and utilizes relatively high-level descriptive models and parameters. The key point in the application of such a ''top down'' approach is that the simplified models and associated high-level parameters must incorporate an accurate representation of their uncertainty. RIP is designed in a very flexible manner such that details can be readily added to various components of the model without modifying the computer code. Uncertainty is also handled in a very flexible manner, and both parameter and model (process) uncertainty can be explicitly considered. Uncertainty is propagated through the integrated PA model using an enhanced Monte Carlo method. RIP must rely heavily on subjective assessment (expert opinion) for much of its input. The process of eliciting the high-level input parameters required for RIP is critical to its successful application. As a result, in order for any project to successfully apply a tool such as RIP, an enormous amount of communication and cooperation must exist between the data collectors, the process modelers, and the performance. assessment modelers

This effort demonstrates business process modeling to describe the integration of particular planning and programming activities of a state highway agency. The motivations to document planning and programming activities are that: (i) resources for co...

This effort demonstrates business process modeling to describe the integration of particular planning and programming activities of a state highway agency. The motivations to document planning and programming activities are that: (i) resources for co...

The VIS-AD data modelintegrates metadata about the precision of values, including missing data indicators and the way that arrays sample continuous functions, with the data objects of a scientific programming language. The data objects of this data model form a lattice, ordered by the precision with which they approximate mathematical objects. We define a similar lattice of displays and study visualization processes as functions from data lattices to display lattices. Such functions can be applied to visualize data objects of all data types and are thus polymorphic.

Catholic healthcare should establish comprehensive compliance strategies, beyond following Medicare reimbursement laws, that reflect mission and ethics. A covenant model of business ethics--rather than a self-interest emphasis on contracts--can help organizations develop a creed to focus on obligations and trust in their relationships. The corporate integrityprogram (CIP) of Mercy Health System Oklahoma promotes its mission and interests, educates and motivates its employees, provides assurance of systemwide commitment, and enforces CIP policies and procedures. Mercy's creed, based on its mission statement and core values, articulates responsibilities regarding patients and providers, business partners, society and the environment, and internal relationships. The CIP is carried out through an integrated network of committees, advocacy teams, and an expanded institutional review board. Two documents set standards for how Mercy conducts external affairs and clarify employee codes of conduct.

U.S. Department of Health & Human Services — State programintegrity reviews play a critical role in how CMS provides effective support and assistance to states in their efforts to combat provider fraud and...

IMP is a simulation language that is used to model missions around the Earth, Moon, Mars, or other planets. It has been used to model missions for the Saturn Program, Apollo Program, Space Transportation System, Space Exploration Initiative, and Space Station Freedom. IMP allows a user to control the mission being simulated through a large event/maneuver menu. Up to three spacecraft may be used: a main, a target and an observer. The simulation may begin at liftoff, suborbital, or orbital. IMP incorporates a Fehlberg seventh order, thirteen evaluation Runge-Kutta integrator with error and step-size control to numerically integrate the equations of motion. The user may choose oblate or spherical gravity for the central body (Earth, Mars, Moon or other) while a spherical model is used for the gravity of an additional perturbing body. Sun gravity and pressure and Moon gravity effects are user-selectable. Earth/Mars atmospheric effects can be included. The optimum thrust guidance parameters are calculated automatically. Events/maneuvers may involve many velocity changes, and these velocity changes may be impulsive or of finite duration. Aerobraking to orbit is also an option. Other simulation options include line-of-sight communication guidelines, a choice of propulsion systems, a soft landing on the Earth or Mars, and rendezvous with a target vehicle. The input/output is in metric units, with the exception of thrust and weight which are in English units. Input is read from the user's input file to minimize real-time keyboard input. Output includes vehicle state, orbital and guide parameters, event and total velocity changes, and propellant usage. The main output is to the user defined print file, but during execution, part of the input/output is also displayed on the screen. An included FORTRAN program, TEKPLOT, will display plots on the VDT as well as generating a graphic file suitable for output on most laser printers. The code is double precision. IMP is written in

This report describes the MICA (Mentally Ill Chemically Abusing) Program at the Tewksbury Hospital campus in Tewksbury, Massachusetts. Several campus facilities collaborate in the MICA Program. Through Expert Case Conferences, principles of integrated psychosocial treatment with dual diagnosis patients are demonstrated. An expert clinician focuses on the interplay between psychological pain, characterological traits, defenses, and the patient's drug of choice. Patients who have participated in the program have reported positive experiences. The staff reported that the program has resulted in facility improvement in assessment and treatment of complex dual diagnosis patients.

demand, and the production capacity have been considered as mutative variables, then an improved model in which some parameters are not constant has been developed and a new method to solve the grey linear programming has been proposed. In the grey programmingmodel, the value of credibility can...

Money, banking, and macroeconomic textbooks traditionally present the topics of money, the creation of demand deposits by depository institutions, and the Hicksian-Keynesian Theory of Income and Interest separately, as if they were unrelated. This paper presents an integrated approach to those subjects using computer programs written in BASIC, the…

A new research program on steam generator tubing degradation is being sponsored by the U.S. Nuclear Regulatory Commission (NRC) at Argonne National Laboratory. This program is intended to support a performance-based steam generator tube integrity rule. Critical areas addressed by the program include evaluation of the processes used for the in-service inspection of steam generator tubes and recommendations for improving the reliability and accuracy of inspections; validation and improvement of correlations for evaluating integrity and leakage of degraded steam generator tubes, and validation and improvement of correlations and models for predicting degradation in steam generator tubes as aging occurs. The studies will focus on mill-annealed Alloy 600 tubing, however, tests will also be performed on replacement materials such as thermally-treated Alloy 600 or 690. An overview of the technical work planned for the program is given

The IDB Program provides direct support to the DOE Nuclear Waste Management and Fuel Cycle Programs and their lead sites and support contractors by providing and maintaining a current, integrated data base of spent fuel and radioactive waste inventories and projections. All major waste types (HLW, TRU, and LLW) and sources (government, commerical fuel cycle, and I/I) are included. A major data compilation was issued in September, 1981: Spent Fuel and Radioactive Waste Inventories and Projections as of December 31, 1980, DOE/NE-0017. This report includes chapters on Spent Fuel, HLW, TRU Waste, LLW, Remedial Action Waste, Active Uranium Mill Tailings, and Airborne Waste, plus Appendices with more detailed data in selected areas such as isotopics, radioactivity, thermal power, projections, and land usage. The LLW sections include volumes, radioactivity, thermal power, current inventories, projected inventories and characteristics, source terms, land requirements, and a breakdown in terms of government/commercial and defense/fuel cycle/I and I

Full Text Available Hospitality Study Program, Politeknik Negeri Bali (PNB, hadnâ€™t implemented integrated learning practice optimally. The aim of this research was improving the learning process method as an integrated practice learning model involving three courses (Food Production, FB Service, English for Restaurant in the same topic. This study was conducted on the forth semester of Hotel Study Program as the sample used in this research. After the random sampling was selected two classes as research samples, those were IVA class as an experiment group and IVB class as a control. Thus the samples could be determined according to the number of students in each class as many as 26 people. The application of integrated practice learning had an effect on the achievement of student competency in waiter/s occupation at Hotel Studies Program. The result of statistical test showed that there was a significant difference of competency achievement between integrated learning practices with partial practice learning students groups. Itâ€™s suggested to the management Hospitality Study Program to encourage and to facilitate the lecturers especially of core subjects to apply integrated learning practices in order to achieve the competency.

Approaches to the maintenance of nuclear power plants have undergone significant change in the past several decades. The traditional breakdown approach has been displaced by preventive (calendar-based) maintenance and more recently, by condition-based maintenance (CBM). This is largely driven by the fact that traditional maintenance programs, derived primarily from equipment vendor recommendations, are generally unsuccessful in controlling maintenance costs or equipment failures. Many advances in the maintenance field have taken place since the maintenance plans for Ontario Hydro's nuclear plants were initially established. Ontario Hydro nuclear plant operating costs can be substantially reduced and Incapability Factor improved with the application of modern maintenance processes and tools. Pickering is designated as the lead station for IMP. Of immediate concern is the fact that Pickering Nuclear Division has been experiencing a significant backlog of Operating Preventive Maintenance Callups. This backlog, over 2000, is unacceptable to both station management and the nuclear regulator, the Atomic Energy Control Board. In addition there are over 500 callups in various stages of revision (in hyperspace) without an adequate control nor reporting system to manage their completion. There is also considerable confusion about the classification of l icensing c allups, e.g. callups which are mandatory as a result of legal requirements. Furthermore the ineffectiveness of the Preventive Maintenance (PM) has been the subject of peer audits and Atomic Energy Control Board (AECB) findings over the past several years. The current preventive maintenance ratio PM2 /(PM+CM3) at Pickering ND is less than 20%, due to the current high load of equipment breakdown. This past summer, an Independent Integrated Performance Assessment (IIPA) review at Ontario Hydro confirmed these concerns. Over the past several years, Ontario Hydro nuclear staff have evaluated several programs to improve

This report describes the theory and capabilities of RIP (Repository IntegrationProgram). RIP is a powerful and flexible computational tool for carrying out probabilistic integrated total system performance assessments for geologic repositories. The primary purpose of RIP is to provide a management tool for guiding system design and site characterization. In addition, the performance assessment model (and the process of eliciting model input) can act as a mechanism for integrating the large amount of available information into a meaningful whole (in a sense, allowing one to keep the ``big picture`` and the ultimate aims of the project clearly in focus). Such an integration is useful both for project managers and project scientists. RIP is based on a `` top down`` approach to performance assessment that concentrates on the integration of the entire system, and utilizes relatively high-level descriptive models and parameters. The key point in the application of such a ``top down`` approach is that the simplified models and associated high-level parameters must incorporate an accurate representation of their uncertainty. RIP is designed in a very flexible manner such that details can be readily added to various components of the model without modifying the computer code. Uncertainty is also handled in a very flexible manner, and both parameter and model (process) uncertainty can be explicitly considered. Uncertainty is propagated through the integrated PA model using an enhanced Monte Carlo method. RIP must rely heavily on subjective assessment (expert opinion) for much of its input. The process of eliciting the high-level input parameters required for RIP is critical to its successful application. As a result, in order for any project to successfully apply a tool such as RIP, an enormous amount of communication and cooperation must exist between the data collectors, the process modelers, and the performance. assessment modelers.

Having worked in the Employees and Commercial Payments Branch of the Financial Management Division for the past 3 summers, I have seen the many changes that have occurred within the NASA organization. As I return each summer, I find that new programs and systems have been adapted to better serve the needs of the Center and of the Agency. The NASA Agency has transformed itself the past couple years with the implementation of the Integrated Financial Management Program (IFMP). IFMP is designed to allow the Agency to improve its management of its Financial, Physical, and Human Resources through the use of multiple enterprise module applications. With my mentor, Joseph Kan, being the branch chief of the Employees and Commercial Payments Branch, I have been exposed to several modules, such as Travel Manager, WebTads, and Core Financial/SAP, which were implemented in the last couple of years under the IFMP. The implementation of these agency-wide systems has sometimes proven to be troublesome. Prior to IFMP, each NASA Center utilizes their own systems for Payroll, Travel, Accounts Payable, etc. But with the implementation of the Integrated Financial Management Program, all the "legacy" systems had to be eliminated. As a result, a great deal of enhancement and preparation work is necessary to ease the transformation from the old systems to the new. All this work occurs simultaneously; for example, e-Payroll will "go live" in several months, but a system like Travel Manager will need to have information upgraded within the system to meet the requirements set by Headquarters. My assignments this summer have given me the opportunity to become involved with such work. So far, I have been given the opportunity to participate in projects resulting from a congressional request, several bankcard reconciliations, updating routing lists for Travel Manager, updating the majordomo list for Travel Manager approvers and point of contacts, and a NASA Headquarters project involving

Full Text Available The Program of Research to Integrate the Services for the Maintenance of Autonomy (PRISMA began in Quebec in 1999. Evaluation results indicated that the PRISMA Project improved the system of care for the frail elderly at no additional cost. In 2001, the Quebec Ministry of Health and Social Services made implementing the six features of the PRISMA approach a province-wide goal in the programme now known as RSIPA (French acronym. Extensive Province-wide progress has been made since then, but ongoing challenges include reducing unmet need for case management and home care services, creating incentives for increased physician participation in care planning and improving the computerized client chart, among others. PRISMA is the only evaluated international model of a coordination approach to integration and one of the few, if not the only, integrationmodel to have been adopted at the system level by policy-makers.

U.S. Department of Health & Human Services — The State ProgramIntegrity Assessment (SPIA) is the Centers for Medicare and Medicaid Services (CMS) first national data collection on state Medicaid program...

The data reduction program used to analyze the performance of the Aerothermodynamic IntegrationModel is described. Routines to acquire, calibrate, and interpolate the test data, to calculate the axial components of the pressure area integrals and the skin function coefficients, and to report the raw data in engineering units are included along with routines to calculate flow conditions in the wind tunnel, inlet, combustor, and nozzle, and the overall engine performance. Various subroutines were modified and used to obtain species concentrations and transport properties in chemical equilibrium at each of the internal and external engine stations. It is recommended that future test plans include the configuration, calibration, and channel assignment data on a magnetic tape generated at the test site immediately before or after a test, and that the data reduction program be designed to operate in a batch environment.

The development of the computer code ATHLET-CD is a contribution to the reactor safety research. ATHLET-CD is an extension of the system code ATHLET by core degradation models especially of the modular software package KESS. The aim of the ATHLET-CD development is the simulation of severe accident sequences from their initialisation to severe core degradation in a continous manner. In the framework of this project the ATHLET-CD development has been focused on the integration of KESS model like the control rod model as well as the models describing chemical interactions and material relocation along a rod and fission product release. The present ATHLET-CD version is able to describe severe accidents in a PWR up to the early core degradation (relocation of material along a rod surface in axial direction). Contributions to the verification of ATHLET-CD comprised calculations of the experiments PHEBUS AIC and PBF SFD 1-4. The PHEBUS AIC calculation was focused on the examination of the control rod model whereas the PBF SFD 1-4 claculation served to check the models describing melting, material relocation and fission product release. (orig.)

The mission of the Structural IntegrityProgram is to ensure continued safe management and operation of the waste tanks for whatever period of time these tanks are required. Matthew Maryak provides an overview of the Structural IntegrityProgram to open Session 5 (Waste Storage and Tank Inspection) of the 2010 EM Waste Processing Technical Exchange.

electrification simulator; A national CLEW tool allows for the optimization of national level integrated resource use and Macro-CLEW presents the same allowing for detailed economic-biophysical interactions. Finally open Model Management Infrastructure (MoManI) is presented that allows for the rapid prototyping of new additions to, or new resource optimization tools. Collectively these tools provide insights to some fifteen of the SDGs and are made publicly available with support to governments and academic institutions.

This article introduces a curricular innovation, the Integrated Health Scholars Program (IHSP), developed to prepare master's-level social work students for practice in integrated health care settings, and presents preliminary findings related to students' self-reported program competencies and perceptions. IHSP, implemented in a…

Full Text Available Background: End-of-life care financing and delivery in the United States is fragmented and uncoordinated, with little integration of acute and long-term care services. Objective: To assess policy issues involving end-of-life care, especially involving the hospice benefit, and to analyse modelprograms of integrated care for people who are dying. Methods: The study conducted structured interviews with stakeholders and experts in end-of-life care and with administrators of modelprograms in the United States, which were nominated by the experts. Results: The two major public insurance programs—Medicare and Medicaid—finance the vast majority of end-of-life care. Both programs offer a hospice benefit, which has several shortcomings, including requiring physicians to make a prognosis of a six month life expectancy and insisting that patients give up curative treatment—two steps which are difficult for doctors and patients to make—and payment levels that may be too low. In addition, quality of care initiatives for nursing homes and hospice sometimes conflict. Four innovative health systems have overcome these barriers to provide palliative services to beneficiaries in their last year of life. Three of these health systems are managed care plans which receive capitated payments. These providers integrate health, long-term and palliative care using an interdisciplinary team approach to management of services. The fourth provider is a hospice that provides palliative services to beneficiaries of all ages, including those who have not elected hospice care. Conclusions: End-of-life care is deficient in the United States. Public payers could use their market power to improve care through a number of strategies.

from a catalog of courses is difficult because of the many factors being considered. To assist this process, the multi-objective model and the curriculum requirements were incorporated in a linear program to select the "optimum" curriculum. The application of this tool was also beneficial in identifying the active constraints that limit curriculum development and content.

National Aeronautics and Space Administration — The Exploration Medical Capability (ExMC) Element of NASA's Human Research Program (HRP) developed the Integrated Medical Model (IMM) to forecast the resources...

Full Text Available Technical analysis has been proved to be capable of exploiting short-term fluctuations in financial markets. Recent results indicate that the market timing approach beats many traditional buy-and-hold approaches in most of the short-term trading periods. Genetic programming (GP was used to generate short-term trade rules on the stock markets during the last few decades. However, few of the related studies on the analysis of financial time series with genetic programming considered the non-stationary and noisy characteristics of the time series. In this paper, to de-noise the original financial time series and to search profitable trading rules, an integrated method is proposed based on the Wavelet Threshold (WT method and GP. Since relevant information that affects the movement of the time series is assumed to be fully digested during the market closed periods, to avoid the jumping points of the daily or monthly data, in this paper, intra-day high-frequency time series are used to fully exploit the short-term forecasting advantage of technical analysis. To validate the proposed integrated approach, an empirical study is conducted based on the China Securities Index (CSI 300 futures in the emerging China Financial Futures Exchange (CFFEX market. The analysis outcomes show that the wavelet de-noise approach outperforms many comparative models.

An existing computer model for dynamic hygrothermal analysis of buildings has been extended with a multizone airflow model based on loop equations to account for the coupled thermal and airflow in natural and hybrid ventilated buildings.......An existing computer model for dynamic hygrothermal analysis of buildings has been extended with a multizone airflow model based on loop equations to account for the coupled thermal and airflow in natural and hybrid ventilated buildings....

Objective The Doorway program is a 3-year pilot integrated housing and recovery support program aimed at people with a severe and persistent mental illness who are 'at risk' or actually homeless. Participants source and choose properties through the open rental market, with appropriate rental subsidy and brokerage support. This arrangement is highly innovative, differing from widely favoured arrangements internationally involving congregate and scattered-site housing owned or managed by the support program. The aim of the present study was to determine the effects of the Doorway program on participants' health, housing, service utilisation and costs. Methods A pre-post study design was used with outcome measures consisting of a number of question inventories and their costs (where relevant). The principal inventories were the Behaviour and Symptom Identification Scale 32 (BASIS-32), a consumer-oriented, self-report measure of behavioural symptoms and distress, the Health of the Nation Outcome Scale (HoNOS), an interviewer-administered measurement tool designed to assess general health and social functioning of mentally ill people and the Outcomes Star (Homelessness) system which measures various aspects of the homelessness experience. Baseline measurements were performed routinely by staff at entry to the program and then at 6-monthly intervals across the evaluation period. Results For 55 of 59 participants, total mean BASIS-32 scores (including as well three of five subscale scores) improved significantly and with moderate effect size. Four of the 10 domain scores on the Outcome Star (Homelessness) inventory also improved significantly, with effect sizes ranging from small-medium (three domains) to large (one domain). Mean usage of bed-based mental health clinical services and general hospital admissions both significantly decreased (with overall net savings of A$3096 per participant per annum). Overall cost savings (including housing) to government ranged from A

Full Text Available The high abstraction level of equation-based object-oriented (EOO languages such as Modelica has the drawback that programming and modeling errors are often hard to find. In this paper we present integrated static and dynamic debugging methods for Modelica models and a debugger prototype that addresses several of those problems. The goal is an integrated debugging framework that combines classical debugging techniques with special techniques for equation-based languages partly based on graph visualization and interaction. To our knowledge, this is the first Modelica debugger that supports both equation-based transformational and algorithmic code debugging in an integrated fashion.

Integrated logic circuits were described as a means of formally representing genetic-geologic models for estimating undiscovered uranium resources. The logic circuits are logical combinations of selected geologic characteristics judged to be associated with particular types of uranium deposits. Each combination takes on a value which corresponds to the combined presence, absence, or don't know states of the selected characteristic within a specified geographic cell. Within each cell, the output of the logic circuit is taken as a measure of the favorability of occurrence of an undiscovered deposit of the type being considered. In this way, geological, geochemical, and geophysical data are incorporated explicitly into potential uranium resource estimates. The present report describes how integrated logic circuits are constructed by use of a computer graphics program. A user's guide is also included

The Integrated Medical Model (IMM) Project represents one aspect of NASA's Human Research Program (HRP) to quantitatively assess medical risks to astronauts for existing operational missions as well as missions associated with future exploration and commercial space flight ventures. The IMM takes a probabilistic approach to assessing the likelihood and specific outcomes of one hundred medical conditions within the envelope of accepted space flight standards of care over a selectable range of mission capabilities. A specially developed Integrated Medical Evidence Database (iMED) maintains evidence-based, organizational knowledge across a variety of data sources. Since becoming operational in 2011, version 3.0 of the IMM, the supporting iMED, and the expertise of the IMM project team have contributed to a wide range of decision and informational processes for the space medical and human research community. This presentation provides an overview of the IMM conceptual architecture and range of application through examples of actual space flight community questions posed to the IMM project.

This document serves as a detailed companion to the powerpoint slides presented as part of the ASC L2 milestone review for Integrated Codes milestone #4782 titled “Assess Newly Emerging Programming and Memory Models for Advanced Architectures on Integrated Codes”, due on 9/30/2014, and presented for formal program review on 9/12/2014. The program review committee is represented by Mike Zika (A Program Project Lead for Kull), Brian Pudliner (B Program Project Lead for Ares), Scott Futral (DEG Group Lead in LC), and Mike Glass (Sierra Project Lead at Sandia). This document, along with the presentation materials, and a letter of completion signed by the review committee will act as proof of completion for this milestone.

While there has been extensive research in defining project organizational structures for traditional projects, little research exists to support high technology government project s organizational structure definition. High-Technology Government projects differ from traditional projects in that they are non-profit, span across Government-Industry organizations, typically require significant integration effort, and are strongly susceptible to a volatile external environment. Systems Integration implementation has been identified as a major contributor to both project success and failure. The literature research bridges program management organizational planning, systems integration, organizational theory, and independent project reports, in order to assess Systems Integration (SI) organizational structure selection for improving the high-technology government project s probability of success. This paper will describe the methodology used to 1) Identify and assess SI organizational structures and their success rate, and 2) Identify key factors to be used in the selection of these SI organizational structures during the acquisition strategy process.

The promotion of energy economy and efficiency is recognized as the single most cost-effective and least controversial component of any strategy of matching energy demand and supply with resource and environmental constraints. Historically such efficiency gains are not out of reach for the industrialized market economy countries, but are unlikely to be reached under present conditions by developing countries and economics in transition. The aim of the work was to analyze the main characteristics of United Kingdom, France, Japan, Canada, Australia and Denmark energy conservation integratedprograms

Modelintegration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Modelintegration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for modelintegration. Schema and process integration are examined as the logical and manipulation counterparts of modelintegr...

The aim of the work entitled ''Integral Quality Programs for Radiodiagnostics Services'' is to present the experience accumulated over the past 10 years by the Radiodiagnostics Service of C.M.E. Ramon y Cajal in Zaragoza. The term ''integral quality'' will be defined conceptually in order to differentiate it from the classical quality control which refers exclusively to the control of radiology equipment. The problem will be reviewed from the historical point of view and a basic, homologated model, contrasted on the basis of the work of these 10 years, is proposed mainly to serve as the backbone for the working system in a Radiodiagnostics Service. (Author) 46 ref

Full Text Available Corporations are at present operating in demanding and highly unsure periods, facing a mixture of increased macroeconomic need, competitive and capital market dangers, and in many cases, the prospect for significant technical and regulative gap. Throughout these demanding and highly unsure times, the corporations must pay particular attention to corporate strategy. In present times, corporate strategy must be perceived and used as a function of various fields, covers, and characters as well as a highly interactive system. For the corporation's strategy to become a competitive advantage is necessary to understand and also to integrate it in a holistic model to ensure sustainable progress of corporation activities under the optimum conditions of profitability. The model proposed in this paper is aimed at integrating the two strategic models, Hoshin Kanri and Integrated Strategy Model, as well as their consolidation with the principles of sound corporate governance set out by the OECD.

This report describes the work of developing an integratedmodel used to predict the thermal history, deformation, roll forces, microstructural evolution and mechanical properties of steel strip in a hot-strip mill. This achievement results from a joint research effort that is part of the American Iron and Steel Institute's (AIS) Advanced Process Control Program, a collaboration between the U.S. DOE and fifteen North American Steelmakers.

The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge for future research.

Highlights: • MILP model developed for integration of waste heat recovery technologies in process sites. • Five thermodynamic cycles considered for exploitation of industrial waste heat. • Temperature and quantity of multiple waste heat sources considered. • Interactions with the site utility system considered. • Industrial case study presented to illustrate application of the proposed methodology. - Abstract: Thermodynamic cycles such as organic Rankine cycles, absorption chillers, absorption heat pumps, absorption heat transformers, and mechanical heat pumps are able to utilize wasted thermal energy in process sites for the generation of electrical power, chilling and heat at a higher temperature. In this work, a novel systematic framework is presented for optimal integration of these technologies in process sites. The framework is also used to assess the best design approach for integrating waste heat recovery technologies in process sites, i.e. stand-alone integration or a systems-oriented integration. The developed framework allows for: (1) selection of one or more waste heat sources (taking into account the temperatures and thermal energy content), (2) selection of one or more technology options and working fluids, (3) selection of end-uses of recovered energy, (4) exploitation of interactions with the existing site utility system and (5) the potential for heat recovery via heat exchange is also explored. The methodology is applied to an industrial case study. Results indicate a systems-oriented design approach reduces waste heat by 24%; fuel consumption by 54% and CO_2 emissions by 53% with a 2 year payback, and stand-alone design approach reduces waste heat by 12%; fuel consumption by 29% and CO_2 emissions by 20.5% with a 4 year payback. Therefore, benefits from waste heat utilization increase when interactions between the existing site utility system and the waste heat recovery technologies are explored simultaneously. The case study also shows

The Mixed Waste IntegratedProgram Logic Diagram was developed to provide technical alternative for mixed wastes projects for the Office of Technology Development's Mixed Waste IntegratedProgram (MWIP). Technical solutions in the areas of characterization, treatment, and disposal were matched to a select number of US Department of Energy (DOE) treatability groups represented by waste streams found in the Mixed Waste Inventory Report (MWIR)

Full Text Available This paper presents a supplier selection and order allocation (SSOA model to solve the problem of a multiperiod supplier selection and then order allocation in the environment of short product life cycle and frequent material purchasing, for example, fast fashion environment in apparel industry. At the first stage, with consideration of multiple decision criteria and the fuzziness of the data involved in deciding the preferences of multiple decision variables in supplier selection, the fuzzy extent analytic hierarchy process (FEAHP is adopted. In the second stage, supplier ranks are inputted into an order allocation model that aims at minimizing the risk of material purchasing and minimizing the total material purchasing costs using a dynamic programming approach, subject to constraints on deterministic customer demand and deterministic supplier capacity. Numerical examples are presented, and computational results are reported.

This paper discusses applications and implementation approaches used for integratedmodeling of structural systems with optics over the past 30 years. While much of the development work focused on control system design, significant contributions were made in system modeling and computer-aided design (CAD) environments. Early work appended handmade line-of-sight models to traditional finite element models, such as the optical spacecraft concept from the ACOSS program. The IDEAS2 computational environment built in support of Space Station collected a wider variety of existing tools around a parametric database. Later, IMOS supported interferometer and large telescope mission studies at JPL with MATLAB modeling of structural dynamics, thermal analysis, and geometric optics. IMOS's predecessor was a simple FORTRAN command line interpreter for LQG controller design with additional functions that built state-space finite element models. Specialized language systems such as CAESY were formulated and prototyped to provide more complex object-oriented functions suited to control-structure interaction. A more recent example of optical modeling directly in mechanical CAD is used to illustrate possible future directions. While the value of directly posing the optical metric in system dynamics terms is well understood today, the potential payoff is illustrated briefly via project-based examples. It is quite likely that integrated structure thermal optical performance (STOP) modeling could be accomplished in a commercial off-the-shelf (COTS) tool set. The work flow could be adopted, for example, by a team developing a small high-performance optical or radio frequency (RF) instrument.

The coupled nature of the various processes in the near field require that integratedmodels be employed to assess long term performance of the waste package and repository. The nature of the integrated near field models being compiled under the SCEPTER program are discussed. The interfaces between these near field models and far field models are described. Finally, near field data requirements are outlined in sufficient detail to indicate overall programmatic guidance for data gathering activities

This report compiles information and conclusions gathered as part of the “Modeling EERE Deployment Programs” project. The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge in which future research is needed.

This is the PDF of a powerpoint presentation from a teleconference on Los Alamos programmingmodels. It starts by listing their assumptions for the programmingmodels and then details a hierarchical programmingmodel at the System Level and Node Level. Then it details how to map this to their internal nomenclature. Finally, a list is given of what they are currently doing in this regard.

Integrated assessment models of climate change (IAMs) are widely used to provide insights into the dynamics of the coupled human and socio-economic system, including emission mitigation analysis and the generation of future emission scenarios. Similar to the climate modeling community, the integrated assessment community has a two decade history of model inter-comparison, which has served as one of the primary venues for model evaluation and confirmation. While analysis of historical trends in the socio-economic system has long played a key role in diagnostics of future scenarios from IAMs, formal hindcast experiments are just now being contemplated as evaluation exercises. Some initial thoughts on setting up such IAM evaluation experiments are discussed. Socio-economic systems do not follow strict physical laws, which means that evaluation needs to take place in a context, unlike that of physical system models, in which there are few fixed, unchanging relationships. Of course strict validation of even earth system models is not possible (Oreskes etal 2004), a fact borne out by the inability of models to constrain the climate sensitivity. Energy-system models have also been grappling with some of the same questions over the last quarter century. For example, one of "the many questions in the energy field that are waiting for answers in the next 20 years" identified by Hans Landsberg in 1985 was "Will the price of oil resume its upward movement?" Of course we are still asking this question today. While, arguably, even fewer constraints apply to socio-economic systems, numerous historical trends and patterns have been identified, although often only in broad terms, that are used to guide the development of model components, parameter ranges, and scenario assumptions. IAM evaluation exercises are expected to provide useful information for interpreting model results and improving model behavior. A key step is the recognition of model boundaries, that is, what is inside

This thesis is centered around three topics, sharing integrability as a common theme. This thesis explores different methods in the field of integrablemodels. The first two chapters are about integrable lattice models in statistical physics. The last chapter describes an integrable quantum chain.

This report describes the results of the Spanish participation in the project Coupling CORINAIR data to cost-effect emission reduction strategies based on critical threshold. (EU/LIFE97/ENV/FIN/336). The subproject has focused on three tasks. Develop tools to improve knowledge on the spatial and temporal details of emissions of air pollutants in Spain. Exploit existing experimental information on plant response to air pollutants in temperate ecosystem and Integrate these findings in a modelling framework that can asses with more accuracy the impact of air pollutants to temperate ecosystems. The results obtained during the execution of this project have significantly improved the models of the impact of alternative emission control strategies on ecosystems and crops in the Iberian Peninsula. (Author) 375 refs.

The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The primary function of MAR-D is to create a data repository for completed PRAs and Individual Plant Examinations (IPEs) by providing input, conversion, and output capabilities for data used by IRRAS, SARA, SETS, and FRANTIC software. As probabilistic risk assessments and individual plant examinations are submitted to the NRC for review, MAR-D can be used to convert the models and results from the study for use with IRRAS and SARA. Then, these data can be easily accessed by future studies and will be in a form that will enhance the analysis process. This reference manual provides an overview of the functions available within MAR-D and step-by-step operating instructions

a system architecture for integrated robot task planning. It identifies and describes the components considered necessary for implementation. The focus is on functionality of these elements as well as on the information flow. A pilot implementation of such an integrated system architecture for a robot......The addition of robot task planning in off-line programming systems aims at improving the capability of current state-of-the-art commercially available off-line programming systems, by integratingmodeling, task planning, programming and simulation together under one platform. This article proposes...... assembly task is discussed....

This increase in complexity has provided an impetus for the investigation into integrated asset- and liability-management frameworks that could realistically address dynamic portfolio allocation in a risk-controlled way. In this paper the authors propose a multi-stage dynamic stochastic-programmingmodel for the integrated ...

The Mixed Waste IntegratedProgram Logic Diagram was developed to provide technical alternative for mixed wastes projects for the Office of Technology Development`s Mixed Waste IntegratedProgram (MWIP). Technical solutions in the areas of characterization, treatment, and disposal were matched to a select number of US Department of Energy (DOE) treatability groups represented by waste streams found in the Mixed Waste Inventory Report (MWIR).

The advances by the Integral Fast Reactor Program at Argonne National Laboratory are the subject of this paper. The Integral Fast Reactor (IFR) is an advanced liquid-metal-cooled reactor concept being developed at Argonne National Laboratory. The advances stressed in the paper include fuel irradiation performance, improved passive safety, and the development of a prototype fuel cycle facility. 14 refs

This study examined technology in post-graduate teacher training programs in the Netherlands. A questionnaire was completed by 111 teacher educators from 12 Dutch universities with a post-graduate teacher training program. The general view of the use of technology in Dutch post-graduate teacher education was quite conventional. Basic technology…

The purpose of this report is to compile information and conclusions gathered as part of three separate tasks undertaken as part of the overall project, “Modeling EERE Deployment Programs,” sponsored by the Planning, Analysis, and Evaluation office within the Department of Energy’s Office of Energy Efficiency and Renewable Energy (EERE). The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address improvements to modeling in the near term, and note gaps in knowledge where future research is needed.

The objective of this study was to determine the effect of reproductive performance on dairy cattle herd value. Herd value was defined as the herd's average retention payoff (RPO). Individual cow RPO is the expected profit from keeping the cow compared with immediate replacement. First, a daily dynamic programmingmodel was developed to calculate the RPO of all cow states in a herd. Second, a daily Markov chain model was applied to estimate the herd demographics. Finally, the herd value was calculated by aggregating the RPO of all cows in the herd. Cow states were described by 5 milk yield classes (76, 88, 100, 112, and 124% with respect to the average), 9 lactations, 750 d in milk, and 282 d in pregnancy. Five different reproductive programs were studied (RP1 to RP5). Reproductive program 1 used 100% timed artificial insemination (TAI; 42% conception rate for first TAI and 30% for second and later services) and the other programs combined TAI with estrus detection. The proportion of cows receiving artificial insemination after estrus detection ranged from 30 to 80%, and conception rate ranged from 25 to 35%. These 5 reproductive programs were categorized according to their 21-d pregnancy rate (21-d PR), which is an indication of the rate that eligible cows become pregnant every 21 d. The 21-d PR was 17% for RP1, 14% for RP2, 16% for RP3, 18% for RP4, and 20% for RP5. Results showed a positive relationship between 21-d PR and herd value. The most extreme herd value difference between 2 reproductive programs was $77/cow per yr for average milk yield (RP5 - RP2), $13/cow per yr for lowest milk yield (RP5 - RP1), and $160/cow per yr for highest milk yield (RP5 - RP2). Reproductive programs were ranked based on their calculated herd value. With the exception of the best reproductive program (RP5), all other programs showed some level of ranking change according to milk yield. The most dramatic ranking change was observed in RP1, which moved from being the worst ranked

This report presents a plan for research on the question of containment performance in postulated severe accident scenarios. It focuses on the research being performed by the Structural and Seismic Engineering Branch, Division of Engineering, Office of Nuclear Regulatory Research. Summaries of the plans for this work have previously been published in the ''Nuclear Power Plant Severe Accident Research Plan'' (NUREG-0900). This report provides an update to reflect current status. This plan provides a summary of results to date as well as an outline of planned activities and milestones to the contemplated completion of the program in FY 1989

This booklet contains summary sheets that describe FY 1993 characterization, monitoring, and sensor technology (CMST) development projects. Currently, 32 projects are funded, 22 through the OTD Characterization, Monitoring, and Sensor Technology IntegratedProgram (CMST-IP), 8 through the OTD Program Research and Development Announcement (PRDA) activity managed by the Morgantown Energy Technology Center (METC), and 2 through Interagency Agreements (IAGs). This booklet is not inclusive of those CMST projects which are funded through Integrated Demonstrations (IDs) and other IntegratedPrograms (IPs). The projects are in six areas: Expedited Site Characterization; Contaminants in Soils and Groundwater; Geophysical and Hydrogeological Measurements; Mixed Wastes in Drums, Burial Grounds, and USTs; Remediation, D ampersand D, and Waste Process Monitoring; and Performance Specifications and Program Support. A task description, technology needs, accomplishments and technology transfer information is given for each project

There is a growing interest in business modeling and architecture in the areas of management and information systems. One of the issues in the area is the lack of integration between the modeling techniques that are employed to support business development and those used for technology modeling. This paper proposes a modeling approach that is capable of integrating the modeling of the business and of the technology. By depicting the business model, the organization structure and the technolog...

Must a master's of social work (MSW) program's orientation be either advanced generalist or some form of specialist? Or is there the possibility of a hybrid curriculum that provides enough breadth to prepare MSW graduates for a wide range of social work jobs, but that also addresses students' and community agencies' demands for student…

.... This MURI program took an integrated approach towards modeling, design and control of crystal growth processes and in conjunction with growth and characterization experiments developed much better...

Full Text Available Programs and project evaluation models can be extremely useful in project planning and management. The aim is to set the right questions as soon as possible in order to see in time and deal with the unwanted program effects, as well as to encourage the positive elements of the project impact. In short, different evaluation models are used in order to minimize losses and maximize the benefits of the interventions upon small or large social groups. This article introduces some of the most recently used evaluation models.

In the last decade, integratedmodelling has become a very popular topic in environmental modelling since it helps solving problems, which is difficult to model using a single model. However, managing complexity of integratedmodels and minimizing time required for their setup remains a challenging task. The integratedmodelling environment Delta Shell simplifies this task. The software components of Delta Shell are easy to reuse separately from each other as well as a part of integrated environment that can run in a command-line or a graphical user interface mode. The most components of the Delta Shell are developed using C# programming language and include libraries used to define, save and visualize various scientific data structures as well as coupled model configurations. Here we present two examples showing how Delta Shell simplifies process of setting up integratedmodels from the end user and developer perspectives. The first example shows coupling of a rainfall-runoff, a river flow and a run-time control models. The second example shows how coastal morphological database integrates with the coastal morphological model (XBeach) and a custom nourishment designer. Delta Shell is also available as open-source software released under LGPL license and accessible via http://oss.deltares.nl.

A six-segment management model is presented, each segment of which represents a major area in a new product development program. The first segment of the model covers integration of specialist engineers into 'systems requirement definition' or the system engineering documentation process. The second covers preparation of five basic types of 'development program plans.' The third segment covers integration of system requirements, scheduling, and funding of specialist engineering activities into 'work breakdown structures,' 'cost accounts,' and 'work packages.' The fourth covers 'requirement communication' by line organizations. The fifth covers 'performance measurement' based on work package data. The sixth covers 'baseline requirements achievement tracking.'

In examining integrated rural development programs the question that arises is why is it possible to identify several relatively successful small-scale or pilot rural development projects yet so difficult to find examples of successful rural development programs. 3 bodies of literature offer some insight into the morphology of rural development projects, programs, and processes: the urban-industrial impact hypothesis; the theory of induced technical change; and the new models of institutional change that deal with institution building and the economics of bureaucratic behavior. The urban-industrial impact hypothesis helps in the clarification of the relationships between the development of rural areas and the development of the total society of which rural areas are a part. It is useful in understanding the spatial dimensions of rural development where rural development efforts are likely to be most successful. Formulation of the hypothesis generated a series of empirical studies designed to test its validity. The effect of these studies has been the development of a rural development model in which the rural community is linked to the urban-industrial economy through a series of market relationships. Both the urban economy's rate of growth and the efficiency of the intersector product and factor markets place significant constraints on the possibilities of rural area development. It is not possible to isolate development processes in the contemporary rural community in a developing society from development processes in the larger society. The induced technical change theory provides a guide as to what must be done to gain access to efficient sources of economic growth, the new resources and incomes that are necessary to sustain rural development. Design of a successful rural development strategy involves a combination of technical and institutional change. The ability of rural areas to respond to the opportunities for economic growth generated by local urban

The Rabi model is a paradigm for interacting quantum systems. It couples a bosonic mode to the smallest possible quantum model, a two-level system. I present the analytical solution which allows us to consider the question of integrability for quantum systems that do not possess a classical limit. A criterion for quantum integrability is proposed which shows that the Rabi model is integrable due to the presence of a discrete symmetry. Moreover, I introduce a generalization with no symmetries; the generalized Rabi model is the first example of a nonintegrable but exactly solvable system.

Java Pathfinder (JPF) is a verification and testing environment for Java that integratesmodel checking, program analysis, and testing. JPF consists of a custom-made Java Virtual Machine (JVM) that interprets bytecode, combined with a search interface to allow the complete behavior of a Java program to be analyzed, including interleavings of concurrent programs. JPF is implemented in Java, and its architecture is highly modular to support rapid prototyping of new features. JPF is an explicit-state model checker, because it enumerates all visited states and, therefore, suffers from the state-explosion problem inherent in analyzing large programs. It is suited to analyzing programs less than 10kLOC, but has been successfully applied to finding errors in concurrent programs up to 100kLOC. When an error is found, a trace from the initial state to the error is produced to guide the debugging. JPF works at the bytecode level, meaning that all of Java can be model-checked. By default, the software checks for all runtime errors (uncaught exceptions), assertions violations (supports Java s assert), and deadlocks. JPF uses garbage collection and symmetry reductions of the heap during model checking to reduce state-explosion, as well as dynamic partial order reductions to lower the number of interleavings analyzed. JPF is capable of symbolic execution of Java programs, including symbolic execution of complex data such as linked lists and trees. JPF is extensible as it allows for the creation of listeners that can subscribe to events during searches. The creation of dedicated code to be executed in place of regular classes is supported and allows users to easily handle native calls and to improve the efficiency of the analysis.

The program GIGMF computes the differential and integrated statistical model cross sections for the reactions proceeding through a compound nuclear stage. The computational method is based on the Hauser-Feshbach-Wolfenstein theory, modified to include the modern version of Tepel et al. Although the program was written for a PDP-15 computer, with 16K high speed memory, many reaction channels can be taken into account with the following restrictions: the pro ectile spin must be less than 2, the maximum spin momenta of the compound nucleus can not be greater than 10. These restrictions are due solely to the storage allotments and may be easily relaxed. The energy of the impinging particle, the target and projectile masses, the spin and paritjes of the projectile, target, emergent and residual nuclei the maximum orbital momentum and transmission coefficients for each reaction channel are the input parameters of the program. (author)

Full Text Available In the context of the European population aging trend, and while the birth rate is still at a low level, the immigrants may contribute to the support of the EU economy and to finance the national social protection systems. But this would be possible only if they have been fully integrated in the host countries, the integration policies being a task of the national governments. The European Union may still offer support and stimulation through financing, policies coordination and good practices exchange facilitation. The new measures should encourage local level actions, including cooperation between local authorities, employers, migrants’ organizations, service providers and local population. Within the EU, there live 20.1 million immigrants (approximately 4% of the entire population coming from outside European area. An important element of the common EU policy on immigration is the one regarding the development of a policy on immigrants’ integration, which should provide a fair treatment within the member states, and guarantee rights and obligations comparable with the ones of the Union citizens.

This study examines trends and features of student research integration in educational program during international cooperation between Østfold University College in Norway and Southern Federal University in Russia. According to research and education approach the international project is aimed to use four education models, which linked student…

... and labour allocation of quota based integrated fisheries. We demonstrate the workability of our model with a numerical example and sensitivity analysis based on data obtained from one of the major fisheries in New Zealand. Keywords: mixed integer linear program, fishing, trawler scheduling, processing, quotas ORiON: ...

The Integrated Data Base (IDB) Program provides official Department of Energy (DOE) data on spent fuel and radioactive waste inventories, projections, and characteristics. The accomplishments of FY 1983 are summarized for three broad areas: (1) upgrading and issuing of the annual report on spent fuel and radioactive waste inventories, projections, and characteristics, including ORIGEN2 applications and a quality assurance plan; (2) creation of a summary data file in user-friendly format for use on a personal computer and enhancing user access to program data; and (3) optimizing and documentation of the data handling methodology used by the IDB Program and providing direct support to other DOE programs and sites in data handling. Plans for future work in these three areas are outlined. 23 references, 11 figures

The mathematical background for a multiport-network-solving program is described. A method for accurately numerically modeling an arbitrary, continuous, multiport transmission line is discussed. A modification to the transmission-line equations to accommodate multiple rf drives is presented. An improved model for the radio-frequency quadrupole (RFQ) accelerator that corrects previous errors is given. This model permits treating the RFQ as a true eight-port network for simplicity in interpreting the field distribution and ensures that all modes propagate at the same velocity in the high-frequency limit. The flexibility of the multiport model is illustrated by simple modifications to otherwise two-dimensional systems that permit modeling them as linear chains of multiport networks

By modifying some of the local L operators of the algebraic form of the Bethe Ansatz inhomogeneous one dimensional quantum lattice models can be constructed. This fact has recently attracted new attention, the inhomogeneities being interpreted as local impurities. The Hamiltonians of the so constructed one-dimensional quantum models have a nearest neighbour structure except in the vicinity of the local impurities which involve three-site interactions. The pertinent feature of these models is the absence of backscattering at the impurities: the impurities are transparent. (Copyright (1998) World Scientific Publishing Co. Pte. Ltd)

An effort has been made to demonstrate that dam safety is an integral part of asset management which, when properly done, ensures that all objectives relating to safety and compliance, profitability, stakeholders' expectations and customer satisfaction, are achieved. The means to achieving this integration of the dam safety program and the level of effort required for each core function have been identified using the risk management approach to pinpoint vulnerabilities, and subsequently to focus priorities. The process is considered appropriate for any combination of numbers, sizes and uses of dams, and is designed to prevent exposure to unacceptable risks. 5 refs., 1 tab

We couple non-linear σ-models to Liouville gravity, showing that integrability properties of symmetric space models still hold for the matter sector. Using similar arguments for the fermionic counterpart, namely Gross-Neveu-type models, we verify that such conclusions must also hold for them, as recently suggested. (author). 18 refs

Provides examples of best practices in technology integration from five Technology Innovation Challenge Grant (TICG) programs, funded through the Department of Education to meet the No Child Left Behind technology goals. Highlights include professional development activities in Louisiana and New Mexico; collaborative learning applications; and…

A QUEST model and associated detailed IGRIP models were developed and used to simulate several workcells in a proposed Plutonium Storage Facility (PSF). The models are being used by team members assigned to the program to improve communication and to assist in evaluating concepts and in performing trade-off studies which will result in recommendations and a final design. The model was designed so that it could be changed easily. The added flexibility techniques used to make changes easily are described in this paper in addition to techniques for integrating the QUEST and IGRIP products. Many of these techniques are generic in nature and can be applied to any modeling endeavor

maintains a long-term level of the stress hormone cortisol which is also anti-inflammatory. A new integratedmodel of the interaction between these two subsystems of the inflammatory system is proposed and coined the integrated inflammatory stress (ITIS) model. The coupling mechanisms describing....... A constant activation results in elevated levels of the variables in the model while a prolonged change of the oscillations in ACTH and cortisol concentrations is the most pronounced result of different LPS doses predicted by the model....

penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 13 MAY 2015 2...Threat to our National Economy DOD Cybersecurity Gaps Could Be Canary in Federal Acquisition Coal Mine Intangible Assets Create Vulnerabilities...operational approach integrates with current or planned CONOPS, BCP, information architecture, programs or initiatives Development  Approach to

The setting up of the European energy market has triggered a radical change of the context within with the energy players operated. The natural markets of the incumbent operators, which were formerly demarcated by national and even regional borders, have extended to at least the scale of the European Union. In addition to their geographical development strategy, gas undertakings are diversifying their portfolios towards both upstream as well as downstream activities of the gas chain, and/or extending their offers to other energies and services. Energy players' strategies are rather complex and sometimes give the impression that of being based on contradictory decisions. Some operators widen their field of operations, whereas others specialize in a limited number of activities. This Round Table provides an opportunity to compare business models as adopted by the major gas undertakings in response to structural changes observed in various countries over recent years

Full Text Available Background: The research work on entrepreneurship, enterprise's policy and management, which started in 1992, successfully continued in the following years. Between 1992 and 2011, more than 400 academics and other researchers have participated in research work (MER research program whose main orientation has been the creation of their own model of integral management. Results: In past years, academics (researchers and authors of published papers from Austria, Belgium, Bosnia and Herzegovina, Bulgaria, Byelorussia, Canada, the Czech Republic, Croatia, Estonia, France, Germany, Hungary, Italy, Poland, Romania, Russia, the Slovak Republic, Slovenia, Switzerland, Ukraine, and the US have cooperated in MER programs, coming from more than fifty institutions. Thus, scientific doctrines of different universities influenced the development of the MER model which is based on both horizontal and vertical integration of the enterprises' governance and management processes, instruments and institutions into a consistently operating unit. Conclusions: The presented MER model is based on the multi-layer integration of governance and management with an enterprise and its environment, considering the fundamental desires for the enterprises' existence and, thus, their quantitative as well as qualitative changes. The process, instrumental, and institutional integrity of the governance and management is also the initial condition for the implementation of all other integration factors.

Research and development of advanced reprocessing plant designs can greatly benefit from the development of a reprocessing plant model capable of transient solvent extraction chemistry. This type of model can be used to optimize the operations of a plant as well as the designs for safeguards, security, and safety. Previous work has integrated a transient solvent extraction simulation module, based on the Solvent Extraction Process Having Interaction Solutes (SEPHIS) code developed at Oak Ridge National Laboratory, with the Separations and Safeguards Performance Model (SSPM) developed at Sandia National Laboratory, as a first step toward creating a more versatile design and evaluation tool. The goal of this work was to strengthen the integration by linking more variables between the two codes. The results from this integratedmodel show expected operational performance through plant transients. Additionally, ORIGEN source term files were integrated into the SSPM to provide concentrations, radioactivity, neutron emission rate, and thermal power data for various spent fuels. This data was used to generate measurement blocks that can determine the radioactivity, neutron emission rate, or thermal power of any stream or vessel in the plant model. This work examined how the code could be expanded to integrate other separation steps and benchmark the results to other data. Recommendations for future work will be presented.

Until fairly recently, the idea of dynamic model content and presentation were treated synonymously. For example, if one was to take a data flow network, which captures the dynamics of a target system in terms of the flow of data through nodal operators, then one would often standardize on rectangles and arrows for the model display. The increasing web emphasis on XML, however, suggests that the network model can have its content specified in an XML language, and then the model can be represented in a number of ways depending on the chosen style. We have developed a formal method, based on styles, that permits a model to be specified in XML and presented in 1D (text), 2D, and 3D. This method allows for customization and personalization to exert their benefits beyond e-commerce, to the area of model structures used in computer simulation. This customization leads naturally to solving the bigger problem of modelintegration - the act of taking models of a scene and integrating them with that scene so that there is only one unified modeling interface. This work focuses mostly on customization, but we address the integration issue in the future work section.

textabstractThe thesis adresses the phenomenon of integrated care. The implementation of integrated care for patients with a stroke or dementia is studied. Because a generic quality management model for integrated care is lacking, the study works towards building a development model for integrated

A programmingmodel and architecture which was developed for the design and implementation of complex, heterogeneous measurement and control systems is described. The Multigraph Architecture integrates artificial intelligence techniques with conventional software technologies, offers a unified framework for distributed and shared memory based parallel computational models and supports multiple programming paradigms. The system can be implemented on different hardware architectures and can be adapted to strongly different applications.

The US Department of Energy (DOE) established the Office of Technology Development (EM-50) as an element of the Office of Environmental Management (EM) in November 1989. In an effort to focus resources and address priority needs, EM-50 introduced the concept of integratedprograms (IPs) and integrated demonstrations (IDs). The In Situ Remediation IntegratedProgram (ISR IP) focuses research and development on the in-place treatment of contaminated environmental media, such as soil and groundwater, and the containment of contaminants to prevent the contaminants from spreading through the environment. Using in situ remediation technologies to clean up DOE sites minimizes adverse health effects on workers and the public by reducing contact exposure. The technologies also reduce cleanup costs by orders of magnitude. This report summarizes project work conducted in FY 1994 under the ISR IP in three major areas: treatment (bioremediation), treatment (physical/chemical), and containment technologies. Buried waste, contaminated soils and groundwater, and containerized waste are all candidates for in situ remediation. Contaminants include radioactive waste, volatile and nonvolatile organics, heavy metals, nitrates, and explosive materials.

The US Department of Energy (DOE) established the Office of Technology Development (EM-50) as an element of the Office of Environmental Management (EM) in November 1989. In an effort to focus resources and address priority needs, EM-50 introduced the concept of integratedprograms (IPs) and integrated demonstrations (IDs). The In Situ Remediation IntegratedProgram (ISR IP) focuses research and development on the in-place treatment of contaminated environmental media, such as soil and groundwater, and the containment of contaminants to prevent the contaminants from spreading through the environment. Using in situ remediation technologies to clean up DOE sites minimizes adverse health effects on workers and the public by reducing contact exposure. The technologies also reduce cleanup costs by orders of magnitude. This report summarizes project work conducted in FY 1994 under the ISR IP in three major areas: treatment (bioremediation), treatment (physical/chemical), and containment technologies. Buried waste, contaminated soils and groundwater, and containerized waste are all candidates for in situ remediation. Contaminants include radioactive waste, volatile and nonvolatile organics, heavy metals, nitrates, and explosive materials

The Efficient Separations and Processing IntegratedProgram (ESPIP) was created in 1991 to identify, develop, and perfect separations technologies and processes to treat wastes and address environmental problems throughout the US Department of Energy (DOE) complex. The ESPIP funds several multiyear tasks that address high-priority waste remediation problems involving high-level, low-level, transuranic, hazardous, and mixed (radioactive and hazardous) wastes. The ESPIP supports applied R ampersand D leading to demonstration or use of these separations technologies by other organizations within DOE's Office of Environmental Restoration and Waste Management. Examples of current ESPIP-funded separations technologies are described here

Many in the nuclear power plant business believe that the catastrophic failure mode for reactor containment structures is unrealistic. One of the goals of the EPRI containment integrityprogram is to demonstrate that this is true. The objective of the program is to provide the utility industry with an experimental data base and a test-validated analytical method for realistically evaluating the actual over-pressure capability of concrete containment buildings and to predict leakage behavior if higher pressures were to occur. The ultimate goal of this research effort is to characterize the containment leakage mode and rate as a function of internal pressure and time so that the risk can be realistically assessed for hypothetical degraded core accidents. Progress in the first and second phases of the three-phase analytical and testing efforts is discussed

STEFINS (STEel Freezing INtegral Simulation) is a computer program for the calculation of the rate of solidification of molten steel on solid steel. Such computations arize when investigating core melt accidents in fast reactors. In principle this problem involves a coupled two-dimensional thermal and hydraulic approach. However, by physically reasonable assumptions a decoupled approach has been developed. The transient solidification of molten steel on a cold wall is solved in the direction normal to the molten steel flow and independent from the solution for the molten steel temperature and Nusselt number along the direction of flow. The solutions to the applicable energy equations have been programmed in cylindrical and slab geometries. Internal gamma heating of steel is included

The In Situ Remediation IntegratedProgram (ISRP) supports and manages a balanced portfolio of applied research and development activities in support of DOE environmental restoration and waste management needs. ISRP technologies are being developed in four areas: containment, chemical and physical treatment, in situ bioremediation, and in situ manipulation (including electrokinetics). the focus of containment is to provide mechanisms to stop contaminant migration through the subsurface. In situ bioremediation and chemical and physical treatment both aim to destroy or eliminate contaminants in groundwater and soils. In situ manipulation (ISM) provides mechanisms to access contaminants or introduce treatment agents into the soil, and includes other technologies necessary to support the implementation of ISR methods. Descriptions of each major program area are provided to set the technical context of the ISM subprogram. Typical ISM needs for major areas of in situ remediation research and development are identified

Systems Biology has motivated dynamic models of important intracellular processes at the pathway level, for example, in signal transduction and cell cycle control. To answer important biomedical questions, however, one has to go beyond the study of isolated pathways towards the joint study of interacting signaling pathways or the joint study of signal transduction and cell cycle control. Thereby the reuse of established models is preferable, as it will generally reduce the modeling effort and increase the acceptance of the combined model in the field. Obtaining a combined model can be challenging, especially if the submodels are large and/or come from different working groups (as is generally the case, when models stored in established repositories are used). To support this task, we describe a semi-automatic workflow based on established software tools. In particular, two frequent challenges are described: identification of the overlap and subsequent (re)parameterization of the integratedmodel. The reparameterization step is crucial, if the goal is to obtain a model that can reproduce the data explained by the individual models. For demonstration purposes we apply our workflow to integrate two signaling pathways (EGF and NGF) from the BioModels Database.

Full Text Available the article considers an integrated approach to teaching programming with the use of technologies of computer modeling and 3D-graphics, allowing to improve the quality of education. It is shown that this method will allow you to systematize knowledge, improve the level of motivation through the inclusion of relevant technologies, to develop skills of project activities, to strengthen interdisciplinary connections, and promotes professional and personal self-determination of students of secondary school.

This work is prepared for the Swedish Power Inspectorate (SKI). The SKI has from the Atomic Energy Research Establishment (AERE) at Harwell, U.K., acquired the computer model NAMMU for groundwater hydrology calculations. The code was first implemented on an AMDAHL 470, a IBM compatible computer, and then modified in order to integrate it with HYPAC, which is a program package for pre- and post-processing finite element data, developed by KEMAKTA AB. This report describes the modifications done to both NAMMU and HYPAC, and the verification of the coupled program system NAMMU-HYPAC. (author)

Full Text Available The aim of this paper is to determine the weakest point of Serbian destination competitiveness as a tourist destination in comparation with its main competitors. The paper is organized as follows. The short introduction of the previous research on the destination competitiveness is followed by description of the Integratedmodel of destination competitiveness (Dwyer et al, 2003 that was used as the main reference framework. Section three is devoted to the description of the previous studies on competitiveness of Serbian tourism, while section four outlines the statistical methodology employed in this study and presents and interprets the empirical results. The results showed that Serbia is more competitive in its natural, cultural and created resources than in destination management while, according to the Integratedmodel, Serbia is less competitive in demand conditions that refer to the image and awareness of the destination itself.

The definition of exclusion statistics, as given by Haldane, allows for a statistical interaction between distinguishable particles (multi-species statistics). The thermodynamic quantities for such statistics ca be evaluated exactly. The explicit expressions for the cluster coefficients are presented. Furthermore, single-species exclusion statistics is realized in one-dimensional integrablemodels. The interesting questions of generalizing this correspondence onto the higher-dimensional and the multi-species cases remain essentially open

The definition of exclusion statistics that was given by Haldane admits a 'statistical interaction' between distinguishable particles (multispecies statistics). For such statistics, thermodynamic quantities can be evaluated exactly; explicit expressions are presented here for cluster coefficients. Furthermore, single-species exclusion statistics is realized in one-dimensional integrablemodels of the Calogero-Sutherland type. The interesting questions of generalizing this correspondence to the higher-dimensional and the multispecies cases remain essentially open; however, our results provide some hints as to searches for the models in question

, repair works and strengthening methods for structures. A very significant part of the infrastructure consists of reinforced concrete structures. Even though reinforced concrete structures typically are very competitive, certain concrete structures suffer from various types of degradation. A framework...... should define a framework in which materials research results eventually should fit in and on the other side the materials research should define needs and capabilities in structural modelling. Integrated materials-structural models of a general nature are almost non-existent in the field of cement based...

The capability to estimate the performance and cost of emission control systems is critical to a variety of planning and analysis requirements faced by utilities, regulators, researchers and analysts in the public and private sectors. The computer model described in this paper has been developed for DOe to provide an up-to-date capability for analyzing a variety of pre-combustion, combustion, and post-combustion options in an integrated framework. A unique capability allows performance and costs to be modeled probabilistically, which allows explicit characterization of uncertainties and risks.

The French health and services system to maintain at home is characterized by its fragmentation, whereas the need of the people for intervention is generally total. This fragmentation have consequences: delay in services delivery, inadequate transmission of information, redundant evaluation, service conditioned by the entrance point solicited rather than by the need of the person and inappropriate use of expensive resources by ignorance or difficulty of access to the less expensive resources. The purpose of integration is to improve continuity of interventions for people in loss of autonomy. It consists in setting up a whole of organisational, managerial and clinical common tools. Organisational model "Projet et Recherches sur l'Intégration des Services pour le Maintien de l'Autonomie" (Prisma) tested in Quebec showed a strong impact on the prevention of the loss of autonomy in term of public health on a population level. This model rests on six principal elements: partnership, single entry point, case-management, a multidimensional standardized tool for evaluation, an individualized services plan and a system for information transmission. Thus, it was decided to try to implement in France this organisational model. The project is entitled Prisma France and is presented here. The analysis of the context of implementation of the innovation which represents integration in the field of health and services for frail older reveals obstacles (in particular because of diversity of professional concerned and a presentiment of complexity of the implementation of the model) and favourable conditions (in particular the great tension towards change in this field). The current conditions in France appear mainly favourable to the implementation of integration. The establishment of Prisma model in France requires a partnership work of definition of a common language as well on the diagnoses as on the solutions. The strategic and operational dialogue is thus a key element of the

The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

We associate cotangent models to a neighbourhood of a Liouville torus in symplectic and Poisson manifolds focusing on b-Poisson/ b-symplectic manifolds. The semilocal equivalence with such models uses the corresponding action-angle theorems in these settings: the theorem of Liouville-Mineur-Arnold for symplectic manifolds and an action-angle theorem for regular Liouville tori in Poisson manifolds (Laurent- Gengoux et al., IntMath Res Notices IMRN 8: 1839-1869, 2011). Our models comprise regular Liouville tori of Poisson manifolds but also consider the Liouville tori on the singular locus of a b-Poisson manifold. For this latter class of Poisson structures we define a twisted cotangent model. The equivalence with this twisted cotangent model is given by an action-angle theorem recently proved by the authors and Scott (Math. Pures Appl. (9) 105(1):66-85, 2016). This viewpoint of cotangent models provides a new machinery to construct examples of integrable systems, which are especially valuable in the b-symplectic case where not many sources of examples are known. At the end of the paper we introduce non-degenerate singularities as lifted cotangent models on b-symplectic manifolds and discuss some generalizations of these models to general Poisson manifolds.

This case study covers the process of successfully integrating photovoltaic (PV) systems into a low-income housing development in northeast Denver, Colorado, focusing specifically on a new financing model and job training. The Northeast Denver Housing Center (NDHC), working in cooperation with Del Norte Neighborhood Development Corporation, Groundwork Denver, and the National Renewable Energy Laboratory (NREL), was able to finance the PV system installations by blending private equity funding with utility rebates, federal tax credits, and public sector funding. A grant provided by the Governor's Energy Office allowed for the creation of the new financing model. In addition, the program incorporated an innovative low-income job training program and an energy conservation incentive program.

This paper discusses the systems integrationmodeling system (SIMS), an analysis tool for the detailed evaluation of the structure and related performance of the Federal Waste Management System (FWMS) and its interface with waste generators. It's use for evaluations in support of system-level decisions as to FWMS configurations, the allocation, sizing, balancing and integration of functions among elements, and the establishment of system-preferred waste selection and sequencing methods and other operating strategies is presented. SIMS includes major analysis submodels which quantify the detailed characteristics of individual waste items, loaded casks and waste packages, simulate the detailed logistics of handling and processing discrete waste items and packages, and perform detailed cost evaluations

Integrating multidisciplinary models requires linking models: that may operate at different temporal and spatial scales; developed using different methodologies, tools and techniques; different levels of complexity; calibrated for different ranges of inputs and outputs, etc. On the other hand......, Enterprise Application Integration, and Integration Design Patterns. We developed an architecture of a multidisciplinary modelintegration framework that brings these three aspects of integration together. Service-oriented-based platform independent architecture that enables to establish loosely coupled...

The In Situ Remediation IntegratedProgram (ISR IP) was instituted out of recognition that in situ remediation could fulfill three important criteria: significant cost reduction of cleanup by eliminating or minimizing excavation, transportation, and disposal of wastes; reduced health impacts on workers and the public by minimizing exposure to wastes during excavation and processing; and remediation of inaccessible sites, including: deep subsurfaces, in, under, and around buildings. Buried waste, contaminated soils and groundwater, and containerized wastes are all candidates for in situ remediation. Contaminants include radioactive wastes, volatile and non-volatile organics, heavy metals, nitrates, and explosive materials. The ISR IP intends to facilitate development of in situ remediation technologies for hazardous, radioactive, and mixed wastes in soils, groundwater, and storage tanks. Near-term focus is on containment of the wastes, with treatment receiving greater effort in future years. ISR IP is an applied research and development program broadly addressing known DOE environmental restoration needs. Analysis of a sample of 334 representative sites by the Office of Environmental Restoration has shown how many sites are amenable to in situ remediation: containment--243 sites; manipulation--244 sites; bioremediation--154 sites; and physical/chemical methods--236 sites. This needs assessment is focused on near-term restoration problems (FY93--FY99). Many other remediations will be required in the next century. The major focus of the ISR EP is on the long term development of permanent solutions to these problems. Current needs for interim actions to protect human health and the environment are also being addressed.

The In Situ Remediation IntegratedProgram (ISR IP) was instituted out of recognition that in situ remediation could fulfill three important criteria: significant cost reduction of cleanup by eliminating or minimizing excavation, transportation, and disposal of wastes; reduced health impacts on workers and the public by minimizing exposure to wastes during excavation and processing; and remediation of inaccessible sites, including: deep subsurfaces, in, under, and around buildings. Buried waste, contaminated soils and groundwater, and containerized wastes are all candidates for in situ remediation. Contaminants include radioactive wastes, volatile and non-volatile organics, heavy metals, nitrates, and explosive materials. The ISR IP intends to facilitate development of in situ remediation technologies for hazardous, radioactive, and mixed wastes in soils, groundwater, and storage tanks. Near-term focus is on containment of the wastes, with treatment receiving greater effort in future years. ISR IP is an applied research and development program broadly addressing known DOE environmental restoration needs. Analysis of a sample of 334 representative sites by the Office of Environmental Restoration has shown how many sites are amenable to in situ remediation: containment--243 sites; manipulation--244 sites; bioremediation--154 sites; and physical/chemical methods--236 sites. This needs assessment is focused on near-term restoration problems (FY93--FY99). Many other remediations will be required in the next century. The major focus of the ISR EP is on the long term development of permanent solutions to these problems. Current needs for interim actions to protect human health and the environment are also being addressed

In the late 1980's and early 1990's, Lawrence Livermore National Laboratory was deeply engrossed in determining the next generation programmingmodel for the Integrated Design Codes (IDC) beyond vectorization for the Cray 1s series of computers. The vector model, developed in mid 1970's first for the CDC 7600 and later extended from stack based vector operation to memory to memory operations for the Cray 1s, lasted approximately 20 years (See Slide 5). The Cray vector era was deemed an extremely long lived era as it allowed vector codes to be developed over time (the Cray 1s were faster in scalar mode than the CDC 7600) with vector unit utilization increasing incrementally over time. The other attributes of the Cray vector era at LLNL were that we developed, supported and maintained the Operating System (LTSS and later NLTSS), communications protocols (LINCS), Compilers (Civic Fortran77 and Model), operating system tools (e.g., batch system, job control scripting, loaders, debuggers, editors, graphics utilities, you name it) and math and highly machine optimized libraries (e.g., SLATEC, and STACKLIB). Although LTSS was adopted by Cray for early system generations, they later developed COS and UNICOS operating systems and environment on their own. In the late 1970s and early 1980s two trends appeared that made the Cray vector programmingmodel (described above including both the hardware and system software aspects) seem potentially dated and slated for major revision. These trends were the appearance of low cost CMOS microprocessors and their attendant, departmental and mini-computers and later workstations and personal computers. With the wide spread adoption of Unix in the early 1980s, it appeared that LLNL (and the other DOE Labs) would be left out of the mainstream of computing without a rapid transition to these 'Killer Micros' and modern OS and tools environments. The other interesting advance in the period is that systems were being

Full Text Available Many programmingmodels for massively parallel machines exist, and each has its advantages and disadvantages. In this article we present a programmingmodel that combines features from other programmingmodels that (1 can be efficiently implemented on present and future Cray Research massively parallel processor (MPP systems and (2 are useful in constructing highly parallel programs. The model supports several styles of programming: message-passing, data parallel, global address (shared data, and work-sharing. These styles may be combined within the same program. The model includes features that allow a user to define a program in terms of the behavior of the system as a whole, where the behavior of individual tasks is implicit from this systemic definition. (In general, features marked as shared are designed to support this perspective. It also supports an opposite perspective, where a program may be defined in terms of the behaviors of individual tasks, and a program is implicitly the sum of the behaviors of all tasks. (Features marked as private are designed to support this perspective. Users can exploit any combination of either set of features without ambiguity and thus are free to define a program from whatever perspective is most appropriate to the problem at hand.

Integratedmodelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integratedmodelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integratedmodelling applied to a failure analysis...

The present work shows the procedure carried out to integrate the code TNXYZ as a calculation tool at the graphical simulation platform Salome. The TNXYZ code propose a numerical solution of the neutron transport equation, in several groups of energy, steady-state and three-dimensional geometry. In order to discretized the variables of the transport equation, the code uses the method of discrete ordinates for the angular variable, and a nodal method for the spatial dependence. The Salome platform is a graphical environment designed for building, editing and simulating mechanical models mainly focused on the industry and unlike other software, in order to form a complete scheme of pre and post processing of information, to integrate and control an external source code. Before the integration the in the Salome platform TNXYZ code was upgraded. TNXYZ was programmed in the 90s using Fortran 77 compiler; for this reason the code was adapted to the characteristics of the current Fortran compilers; in addition, with the intention of extracting partial results over the process sequence, the original structure of the program underwent a modularization process, i.e. the main program was divided into sections where the code performs major operations. This procedure is controlled by the information module (YACS) on Salome platform, and it could be useful for a subsequent coupling with thermal-hydraulics codes. Finally, with the help of the Monte Carlo code Serpent several study cases were defined in order to check the process of integration; the verification process consisted in performing a comparison of the results obtained with the code executed as stand-alone and after modernized, integrated and controlled by the Salome platform. (Author)

Full Text Available Organizational buying behavior is decision making process by which formal organizations establish the need for purchased products and services, and identify, evaluate, and choose among alternative brands and suppliers. Understanding the buying decision processes is essential to developing the marketing programs of companies that sell to organizations, or to 'industrial customers'. In business (industrial marketing, exchange relationships between the organizational selling center and the organizational buying center are crucial. Integrativemodel of organizational buying behavior offers a systematic framework in analyzing the complementary factors and what effect they have on the behavior of those involved in making buying decisions.

The mission of the Mixed Waste IntegratedProgram (MWIP) is to develop and demonstrate innovative and emerging technologies for the treatment and management of DOE's mixed low-level wastes (MLLW) for use by its customers, the Office of Waste Operations (EM-30) and the Office of Environmental Restoration (EM-40). The primary goal of MWIP is to develop and demonstrate the treatment and disposal of actual mixed waste (MMLW and MTRU). The vitrification process and the plasma hearth process are scheduled for demonstration on actual radioactive waste in FY95 and FY96, respectively. This will be accomplished by sequential studies of lab-scale non-radioactive testing followed by bench-scale radioactive testing, followed by field-scale radioactive testing. Both processes create a highly durable final waste form that passes leachability requirements while destroying organics. Material handling technology, and off-gas requirements and capabilities for the plasma hearth process and the vitrification process will be established in parallel

The US Department of Energy (DOE) is responsible for the management and treatment of its mixed low-level wastes (MLLW). MLLW are regulated under both the Resource Conservation and Recovery Act and various DOE orders. Over the next 5 years, DOE will manage over 1.2 m 3 of MLLW and mixed transuranic (MTRU) wastes. In order to successfully manage and treat these mixed wastes, DOE must adapt and develop characterization, treatment, and disposal technologies which will meet performance criteria, regulatory approvals, and public acceptance. Although technology to treat MLLW is not currently available without modification, DOE is committed to developing such treatment technologies and demonstrating them at the field scale by FY 1997. The Office of Research and Development's Mixed Waste IntegratedProgram (MWIP) within the DOE Office of Environmental Management (EM), OfFice of Technology Development, is responsible for the development and demonstration of such technologies for MLLW and MTRU wastes. MWIP advocates and sponsors expedited technology development and demonstrations for the treatment of MLLW

The US Department of Energy (DOE) is responsible for the management and treatment of its mixed low-level wastes (MLLW). MLLW are regulated under both the Resource Conservation and Recovery Act and various DOE orders. Over the next 5 years, DOE will manage over 1.2 m{sup 3} of MLLW and mixed transuranic (MTRU) wastes. In order to successfully manage and treat these mixed wastes, DOE must adapt and develop characterization, treatment, and disposal technologies which will meet performance criteria, regulatory approvals, and public acceptance. Although technology to treat MLLW is not currently available without modification, DOE is committed to developing such treatment technologies and demonstrating them at the field scale by FY 1997. The Office of Research and Development`s Mixed Waste IntegratedProgram (MWIP) within the DOE Office of Environmental Management (EM), OfFice of Technology Development, is responsible for the development and demonstration of such technologies for MLLW and MTRU wastes. MWIP advocates and sponsors expedited technology development and demonstrations for the treatment of MLLW.

This paper reviews various issues in the integration of applications with a building model... (Truncated.)......This paper reviews various issues in the integration of applications with a building model... (Truncated.)...

Full Text Available Numerous studies have analysed farm planning decisions focusing on producer risk preferences. Few studies have focussed on the farm planning decisions in an integrated croplivestock farm context. Income variability and means of managing risk continues to receive much attention in farm planning research. Different risk programmingmodels have attempted to focus on minimising the income variability of farm activities. This study attempts to identify the optimal mix of crops and the number of animals the farm needs to keep in the presence of crop production risk for a range of risk levels. A mixed integer linear programmingmodel was developed to model the decision environment faced by an integrated crop-livestock farmer. The deviation of income from the expected value was used as a measure of risk. A case study is presented with representative data from a farm in the Swartland area. An investigation of the results of the model under different constraints shows that, in general, strategies that depend on crop rotation principles are preferred to strategies that follow mono-crop production practices.

Integration Adapters are a fundamental part of an integration system, since they provide (business) applications access to its messaging channel. However, their modeling and configuration remain under-represented. In previous work, the integration control and data flow syntax and semantics have been expressed in the Business Process Model and Notation (BPMN) as a semantic model for message-based integration, while adapter and the related quality of service modeling were left for further studi...

Recognizing that over half of STEM Ph.D. graduates are finding work outside of academia, a new, NSF-funded program at Syracuse University, EMPOWER (or Education ModelProgram on Water-Energy Research) is encouraging its graduate students to take ownership of their graduate program and design it to meet their anticipated needs. Launched in 2016, EMPOWER's goal is to prepare graduate students for careers in the water-energy field by offering targeted workshops, professional training coursework, a career capstone experience, a professional development mini-grant program, and an interdisciplinary "foundations" seminar. Through regular student feedback and program evaluation, EMPOWER has learned some important lessons this first year: career options and graduate students' interests are diverse, requiring individualized programs designed to meet the needs of prospective employers and employees; students need exposure to the range of careers in their field to provide a roadmap for designing their own graduate school experience; effective programs nurture a culture that values professional development thereby giving students permission to pursue career paths and professional development opportunities that meet their own needs and interests; and existing university resources support the effective and efficient integration of professional development activities into graduate programs. Many of the positive outcomes experienced by EMPOWER students may be achieved in departmental graduate programs with small changes to their graduate curricula.

Dr. Kevin A. Fenton, Director of CDC's National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention, discusses program collaboration and service integration, a strategy that promotes better collaboration between public health programs and supports appropriate service integration at the point-of-care.

Since 25 June 1986, when the CSN (Nuclear Safety Conseil) approve the IntegratedProgram of Probabilistic Safety Analysis, this program has articulated the main activities of CSN. This document summarize the activities developed during these years and reviews the Integrated programme

The Steam Generator Tube IntegrityProgram (SGTIP) was a three phase program conducted for the US Nuclear Regulatory Commission (NRC) by Pacific Northwest Laboratory (PNL). The first phase involved burst and collapse testing of typical steam generator tubing with machined defects. The second phase of the SGTIP continued the integrity testing work of Phase I, but tube specimens were degraded by chemical means rather than machining methods. The third phase of the program used a removed-from-service steam generator as a test bed for investigating the reliability and effectiveness of in-service nondestructive eddy-current inspection methods and as a source of service degraded tubes for validating the Phase I and Phase II data on tube integrity. This report describes the results of Phase II of the SGTIP. The object of this effort included burst and collapse testing of chemically defected pressurized water reactor (PWR) steam generator tubing to validate empirical equations of remaining tube integrity developed during Phase I. Three types of defect geometries were investigated: stress corrosion cracking (SCC), uniform thinning and elliptical wastage. In addition, a review of the publicly available leak rate data for steam generator tubes with axial and circumferential SCC and a comparison with an analytical leak rate model is presented. Lastly, nondestructive eddy-current (EC) measurements to determine accuracy of defect depth sizing using conventional and alternate standards is described. To supplement the laboratory EC data and obtain an estimate of EC capability to detect and size SCC, a mini-round robin test utilizing several firms that routinely perform in-service inspections was conducted.

The Steam Generator Tube IntegrityProgram (SGTIP) was a three phase program conducted for the US Nuclear Regulatory Commission (NRC) by Pacific Northwest Laboratory (PNL). The first phase involved burst and collapse testing of typical steam generator tubing with machined defects. The second phase of the SGTIP continued the integrity testing work of Phase I, but tube specimens were degraded by chemical means rather than machining methods. The third phase of the program used a removed-from-service steam generator as a test bed for investigating the reliability and effectiveness of in-service nondestructive eddy-current inspection methods and as a source of service degraded tubes for validating the Phase I and Phase II data on tube integrity. This report describes the results of Phase II of the SGTIP. The object of this effort included burst and collapse testing of chemically defected pressurized water reactor (PWR) steam generator tubing to validate empirical equations of remaining tube integrity developed during Phase I. Three types of defect geometries were investigated: stress corrosion cracking (SCC), uniform thinning and elliptical wastage. In addition, a review of the publicly available leak rate data for steam generator tubes with axial and circumferential SCC and a comparison with an analytical leak rate model is presented. Lastly, nondestructive eddy-current (EC) measurements to determine accuracy of defect depth sizing using conventional and alternate standards is described. To supplement the laboratory EC data and obtain an estimate of EC capability to detect and size SCC, a mini-round robin test utilizing several firms that routinely perform in-service inspections was conducted

A present-worth generating cost model has been developed and used to evaluate the economic value of integrated plant upgrading life extension project in nuclear power plants. This paper shows that integrated plant upgrading programs can be developed in which a mix of near-term availability, power rating, and heat rate improvements can be obtained in combination with life extension. All significant benefits and costs are evaluated from the viewpoint of the utility, as measured in discounted revenue requirement differentials between alternative plans which are equivalent in system generating capacity. The near-term upgrading benefits are shown to enhance the benefit picture substantially. In some cases the net benefit is positive, even if the actual life extension proves to be less than expected

Full Text Available We give an overview of exactly solvable many-body models of quantum optics. Among them is a system of two-level atoms which interact with photons propagating in a one-dimensional (1D chiral waveguide; exact eigenstates of this system can be explicitly constructed. This approach is used also for a system of closely located atoms in the usual (non-chiral waveguide or in 3D space. Moreover, it is shown that for an arbitrary atomic system with a cascade spontaneous radiative decay, the fluorescence spectrum can be described by an exact analytic expression which accounts for interference of emitted photons. Open questions related with broken integrability are discussed.

The path-integral generalization of the Duistermaat-Heckman integration formula is investigated for integrablemodels. It is shown that for models with periodic classical trajectories the path integral reduces to a form similar to the finite-dimensional Duistermaat-Heckman integration formula. This provides a relation between exactness of the stationary-phase approximation and Morse theory. It is also argued that certain integrablemodels can be related to topological quantum theories. Finally, it is found that in general the stationary-phase approximation presumes that the initial and final configurations are in different polarizations. This is exemplified by the quantization of the SU(2) coadjoint orbit

... for reasons bearing on professional competence, professional conduct, or financial integrity; who has surrendered such a license while formal disciplinary proceedings involving professional conduct were pending...

Introduction to sequential decision processes covers use of dynamic programming in studying models of resource allocation, methods for approximating solutions of control problems in continuous time, production control, more. 1982 edition.

and terrorism. The Light Water Reactor Sustainability (LWRS) Program is the primary programmatic activity that addresses Objective 1. This document summarizes the LWRS Program's plans. For the LWRS Program, sustainability is defined as the ability to maintain safe and economic operation of the existing fleet of nuclear power plants for a longer-than-initially-licensed lifetime. It has two facets with respect to long-term operations: (1) manage the aging of plant systems, structures, and components so that nuclear power plant lifetimes can be extended and the plants can continue to operate safely, efficiently, and economically; and (2) provide science-based solutions to the industry to implement technology to exceed the performance of the current labor-intensive business model.

proliferation and terrorism. The Light Water Reactor Sustainability (LWRS) Program is the primary programmatic activity that addresses Objective 1. This document summarizes the LWRS Program's plans. For the LWRS Program, sustainability is defined as the ability to maintain safe and economic operation of the existing fleet of nuclear power plants for a longer-than-initially-licensed lifetime. It has two facets with respect to long-term operations: (1) manage the aging of plant systems, structures, and components so that nuclear power plant lifetimes can be extended and the plants can continue to operate safely, efficiently, and economically; and (2) provide science-based solutions to the industry to implement technology to exceed the performance of the current labor-intensive business model.

Understanding abandoned mine land (AML) changes during land reclamation is crucial for reusing damaged land resources and formulating sound ecological restoration policies. This study combines the linear programming (LP) model and the CLUE-S model to simulate land-use dynamics in the Mentougou District (Beijing, China) from 2007 to 2020 under three reclamation scenarios, that is, the planning scenario based on the general land-use plan in study area (scenario 1), maximal comprehensive benefits (scenario 2), and maximal ecosystem service value (scenario 3). Nine landscape-scale graph metrics were then selected to describe the landscape characteristics. The results show that the coupled model presented can simulate the dynamics of AML effectively and the spatially explicit transformations of AML were different. New cultivated land dominates in scenario 1, while construction land and forest land account for major percentages in scenarios 2 and 3, respectively. Scenario 3 has an advantage in most of the selected indices as the patches combined most closely. To conclude, reclaiming AML by transformation into more forest can reduce the variability and maintain the stability of the landscape ecological system in study area. These findings contribute to better mapping AML dynamics and providing policy support for the management of AML.

textabstractPeriodically integrated time series require a periodic differencing filter to remove the stochastic trend. A non-periodic integrated time series needs the first-difference filter for similar reasons. When the changing seasonal fluctuations for the non-periodic integrated series can be

Science, technology, engineering, arts and math (STEAM) education integrates science with art, presenting a unique and interesting opportunity to increase accessibility in science for learners. This case study examines an afterschool program grounded in art and science integration. Specifically, I studied the goals of the program, it's implementation and the student experience (thinking, feeling and doing) as they participated in the program. My findings suggest that these programs can be powerful methods to nurture scientific literacy, creativity and emotional development in learners. To do so, this program made connections between disciplines and beyond, integrated holistic teaching and learning practices, and continually adapted programming while also responding to challenges. The program is therefore specially suited to engage the heads, hands and hearts of learners, and can make an important contribution to their learning and development. To conclude, I provide some recommendations for STEAM implementation in both formal and informal learning settings.

SLIDE is a FORTRAN IV program for producing 35 mm color slides on the Control Data CYBER-74. SLIDE interfaces with the graphics package, DISSPLA, on the CYBER-74. It was designed so that persons with no previous computer experience can easily and quickly generate their own textual 35 mm color slides for verbal presentations. SLIDE's features include seven different colors, five text sizes, ten tab positions, and two page sizes. As many slides as desired may be produced during any one run of the program. Each slide is designed to represent an 8 1 / 2 in. x 11 in. or an 11 in. x 8 1 / 2 in. page. The input data cards required to run the SLIDE program and the program output are described. Appendixes contain a sample program run showing input, output, and the resulting slides produced and a FORTRAN listing of the SLIDE program. (U.S.)

In 1993, I tested a radio-controlled airplane designed by Jim Walker of Brigham Young University for low-elevation aerial photography. Model-air photography retains most of the advantages of standard aerial photography --- the photographs can be used to detect lineaments, to map roads and buildings, and to construct stereo pairs to measure topography --- and it is far less expensive. Proven applications on the Oak Ridge Reservation include: updating older aerial records to document new construction; using repeated overflights of the same area to capture seasonal changes in vegetation and the effects of major storms; and detecting waste trench boundaries from the color and character of the overlying grass. Aerial photography is only one of many possible applications of radio-controlled aircraft. Currently, I am funded by the Department of Energy's Office of Technology Development to review the state of the art in microavionics, both military and civilian, to determine ways this emerging technology can be used for environmental site characterization. Being particularly interested in geophysical applications, I am also collaborating with electrical engineers at Oak Ridge National Laboratory to design a model plane that will carry a 3-component flux-gate magnetometer and a global positioning system, which I hope to test in the spring of 1994

Sandia National Laboratories (SNL), Albuquerque, New Mexico, supports the International Technology Exchange Division (ITED) through the integration of all international activities conducted within the DOE's Office of Environmental Management (EM)

... Code of Federal Regulations is available via the Federal Digital System at: http://www.gpo.gov/fdsys... educational programs or those that provide marketing, advertising, recruiting, or admissions services. We have... the institution to provide services, such as food service, other than educational programs, marketing...

... College and Higher Education (TEACH) Grant Program, the Federal Pell Grant Program, and the Academic Competitiveness Grant (AGC) and National Science and Mathematics Access to Retain Talent Grant (National Smart... is most likely to be obtained. As the primary function of admissions representatives is to serve as...

proliferation and terrorism. The Light Water Reactor Sustainability (LWRS) Program is the primary programmatic activity that addresses Objective 1. This document summarizes the LWRS Program’s plans. For the LWRS Program, sustainability is defined as the ability to maintain safe and economic operation of the existing fleet of nuclear power plants for a longer-than-initially-licensed lifetime. It has two facets with respect to long-term operations: (1) manage the aging of plant systems, structures, and components so that nuclear power plant lifetimes can be extended and the plants can continue to operate safely, efficiently, and economically; and (2) provide science-based solutions to the industry to implement technology to exceed the performance of the current labor-intensive business model.

A FORTRAN program NLOM for nonlocal optical model calculations is described. It is based on a method recently developed by Kim and Udagawa, which utilizes the Lanczos technique for solving integral equations derived from the nonlocal Schroedinger equation. (orig.)

The need for an effective long term Plant Configuration Management Program (CMP) has been demonstrated in response to Plant Design Modification and Plant Life Extension activities. Having particular need are those Utilities operating early vintage nuclear plants, where numerous modifications have been made without the benefit of an accurate, complete, properly maintained and controlled Design Basis. This paper presents a model for a long term, cost effective CMP which is based on and driven by the development, maintenance and control of accurate plant Design Basis Information. The model also provides a systematic approach for devising and implementing an integrated Plant CMP based on the essential attributes of the Plant Configuration Management, including Design Basis

A brief description of software for automated developing the models of integrating modular programming system, program module generator and program module library providing thermal-hydraulic calcualtion of process dynamics in power unit equipment components and on-line control system operation simulation is given. Technical recommendations for model development are based on experience in creation of concrete models of NPP power units. 8 refs., 1 tab., 4 figs

This work presents the procedure realized to integrate the code TNXYZ like a processing tool to the graphic simulation platform SALOME. The code TNXYZ solves the neutron transport equation in stationary state, for several energy groups, quantizing the angular variable by the discrete ordinates method and the space variable by nodal methods. The platform SALOME is a graphic surrounding designed for the construction, edition and simulation of mechanical models focused to the industry and contrary to other software, it allows to integrate external source codes to the surrounding, to form a complete scheme of execution, supervision, pre and post information processing. The code TNXYZ was programmed in the 90s in a Fortran compiler, but to be used at the present time the code should be actualized to the current compiler characteristics; also, in the original scheme was carried out a modularization process, that is to say, the main program was divided in sections where the code carries out important operations, with the intention of flexibility the data extraction process along its processing sequence and that can be useful in a later development of coupling. Finally, to verify the integration a fuel assembly BWR was modeled, as well as a control cell. The cross sections were obtained with the Monte Carlo Serpent code. Some results obtained with Serpent were used to verify and to begin with the validation of the code, being obtained an acceptable comparison in the infinite multiplication factor. The validation process should extend and one has planned to present in a future work. This work is part of the development of the research group formed between the Escuela Superior de Fisica y Matematicas del Instituto Politecnico Nacional (IPN) and the Instituto Nacional de Investigaciones Nucleares (ININ) in which a simulation Mexican platform of nuclear reactors is developed. (Author)

Integrable systems are investigated, especially the rational and trigonometric Gaudin models. The Gaudin models are diagonalized for the case of classical Lie algebras. Their relation to the other integrablemodels and to the quantum inverse scattering method is investigated. Applications in quantum optics and plasma physics are discussed. (author). 94 refs

Full Text Available In recent years, many tools have been proposed to reduce programming learning difficulties felt by many students. Our group has contributed to this effort through the development of several tools, such as VIP, SICAS, OOP-Anim, SICAS-COL and H-SICAS. Even though we had some positive results, the utilization of these tools doesn’t seem to significantly reduce weaker student’s difficulties. These students need stronger support to motivate them to get engaged in learning activities, inside and outside classroom. Nowadays, many technologies are available to create contexts that may help to accomplish this goal. We consider that a promising path goes through the integration of solutions. In this paper we analyze the features, strengths and weaknesses of the tools developed by our group. Based on these considerations we present a new environment, integrating different types of pedagogical approaches, resources, tools and technologies for programming learning support. With this environment, currently under development, it will be possible to review contents and lessons, based on video and screen captures. The support for collaborative tasks is another key point to improve and stimulate different models of teamwork. The platform will also allow the creation of various alternative models (learning objects for the same subject, enabling personalized learning paths adapted to each student knowledge level, needs and preferential learning styles. The learning sequences will work as a study organizer, following a suitable taxonomy, according to student’s cognitive skills. Although the main goal of this environment is to support students with more difficulties, it will provide a set of resources supporting the learning of more advanced topics. Software engineering techniques and representations, object orientation and event programming are features that will be available in order to promote the learning progress of students.

A large number of mathematical models have been developed for supporting optimization of land-use allocation; however, few of them simultaneously consider land suitability (e.g., physical features and spatial information) and various uncertainties existing in many factors (e.g., land availabilities, land demands, land-use patterns, and ecological requirements). This paper incorporates geographic information system (GIS) technology into interval-probabilistic programming (IPP) for land-use planning management (IPP-LUPM). GIS is utilized to assemble data for the aggregated land-use alternatives, and IPP is developed for tackling uncertainties presented as discrete intervals and probability distribution. Based on GIS, the suitability maps of different land users are provided by the outcomes of land suitability assessment and spatial analysis. The maximum area of every type of land use obtained from the suitability maps, as well as various objectives/constraints (i.e., land supply, land demand of socioeconomic development, future development strategies, and environmental capacity), is used as input data for the optimization of land-use areas with IPP-LUPM model. The proposed model not only considers the outcomes of land suitability evaluation (i.e., topography, ground conditions, hydrology, and spatial location) but also involves economic factors, food security, and eco-environmental constraints, which can effectively reflect various interrelations among different aspects in a land-use planning management system. The case study results at Suzhou, China, demonstrate that the model can help to examine the reliability of satisfying (or risk of violating) system constraints under uncertainty. Moreover, it may identify the quantitative relationship between land suitability and system benefits. Willingness to arrange the land areas based on the condition of highly suitable land will not only reduce the potential conflicts on the environmental system but also lead to a lower

Biomass gasification is an approach to producing energy and/or biofuels that could be integrated into existing forest product production facilities, particularly at pulp mills. Existing process heat and power loads tend to favor integration at existing pulp mills. This paper describes a generic modeling system for evaluating integrated biomass gasification business...

Discusses the role that information resource management (IRM) plays in educational program-oriented budgeting (POB), and presents a theoretical IRM model. Highlights include design considerations for integrated data systems; database management systems (DBMS); and how POB data can be integrated to enhance its value and use within an educational…

Describes the Integrated Special Education-English Project (ISEP) which facilitated the gradual integration of special education and English teacher preparation programs. A description of the ISEP model and a case study are included. The case study indicated student teachers who participated in the ISEP improved special education and English…

Dr. Kevin A. Fenton, Director of CDC's National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention, discusses program collaboration and service integration, a strategy that promotes better collaboration between public health programs and supports appropriate service integration at the point-of-care. Created: 9/15/2010 by National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention. Date Released: 9/15/2010.

Argues that two-year college business programs need to provide moral guidance and leadership to students to help stem the proliferation of fraudulent and questionable financial reporting practices. Reviews amoral and moral unity theories of business ethics. Discusses barriers to ethical instruction in business curricula, and ways to overcome them.…

Experiments designed to investigate the physics of particle transport and heating of dense plasmas have been carried out in an number of facilities around the world since the publication of the fast ignition concept in 1997. To date a number of integrated experiments, examining the capsule implosion and subsequent heating have been carried out on the Gekko facility at the Institute of Laser Engineering (ILE) Osaka, Japan. The coupling of energy by the short pulse into the pre-compressed core in these experiments was very encouraging. More facilities capable of carrying out integrated experiments are currently under construction: Firex at ILEm the Omega EP facility at the University of Rochester, Z PW at Sandia National Lab, LIL in France and eventually high energy PW beams on the NIF. This presentation will review the current status of experiments in this area and discuss the capabilities of integrated fast ignition research that will be required to design the proof of principle and scaling experiments for fast ignition to be carried on the NIF. (Author)

One of the recent challenges for the hydrologic research community is the need for the development of coupled systems that involves the integration of hydrologic, atmospheric and socio-economic relationships. This poses a requirement for novel modelling frameworks that can accurately represent complex systems, given, the limited understanding of underlying processes, increasing volume of data and high levels of uncertainity. Each of the existing hydrological models vary in terms of conceptualization and process representation and is the best suited to capture the environmental dynamics of a particular hydrological system. Data driven approaches can be used in the integration of alternative process hypotheses in order to achieve a unified theory at catchment scale. The key steps in the implementation of integratedmodelling framework that is influenced by prior understanding and data, include, choice of the technique for the induction of knowledge from data, identification of alternative structural hypotheses, definition of rules, constraints for meaningful, intelligent combination of model component hypotheses and definition of evaluation metrics. This study aims at defining a Genetic Programming based modelling framework that test different conceptual model constructs based on wide range of objective functions and evolves accurate and parsimonious models that capture dominant hydrological processes at catchment scale. In this paper, GP initializes the evolutionary process using the modelling decisions inspired from the Superflex framework [Fenicia et al., 2011] and automatically combines them into model structures that are scrutinized against observed data using statistical, hydrological and flow duration curve based performance metrics. The collaboration between data driven and physical, conceptual modelling paradigms improves the ability to model and manage hydrologic systems. Fenicia, F., D. Kavetski, and H. H. Savenije (2011), Elements of a flexible approach

Pathological gambling was classified under impulse control disorders within the International Classification of Diseases (ICD-10) (WHO 1992), but the most recent Diagnostic and Statistical Manual, 5th edition (DSM-V), (APA 2013), has recognized pathological gambling as a first disorder within a new diagnostic category of behavioral addictions - Gambling disorder. Pathological gambling is a disorder in progression, and we hope that our experience in the treatment of pathological gambling in the Daily Hospital for Addictions at The Institute of Mental Health, through the original "Integrative - systemic model" would be of use to colleagues, dealing with this pathology. This model of treatment of pathological gambling is based on multi-systemic approach and it primarily represents an integration of family and cognitive-behavioral therapy, with traces of psychodynamic, existential and pharmacotherapy. The model is based on the book "Pathological gambling - with self-help manual" by Dr Mladenovic and Dr Lazetic, and has been designed in the form of a program that lasts 10 weeks in the intensive phase, and then continues for two years in the form of "extended treatment" ("After care"). The intensive phase is divided into three segments: educational, insight with initial changes and analysis of the achieved changes with the definition of plans and areas that need to be addressed in the extended treatment. "Extended treatment" lasts for two years in the form of group therapy, during which there is a second order change of the identified patient, but also of other family members. Pathological gambling has been treated in the form of systemic-family therapy for more than 10 years at the Institute of Mental Health (IMH), in Belgrade. For second year in a row the treatment is carried out by the modern "Integrative-systemic model". If abstinence from gambling witihin the period of one year after completion of the intensive phase of treatment is taken as the main criterion of

Among the barrage of agreements faced by federal facilities are the State Oversight Agreements (known as Agreements in Principle in many states). These agreements between the Department of Energy (DOE) and the states fund the states to conduct independent environmental monitoring and oversight which requires plans, studies, inventories, models, and reports from DOE and its management and operating contractors. Many states have signed such agreements, including Tennessee, Kentucky, Washington, Idaho, Colorado, California, and Florida. This type of oversight agreement originated in Colorado as a result of environmental concerns at the Rocky Flats Plant. The 5-year State Oversight Agreements for Tennessee and Kentucky became effective on May 13, 1991, and fund these states nearly $21 million and $7 million, respectively. Implementation of these open-quotes comprehensive and integratedclose quotes agreements is particularly complex in Tennessee where the DOE Oak Ridge Reservation houses three installations with distinctly different missions. The program development and strategic planning required for coordinating and integrating a program of this magnitude is discussed. Included are the organizational structure and interfaces required to define and coordinate program elements across plants and to also effectively negotiate scope and schedules with the state. The planned Program Management Plan, which will contain implementation and procedural guidelines, and the management control system for detailed tracking of activities and costs are outlined. Additionally, issues inherent in the nature of the agreements and implementation of a program of this magnitude are discussed. Finally, a comparison of the agreements for Tennessee, Kentucky, Colorado, and Idaho is made to gain a better understanding of the similarities and differences in State Oversight Agreements to aid in implementation of these agreements

The purpose of this single case study was to examine a grant-funded program of professional development (PD) at a small rural high school in Ohio. Evidence has shown that the current model of technology professional development in-service sessions has had little impact on classroom technology integration. This PD program focused on 21st Century…

Using technology with children in play-based early learning programs creates questions for some within the Early Childhood Education (ECE) community. This paper presents how two faculty who teach in ECE-related degree programsintegrated educational technology into their teaching pedagogy as a way to model to their students how it can be used to…

Due to the recent emphasis on mathematical modeling, many ecologists are using mathematics and computers more than ever, and engineers, mathematicians and physical scientists are now included in ecological projects. However, the individual ecologist, with intuitive knowledge of the system, still requires the means to critically examine and adjust system models. An interactive program was developed with the primary goal of allowing an ecologist with minimal experience in either mathematics or computers to develop a system model. It has also been used successfully by systems ecologists, engineers, and mathematicians. This program was written in FORTRAN for the DEC PDP-10, a remote terminal system at Oak Ridge National Laboratory. However, with relatively minor modifications, it can be implemented on any remote terminal system with a FORTRAN IV compiler, or equivalent. This program may be used to simulate any phenomenon which can be described as a system of ordinary differential equations. The program allows the user to interactively change system parameters and/or initial conditions, to interactively select a set of variables to be plotted, and to model discontinuities in the state variables and/or their derivatives. One of the most useful features to the non-computer specialist is the ability to interactively address the system parameters by name and to interactively adjust their values between simulations. These and other features are described in greater detail

The long-term economic outcome of energy-related industrial investment projects is difficult to evaluate because of uncertain energy market conditions. In this article, a general, multistage, stochastic programmingmodel for the optimization of investments in process integration and industrial energy technologies is proposed. The problem is formulated as a mixed-binary linear programmingmodel where uncertainties are modelled using a scenario-based approach. The objective is to maximize the expected net present value of the investments which enables heat savings and decreased energy imports or increased energy exports at an industrial plant. The proposed modelling approach enables a long-term planning of industrial, energy-related investments through the simultaneous optimization of immediate and later decisions. The stochastic programming approach is also suitable for modelling what is possibly complex process integration constraints. The general model formulation presented here is a suitable basis for more specialized case studies dealing with optimization of investments in energy efficiency. -- Highlights: → Stochastic programming approach to long-term planning of process integration investments. → Extensive mathematical model formulation. → Multi-stage investment decisions and scenario-based modelling of uncertain energy prices. → Results illustrate how investments made now affect later investment and operation opportunities. → Approach for evaluation of robustness with respect to variations in probability distribution.

..., requires the GAO to conduct a study and report on issues pertaining to the oral health of children... response to, an initiative by a governmental entity, such as the oral health program with the Federal... already understand the employment demands in their field. The commenters also believed that because...

With the passage of the Workforce Innovation and Opportunity Act (WIOA) of 2014, Northampton Community College began the creation of Integrated Education and Training (IE&T) programs in October 2015. After a needs assessment was conducted with the partners, programs were created to address the needs in the hospitality and healthcare sectors.…

The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to modelintegrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.

A CAD/CMM workpiece modeling system based on IGES file is proposed. The modeling system is implemented by using a new method for labelling the tolerance items of 3D workpiece. The concept-"feature face" is used in the method. First the CAD data of workpiece are extracted and recognized automatically. Then a workpiece model is generated, which is the integration of pure 3D geometry form with its corresponding inspection items. The principle of workpiece modeling is also presented. At last, the experiment results are shown and correctness of the model is certified.

Integrated hydrological models are useful tools for water resource management and research, and advances in computational power and the advent of new observation types has resulted in the models generally becoming more complex and distributed. However, the models are often characterized by a high...... degree of parameterization which results in significant model uncertainty which cannot be reduced much due to observations often being scarce and often taking the form of point measurements. Data assimilation shows great promise for use in integrated hydrological models , as it allows for observations...... to be efficiently combined with models to improve model predictions, reduce uncertainty and estimate model parameters. In this thesis, a framework for assimilating multiple observation types and updating multiple components and parameters of a catchment scale integrated hydrological model is developed and tested...

We present a model of the eye movement system in which the programming of an eye movement is the result of the competitive integration of information in the superior colliculi (SC). This brain area receives input from occipital cortex, the frontal eye fields, and the dorsolateral prefrontal cortex,

Nuclear power has safely, reliably, and economically contributed almost 20% of electrical generation in the United States over the past two decades. It remains the single largest contributor (more than 70%) of non-greenhouse-gas-emitting electric power generation in the United States. Domestic demand for electrical energy is expected to experience a 31% growth from 2009 to 2035. At the same time, most of the currently operating nuclear power plants will begin reaching the end of their initial 20-year extension to their original 40-year operating license for a total of 60 years of operation. Figure E-1 shows projected nuclear energy contribution to the domestic generating capacity. If current operating nuclear power plants do not operate beyond 60 years, the total fraction of generated electrical energy from nuclear power will begin to decline - even with the expected addition of new nuclear generating capacity. The oldest commercial plants in the United States reached their 40th anniversary in 2009. The U.S. Department of Energy Office of Nuclear Energy's Research and Development Roadmap (Nuclear Energy Roadmap) organizes its activities around four objectives that ensure nuclear energy remains a compelling and viable energy option for the United States. The four objectives are as follows: (1) develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the life of the current reactors; (2) develop improvements in the affordability of new reactors to enable nuclear energy to help meet the Administration's energy security and climate change goals; (3) develop sustainable nuclear fuel cycles; and (4) understand and minimize the risks of nuclear proliferation and terrorism. The Light Water Reactor Sustainability (LWRS) Program is the primary programmatic activity that addresses Objective 1. This document summarizes the LWRS Program's plans.

Nuclear power has safely, reliably, and economically contributed almost 20% of electrical generation in the United States over the past two decades. It remains the single largest contributor (more than 70%) of non-greenhouse-gas-emitting electric power generation in the United States. Domestic demand for electrical energy is expected to experience a 31% growth from 2009 to 2035. At the same time, most of the currently operating nuclear power plants will begin reaching the end of their initial 20-year extension to their original 40-year operating license for a total of 60 years of operation. Figure E-1 shows projected nuclear energy contribution to the domestic generating capacity. If current operating nuclear power plants do not operate beyond 60 years, the total fraction of generated electrical energy from nuclear power will begin to decline - even with the expected addition of new nuclear generating capacity. The oldest commercial plants in the United States reached their 40th anniversary in 2009. The U.S. Department of Energy Office of Nuclear Energy's Research and Development Roadmap (Nuclear Energy Roadmap) organizes its activities around four objectives that ensure nuclear energy remains a compelling and viable energy option for the United States. The four objectives are as follows: (1) develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the life of the current reactors; (2) develop improvements in the affordability of new reactors to enable nuclear energy to help meet the Administration's energy security and climate change goals; (3) develop sustainable nuclear fuel cycles; and (4) understand and minimize the risks of nuclear proliferation and terrorism. The Light Water Reactor Sustainability (LWRS) Program is the primary programmatic activity that addresses Objective 1. This document summarizes the LWRS Program's plans.

An on-line accelerator modeling facility is currently under development at CEBAF. The model server, which is integrated with the EPICS control system, provides coupled and 2nd-order matrices for the entire accelerator, and forms the foundation for automated model- based control and diagnostic applications. Four types of machine models are provided, including design, golden or certified, live, and scratch or simulated model. Provisions are also made for the use of multiple lattice modelingprograms such as DIMAD, PARMELA, and TLIE. Design and implementation details are discussed. 2 refs., 4 figs

The World Integrated Nuclear Evaluation System (WINES) is an aggregate demand-based partial equilibrium model used by the Energy Information Administration (EIA) to project long-term domestic and international nuclear energy requirements. WINES follows a top-down approach in which economic growth rates, delivered energy demand growth rates, and electricity demand are projected successively to ultimately forecast total nuclear generation and nuclear capacity. WINES could be potentially used to produce forecasts for any country or region in the world. Presently, WINES is being used to generate long-term forecasts for the United States, and for all countries with commercial nuclear programs in the world, excluding countries located in centrally planned economic areas. Projections for the United States are developed for the period from 2010 through 2030, and for other countries for the period starting in 2000 or 2005 (depending on the country) through 2010. EIA uses a pipeline approach to project nuclear capacity for the period between 1990 and the starting year for which the WINES model is used. This approach involves a detailed accounting of existing nuclear generating units and units under construction, their capacities, their actual or estimated time of completion, and the estimated date of retirements. Further detail on this approach can be found in Appendix B of Commercial Nuclear Power 1991: Prospects for the United States and the World

The aim of this paper is to highlight the significance of integrated governance in bringing about community participation, improved service delivery, accountability of public systems and human resource rationalisation. It discusses the strategies of innovative institutional structures in translating such integration in the areas of public health and nutrition for poor communities. The paper draws on experience of initiating integrated governance through innovations in health and nutrition programming in the resource-poor state of Chhattisgarh, India, at different levels of governance structures--hamlets, villages, clusters, blocks, districts and at the state. The study uses mixed methods--i.e. document analysis, interviews, discussions and quantitative data from facilities surveys--to present a case study analyzing the process and outcome of integration. The data indicate that integrated governance initiatives improved convergence between health and nutrition departments of the state at all levels. Also, innovative structures are important to implement the idea of integration, especially in contexts that do not have historical experience of such partnerships. Integration also contributed towards improved participation of communities in self-governance, community monitoring of government programs, and therefore, better services. As governments across the world, especially in developing countries, struggle towards achieving better governance, integration can serve as a desirable process to address this. Integration can affect the decentralisation of power, inclusion, efficiency, accountability and improved service quality in government programs. The institutional structures detailed in this paper can provide models for replication in other similar contexts for translating and sustaining the idea of integrated governance. This paper is one of the few to investigate innovative public institutions of a and community mobilisation to explore this important, and under

ICT integration of teacher preparation programmes was undertaken by the Australian Teaching Teachers for the Future (TTF) project in all 39 Australian teacher education institutions and highlighted the need for guidelines to inform systemic ICT integration approaches. A Social Ecological Model (SEM) was used to positively inform integration…

Mixed Waste IntegratedProgram (MWIP) is sponsored by the US Department of Energy (DOE), Office of Technology Development, Waste Management Division. The strategic objectives of MWIP are defined in the Mixed Waste IntegratedProgram Strategic Plan, and expanded upon in the MWIP Program Management Plan. This MWIP Quality Assurance Requirement Plan (QARP) applies to mixed waste treatment technologies involving both hazardous and radioactive constituents. As a DOE organization, MWIP is required to develop, implement, and maintain a written Quality Assurance Program in accordance with DOE Order 4700.1 Project Management System, DOE Order 5700.6C, Quality Assurance, DOE Order 5820.2A Radioactive Waste Management, ASME NQA-1 Quality Assurance Program Requirements for Nuclear Facilities and ANSI/ASQC E4-19xx Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs. The purpose of the MWIP QA program is to establish controls which address the requirements in 5700.6C, with the intent to minimize risks and potential environmental impacts; and to maximize environmental protection, health, safety, reliability, and performance in all program activities. QA program controls are established to assure that each participating organization conducts its activities in a manner consistent with risks posed by those activities

Mixed Waste IntegratedProgram (MWIP) is sponsored by the US Department of Energy (DOE), Office of Technology Development, Waste Management Division. The strategic objectives of MWIP are defined in the Mixed Waste IntegratedProgram Strategic Plan, and expanded upon in the MWIP Program Management Plan. This MWIP Quality Assurance Requirement Plan (QARP) applies to mixed waste treatment technologies involving both hazardous and radioactive constituents. As a DOE organization, MWIP is required to develop, implement, and maintain a written Quality Assurance Program in accordance with DOE Order 4700.1 Project Management System, DOE Order 5700.6C, Quality Assurance, DOE Order 5820.2A Radioactive Waste Management, ASME NQA-1 Quality Assurance Program Requirements for Nuclear Facilities and ANSI/ASQC E4-19xx Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs. The purpose of the MWIP QA program is to establish controls which address the requirements in 5700.6C, with the intent to minimize risks and potential environmental impacts; and to maximize environmental protection, health, safety, reliability, and performance in all program activities. QA program controls are established to assure that each participating organization conducts its activities in a manner consistent with risks posed by those activities.

The computer program DITTY (Dose Integrated Over Ten Thousand Years) was developed to determine the collective dose from long term nuclear waste disposal sites resulting from the ground-water pathways. DITTY estimates the time integral of collective dose over a ten-thousand-year period for time-variant radionuclide releases to surface waters, wells, or the atmosphere. This document includes the following information on DITTY: a description of the mathematical models, program designs, data file requirements, input preparation, output interpretations, sample problems, and program-generated diagnostic messages

The MCHF-MLTPOL program performs the angular integrations necessary for expressing the matrix elements of transition operators, E1, E2, ..., or M1, M2, ..., as linear combinations of radial integrals. All matrix elements for transitions between two lists of configuration states will be evaluated. A limited amount of non-orthogonality is allowed between orbitals of the initial and final state. (orig.)

Nuclear power has safely, reliably, and economically contributed almost 20% of electrical generation in the United States over the past two decades. It remains the single largest contributor (more than 70%) of non-greenhouse-gas-emitting electric power generation in the United States. Domestic demand for electrical energy is expected to experience a 31% growth from 2009 to 2035. At the same time, most of the currently operating nuclear power plants will begin reaching the end of their initial 20-year extension to their original 40-year operating license for a total of 60 years of operation. Figure E-1 shows projected nuclear energy contribution to the domestic generating capacity. If current operating nuclear power plants do not operate beyond 60 years, the total fraction of generated electrical energy from nuclear power will begin to decline—even with the expected addition of new nuclear generating capacity. The oldest commercial plants in the United States reached their 40th anniversary in 2009. The U.S. Department of Energy Office of Nuclear Energy’s Research and Development Roadmap (Nuclear Energy Roadmap) organizes its activities around four objectives that ensure nuclear energy remains a compelling and viable energy option for the United States. The four objectives are as follows: (1) develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the life of the current reactors; (2) develop improvements in the affordability of new reactors to enable nuclear energy to help meet the Administration’s energy security and climate change goals; (3) develop sustainable nuclear fuel cycles; and (4) understand and minimize the risks of nuclear proliferation and terrorism. The Light Water Reactor Sustainability (LWRS) Program is the primary programmatic activity that addresses Objective 1. This document summarizes the LWRS Program’s plans.

Nuclear power has safely, reliably, and economically contributed almost 20% of electrical generation in the United States over the past two decades. It remains the single largest contributor (more than 70%) of non-greenhouse-gas-emitting electric power generation in the United States. Domestic demand for electrical energy is expected to experience a 31% growth from 2009 to 2035. At the same time, most of the currently operating nuclear power plants will begin reaching the end of their initial 20-year extension to their original 40-year operating license for a total of 60 years of operation. Figure E-1 shows projected nuclear energy contribution to the domestic generating capacity. If current operating nuclear power plants do not operate beyond 60 years, the total fraction of generated electrical energy from nuclear power will begin to decline—even with the expected addition of new nuclear generating capacity. The oldest commercial plants in the United States reached their 40th anniversary in 2009. The U.S. Department of Energy Office of Nuclear Energy’s Research and Development Roadmap (Nuclear Energy Roadmap) organizes its activities around four objectives that ensure nuclear energy remains a compelling and viable energy option for the United States. The four objectives are as follows: (1) develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the life of the current reactors; (2) develop improvements in the affordability of new reactors to enable nuclear energy to help meet the Administration’s energy security and climate change goals; (3) develop sustainable nuclear fuel cycles; and (4) understand and minimize the risks of nuclear proliferation and terrorism. The Light Water Reactor Sustainability (LWRS) Program is the primary programmatic activity that addresses Objective 1. This document summarizes the LWRS Program’s plans.

Continuum plasma models often use a finite element (FE) formulation. Another approach is simulation models based on particle-in-cell (PIC) formulation. The model equations generally include four nonlinear differential equations specifying the plasma parameters. In simulation a large number of equations must be integrated iteratively to determine the plasma evolution from an initial state. The complexity of the resulting programs is a combination of the physics involved and the numerical method used. The data structure requirements of plasma programs are stated by defining suitable abstract data types. These abstractions are then reduced to data structures and a group of associated algorithms. These are implemented in an object oriented language (C++) as object classes. Base classes encapsulate data management into a group of common functions such as input-output management, instance variable updating and selection of objects by Boolean operations on their instance variables. Operations are thereby isolated from specific element types and uniformity of treatment is guaranteed. Creation of the data structures and associated functions for a particular plasma model is reduced merely to defining the finite element matrices for each equation, or the equations of motion for PIC models. Changes in numerical method or equation alterations are readily accommodated through the mechanism of inheritance, without modification of the data management software. The central data type is an n-relation implemented as a tuple of variable internal structure. Any finite element program may be described in terms of five relational tables: nodes, boundary conditions, sources, material/particle descriptions, and elements. Equivalently, plasma simulation programs may be described using four relational tables: cells, boundary conditions, sources, and particle descriptions

The Hazardous Waste Remedial Actions Program was established to integrate Defense Programs' activities in hazardous and mixed waste management. The Program currently provides centralized planning and technical support to the Office of the Assistant Secretary for Defense Programs. More direct project management responsibilities may be assumed in the future. The Program, under the direction of the ASDP's Office of Defense Waste and Transportation Management, interacts with numerous organizational entities of the Department. The Oak Ridge Operations Office has been designated as the Lead Field Office. The Program's four current components cover remedial action project identification and prioritization; technology adaptation; an informative system; and a strategy study for long-term, ''corporate'' project and facility planning

This study examined whether a "programintegrity booster" could improve the low to moderate programintegrity and effectiveness of the EQUIP program for incarcerated youth as practiced in The Netherlands. Programintegrity was assessed in EQUIP groups before and after the booster. Youth residing in

Describes the purpose of the Master of Education (M. Ed.) Program in Integrated Mathematics, Science, and Technology Education (MSAT Program) at The Ohio State University and discusses preservice teachers' attitudes and perceptions toward integrated curriculum. (Contains 35 references.) (YDS)

Quality pressure boundary maintenance and an excellent loss prevention record at Bruce Heavy Water Plant are the results of the Material and Inspection Unit's five inspection programs. Experienced inspectors are responsible for the integrity of the pressure boundary in their own operating area. Inspectors are part of the Technical Section, and along with unit engineering staff, they provide technical input before, during, and after the job. How these programs are completed, and the results achieved, are discussed. 5 figs., 1 appendix

This article is a review of the IAEA integrated safeguards instrumentation program. The historical development of the program is outlined, and current activities are also noted. Brief technical descriptions of certain features are given. It is concluded that the results of this year's efforts in this area will provide significant input and be used to assess the viability of the proposed concepts and to decide on the directions to pursue in the future

Quality pressure boundary maintenance and an excellent loss prevention record at Bruce Heavy Water Plant are the results of the Material and Inspection Unit`s five inspection programs. Experienced inspectors are responsible for the integrity of the pressure boundary in their own operating area. Inspectors are part of the Technical Section, and along with unit engineering staff, they provide technical input before, during, and after the job. How these programs are completed, and the results achieved, are discussed. 5 figs., 1 appendix.

To ensure optimal management and sustainable strategies for water resources, infrastructures, food production and ecosystems there is a need for an improved understanding of feedback and interaction mechanisms between the atmosphere and the land surface. This is especially true in light of expected...... global warming and increased frequency of extreme events. The skill in developing projections of both the present and future climate depends essentially on the ability to numerically simulate the processes of atmospheric circulation, hydrology, energy and ecology. Previous modelling efforts of climate...... and hydrology models to more directly include the interaction between the atmosphere and the land surface. The present PhD study is motivated by an ambition of developing and applying a modelling tool capable of including the interaction and feedback mechanisms between the atmosphere and the land surface...

This book presents cutting-edge applications of, and up-to-date research on, ontology engineering techniques in the physical asset integrity domain. Though a survey of state-of-the-art theory and methods on ontology engineering, the authors emphasize essential topics including data integrationmodeling, knowledge representation, and semantic interpretation. The book also reflects novel topics dealing with the advanced problems of physical asset integrity applications such as heterogeneity, data inconsistency, and interoperability existing in design and utilization. With a distinctive focus on applications relevant in heavy industry, Ontology Modeling in Physical Asset Integrity Management is ideal for practicing industrial and mechanical engineers working in the field, as well as researchers and graduate concerned with ontology engineering in physical systems life cycles. This book also: Introduces practicing engineers, research scientists, and graduate students to ontology engineering as a modeling techniqu...

a systematic and continuous monitoring program of antimicrobial drug consumption and antimicrobial agent resistance in animals, food, and humans, the Danish Integrated Antimicrobial Resistance Monitoring and Research Program (DANMAP). Monitoring of antimicrobial drug resistance and a range of research......Resistance to antimicrobial agents is an emerging problem worldwide. Awareness of the undesirable consequences of its widespread occurrence has led to the initiation of antimicrobial agent resistance monitoring programs in several countries. In 1995, Denmark was the first country to establish...... activities related to DANMAP have contributed to restrictions or bans of use of several antimicrobial agents in food animals in Denmark and other European Union countries....

Systems biology is an approach to biology that emphasizes the structure and dynamic behavior of biological systems and the interactions that occur within them. To succeed, systems biology crucially depends on the accessibility and integration of data across domains and levels of granularity. Biomedical ontologies were developed to facilitate such an integration of data and are often used to annotate biosimulation models in systems biology. We provide a framework to integrate representations of in silico systems biology with those of in vivo biology as described by biomedical ontologies and demonstrate this framework using the Systems Biology Markup Language. We developed the SBML Harvester software that automatically converts annotated SBML models into OWL and we apply our software to those biosimulation models that are contained in the BioModels Database. We utilize the resulting knowledge base for complex biological queries that can bridge levels of granularity, verify models based on the biological phenomenon they represent and provide a means to establish a basic qualitative layer on which to express the semantics of biosimulation models. We establish an information flow between biomedical ontologies and biosimulation models and we demonstrate that the integration of annotated biosimulation models and biomedical ontologies enables the verification of models as well as expressive queries. Establishing a bi-directional information flow between systems biology and biomedical ontologies has the potential to enable large-scale analyses of biological systems that span levels of granularity from molecules to organisms.

This Multi-Year Program Plan (MAP) Planning IntegrationProgram, Work Breakdown Structure (WBS) Element 1.8.2, is the primary management tool to document the technical, schedule, and cost baseline for work directed by the US Department of Energy (DOE), Richland Operations Office (RL). As an approved document, it establishes an agreement between RL and the performing contractors for the work to be performed. It was prepared by Westinghouse Hanford Company (WHC) and Pacific Northwest Laboratory (PNL). The MYPPs for the Hanford Site programs are to provide a picture from fiscal year (FY) 1996 through FY 2002. At RL Planning and Integration Division (PID) direction, only the FY 1996 Planning IntegrationProgram work scope has been planned and presented in this MAP. Only those known significant activities which occur after FY 1996 are portrayed in this MAP. This is due to the uncertainty of who will be accomplishing what work scope when, following the award of the Management and Integration (M ampersand I) contract

This report is a compilation of studies done to develop an integrated set of strategies for the production of energy from renewable resources in Hawaii. Because of the close coordination between this program and other ongoing DOE research, the work will have broad-based applicability to the entire United States.

This document discloses the comments provided by a review panel at the U.S. Department of Energy Office of the Biomass Program Peer Review held on November 15-16, 2007 in Baltimore, MD and the Integrated Biorefinery Platform Review held on August 13-15, 2007 in Golden, Colorado.

This podcast provides a description of Program Collaboration and Service Integration (PCSI). Created: 12/7/2009 by National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention (NCHHSTP). Date Released: 12/7/2009.

In 2006, Honeywell Federal Manufacturing & Technologies (FM&T) announced an updatedvision statement for the organization. The vision is “To be the most admired team within the NNSA [National Nuclear Security Administration] for our relentless drive to convert ideas into the highest quality products and services for National Security by applying the right technology, outstanding program management and best commercial practices.” The challenge to provide outstanding program management was taken up by the Program Management division and the ProgramIntegration Office (PIO) of the company. This article describes how Honeywell developed and deployed a program management maturity model to drive toward excellence.

... 25 Indians 1 2010-04-01 2010-04-01 false Can a school integrate Language Development programs into... Language Development Programs § 39.132 Can a school integrate Language Development programs into its regular instructional program? A school may offer Language Development programs to students as part of its...

This paper considers the overall training programs undertaken by a newly appointed Operations Engineer at one of the Central Electricity Generating Board's (CEGB) Advanced Gas Cooled Reactor (AGR) nuclear power stations. The training program is designed to equip him with the skills and knowledge necessary for him to discharge his duties safely and effectively. In order to assist the learning process and achieve and integratedprogram, aspects of reactor technology and operation, initially the subject of theoretical presentations at the CEGB's Nuclear Power Training Center (NPTC) are reinforced by either simulation and/or practical experience on site. In the later stages plant-specific simulators, operated by trained tutors, are incorporated into the training program to provide the trainee with practical experience of plant operation. The trainee's performance is assessed throughout the program to provide feedback to the trainee, the trainers and station management

Integratedmodels are defined as economic energy models that consist of several submodels, either coupled by an interface module, or embedded in one large model. These models can be used for energy policy analysis. Using integratedmodels yields the following benefits. They provide a framework in which energy-economy interactions can be better analyzed than in stand-alone models. Integratedmodels can represent both energy sector technological details, as well as the behaviour of the market and the role of prices. Furthermore, the combination of modeling methodologies in one model can compensate weaknesses of one approach with strengths of another. These advantages motivated this survey of the class of integratedmodels. The purpose of this literature survey therefore was to collect and to present information on integratedmodels. To carry out this task, several goals were identified. The first goal was to give an overview of what is reported on these models in general. The second one was to find and describe examples of such models. Other goals were to find out what kinds of models were used as component models, and to examine the linkage methodology. Solution methods and their convergence properties were also a subject of interest. The report has the following structure. In chapter 2, a 'conceptual framework' is given. In chapter 3 a number of integratedmodels is described. In a table, a complete overview is presented of all described models. Finally, in chapter 4, the report is summarized, and conclusions are drawn regarding the advantages and drawbacks of integratedmodels. 8 figs., 29 refs

Over the last 20 years the Idaho National Laboratory (INL) has adopted a number of operations and safety-related programs which has each periodically taken its turn in the limelight. As new programs have come along there has been natural competition for resources, focus and commitment. In the last few years, the INL has made real progress in integrating all these programs and are starting to realize important synergies. Contributing to this integration are both collaborative individuals and an emerging shared vision and goal of the INL fully maturing in its high reliability operations. This goal is so powerful because the concept of high reliability operations (and the resulting organizations) is a masterful amalgam and orchestrator of the best of all the participating programs (i.e. conduct of operations, behavior based safety, human performance, voluntary protection, quality assurance, and integrated safety management). This paper is a brief recounting of the lessons learned, thus far, at the INL in bringing previously competing programs into harmony under the goal (umbrella) of seeking to perform regularly as a high reliability organization. In addition to a brief diagram-illustrated historical review, the authors will share the INL’s primary successes (things already effectively stopped or started) and the gaps yet to be bridged.

This paper summarizes the process and results of human health risk assessments of the US Department of Energy (DOE) complex-wide programs for high-level waste, transuranic waste, low-level, mixed low-level waste, and spent nuclear fuel. The DOE baseline programs and alternatives for these five material types were characterized by disposition maps (material flow diagrams) and supporting information in the May 1997 report 'A Contractor Report to the Department of Energy on Environmental Baseline Programs and Integration Opportunities' (Discussion Draft). Risk analyses were performed using the Simplified Risk Model (SRM), developed to support DOE Environmental Management Integration studies. The SRM risk analyses consistently and comprehensively cover the life cycle programs for the five material types, from initial storage through final disposition. Risk results are presented at several levels: DOE complex-wide, material type program, individual DOE sites, and DOE site activities. The detailed risk results are documented in the February 1998 report 'Human Health Risk Comparisons for Environmental Management Baseline Programs and Integration Opportunities' (Discussion Draft)

The Steam Generator IntegrityProgram (SGIP) is a comprehensive effort addressing issues of nondestructive test (NDT) reliability, inservice inspection (ISI) requirements, and tube plugging criteria for PWR steam generators. In addition, the program has interactive research tasks relating primary side decontamination, secondary side cleaning, and proposed repair techniques to nondestructive inspectability and primary system integrity. The program has acquired a service degraded PWR steam generator for research purposes. This past year a research facility, the Steam Generator Examination Facility (SGEF), specifically designed for nondestructive and destructive examination tasks of the SGIP was completed. The Surry generator previously transported to the Hanford Reservation was then inserted into the SGEF. Nondestructive characterization of the generator from both primary and secondary sides has been initiated. Decontamination of the channelhead cold leg side was conducted. Radioactive field maps were established in the steam generator, at the generator surface and in the SGEF

The paper presents a recently developed Heat Air & Moisture Laboratory in SimuLink. The simulation laboratory facilitates the integration of the following models: (1) a whole building model; (2) Heating Venting and Air-Conditioning and primary systems; (3) 2D indoor airflow, 3D Heat Air & Moisture

Integral type models to describe stationary plumes and jets in cross-flows (wind) have been developed since about 1970. These models are widely used for risk analysis, to describe the consequences of many different scenarios. Alternatively, CFD codes are being applied, but computational requireme......Integral type models to describe stationary plumes and jets in cross-flows (wind) have been developed since about 1970. These models are widely used for risk analysis, to describe the consequences of many different scenarios. Alternatively, CFD codes are being applied, but computational...... requirements still limit the number of scenarios that can be dealt with using CFD only. The integralmodels, however, are not suited to handle transient releases, such as releases from pressurized equipment, where the initially high release rate decreases rapidly with time. Further, on gas ignition, a second...... model is needed to describe the rapid combustion of the flammable part of the plume (flash fire) and a third model has to be applied for the remaining jet fire. The objective of this paper is to describe the first steps of the development of an integral-type model describing the transient development...

Integrated hydrologic modeling encompasses a vast number of processes and specifications, variable in time and space, and development of model datasets can be arduous. Model input construction techniques have not been formalized or made easily reproducible. Creating the input files for integrated hydrologic models (IHM) requires complex GIS processing of raster and vector datasets from various sources. Developing stream network topology that is consistent with the model resolution digital elevation model is important for robust simulation of surface water and groundwater exchanges. Distribution of meteorologic parameters over the model domain is difficult in complex terrain at the model resolution scale, but is necessary to drive realistic simulations. Historically, development of input data for IHM models has required extensive GIS and computer programming expertise which has restricted the use of IHMs to research groups with available financial, human, and technical resources. Here we present a series of Python scripts that provide a formalized technique for the parameterization and development of integrated hydrologic model inputs for GSFLOW. With some modifications, this process could be applied to any regular grid hydrologic model. This Python toolkit automates many of the necessary and laborious processes of parameterization, including stream network development and cascade routing, land coverages, and meteorological distribution over the model domain.

The need for a software toolkit that integrates space weather models and data is one of many challenges we are facing with when applying the models to space weather forecasting. To meet this challenge, we have developed Space Weather IntegratedModeling (SWIM) that is capable of analysis and visualizations of the results from a diverse set of space weather models. SWIM has a modular design and is written in Python, by using NumPy, matplotlib, and the Visualization ToolKit (VTK). SWIM provides data management module to read a variety of spacecraft data products and a specific data format of Solar-Interplanetary Conservation Element/Solution Element MHD model (SIP-CESE MHD model) for the study of solar-terrestrial phenomena. Data analysis, visualization and graphic user interface modules are also presented in a user-friendly way to run the integratedmodels and visualize the 2-D and 3-D data sets interactively. With these tools we can locally or remotely analysis the model result rapidly, such as extraction of data on specific location in time-sequence data sets, plotting interplanetary magnetic field lines, multi-slicing of solar wind speed, volume rendering of solar wind density, animation of time-sequence data sets, comparing between model result and observational data. To speed-up the analysis, an in-situ visualization interface is used to support visualizing the data 'on-the-fly'. We also modified some critical time-consuming analysis and visualization methods with the aid of GPU and multi-core CPU. We have used this tool to visualize the data of SIP-CESE MHD model in real time, and integrated the Database Model of shock arrival, Shock Propagation Model, Dst forecasting model and SIP-CESE MHD model developed by SIGMA Weather Group at State Key Laboratory of Space Weather/CAS.

The socio-economic field researches have indicated the necessity of realizing an integrated consultancy service for beekeepers that will supply technical-economic solutions with a practical character for ensuring the lucrativeness and viability of the apiaries. Consequently, an integrated apiarian consultancy model has been built holding the following features: it realizes the diagnosis of the meliferous resources and supplies solutions for its optimal administration; it realizes the technica...

This is the final report of the International Piping Integrity Research Group (IPIRG) Program. The IPIRG Program was an international group program managed by the U.S. Nuclear Regulatory Commission and funded by a consortium of organizations from nine nations: Canada, France, Italy, Japan, Sweden, Switzerland, Taiwan, the United Kingdom, and the United States. The program objective was to develop data needed to verify engineering methods for assessing the integrity of circumferentially-cracked nuclear power plant piping. The primary focus was an experimental task that investigated the behavior of circumferentially flawed piping systems subjected to high-rate loadings typical of seismic events. To accomplish these objectives a pipe system fabricated as an expansion loop with over 30 meters of 16-inch diameter pipe and five long radius elbows was constructed. Five dynamic, cyclic, flawed piping experiments were conducted using this facility. This report: (1) provides background information on leak-before-break and flaw evaluation procedures for piping, (2) summarizes technical results of the program, (3) gives a relatively detailed assessment of the results from the pipe fracture experiments and complementary analyses, and (4) summarizes advances in the state-of-the-art of pipe fracture technology resulting from the IPIRG program

This is the final report of the International Piping Integrity Research Group (IPIRG) Program. The IPIRG Program was an international group program managed by the U.S. Nuclear Regulatory Commission and funded by a consortium of organizations from nine nations: Canada, France, Italy, Japan, Sweden, Switzerland, Taiwan, the United Kingdom, and the United States. The program objective was to develop data needed to verify engineering methods for assessing the integrity of circumferentially-cracked nuclear power plant piping. The primary focus was an experimental task that investigated the behavior of circumferentially flawed piping systems subjected to high-rate loadings typical of seismic events. To accomplish these objectives a pipe system fabricated as an expansion loop with over 30 meters of 16-inch diameter pipe and five long radius elbows was constructed. Five dynamic, cyclic, flawed piping experiments were conducted using this facility. This report: (1) provides background information on leak-before-break and flaw evaluation procedures for piping, (2) summarizes technical results of the program, (3) gives a relatively detailed assessment of the results from the pipe fracture experiments and complementary analyses, and (4) summarizes advances in the state-of-the-art of pipe fracture technology resulting from the IPIRG program.

The Integrated Research Plan (IRP) describes the portfolio of Human Research Program (HRP) research and technology tasks. The IRP is the HRP strategic and tactical plan for research necessary to meet HRP requirements. The need to produce an IRP is established in HRP-47052, Human Research Program - Program Plan, and is under configuration management control of the Human Research Program Control Board (HRPCB). Crew health and performance is critical to successful human exploration beyond low Earth orbit. The Human Research Program (HRP) is essential to enabling extended periods of space exploration because it provides knowledge and tools to mitigate risks to human health and performance. Risks include physiological and behavioral effects from radiation and hypogravity environments, as well as unique challenges in medical support, human factors, and behavioral or psychological factors. The Human Research Program (HRP) delivers human health and performance countermeasures, knowledge, technologies and tools to enable safe, reliable, and productive human space exploration. Without HRP results, NASA will face unknown and unacceptable risks for mission success and post-mission crew health. This Integrated Research Plan (IRP) describes HRP s approach and research activities that are intended to address the needs of human space exploration and serve HRP customers and how they are integrated to provide a risk mitigation tool. The scope of the IRP is limited to the activities that can be conducted with the resources available to the HRP; it does not contain activities that would be performed if additional resources were available. The timescale of human space exploration is envisioned to take many decades. The IRP illustrates the program s research plan through the timescale of early lunar missions of extended duration.

his research investigates the combination of task and data parallel language constructs within a single programming language. There are an number of applications that exhibit properties which would be well served by such an integrated language. Examples include global climate models, aircraft design problems, and multidisciplinary design optimization problems. Our approach incorporates data parallel language constructs into an existing, object oriented, task parallel language. The language will support creation and manipulation of parallel classes and objects of both types (task parallel and data parallel). Ultimately, the language will allow data parallel and task parallel classes to be used either as building blocks or managers of parallel objects of either type, thus allowing the development of single and multi-paradigm parallel applications. 1995 Research Accomplishments In February I presented a paper at Frontiers '95 describing the design of the data parallel language subset. During the spring I wrote and defended my dissertation proposal. Since that time I have developed a runtime model for the language subset. I have begun implementing the model and hand-coding simple examples which demonstrate the language subset. I have identified an astrophysical fluid flow application which will validate the data parallel language subset. 1996 Research Agenda Milestones for the coming year include implementing a significant portion of the data parallel language subset over the Legion system. Using simple hand-coded methods, I plan to demonstrate (1) concurrent task and data parallel objects and (2) task parallel objects managing both task and data parallel objects. My next steps will focus on constructing a compiler and implementing the fluid flow application with the language. Concurrently, I will conduct a search for a real-world application exhibiting both task and data parallelism within the same program m. Additional 1995 Activities During the fall I collaborated

Integratedmodelling (IM) has made considerable advances over the past decade but it has not yet been taken up as an operational tool in the way that its proponents had hoped. The reasons why will be discussed in Session U17. This talk will propose topics for a research and development programme and suggest an institutional structure which, together, could overcome the present obstacles. Their combined aim would be first to make IM into an operational tool useable by competent public authorities and commercial companies and, in time, to see it evolve into the modelling equivalent of Google Maps, something accessible and useable by anyone with a PC or an iphone and an internet connection. In a recent study, a number of government agencies, water authorities and utilities applied integratedmodelling to operational problems. While the project demonstrated that IM could be used in an operational setting and had benefit, it also highlighted the advances that would be required for its widespread uptake. These were: greatly improving the ease with which models could be a) made linkable, b) linked and c) run; developing a methodology for applying integratedmodelling; developing practical options for calibrating and validating linked models; addressing the science issues that arise when models are linked; extending the range of modelling concepts that can be linked; enabling interface standards to pass uncertainty information; making the interface standards platform independent; extending the range of platforms to include those for high performance computing; developing the concept of modelling components as web services; separating simulation code from the model’s GUI, so that all the results from the linked models can be viewed through a single GUI; developing scenario management systems so that that there is an audit trail of the version of each model and dataset used in each linked model run. In addition to the above, there is a need to build a set of integrated

Since 2004, the Gemini Observatory’s week-long Journey Through the Universe (JTtU) program has successfully shared the excitement of scientific research with teachers, students and the public on Hawaii’s Big Island. Based on the national JTtU program started in 1999, the Hawai‘i version reaches an average of 7,000 students annually and each year features a different theme shared with a diverse set of learners. In 2010, the theme includes the integration of the GalileoScope-produced as a keystone project for the International Year of Astronomy. In preparation, a pilot teacher workshop (held in October 2009) introduced local island teachers to the GalileoScope and a 128-page educator’s activity resource book coordinated by the University of Wyoming. Response from this initial teacher’s workshop has been strong and evaluations plus follow-up actions by participating teachers illustrate that the integration of the GalileoScope has been successful based upon this diverse sample. Integrating GalileoScopes into Chilean schools in 2010 is also underway at Gemini South. This program will solicit informal proposals from educators who wish to use the telescopes in classrooms and a Spanish version of the teacher resource book is planned. The authors conclude that integration of the GalileoScope into an existing outreach program is an effective way to keep content fresh, relevant and engaging for both educators and students. This initiative is funded by Gemini Observatory outreach program. The Gemini Observatory is operated by the Association of Universities for Research in Astronomy, Inc., under a cooperative agreement with the NSF on behalf of the Gemini partnership: the National Science Foundation (US), the Science and Technology Facilities Council (UK), the National Research Council (Canada), CONICYT (Chile), the Australian Research Council (Australia), Ministério da Ciência e Tecnologia (Brazil), and Ministerio de Ciencia, Tecnología e Innovación Productiva

Various aspects of the generation of macromodels of digital integrated circuits are examined, and their effective application in program packages of circuit engineering design is considered. Three levels of macromodels are identified, and the application of such models to the simulation of circuit outputs is discussed.

This paper presents a dynamic mathematical model that describes the fate and transport of two selected xenobiotic organic compounds (XOCs) in a simplified representation. of an integrated urban wastewater system. A simulation study, where the xenobiotics bisphenol A and pyrene are used as reference...... compounds, is carried out. Sorption and specific biological degradation processes are integrated with standardised water process models to model the fate of both compounds. Simulated mass flows of the two compounds during one dry weather day and one wet weather day are compared for realistic influent flow...... rate and concentration profiles. The wet weather day induces resuspension of stored sediments, which increases the pollutant load on the downstream system. The potential of the model to elucidate important phenomena related to origin and fate of the model compounds is demonstrated....

Full Text Available The main purpose of this paper is to present the modern model of integrated corporate communication. Beside this, the authors will describe the changes occurring in the corporate environment and importance of changing the model of corporate communication. This paper also discusses the importance of implementation of the suggested model, the use of new media and effects of these changes on corporations. The approach used in this paper is the literature review. The authors explore the importance of implementation of the suggested model and the new media in corporate communication, both internal and external, addressing all the stakeholders and communication contents. The paper recommends implementation of a modern model of integrated corporate communication as a response to constant development of the new media and generation changes taking place. Practical implications: the modern model of integrated corporate communication can be used as an upgrade of the conventional communication models. This modern model empowers companies to sustain and build up the existing relationships with stakeholders, and to find out and create new relationships with stakeholders who were previously inaccessible and invisible.

This report of the IntegratedProgram Planning Activity (IPPA) has been prepared in response to a recommendation by the Secretary of Energy Advisory Board that, ''Given the complex nature of the fusion effort, an integratedprogram planning process is an absolute necessity.'' We, therefore, undertook this activity in order to integrate the various elements of the program, to improve communication and performance accountability across the program, and to show the inter-connectedness and inter-dependency of the diverse parts of the national fusion energy sciences program. This report is based on the September 1999 Fusion Energy Sciences Advisory Committee's (FESAC) report ''Priorities and Balance within the Fusion Energy Sciences Program''. In its December 5,2000, letter to the Director of the Office of Science, the FESAC has reaffirmed the validity of the September 1999 report and stated that the IPPA presents a framework and process to guide the achievement of the 5-year goals listed in the 1999 report. The National Research Council's (NRC) Fusion Assessment Committee draft final report ''An Assessment of the Department of Energy's Office of Fusion Energy Sciences Program'', reviewing the quality of the science in the program, was made available after the IPPA report had been completed. The IPPA report is, nevertheless, consistent with the recommendations in the NRC report. In addition to program goals and the related 5-year, 10-year, and 15-year objectives, this report elaborates on the scientific issues associated with each of these objectives. The report also makes clear the relationships among the various program elements, and cites these relationships as the reason why integratedprogram planning is essential. In particular, while focusing on the science conducted by the program, the report addresses the important balances between the science and energy goals of the program, between the MFE and IFE approaches, and between the domestic and international aspects

Full Text Available One of the most important tendencies in child psychotherapy is the integration of various psychotherapeutic approaches and technical interventions belonging to different orientations. Based on the Harry Potter stories, the „Wizarding School” structured group therapy program is a 12-step integratively oriented program applicable in personal development, individual and group therapy for children aged 6 to 13 (at present being adapted for adult psychotherapy. The program takes place within a fairy tale, being therefore a type of informal hypnotic trance. The interventions are drawn from the lessons described in Harry Potter’s story at Hogwarts, based on the fundamental principles of child psychotherapy and including elements of play therapy, art therapy, hypnotherapy, cognitive- behavioural therapy, transactional analysis, supportive therapy, family therapy and person centred therapy. From a theoretical point of view the program is based on elements from a number of psychotherapeutic approaches, the main concept being that we need to create a therapeutic myth that is acceptable to a child. The program is not suitable for children with structural deficits, who have difficulties in making the difference between fantasy and reality.

Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

As our global older adult populations are increasing, university programs are well-positioned to produce an effective, gerontology-trained workforce (Morgan, 2012; Silverstein & Fitzgerald, 2017). A gerontology curriculum comprehensively can offer students an aligned career development track that encourages them to: (a) learn more about themselves as a foundation for negotiating career paths; (b) develop and refine career skills; (c) participate in experiential learning experiences; and (d) complete competency-focused opportunities. In this article, we discuss a programmatic effort to help undergraduate gerontology students integrate development-based career planning and decision-making into their academic programs and achieve postgraduation goals.

Many challenges are introduced to interdisciplinary research in and around the hydrologic science community due to advances in computing technology and modeling capabilities in different programming languages, across different platforms and frameworks by researchers in a variety of fields with a variety of experience in computer programming. Many new hydrologic models as well as optimization, parameter estimation, and uncertainty characterization techniques are developed in scripting languages such as Matlab, R, Python, or in newer languages such as Java and the .Net languages, whereas many legacy models have been written in FORTRAN and C, which complicates inter-model communication for two-way feedbacks. However, most hydrologic researchers and industry personnel have little knowledge of the computing technologies that are available to address the modelintegration process. Therefore, the goal of this study is to address these new challenges by utilizing a novel approach based on a publish-subscribe-type system to enhance modeling capabilities of legacy socio-economic, hydrologic, and ecologic software. Enhancements include massive parallelization of executions and access to legacy model variables at any point during the simulation process by another program without having to compile all the models together into an inseparable 'super-model'. Thus, this study provides two-way feedback mechanisms between multiple different process models that can be written in various programming languages and can run on different machines and operating systems. Additionally, a level of abstraction is given to the modelintegration process that allows researchers and other technical personnel to perform more detailed and interactive modeling, visualization, optimization, calibration, and uncertainty analysis without requiring deep understanding of inter-process communication. To be compatible, a program must be written in a programming language with bindings to a common

This paper discusses the utilization of a P.C. based program to facilitate the management of Entergy Operations Arkansas Nuclear One (ANO) fire barrier penetration seal program. The computer program was developed as part of a streamlining process to consolidate all aspects of the ANO Penetration Seal Program under one system. The program tracks historical information related to each seal such as maintenance activities, design modifications and evaluations. The program is integrated with approved penetration seal design details which have been substantiated by full scale fire tests. This control feature is intended to prevent the inadvertent utilization of an unacceptable penetration detail in a field application which may exceed the parameters tested. The system is also capable of controlling the scope of the periodic surveillance of penetration seals by randomly selecting the inspection population and generating associated inspection forms. Inputs to the data base are required throughout the modification and maintenance process to ensure configuration control and maintain accurate data base information. These inputs are verified and procedurally controlled by Fire Protection Engineering (FPE) personnel. The implementation of this system has resulted in significant cost savings and has minimized the allocation of resources necessary to ensure long term program viability

Describes the spiral interactive program evaluation model, which is designed to evaluate vocational-technical education programs in secondary schools in Nigeria. Program evaluation is defined; utility oriented and process oriented models for evaluation are described; and internal and external evaluative factors and variables that define each…

of this paper is to go back to the future and get inspiration from SIMULA and propose a unied approach. In addition to reintroducing the contributions of SIMULA and the Scandinavian approach to object-oriented programming, we do this by discussing a number of issues in modeling and programming and argue3 why we......SIMULA was a language for modeling and programming and provided a unied approach to modeling and programming in contrast to methodologies based on structured analysis and design. The current development seems to be going in the direction of separation of modeling and programming. The goal...

The purpose of this dissertation is to develop a simulation based accident sequence analysis program (ADS) for large scale dynamic accident sequence simulation. Human operators, front-line and support systems as well as plant thermal-hydraulic behavior are explicitly modeled as integrated active parts in the development of accident scenarios. To overcome the model size, the proposed methodology employs several techniques including use of 'initial state vector' which decouples time-dependent and time-independent factors, and a depth first integration method in which the computation memory demand increases in a linear order. The computer implementation of the method is capable of simulating up to 500 branch points in sequence development, models system failure during operation, allows for recovery from operator errors and hardware failures, and implements a simple model for operator system interactions. (author)

This paper addresses improving the safety and reliability of power plants in a cost-effective manner by integrating the recently developed reliability centered maintenance techniques with the traditional predictive maintenance techniques of nuclear power plants. The topics of the paper include a description of reliability centered maintenance (RCM), enhancing RCM with predictive maintenance, predictive maintenance programs, condition monitoring techniques, performance test techniques, the mid-Atlantic Reliability Centered Maintenance Users Group, test guides and the benefits of shared guide development

Interdisciplinary cognitive rehabilitation is emerging as the expected standard of care for individuals with mild to moderate degrees of cognitive impairment for a variety of etiologies. There is a growing body of evidence in cognitive rehabilitation literature supporting the involvement of multiple disciplines, with the use of cognitive support technologies (CSTs), in delivering cognitive therapy to individuals who require cognitive rehabilitative therapies. This article provides an overview of the guiding theories related to traditional approaches of cognitive rehabilitation and the positive impact of current theoretical models of an interdisciplinary approach in clinical service delivery of this rehabilitation. A theoretical model of the Integrative Cognitive Rehabilitation Program (ICRP) will be described in detail along with the practical substrates of delivering specific interventions to individuals and caregivers who are living with mild to moderate cognitive impairment. The ultimate goal of this article is to provide a clinically useful resource for direct service providers. It will serve to further clinical knowledge and understanding of the evolution from traditional silo based treatment paradigms to the current implementation of multiple perspectives and disciplines in the pursuit of patient centered care. The article will discuss the theories that contributed to the development of the interdisciplinary team and the ICRP model, implemented with individuals with mild to moderate cognitive deficits, regardless of etiology. The development and implementation of specific assessment and intervention strategies in this cognitive rehabilitation program will also be discussed. The assessment and intervention strategies utilized as part of ICRP are applicable to multiple clinical settings in which individuals with cognitive impairment are served. This article has specific implications for rehabilitation which include: (a) An Interdisciplinary Approach is an

Due to increased burden on the environment caused by human activities, focus on industrial ecology designs are gaining more attention. In that perspective an environ- mentally effective integration of bionergy and agriculture systems has significant potential. This work introduces a modeling...... of the overall model. C- TOOL and Yasso07 are used in the carbon balance of agri- culture, Dynamic Network Analysis is used for the energy simulation and Brightway2 is used to build a Life Cycle Inventory compatible database and processes it for vari- ous impacts assessment methods. The model is success- fully...... approach that builds on Life Cycle Inventory and carries out Life Cycle Impact Assessment for a con- sequential Life Cycle Assessment on integrated bioenergy and agriculture systems. The model framework is built in Python which connects various freely available soft- ware that handle different aspects...

Full Text Available Demographers have yet to develop a suitable integratedmodel of international migration and consequently have been very poor at forecasting immigration. This paper outlines the basic elements of an integratedmodel and surveys recent history to suggest the key challenges to model construction. A comprehensive theory must explain the structural forces that create a supply of people prone to migrate internationally, the structural origins of labour demand in receiving countries, the motivations of those who respond to these forces by choosing to migrate internationally, the growth and structure of transnational networks that arise to support international movement, the behaviour states in response to immigrant flows, and the influence of state actions on the behaviour of migrants. Recent history suggests that a good model needs to respect the salience of markets, recognize the circularity of migrant flows, appreciate the power of feedback effects, and be alert unanticipated consequences of policy actions.

We discuss connections between certain classes of supersymmetric quiver gauge theories and integrable lattice models from the point of view of topological quantum field theories (TQFTs). The relevant classes include 4d N=1 theories known as brane box and brane tilling models, 3d N=2 and 2d N=(2,2) theories obtained from them by compactification, and 2d N=(0,2) theories closely related to these theories. We argue that their supersymmetric indices carry structures of TQFTs equipped with line operators, and as a consequence, are equal to the partition functions of lattice models. The integrability of these models follows from the existence of extra dimension in the TQFTs, which emerges after the theories are embedded in M-theory. The Yang-Baxter equation expresses the invariance of supersymmetric indices under Seiberg duality and its lower-dimensional analogs.

A voltage pickup coil, inductively coupled to the magnetic field of the superconducting coil under test, is connected so its output may be compared with the terminal voltage of the coil under test. The integrated voltage difference is indicative of the resistive volt-seconds. When multiplied with the main coil current, the volt-seconds yield the loss. In other words, a hysteresis loop is obtained if the integrated voltage difference phi = ∫ΔVdt is plotted as a function of the coil current, i. First, time functions of the two signals phi(t) and i(t) are recorded on a dual-trace digital oscilloscope, and these signals are then recorded on magnetic tape. On a CDC-6600, the recorded information is decoded and plotted, and the hysteresis loops are integrated by the set of FORTRAN programs NICOLET described in this report. 4 figures

Path integration is a navigation strategy widely observed in nature where an animal maintains a running estimate, called the home vector, of its location during an excursion. Evidence suggests it is both ancient and ubiquitous in nature, and has been studied for over a century. In that time, canonical and neural network models have flourished, based on a wide range of assumptions, justifications and supporting data. Despite the importance of the phenomenon, consensus and unifying principles appear lacking. A fundamental issue is the neural representation of space needed for biological path integration. This paper presents a scheme to classify path integration systems on the basis of the way the home vector records and updates the spatial relationship between the animal and its home location. Four extended classes of coordinate systems are used to unify and review both canonical and neural network models of path integration, from the arthropod and mammalian literature. This scheme demonstrates analytical equivalence between models which may otherwise appear unrelated, and distinguishes between models which may superficially appear similar. A thorough analysis is carried out of the equational forms of important facets of path integration including updating, steering, searching and systematic errors, using each of the four coordinate systems. The type of available directional cue, namely allothetic or idiothetic, is also considered. It is shown that on balance, the class of home vectors which includes the geocentric Cartesian coordinate system, appears to be the most robust for biological systems. A key conclusion is that deducing computational structure from behavioural data alone will be difficult or impossible, at least in the absence of an analysis of random errors. Consequently it is likely that further theoretical insights into path integration will require an in-depth study of the effect of noise on the four classes of home vectors. Copyright 2009 Elsevier Ltd

We show how topological G k /G k models can be embedded into the topological matter models that are obtained by perturbing the twisted N = 2 supersymmetric, hermitian symmetric, coset models. In particular, this leads to an embedding of the fusion ring of G as a sub-ring of the perturbed, chiral primary ring. The perturbation of the twisted N = 2 model that leads to the fusion ring is also shown to lead to an integrable N = 2 supersymmetric field theory when the untwisted N = 2 superconformal field theory is perturbed by the same operator and its hermitian conjugate. (orig.)

This paper aims to contribute towards the advancement of an efficient architecture of a single market for knowledge through the development of an integrativemodel of knowledge transfer. Within this aim, several points of departure can be singled out. One, the article builds on the call of the Eu......This paper aims to contribute towards the advancement of an efficient architecture of a single market for knowledge through the development of an integrativemodel of knowledge transfer. Within this aim, several points of departure can be singled out. One, the article builds on the call...... business and academia, and implementing the respective legislature are enduring. The research objectives were to explore (i) the process of knowledge transfer in universities, including the nature of tensions, obstacles and incentives, (ii) the relationships between key stakeholders in the KT market...... of the emergent integrativemodel of knowledge transfer. In an attempt to bring it to a higher level of generalizability, the integrativemodel of KT is further conceptualized from a ‘sociology of markets’ perspective resulting in an emergent architecture of a single market for knowledge. Future research...

This report describes the International Summit on Integrated Environmental Modeling (IEM), held in Washington, DC 7th-9th December 2010. The meeting brought together 57 scientists and managers from leading US and European government and non-governmental organizations, universitie...

The present development of modern integrated circuits (IC’s) is characterized by a number of critical factors that make their design and verification considerably more difficult than before. This dissertation addresses the important questions of modeling all electromagnetic behavior of features on

What would make anti-bullying initiatives more successful? This book offers a new approach to the problem of school bullying. The question of what constitutes a useful theory of bullying is considered and suggestions are made as to how priorities for future research might be identified. The integrated, systemic model of school bullying introduced…

Presented and discussed is a model which can be used by educators who want to develop an interdisciplinary map skills program in geography and mathematics. The model assumes that most children in elementary schools perform cognitively at Piaget's concrete operational stage, that readiness for map skills can be assessed with Piagetian or…

The sausage model, first proposed by Fateev, Onofri, and Zamolodchikov, is a deformation of the O(3) sigma model preserving integrability. The target space is deformed from the sphere to ‘sausage’ shape by a deformation parameter ν. This model is defined by a factorizable S-matrix which is obtained by deforming that of the O(3) sigma model by a parameter λ. Clues for the deformed sigma model are provided by various UV and IR information through the thermodynamic Bethe ansatz (TBA) analysis based on the S-matrix. Application of TBA to the sausage model is, however, limited to the case of 1/λ integer where the coupled integral equations can be truncated to a finite number. In this paper, we propose a finite set of nonlinear integral equations (NLIEs), which are applicable to generic value of λ. Our derivation is based on T-Q relations extracted from the truncated TBA equations. For a consistency check, we compute next-leading order corrections of the vacuum energy and extract the S-matrix information in the IR limit. We also solved the NLIE both analytically and numerically in the UV limit to get the effective central charge and compared with that of the zero-mode dynamics to obtain exact relation between ν and λ. Dedicated to the memory of Petr Petrovich Kulish.

The integration of human resources management (HRM) strategies with long-term program-planning strategies in hospital pharmacy departments is described. HRM is a behaviorally based, comprehensive strategy for the effective management and use of people that seeks to achieve coordination and integration with overall planning strategies and other managerial functions. It encompasses forecasting of staffing requirements; determining work-related factors that are strong "motivators" and thus contribute to employee productivity and job satisfaction; conducting a departmental personnel and skills inventory; employee career planning and development, including training and education programs; strategies for promotion and succession, including routes of advancement that provide alternatives to the managerial route; and recruitment and selection of new personnel to meet changing departmental needs. Increased competitiveness among hospitals and a shortage of pharmacists make it imperative that hospital pharmacy managers create strategies to attract, develop, and retain the right individuals to enable the department--and the hospital as a whole--to grow and change in response to the changing health-care environment in the United States. Pharmacy managers would be greatly aided in this mission by the establishment of a well-defined, national strategic plan for pharmacy programs and services that includes an analysis of what education and training are necessary for their successful accomplishment. Creation of links between overall program objectives and people-planning strategies will aid hospital pharmacy departments in maximizing the long-term effectiveness of their practice.

The Glory program is an Earth and Solar science mission designed to broaden science community knowledge of the environment. The causes and effects of global warming have become a concern in recent years and Glory aims to contribute to the knowledge base of the science community. Glory is designed for two functions: one is solar viewing to monitor the total solar irradiance and the other is observing the Earth s atmosphere for aerosol composition. The former is done with an active cavity radiometer, while the latter is accomplished with an aerosol polarimeter sensor to discern atmospheric particles. The Glory program is managed by NASA Goddard Space Flight Center (GSFC) with Orbital Sciences in Dulles, VA as the prime contractor for the spacecraft bus, mission operations, and ground system. This paper will describe some of the more unique features of the Glory program including the integration and testing of the satellite and instruments as well as the science data processing. The spacecraft integration and test approach requires extensive analysis and additional planning to ensure existing components are successfully functioning with the new Glory components. The science mission data analysis requires development of mission unique processing systems and algorithms. Science data analysis and distribution will utilize our national assets at the Goddard Institute for Space Studies (GISS) and the University of Colorado's Laboratory for Atmospheric and Space Physics (LASP). The Satellite was originally designed and built for the Vegetation Canopy Lidar (VCL) mission, which was terminated in the middle of integration and testing due to payload development issues. The bus was then placed in secure storage in 2001 and removed from an environmentally controlled container in late 2003 to be refurbished to meet the Glory program requirements. Functional testing of all the components was done as a system at the start of the program, very different from a traditional program

The recent development and implementation of a revised Program Approach for the Civilian Radioactive Waste Management System (CRWMS) was accomplished in response to significant changes in the environment in which the program was being executed. The lack of an interim storage site, growing costs and schedule delays to accomplish the full Yucca Mountain site characterization plan, and the development and incorporation of a multi-purpose (storage, transport, and disposal) canister (MPC) into the CRWMS required a reexamination of Program plans and priorities. Dr. Daniel A. Dreyfus, the Director of the Office of Civilian Radioactive Waste Management (OCRWM), established top-level schedule, targets and cost goals and commissioned a Program-wide task force of DOE and contractor personnel to identify and evaluate alternatives to meet them. The evaluation of the suitability of Yucca Mountain site by 1998 and the repository license application data of 2001 were maintained and a target date of January 1998 for MPC availability was established. An increased multi-year funding profile was baselined and agreed to by Congress. A $1.3 billion reduction in Yucca Mountain site characterization costs was mandated to hold the cost to $5 billion. The replanning process superseded all previous budget allocations and focused on program requirements and their relative priorities within the cost profiles. This paper discusses the process for defining alternative scenarios to achieve the top-level program goals in an integrated fashion

Full Text Available In order to prepare students for the workforce, academic programs incorporate a variety of tools that students are likely to use in their future careers. One of these tools employed by business and technology programs is the integration of live software applications such as SAP through the SAP University Alliance (SAP UA program. Since the SAP UA program has been around for only about 10 years and the available literature on the topic is limited, research is needed to determine the strengths and weaknesses of the SAP UA program. A collaborative study of SAP UA faculty perceptions of their SAP UAs was conducted in the fall of 2011. Of the faculty invited to participate in the study, 31% completed the online survey. The results indicate that most faculty experienced difficulty implementing SAP into their programs and report that a need exists for more standardized curriculum and training, while a large percentage indicated that they are receiving the support they need from their schools and SAP.

Climate change and growing water needs have resulted in many parts of the world in water scarcity problems that must by managed by public authorities. Hence, policy-makers are more and more often asked to define and to implement water allocation rules between competitive users. This requires to develop new tools aiming at designing those rules for various scenarios of context (climatic, agronomic, economic). If models have been developed for each type of water use however, very few integrated frameworks link these different uses, while such an integrated approach is a relevant stake for designing regional water and land policies. The lack of such integratedmodels can be explained by the difficulty of integratingmodels developed by very different disciplines and by the problem of scale change (collecting data on large area, arbitrate between the computational tractability of models and their level of aggregation). However, modelers are more and more asked to deal with large basin scales while analyzing some policy impacts at very high detailed levels. These contradicting objectives require to develop new modeling tools. The CALVIN economically-driven optimization model developed for managing water in California is a good example of this type of framework, Draper et al. (2003). Recent reviews of the literature on integrated water management at the basin level include Letcher et al. (2007) or Cai (2008). We present here an original framework for integrated water management at the river basin scale called MoGIRE ("Modèle pour la Gestion Intégrée de la Ressource en Eau"). It is intended to optimize water use at the river basin level and to evaluate scenarios (agronomic, climatic or economic) for a better planning of agricultural and non-agricultural water use. MoGIRE includes a nodal representation of the water network. Agricultural, urban and environmental water uses are also represented using mathematical programming and econometric approaches. The model then

We extend form-factor perturbation theory to non-integrable deformations of massless integrablemodels, in order to address the problem of mass generation in such systems. With respect to the standard renormalisation group analysis this approach is more suitable for studying the particle content of the perturbed theory. Analogously to the massive case, interesting information can be obtained already at first order, such as the identification of the operators which create a mass gap and those which induce the confinement of the massless particles in the perturbed theory

The paradoxical aspect of integration of a social group has been highlighted by Blau (1964). During the integration process, the group members simultaneously compete for social status and play the role of the audience. Here we show that when the competition prevails over the desire of approval, a sharp transition breaks all friendly relations. However, as was described by Blau, people with high status are inclined to bother more with acceptance of others; this is achieved by praising others and revealing her/his own weak points. In our model, this action smooths the transition and improves interpersonal relations.

The authors review recent work in the integrated assessment modeling of global climate change. This field has grown rapidly since 1990. Integrated assessment models seek to combine knowledge from multiple disciplines in formal integrated representations; inform policy-making, structure knowledge, and prioritize key uncertainties; and advance knowledge of broad system linkages and feedbacks, particularly between socio-economic and bio-physical processes. They may combine simplified representations of the socio-economic determinants of greenhouse gas emissions, the atmosphere and oceans, impacts on human activities and ecosystems, and potential policies and responses. The authors summarize current projects, grouping them according to whether they emphasize the dynamics of emissions control and optimal policy-making, uncertainty, or spatial detail. They review the few significant insights that have been claimed from work to date and identify important challenges for integrated assessment modeling in its relationships to disciplinary knowledge and to broader assessment seeking to inform policy- and decision-making. 192 refs., 2 figs

In an effort to improve the psychological health of the athlete who has sustained an injury, the Performance Enhancement Group program for injured athletes was created. This paper will offer a model for the Performance Enhancement Group program as a way to: 1) support the athlete, both mentally and physically; 2) deal with the demands of rehabilitation; and 3) facilitate the adjustments the athlete has to make while being out of the competitive arena. The program consists of responsibilities for professionals in sport psychology (ie, assessment/orientation, support, education, individual counseling, and evaluation) and athletic training (ie, organization/administration, recruitment and screening, support, application of techniques, and program compliance). The paper will emphasize that the success of the program is dependent on collaboration between professionals at all levels. PMID:16558357

In this report we present a type graph that models all executable constructs of the Java programming language. Such a model is useful for any graph-based technique that relies on a representation of Java programs as graphs. The model can be regarded as a common representation to which all Java

In this work we present a type graph that models all executable constructs of the Java programming language. Such a model is useful for any graph-based technique that relies on a representation of Java programs as graphs. The model can be regarded as a common representation to which all Java syntax

This dissertation explores some aspects of knowledge integration, namely, accumulation of scientific knowledge and performing analogical reasoning on the acquired knowledge. Knowledge to be integrated is conveyed by paragraph-like pieces referred to as documents. By incorporating some results from cognitive science, the Deutsch-Kraft model of information retrieval is extended to a model for knowledge engineering, which integrates acquired knowledge and performs intelligent retrieval. The resulting computer model is termed COGMIR, which stands for a COGnitive Model for Intelligent Retrieval. A scheme, named query invoked memory reorganization, is used in COGMIR for knowledge integration. Unlike some other schemes which realize knowledge integration through subjective understanding by representing new knowledge in terms of existing knowledge, the proposed scheme suggests at storage time only recording the possible connection of knowledge acquired from different documents. The actual binding of the knowledge acquired from different documents is deferred to query time. There is only one way to store knowledge and numerous ways to utilize the knowledge. Each document can be represented as a whole as well as its meaning. In addition, since facts are constructed from the documents, document retrieval and fact retrieval are treated in a unified way. When the requested knowledge is not available, query invoked memory reorganization can generate suggestion based on available knowledge through analogical reasoning. This is done by revising the algorithms developed for document retrieval and fact retrieval, and by incorporating Gentner's structure mapping theory. Analogical reasoning is treated as a natural extension of intelligent retrieval, so that two previously separate research areas are combined. A case study is provided. All the components are implemented as list structures similar to relational data-bases.

The research conducted for this dissertation examined the learning environment of a specific high school program that delivered the explicit curriculum through an integrated experiential manner, which utilized field and outdoor experiences. The program ran over one semester (five months) and it integrated the grade 10 British Columbian curriculum in five subjects. A mixed methods approach was employed to identify the students' perceptions and provide richer descriptions of their experiences related to their unique learning environment. Quantitative instruments were used to assess changes in students' perspectives of their learning environment, as well as other supporting factors including students' mindfulness, and behaviours towards the environment. Qualitative data collection included observations, open-ended questions, and impromptu interviews with the teacher. The qualitative data describe the factors and processes that influenced the learning environment and give a richer, deeper interpretation which complements the quantitative findings. The research results showed positive scores on all the quantitative measures conducted, and the qualitative data provided further insight into descriptions of learning environment constructs that the students perceived as most important. A major finding was that the group cohesion measure was perceived by students as the most important attribute of their preferred learning environment. A flow chart was developed to help the researcher conceptualize how the learning environment, learning process, and outcomes relate to one another in the studied program. This research attempts to explain through the consideration of this case study: how learning environments can influence behavioural change and how an interconnectedness among several factors in the learning process is influenced by the type of learning environment facilitated. Considerably more research is needed in this area to understand fully the complexity learning

In today's highly competitive manufacturing environment, the supplier selection process becomes one of crucial activities in supply chain management. In order to select the best supplier(s) it is not only necessary to continuously tracking and benchmarking performance of suppliers but also to make a tradeoff between tangible and intangible factors some of which may conflict. In this paper an integration of case-based reasoning (CBR), analytical network process (ANP) and linear programming (LP) is proposed to solve the supplier selection problem.

The Buried Waste Integrated Demonstration (BWID) is a program funded by the US Department of Energy (DOE) Office of Technology Development. BWID supports the applied research, development, demonstration, and evaluation of a suite of advanced technologies that together form a comprehensive remediation system for the effective and efficient remediation of buried waste. Stakeholder participation in the DOE Environmental Management decision-making process is critical to remediation efforts. Appropriate mechanisms for communication with the public, private sector, regulators, elected officials, and others are being aggressively pursued by BWID to permit informed participation. This document summarizes public outreach efforts during FY-93 and presents a strategy for expanded stakeholder involvement during FY-94

fragmentation-integration-fragmentation-integration upward spiral. In response to the call for integrative approach to strategic management research, we propose an integrativemodel of global business strategy that aims at integrating not only strategy and IB but also the different paradigms within the strategy...... field. We also discuss the merit and limitation of our model....

Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

This paper describes the drop testing of a one-third scale model transport cask system. Two casks were supplied by Transnuclear, Inc. (TN) to demonstrate dual purpose shipping/storage casks. These casks will be used to ship spent fuel from DOEs West Valley demonstration project in New York to the Idaho National Engineering Laboratory (INEL) for long term spent fuel dry storage demonstration. As part of the certification process, one-third scale model tests were performed to obtain experimental data. Two 9-m (30-ft) drop tests were conducted on a mass model of the cask body and scaled balsa and redwood filled impact limiters. In the first test, the cask system was tested in an end-on configuration. In the second test, the system was tested in a slap-down configuration where the axis of the cask was oriented at a 10 degree angle with the horizontal. Slap-down occurs for shallow angle drops where the primary impact at one end of the cask is followed by a secondary impact at the other end. The objectives of the testing program were to (1) obtain deceleration and displacement information for the cask and impact limiter system, (2) obtain dynamic force-displacement data for the impact limiters, (3) verify the integrity of the impact limiter retention system, and (4) examine the crush behavior of the limiters. This paper describes both test results in terms of measured deceleration, post test deformation measurements, and the general structural response of the system

Software programs are proliferating throughout modern life, to a point where even the simplest appliances such as lightbulbs contain software, in addition to the software embedded in cars and airplanes. The correct functioning of these programs is therefore of the utmost importance, for the quality...

The project goal of was to decrease new graduate nurse (NGN) attrition during the first year of employment by improving communication skills and providing additional mentoring for NGNs employed in a community hospital located in a rural area. All NGNs participate in the Versant Residency Program. Even with this standardized residency program, exit interviews of NGNs who resigned during their first year of employment revealed 2 major issues: communication problems with patients and staff and perceived lack of support/mentoring from unit staff. A clinical nurse specialist-led nursing team developed an innovative programintegrating retired nurses, Volunteer Nurse Ambassadors (VNAs), into the Versant Residency Program to address both of those issues. All NGNs mentored by a retired nurse remain employed in the hospital (100% retention). Before the VNA program, the retention rate was 37.5%. Both the NGNs and VNAs saw value in their mentor-mentee relationship. There have been no critical incidences or failure to rescue events involving NGNs mentored by a VNA. Use of VNAs to support NGNs as they adjust to the staff nurse role can prevent attrition during their first year of nursing practice by providing additional support to the NGN.

Full Text Available The integration of a spatial process model into an environmental modeling framework can enhance the model’s capabilities. This paper describes a general methodology for integrating environmental models into the Object Modeling System (OMS regardless of the model’s complexity, the programming language, and the operating system used. We present the integration of the GEOtop model into the OMS version 3.0 and illustrate its application in a small watershed. OMS is an environmental modeling framework that facilitates model development, calibration, evaluation, and maintenance. It provides innovative techniques in software design such as multithreading, implicit parallelism, calibration and sensitivity analysis algorithms, and cloud-services. GEOtop is a physically based, spatially distributed rainfall-runoff model that performs three-dimensional finite volume calculations of water and energy budgets. Executing GEOtop as an OMS model component allows it to: (1 interact directly with the open-source geographical information system (GIS uDig-JGrass to access geo-processing, visualization, and other modeling components; and (2 use OMS components for automatic calibration, sensitivity analysis, or meteorological data interpolation. A case study of the model in a semi-arid agricultural catchment is presented for illustration and proof-of-concept. Simulated soil water content and soil temperature results are compared with measured data, and model performance is evaluated using goodness-of-fit indices. This study serves as a template for future integration of process models into OMS.

A disease-management model must be integrated, comprehensive, individual patient focused and outcome driven. In addition to high quality care, the successful model must reduce variations in care and costs. MS specialists need to be intimately involved in the long-term care of MS patients, while not neglecting primary care issues. A nurse care manager is the "glue" between the managed care company, health care providers and the patient/family. Disease management focuses on education and prevention, and can be cost effective as well as patient specific. To implement a successful program, managed care companies and health care providers must work together.

It is the objective of this paper to present a model reduction technique developed for the integrated controls-structures design of flexible structures. Integrated controls-structures design problems are typically posed as nonlinear mathematical programming problems, where the design variables consist of both structural and control parameters. In the solution process, both structural and control design variables are constantly changing; therefore, the dynamic characteristics of the structure are also changing. This presents a problem in obtaining a reduced-order model for active control design and analysis which will be valid for all design points within the design space. In other words, the frequency and number of the significant modes of the structure (modes that should be included) may vary considerably throughout the design process. This is also true as the locations and/or masses of the sensors and actuators change. Moreover, since the number of design evaluations in the integrated design process could easily run into thousands, any feasible order-reduction method should not require model reduction analysis at every design iteration. In this paper a novel and efficient technique for model reduction in the integrated controls-structures design process, which addresses these issues, is presented.

The Campbellian validity model and the traditional top-down approach to validity have had a profound influence on research and evaluation. That model includes the concepts of internal and external validity and within that model, the preeminence of internal validity as demonstrated in the top-down approach. Evaluators and researchers have, however, increasingly recognized that in an evaluation, the over-emphasis on internal validity reduces that evaluation's usefulness and contributes to the gulf between academic and practical communities regarding interventions. This article examines the limitations of the Campbellian validity model and the top-down approach and provides a comprehensive, alternative model, known as the integrative validity model for program evaluation. The integrative validity model includes the concept of viable validity, which is predicated on a bottom-up approach to validity. This approach better reflects stakeholders' evaluation views and concerns, makes external validity workable, and becomes therefore a preferable alternative for evaluation of health promotion/social betterment programs. The integrative validity model and the bottom-up approach enable evaluators to meet scientific and practical requirements, facilitate in advancing external validity, and gain a new perspective on methods. The new perspective also furnishes a balanced view of credible evidence, and offers an alternative perspective for funding. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrativemodel for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

Predicting the success of students participating in introductory programming courses has been an active research area for more than 25 years. Until recently, no variables or tests have had any significant predictive power. However, Dehnadi and Bornat claim to have found a simple test for programm......Predicting the success of students participating in introductory programming courses has been an active research area for more than 25 years. Until recently, no variables or tests have had any significant predictive power. However, Dehnadi and Bornat claim to have found a simple test...... for programming aptitude to cleanly separate programming sheep from non-programming goats. We briefly present their theory and test instrument. We have repeated their test in our local context in order to verify and perhaps generalise their findings, but we could not show that the test predicts students' success...... in our introductory program-ming course. Based on this failure of the test instrument, we discuss various explanations for our differing results and suggest a research method from which it may be possible to generalise local results in this area. Furthermore, we discuss and criticize Dehnadi and Bornat...

The current document establishes the strategy to be used for achieving sufficient integration between disciplines in producing Site Descriptive Models during the Site Investigation stage. The Site Descriptive Model should be a multidisciplinary interpretation of geology, rock mechanics, thermal properties, hydrogeology, hydrogeochemistry, transport properties and ecosystems using site investigation data from deep bore holes and from the surface as input. The modelling comprise the following iterative steps, evaluation of primary data, descriptive and quantitative modelling (in 3D), overall confidence evaluation. Data are first evaluated within each discipline and then the evaluations are checked between the disciplines. Three-dimensional modelling (i.e. estimating the distribution of parameter values in space and its uncertainty) is made in a sequence, where the geometrical framework is taken from the geological model and in turn used by the rock mechanics, thermal and hydrogeological modelling etc. The three-dimensional description should present the parameters with their spatial variability over a relevant and specified scale, with the uncertainty included in this description. Different alternative descriptions may be required. After the individual discipline modelling and uncertainty assessment a phase of overall confidence evaluation follows. Relevant parts of the different modelling teams assess the suggested uncertainties and evaluate the feedback. These discussions should assess overall confidence by, checking that all relevant data are used, checking that information in past model versions is considered, checking that the different kinds of uncertainty are addressed, checking if suggested alternatives make sense and if there is potential for additional alternatives, and by discussing, if appropriate, how additional measurements (i.e. more data) would affect confidence. The findings as well as the modelling results are to be documented in a Site Description

The Space Asset Management Database (SAM-D) was implemented in order to effectively track known objects in space by ingesting information from a variety of databases and performing calculations to determine the expected position of the object at a specified time. While SAM-D performs this task very well, it is limited by technology and is not available outside of the local user base. Modeling and simulation can be powerful tools to exploit the information contained in SAM-D. However, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. A more capable data management infrastructure would extend SAM-D to support the larger data sets to be generated by the COI. A service-oriented architecture model will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for visualizations. Based on a web-centric approach, the entire COI will be able to access the data and related analytics. In addition, tight control of information sharing policy will increase confidence in the system, which would encourage industry partners to provide commercial data. SIMON is a Government off the Shelf information sharing platform in use throughout DoD and DHS information sharing and situation awareness communities. SIMON providing fine grained control to data owners allowing them to determine exactly how and when their data is shared. SIMON supports a micro-service approach to system development, meaning M&S and analytic services can be easily built or adapted. It is uniquely positioned to fill this need as an information-sharing platform with a proven track record of successful situational awareness system deployments. Combined with the integration of new and legacy M&S tools, a SIMON-based architecture will provide a robust SA environment for the NASA SA COI that can be extended and expanded indefinitely. First Results of Coherent Uplink from a

Full Text Available As a result of the migration wave appearing in summer 2015, the issue of immigrant integration has more often become conspicuous. Although a significant decline has been recorded in the number of immigrants, social-economic-labor market integration is still a challenge for experts and a task to be resolved. In our opinion, the key to the success of migration strategies and integration-aimed programs depends on the attitude and awareness of society (public opinion and – on the organizational level – the manager and future colleagues as well as on the organizational culture and the approach of a proper human resource expert. Besides adequate information, the recognition of international ‘best practices’ and the adaptation of operational diversity-management, one of the possible methods of facilitating integration is the utilization of sensitization trainings. The article introduces partial results of a questionnaire survey involving 220 employees with respect to attributes associated with migrants and emphasizing the peculiarity and significance of sensitization trainings.

• The library can, together with the Schools, create and offer IL modules adapted to the educational programs Today IL education at BTH is quite extensive, but also irregular and highly dependent on contacts with individual teachers, which makes IL education vulnerable. In order to bring this problem to light, and inspired by the Borås model (presented at Creating knowledge VI, as well as Sydostmodellen, the library at BTH contacted the Board of Education during the winter of 2012, and presented a plan on how the library and Schools at BTH could cooperate in order to integrate IL education within all educational programs. Suggestions regarding content, extent, progression, timing, assessment and learning outcomes of the IL education are the focal point of the presented plan. As the first result of the proposal, the library has been commissioned by the BTH Quality Assurance Council to review the situation regarding IL education at BTH together with the educational program directors. In cooperation with the programs, the library should also make a plan for each program on how to integrate IL education as a part of generic skills. At the conference, the following themes were addressed and discussed during our presentation: sustainability of IL education, collaboration within the academy regarding IL education and how integration of IL education at university educational programs is reflected in research on IL in general.

Academic medical centers in North America are expanding their missions from the traditional triad of patient care, research, and education to include the broader issue of healthcare delivery improvement. In recent years, integrated Critical Care Organizations have developed within academic centers to better meet the challenges of this broadening mission. The goal of this article was to provide interested administrators and intensivists with the proper resources, lines of communication, and organizational approach to accomplish integration and Critical Care Organization formation effectively. The Academic Critical Care Organization Building section workgroup of the taskforce established regular monthly conference calls to reach consensus on the development of a toolkit utilizing methods proven to advance the development of their own academic Critical Care Organizations. Relevant medical literature was reviewed by literature search. Materials from federal agencies and other national organizations were accessed through the Internet. The Society of Critical Care Medicine convened a taskforce entitled "Academic Leaders in Critical Care Medicine" on February 22, 2016 at the 45th Critical Care Congress using the expertise of successful leaders of advanced governance Critical Care Organizations in North America to develop a toolkit for advancing Critical Care Organizations. Key elements of an academic Critical Care Organization are outlined. The vital missions of multidisciplinary patient care, safety, and quality are linked to the research, education, and professional development missions that enhance the value of such organizations. Core features, benefits, barriers, and recommendations for integration of academic programs within Critical Care Organizations are described. Selected readings and resources to successfully implement the recommendations are provided. Communication with medical school and hospital leadership is discussed. We present the rationale for critical

The NordForsk CRUCIAL project (2016-2017) "Critical steps in understanding land surface - atmosphere interactions: from improved knowledge to socioeconomic solutions" as a part of the Pan-Eurasian EXperiment (PEEX; https://www.atm.helsinki.fi/peex) programme activities, is looking for a deeper collaboration between Nordic-Russian science communities. In particular, following collaboration between Danish and Russian partners, several topics were selected for joint research and are focused on evaluation of: (1) urbanization processes impact on changes in urban weather and climate on urban-subregional-regional scales and at contribution to assessment studies for population and environment; (2) effects of various feedback mechanisms on aerosol and cloud formation and radiative forcing on urban-regional scales for better predicting extreme weather events and at contribution to early warning systems, (3) environmental contamination from continues emissions and industrial accidents for better assessment and decision making for sustainable social and economic development, and (4) climatology of atmospheric boundary layer in northern latitudes to improve understanding of processes, revising parameterizations, and better weather forecasting. These research topics are realized employing the online integrated Enviro-HIRLAM (Environment - High Resolution Limited Area Model) model within students' research projects: (1) "Online integrated high-resolution modelling of Saint-Petersburg metropolitan area influence on weather and air pollution forecasting"; (2) "Modeling of aerosol impact on regional-urban scales: case study of Saint-Petersburg metropolitan area"; (3) "Regional modeling and GIS evaluation of environmental pollution from Kola Peninsula sources"; and (4) "Climatology of the High-Latitude Planetary Boundary Layer". The students' projects achieved results and planned young scientists research training on online integratedmodelling (Jun 2017) will be presented and

The In Situ Remediation IntegratedProgram (ISR IP), managed under the US Department of Energy's (DOE) Office of Technology Development, focuses research and development efforts on the in-place treatment of contaminated environmental media, such as soil and groundwater, and the containment of contaminants to prevent the contaminants from spreading through the environment. As described here, specific ISR IP projects are advancing the application of in situ technologies to the demonstration point, providing developed technologies to customers within DOE. The ISR IP has also taken a lead role in assessing and supporting innovative technologies that may have application to DOE

The Integral Fast Reactor (IFR) and metallic fuel have emerged as the US Department of Energy reference reactor concept and fuel system for the development of an advanced liquid-metal reactor. This article addresses the basic elements of the IFR reactor concept and focuses on the safety advances achieved by the IFR Program in the areas of (1) fuel performance, (2) superior local faults tolerance, (3) transient fuel performance, (4) fuel-failure mechanisms, (5) performance in anticipated transients without scram, (6) core-melt mitigation, and (7) actinide recycle

Within the past twenty years, new techniques and methods have emerged in response to new technologies that are based upon the performance of high-purity and well-characterized materials. The National Bureau of Standards, through its Standard Reference Materials (SRM's) Program, provides standards in the form of many of these materials to ensure accuracy and the compatibility of measurements throughout the US and the world. These standards, defined by the National Bureau of Standards as Standard Reference Materials (SRMs), are developed by using state-of-the-art methods and procedures for both preparation and analysis. Nuclear methods-activation analysis constitute an integral part of that analysis process

William Creelman William H. Silcox National Marine Service Standard Oil Company of California St. Louis, Missouri San Francisco, California .-- N...develop physical models and generic tools for analyzing the effects of redundancy, reserve strength, and residual strength on the system behavior of marine...probabilistic analyses to be applicable to real-world problems, this program needs to provide - the deterministic physical models and generic tools upon

The model SWIM (Soil and Water IntegratedModel) was developed in order to provide a comprehensive GIS-based tool for hydrological and water quality modelling in mesoscale and large river basins (from 100 to 10,000 km{sup 2}), which can be parameterised using regionally available information. The model was developed for the use mainly in Europe and temperate zone, though its application in other regions is possible as well. SWIM is based on two previously developed tools - SWAT and MATSALU (see more explanations in section 1.1). The modelintegrates hydrology, vegetation, erosion, and nutrient dynamics at the watershed scale. SWIM has a three-level disaggregation scheme 'basin - sub-basins - hydrotopes' and is coupled to the Geographic Information System GRASS (GRASS, 1993). A robust approach is suggested for the nitrogen and phosphorus modelling in mesoscale watersheds. SWIM runs under the UNIX environment. Model test and validation were performed sequentially for hydrology, crop growth, nitrogen and erosion in a number of mesoscale watersheds in the German part of the Elbe drainage basin. A comprehensive scheme of spatial disaggregation into sub-basins and hydrotopes combined with reasonable restriction on a sub-basin area allows performing the assessment of water resources and water quality with SWIM in mesoscale river basins. The modest data requirements represent an important advantage of the model. Direct connection to land use and climate data provides a possibility to use the model for analysis of climate change and land use change impacts on hydrology, agricultural production, and water quality. (orig.)

A new boiling water reactor safety test facility (FIST, Full Integral Simulation Test) is described. It will be used to investigate small breaks and operational transients and to tie results from such tests to earlier large-break test results determined in the TLTA. The new facility's full height and prototypical components constitute a major scaling improvement over earlier test facilities. A heated feedwater system, permitting steady-state operation, and a large increase in the number of measurements are other significant improvements. The program background is outlined and program objectives defined. The design basis is presented together with a detailed, complete description of the facility and measurements to be made. An extensive component scaling analysis and prediction of performance are presented

An accessible treatment of the modeling and solution of integer programming problems, featuring modern applications and software In order to fully comprehend the algorithms associated with integer programming, it is important to understand not only how algorithms work, but also why they work. Applied Integer Programming features a unique emphasis on this point, focusing on problem modeling and solution using commercial software. Taking an application-oriented approach, this book addresses the art and science of mathematical modeling related to the mixed integer programming (MIP) framework and

The Systems Analysis Research Unit at the Civil Aeromedical Institute (CAMI) has developed a generic model for Federal Aviation Administration (FAA) Academy training program evaluation. The model will serve as a basis for integrating the total data b...

To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

KEPCO E and C participated in the NAPS (Nuclear Application Programs) development project for BNPP (Barakah Nuclear Power Plant) simulator. The 3KEY MASTER™ was adopted for this project, which is comprehensive simulation platform software developed by WSC (Western Services Corporation) for the development, and control of simulation software. The NAPS based on actual BNPP project was modified in order to meet specific requirements for nuclear power plant simulators. Considerations regarding software design for BNPP simulator and interfaces between the 3KM platform and application programs are discussed. The repeatability is one of functional requirements for nuclear power plant simulators. In order to migrate software from actual plants to simulators, software functions for storing and retrieving plant conditions and program variables should be implemented. In addition, software structures need to be redesigned to meet the repeatability, and source codes developed for actual plants would have to be optimized to reflect simulator’s characteristics as well. The synchronization is an important consideration to integrate external application programs into the 3KM simulator.

The overall goal of this “Integrated Research Training Program of Excellence in Radiochemistry” is to provide a rich and deep research experience in state-of-the-art radiochemistry and in the fundamentals of radioisotopic labeling and tracer methodology to develop researchers who are capable of meeting the challenges of designing and preparing radiotracers of broad applicability for monitoring and imaging diverse biological systems and environmental processes. This program was based in the Departments of Radiology and Radiation Oncology at Washington University Medical School and the Department of Chemistry at the University of Illinois at Urbana Champaign, and it was initially directed by Professor Michael J. Welch as Principal Investigator. After his passing in 2012, the program was led by Professor Suzanne E. Lapi. Programmatic content and participant progress was overseen by an Internal Advisory Committee of senior investigators consisting of the PIs, Professor Mach from the Department of Radiology at Washington University and Professor John A. Katzenellenbogen of the Department of Chemistry at the University of Illinois. A small External Advisory Committee to give overall program guidance was also constituted of experts in radiolabeled compounds and in their applications in environmental and plant science.

E-learning is not going to work if the system is not used in accordance with user needs. User Interface is very important to encourage using the application. Many theories had discuss about user interface usability evaluation and technology acceptance separately, actually why we do not make it correlation between interface usability evaluation and user acceptance to enhance e-learning process. Therefore, the evaluation model for e-learning interface acceptance is considered important to investigate. The aim of this study is to propose the integrated e-learning user interface acceptance evaluation model. This model was combined some theories of e-learning interface measurement such as, user learning style, usability evaluation, and the user benefit. We formulated in constructive questionnaires which were shared at 125 English Language School (ELS) students. This research statistics used Structural Equation Model using LISREL v8.80 and MANOVA analysis.

Full Text Available The interdisciplinary study of information technology adoption has developed rapidly over the last 30 years. Various theoretical models have been developed and applied such as: the Technology Acceptance Model (TAM, Innovation Diffusion Theory (IDT, Theory of Planned Behavior (TPB, etc. The result of these many years of research is thousands of contributions to the field, which, however, remain highly fragmented. This paper develops a theoretical model of technology adoption by integrating major theories in the field: primarily IDT, TAM, and TPB. To do so while avoiding mess, an approach that goes back to basics in independent variable type’s development is proposed; emphasizing: 1 the logic of classification, and 2 psychological mechanisms behind variable types. Once developed these types are then populated with variables originating in empirical research. Conclusions are developed on which types are underpopulated and present potential for future research. I end with a set of methodological recommendations for future application of the model.

– The OLI and the UIP models fail to include corporate entrepreneurship and managerial psychology in their analyses. We suggest that regulatory focus theory unifies the managerial strategic choice between position logic and opportunity logic. In addition, host country institutions affect this managerial......Purpose – This paper aims to critically review the ownership, location and internalization (OLI) model and the Uppsala internationalization process (UIP) framework. We suggest that the inclusion of concepts such as corporate entrepreneurship, host country institutions and regulatory focus...... in an integrated framework helps to explain firm internationalization. Design/methodology/approach – This paper is based on a review of the literature on the OLI and UIP models. In addition, it presents a conceptual model that encompasses corporate entrepreneurship, regulatory focus and institutions. Findings...

Many physiological, environmental, and operational risks exist for crewmembers during spaceflight. An understanding of these risks from an integrated perspective is required to provide effective and efficient mitigations during future exploration missions that typically have stringent limitations on resources available, such as mass, power, and crew time. The Human Research Program (HRP) is in the early stages of developing collaborative modeling approaches for the purposes of managing its science portfolio in an integrated manner to support cross-disciplinary risk mitigation strategies and to enable resilient human and engineered systems in the spaceflight environment. In this talk, we will share ideas being explored from fields such as network science, complexity theory, and system-of-systems modeling. Initial work on tools to support these explorations will be discussed briefly, along with ideas for future efforts.

As NASA Project Risk Management activities continue to evolve, the need to successfully integrate risk management processes across the life cycle, between functional disciplines, stakeholders, various management policies, and within cost, schedule and performance requirements/constraints become more evident and important. Today's programs and projects are complex undertakings that include a myriad of processes, tools, techniques, management arrangements and other variables all of which must function together in order to achieve mission success. The perception and impact of risk may vary significantly among stakeholders and may influence decisions that may have unintended consequences on the project during a future phase of the life cycle. In these cases, risks may be unintentionally and/or arbitrarily transferred to others without the benefit of a comprehensive systemic risk assessment. Integrating risk across people, processes, and project requirements/constraints serves to enhance decisions, strengthen communication pathways, and reinforce the ability of the project team to identify and manage risks across the broad spectrum of project management responsibilities. The ability to identify risks in all areas of project management increases the likelihood a project will identify significant issues before they become problems and allows projects to make effective and efficient use of shrinking resources. By getting a total team integrated risk effort, applying a disciplined and rigorous process, along with understanding project requirements/constraints provides the opportunity for more effective risk management. Applying an integrated approach to risk management makes it possible to do a better job at balancing safety, cost, schedule, operational performance and other elements of risk. This paper will examine how people, processes, and project requirements/constraints can be integrated across the project lifecycle for better risk management and ultimately improve the

Full Text Available A conceptual model of the life cycle of the program is proposed. This model is based on the value approach. As a resulting index, it uses a category of complex structural value. This model renders the process of the life cycle of the program in the context of time/result. It assumes the presence of four basic phases of the life cycle, namely, initiation, planning, executing and closing. Also, this model formalizes interconnection of management processes of integration of program and management of its community and subprocesses. Selection of a value approach for the forming of a resulting index of a program determines by a variety of results of the program. This is a result of its variety and complexity in the process of finding a criterion for evaluation. Worked out a mechanism for assessing the value of the program. It consists of four steps and involves using of conventional methods (decomposition and expert estimates. As a unit of measurement assumes to use points and rating scale with the maximum score a hundred points. A complex value, which is evaluated at one hundred points, is a result of the program. It is critically important in the process of current and final evaluation of the program.

Integrated carbon-to-liquids technology (ICTL) incorporates three basic processes for the conversion of a wide range of feedstocks to distillate liquid fuels: (1) Direct Microcatalytic Coal Liquefaction (MCL) is coupled with biomass liquefaction via (2) Catalytic Hydrodeoxygenation and Isomerization (CHI) of fatty acid methyl esters (FAME) or trigylceride fatty acids (TGFA) to produce liquid fuels, with process derived (3) CO2 Capture and Utilization (CCU) via algae production and use in BioFertilizer for added terrestrial sequestration of CO2, or as a feedstock for MCL and/or CHI. This novel approach enables synthetic fuels production while simultaneously meeting EISA 2007 Section 526 targets, minimizing land use and water consumption, and providing cost competitive fuels at current day petroleum prices. ICTL was demonstrated with Montana Crow sub-bituminous coal in MCL pilot scale operations at the Energy and Environmental Research Center at the University of North Dakota (EERC), with related pilot scale CHI studies conducted at the University of Pittsburgh Applied Research Center (PARC). Coal-Biomass to Liquid (CBTL) Fuel samples were evaluated at the US Air Force Research Labs (AFRL) in Dayton and greenhouse tests of algae based BioFertilizer conducted at Montana State University (MSU). Econometric modeling studies were also conducted on the use of algae based BioFertilizer in a wheat-camelina crop rotation cycle. We find that the combined operation is not only able to help boost crop yields, but also to provide added crop yields and associated profits from TGFA (from crop production) for use an ICTL plant feedstock. This program demonstrated the overall viability of ICTL in pilot scale operations. Related work on the Life Cycle Assessment (LCA) of a Montana project indicated that CCU could be employed very effectively to reduce the overall carbon footprint of the MCL/CHI process. Plans are currently being made to conduct larger

The principal objective of the Underground Storage Tank -- Integrated Demonstration Program is the demonstration and continued development of technologies suitable for the remediation of waste stored in underground storage tanks. The Underground Storage Tank Integrated Demonstration Program is the most complex of the integrated demonstration programs established under the management of the Office of Technology Development. The Program has the following five participating sites: Oak Ridge, Idaho, Fernald, Savannah River, and Hanford. Activities included within the Underground Storage Tank -- Integrated Demonstration are (1) characterizating radioactive and hazardous waste constituents, (2) determining the need and methodology for improving the stability of the waste form, (3) determining the performance requirements, (4) demonstrating barrier performance by instrumented field tests, natural analog studies, and modeling, (5) determining the need and method for destroying and stabilizing hazardous waste constituents, (6) developing and evaluating methods for retrieving, processing (pretreatment and treatment), and storing the waste on an interim basis, and (7) defining and evaluating waste packages, transportation options, and ultimate closure techniques including site restoration. The eventual objective is the transfer of new technologies as a system to full-scale remediation at the US Department of Energy complexes and sites in the private sector

Full Text Available This article outlines the development of the Australian Gold Coast Integrated Care Model based on the elements identified in contemporary research literature as essential for successful integration of care between primary care, and acute hospital services. The objectives of the model are to proactively manage high risk patients with complex and chronic conditions in collaboration with General Practitioners to ultimately reduce presentations to the health service emergency department, improve the capacity of specialist outpatients, and decrease planned and unplanned admission rates. Central to the model is a shared care record which is maintained and accessed by staff in the Coordination Centre. We provide a process map outlining the care protocols from initial assessment to care of the patient presenting for emergency care. The model is being evaluated over a pilot three year proof of concept phase to determine economic and process perspectives. If found to be cost-effective, acceptable to patients and professionals and as good as or better than usual care in terms of outcomes, the strategic intent is to scale the programme beyond the local health service.

Full Text Available We discuss the basic typological and integrative theoretical models that explain the occurrence of child sexual abuse and the differences detected among the perpetrators of crimes against sexual integrity of minors. A comprehensive review of the theoretical concepts of sexual abuse in our country, in fact has not been carried out, and in this paper for the first time we made such an attempt. It is shown that the existing notions of sexual abuse largely overlap each other, but each of the models somehow takes into account the factors not explicitly addressed in other concepts. Systematic consideration of the theoretical models of sexual abuse can generalize and systematize the available data on the mechanisms of pedophile behavior. This review provides an opportunity to develop a new benchmark in the study of sexual abuse, get closer to building the most accurate and comprehensive model. In turn, this may contribute to solving the questions about the factors, dynamics, and the prevention of criminal sexual conduct against children

Advances in technology over the last decade have resulted in increased opportunities for educators to become more innovative in classroom and clinical teaching. These innovations have allowed students and faculty to access essential clinical information at the point of care/need. By capitalizing on technologies such as personal digital assistants and course delivery shells, faculty and students have both portable and remote access to information that can guide practice and learning activities in clinical, classroom, and distance settings. For instance, a student can use a personal digital assistant to research a patient's new medication at the bedside, study course information, access references during class in response to a question, or download clinical materials from home. Although the benefits of having ready access to information seem obvious, there are costs and strategic planning activities associated with implementing these projects. Clearly, the objective of any academic nursing program is to develop skills among students so they can efficiently access information and use that information to guide their nursing practice. To do so, academic nursing administrators must have the forethought to envision how new technologies can support achieving this goal as well as the ability to put in place the infrastructure supports needed for success. This article presents a case study of how one institution developed the necessary infrastructure and garnished the appropriate resources to implement an ambitious technology initiative integrated throughout a large undergraduate nursing program. In addition, how the integration of technology, online and mobile, can enhance clinical learning will be discussed.

Two major global environmental problems are dealt with: climate change and stratospheric ozone depletion (and their mutual interactions), briefly surveyed in part 1. In Part 2 a brief description of the integratedmodelling framework IMAGE 1.6 is given. Some specific parts of the model are described in more detail in other Chapters, e.g. the carbon cycle model, the atmospheric chemistry model, the halocarbon model, and the UV-B impact model. In Part 3 an uncertainty analysis of climate change and stratospheric ozone depletion is presented (Chapter 4). Chapter 5 briefly reviews the social and economic uncertainties implied by future greenhouse gas emissions. Chapters 6 and 7 describe a model and sensitivity analysis pertaining to the scientific uncertainties and/or lacunae in the sources and sinks of methane and carbon dioxide, and their biogeochemical feedback processes. Chapter 8 presents an uncertainty and sensitivity analysis of the carbon cycle model, the halocarbon model, and the IMAGE model 1.6 as a whole. Part 4 presents the risk assessment methodology as applied to the problems of climate change and stratospheric ozone depletion more specifically. In Chapter 10, this methodology is used as a means with which to asses current ozone policy and a wide range of halocarbon policies. Chapter 11 presents and evaluates the simulated globally-averaged temperature and sea level rise (indicators) for the IPCC-1990 and 1992 scenarios, concluding with a Low Risk scenario, which would meet the climate targets. Chapter 12 discusses the impact of sea level rise on the frequency of the Dutch coastal defence system (indicator) for the IPCC-1990 scenarios. Chapter 13 presents projections of mortality rates due to stratospheric ozone depletion based on model simulations employing the UV-B chain model for a number of halocarbon policies. Chapter 14 presents an approach for allocating future emissions of CO 2 among regions. (Abstract Truncated)

This thesis deals with the integration of system design, identification, modeling and control. In particular, six interdisciplinary engineering problems are addressed and investigated. Theoretical results are established and applied to structural vibration reduction and engine control problems. First, the data-based LQG control problem is formulated and solved. It is shown that a state space model is not necessary to solve this problem; rather a finite sequence from the impulse response is the only model data required to synthesize an optimal controller. The new theory avoids unnecessary reliance on a model, required in the conventional design procedure. The infinite horizon model predictive control problem is addressed for multivariable systems. The basic properties of the receding horizon implementation strategy is investigated and the complete framework for solving the problem is established. The new theory allows the accommodation of hard input constraints and time delays. The developed control algorithms guarantee the closed loop stability. A closed loop identification and infinite horizon model predictive control design procedure is established for engine speed regulation. The developed algorithms are tested on the Cummins Engine Simulator and desired results are obtained. A finite signal-to-noise ratio model is considered for noise signals. An information quality index is introduced which measures the essential information precision required for stabilization. The problems of minimum variance control and covariance control are formulated and investigated. Convergent algorithms are developed for solving the problems of interest. The problem of the integrated passive and active control design is addressed in order to improve the overall system performance. A design algorithm is developed, which simultaneously finds: (i) the optimal values of the stiffness and damping ratios for the structure, and (ii) an optimal output variance constrained stabilizing

Numerical modeling is an essential tool for studying the impacts of geologic carbon storage (GCS). Injection of carbon dioxide (CO2) into deep saline aquifers leads to multi-phase flow (injected CO2 and resident brine), which can be described by a set of three-dimensional governing equations, including mass-balance equation, volumetric flux equations (modified Darcy), and constitutive equations. This is the modeling approach on which commonly used reservoir simulators such as TOUGH2 are based. Due to the large density difference between CO2 and brine, GCS models can often be simplified by assuming buoyant segregation and integrating the three-dimensional governing equations in the vertical direction. The integration leads to a set of two-dimensional equations coupled with reconstruction operators for vertical profiles of saturation and pressure. Vertically-integrated approaches have been shown to give results of comparable quality as three-dimensional reservoir simulators when applied to realistic CO2 injection sites such as the upper sand wedge at the Sleipner site. However, vertically-integrated approaches usually rely on homogeneous properties over the thickness of a geologic layer. Here, we investigate the impact of general (vertical and horizontal) heterogeneity in intrinsic permeability, relative permeability functions, and capillary pressure functions. We consider formations involving complex fluvial deposition environments and compare the performance of vertically-integratedmodels to full three-dimensional models for a set of hypothetical test cases consisting of high permeability channels (streams) embedded in a low permeability background (floodplains). The domains are randomly generated assuming that stream channels can be represented by sinusoidal waves in the plan-view and by parabolas for the streams' cross-sections. Stream parameters such as width, thickness and wavelength are based on values found at the Ketzin site in Germany. Results from the

This paper describes a translator called JAVA PATHFINDER from JAVA to PROMELA, the "programming language" of the SPIN model checker. The purpose is to establish a framework for verification and debugging of JAVA programs based on model checking. This work should be seen in a broader attempt to make formal methods applicable "in the loop" of programming within NASA's areas such as space, aviation, and robotics. Our main goal is to create automated formal methods such that programmers themselves can apply these in their daily work (in the loop) without the need for specialists to manually reformulate a program into a different notation in order to analyze the program. This work is a continuation of an effort to formally verify, using SPIN, a multi-threaded operating system programmed in Lisp for the Deep-Space 1 spacecraft, and of previous work in applying existing model checkers and theorem provers to real applications.

This article elucidates an integrativemodel of hypnosis that integrates social, cultural, cognitive, and neurophysiological variables at play both in and out of hypnosis and considers their dynamic interaction as determinants of the multifaceted experience of hypnosis. The roles of these variables are examined in the induction and suggestion stages of hypnosis, including how they are related to the experience of involuntariness, one of the hallmarks of hypnosis. It is suggested that studies of the modification of hypnotic suggestibility; cognitive flexibility; response sets and expectancies; the default-mode network; and the search for the neurophysiological correlates of hypnosis, more broadly, in conjunction with research on social psychological variables, hold much promise to further understanding of hypnosis.

MOHID LAND is a open source watershed model developed by MARETEC and is part of the MOHID Framework. It integrates four mediums (or compartments): porous media, surface, rivers and atmosphere. The movement of water between these mediums are based on mass and momentum balance equations. The atmosphere medium is not explicity simulated. Instead, it's used as boundary condition to the model through meteorological properties: precipitation, solar radiation, wind speed/direction, relative humidity and air temperature. The surface medium includes the overland runoff and vegetation growth processes and is simulated using a 2D grid. The porous media includes both the unsaturated (soil) and saturated zones (aquifer) and is simulated using a 3D grid. The river flow is simulated through a 1D drainage network. All these mediums are linked through evapotranspiration and flow exchanges (infiltration, river-soil growndwater flow, surface-river overland flow). Besides the water movement, it is also possible to simulate water quality processes and solute/sediment transport. Model setup include the definition of the geometry and the properties of each one of its compartments. After the setup of the model, the only continuous input data that MOHID LAND requires are the atmosphere properties (boundary conditions) that can be provided as timeseries or spacial data. MOHID LAND has been adapted the last 4 years under FP7 and ESA projects to integrate Earth Observation (EO) data, both variable in time and in space. EO data can be used to calibrate/validate or as input/assimilation data to the model. The currently EO data used include LULC (Land Use Land Cover) maps, LAI (Leaf Area Index) maps, EVTP (Evapotranspiration) maps and SWC (Soil Water Content) maps. Model results are improved by the EO data, but the advantage of this integration is that the model can still run without the EO data. This means that model do not stop due to unavailability of EO data and can run on a forecast mode

This paper considers the process of program development aiming at technology integration for teachers. For this consideration, the paper focused on an integrationprogram which was recently developed as part of a larger project. The participants of this program were 45 in-service teachers. The program continued four weeks and the conduct of the…

An integrated delivery system discovered questionable practices when it undertook a process-improvement initiative for its revenue-to-cash cycle. These discoveries served as a wake-up call to the organization that it needed to develop a comprehensive corporate compliance program. The organization engaged legal counsel to help it establish such a program. A corporate compliance officer was hired, and a compliance committee was set up. They worked with counsel to develop the structure and substance of the program and establish a corporate code of conduct that became a part of the organization's policies and procedures. Teams were formed in various areas of the organization to review compliance-related activities and suggest improvements. Clinical and nonclinical staff attended mandatory educational sessions about the program. By approaching compliance systematically, the organization has put itself in an excellent position to avoid fraudulent and abusive activities- and the government scrutiny they invite.

The implementation of an Integrity Management Program (IMP) in a crude oil pipeline system is focused on the accomplishment of two primary corporative objectives: to increase safety operation margins and to optimize available resources. A proactive work philosophy ensures the safe and reliable operation of the pipeline in accordance with current legislation. The Integrity Management Program is accomplished by means of an interdisciplinary team that defines the strategic objectives that complement and are compatible with the corporative strategic business plan. The implementation of the program is based on the analysis of the risks due to external corrosion, third party damage, design and operations, and the definition of appropriate mitigation, inspection and monitoring actions, which will ensure long-term integrity of the assets. By means of a statistical propagation model of the external defects, reported by high-resolution magnetic inspection tool (MFL), together with the information provided by corrosion sensors, field repair interventions, close internal surveys and operation data, projected defect depth; remaining strength and failure probability distributions were obtained. From the analysis, feasible courses of action were established, including the inspection and repair plan, the internal inspection program and both corrosion monitoring and mitigation programs. (author)

Previous approaches for scheduling a league with round-robin and divisional tournaments involved decomposing the problem into easier subproblems. This approach, used to schedule the top Swedish handball league Elitserien, reduces the problem complexity but can result in suboptimal schedules. This paper presents an integrated constraint programmingmodel that allows to perform the scheduling in a single step. Particular attention is given to identifying implied and symmetry-breaking constraints that reduce the computational complexity significantly. The experimental evaluation of the integrated approach takes considerably less computational effort than the previous approach.

The Department of Energy (DOE) established the Office of Technology Development (EM-50) as an element of Environmental Restoration and Waste Management (EM) in November 1989. EM manages remediation of all DOE sites as well as wastes from current operations. The goal of the EM program is to minimize risks to human health, safety and the environment, and to bring all DOE sites into compliance with Federal, state, and local regulations by 2019. EM-50 is charged with developing new technologies that are safer, more effective and less expensive than current methods. The In Situ Remediation IntegratedProgram (the subject of this report) is part of EM-541, the Environmental Restoration Research and Development Division of EM-54. The In Situ Remediation IntegratedProgram (ISR IP) was instituted out of recognition that in situ remediation could fulfill three important criteria: Significant cost reduction of cleanup by eliminating or minimizing excavation, transportation, and disposal of wastes; reduced health impacts on workers and the public by minimizing exposure to wastes during excavation and processing; and remediation of inaccessible sites, including: deep subsurfaces; in, under, and around buildings. Buried waste, contaminated soils and groundwater, and containerized wastes are all candidates for in situ remediation. Contaminants include radioactive wastes, volatile and non-volatile organics, heavy metals, nitrates, and explosive materials. The ISR IP tends to facilitate development of in situ remediation technologies for hazardous, radioactive, and mixed wastes in soils, groundwater, and storage tanks. Near-term focus is on containment of the wastes, with treatment receiving greater effort in future years

This mixed methods study sought to evaluate the outcomes of an integrative Reiki volunteer program in an academic medical oncology center setting. We used de-identified program evaluation data to perform both quantitative and qualitative analyses of participants' experiences of Reiki sessions. The quantitative data were collected pre- and postsession using a modified version of the distress thermometer. The pre- and postsession data from the distress assessment were analyzed using a paired Student's : test. The qualitative data were derived from written responses to open-ended questions asked after each Reiki session and were analyzed for key words and recurring themes. Of the 213 pre-post surveys of first-time sessions in the evaluation period, we observed a more than 50% decrease in self-reported distress (from 3.80 to 1.55), anxiety (from 4.05 to 1.44), depression (from 2.54 to 1.10), pain (from 2.58 to 1.21), and fatigue (from 4.80 to 2.30) with P Reiki, we found 176 (82.6%) of participants liked the Reiki session, 176 (82.6%) found the Reiki session helpful, 157 (73.7%) plan to continue using Reiki, and 175 (82.2%) would recommend Reiki to others. Qualitative analyses found that individuals reported that Reiki induced relaxation and enhanced spiritual well-being. An integrative Reiki volunteer program shows promise as a component of supportive care for cancer patients. More research is needed to evaluate and understand the impact that Reiki may have for patients, caregivers, and staff whose lives have been affected by cancer.

Full Text Available This paper presents an overview of the activities developed by the Structural Integrity Group at the Institute of Aeronautics and Space - IAE, Brazil, as well as the status of ongoing work related to the life extension program for aircraft operated by the Brazilian Air Force BAF. The first BAF-operated airplane to undergo a DTA-based life extension was the F-5 fighter, in the mid 1990s. From 1998 to 2001, BAF worked on a life extension project for the BAF AT- 26 Xavante trainer. All analysis and tests were performed at IAE. The fatigue critical locations (FCLs were presumed based upon structural design and maintenance data and also from exchange of technical information with other users of the airplane around the world. Following that work, BAF started in 2002 the extension of the operational life of the BAF T-25 “Universal”. The T-25 is the basic training airplane used by AFA - The Brazilian Air Force Academy. This airplane was also designed under the “safe-life” concept. As the T-25 fleet approached its service life limit, the Brazilian Air Force was questioning whether it could be kept in flight safely. The answer came through an extensive Damage Tolerance Analysis (DTA program, briefly described in this paper. The current work on aircraft structural integrity is being performed for the BAF F-5 E/F that underwent an avionics and weapons system upgrade. Along with the increase in weight, new configurations and mission profiles were established. Again, a DTA program was proposed to be carried out in order to establish the reliability of the upgraded F-5 fleet. As a result of all the work described, the BAF has not reported any accident due to structural failure on aircraft submitted to Damage Tolerance Analysis.

The purpose of this thesis is the 2-D physics study. The main tool is the conformal field theory with Kac-Moody and W algebra. This theory describes the 2-D models that have translation, rotation and dilatation symmetries, at their critical point. The expanded conformal theories describe models that have a larger symmetry than conformal symmetry. After a review of conformal theory methods, the author effects a detailed study of singular vector form in sl(2) affine algebra. With this important form, correlation functions can be calculated. The classical W algebra is studied and the relations between classical W algebra and quantum W algebra are specified. Bosonization method is presented and sl(2)/sl(2) topological model, studied. Partition function bosonization of different models is described. A program of rational theory classification is described linking rational conformal theories and spin integrablemodels, and interesting relations between Boltzmann weights of different models have been found. With these relations, the integrability of models by a direct calculation of their Boltzmann weights is proved

Faced with the numerous concerns about soil carbon dynamic, a large quantity of carbon dynamic models has been developed during the last century. These models are mainly in the form of deterministic compartment models with carbon fluxes between compartments represented by ordinary differential equations. Nowadays, lots of them consider the microbial biomass as a compartment of the soil organic matter (carbon quantity). But the amount of microbial carbon is rarely used in the differential equations of the models as a limiting factor. Additionally, microbial diversity and community composition are mostly missing, although last advances in soil microbial analytical methods during the two past decades have shown that these characteristics play also a significant role in soil carbon dynamic. As soil microorganisms are essential drivers of soil carbon dynamic, the question about explicitly integrating their role have become a key issue in soil carbon dynamic models development. Some interesting attempts can be found and are dominated by the incorporation of several compartments of different groups of microbial biomass in terms of functional traits and/or biogeochemical compositions to integrate microbial diversity. However, these models are basically heuristic models in the sense that they are used to test hypotheses through simulations. They have rarely been confronted to real data and thus cannot be used to predict realistic situations. The objective of this work was to empirically integrate microbial diversity in a simple model of carbon dynamic through statistical modelling of the model parameters. This work is based on available experimental results coming from a French National Research Agency program called DIMIMOS. Briefly, 13C-labelled wheat residue has been incorporated into soils with different pedological characteristics and land use history. Then, the soils have been incubated during 104 days and labelled and non-labelled CO2 fluxes have been measured at ten

The objective was to identify definitions and/or explanations of the term self-management in educative programs that aim its development. The authors also aimed to describe the educative plans and results of the educative programs analyzed. As a methodology we used integrative review, with 15 published articles (2002 the 2007). The inclusion criteria was: the occurrence of the term self-management; the existence of an educative program for the development of self-management; to be related to the area of the health of the adult. Self-management means the improvement or acquisition of abilities to solve problems in biological, social and affective scopes. The review pointed to different educational methodologies. However, it also showed the predominance of traditional methods, with conceptual contents and of physiopathological nature. The learning was evaluated as favorable, with warns in relation to the application in different populations and contexts and to the increase of costs of the educative intervention. It was concluded that research has evidenced the importance of the education for self-management, but lacked in strength for not relating the biopsychosocial demands of the chronic patient and for not describing in detail the teaching and evaluation methodologies employed.

Biologically plausible strategies for visual scene integration across spatial and temporal domains continues to be a challenging topic. The fundamental question we address is whether classical problems in motion integration, such as the aperture problem, can be solved in a model that samples the visual scene at multiple spatial and temporal scales in parallel. We hypothesize that fast interareal connections that allow feedback of information between cortical layers are the key processes that disambiguate motion direction. We developed a neural model showing how the aperture problem can be solved using different spatial sampling scales between LGN, V1 layer 4, V1 layer 6, and area MT. Our results suggest that multiscale sampling, rather than feedback explicitly, is the key process that gives rise to end-stopped cells in V1 and enables area MT to solve the aperture problem without the need for calculating intersecting constraints or crafting intricate patterns of spatiotemporal receptive fields. Furthermore, the model explains why end-stopped cells no longer emerge in the absence of V1 layer 6 activity (Bolz & Gilbert, 1986), why V1 layer 4 cells are significantly more end-stopped than V1 layer 6 cells (Pack, Livingstone, Duffy, & Born, 2003), and how it is possible to have a solution to the aperture problem in area MT with no solution in V1 in the presence of driving feedback. In summary, while much research in the field focuses on how a laminar architecture can give rise to complicated spatiotemporal receptive fields to solve problems in the motion domain, we show that one can reframe motion integration as an emergent property of multiscale sampling achieved concurrently within lamina and across multiple visual areas.

Agent-oriented conceptual modeling notations are highly effective in representing requirements from an intentional stance and answering questions such as what goals exist, how key actors depend on each other, and what alternatives must be considered. In this chapter, we review an approach to executing i* models by translating these into set of interacting agents implemented in the CASO language and suggest how we can perform reasoning with requirements modeled (both functional and non-functional) using i* models. In this chapter we particularly incorporate deliberation into the agent design. This allows us to benefit from the complementary representational capabilities of the two frameworks.

Full Text Available Modeling of 3D electromagnetic phenomena in TOKAMAK with typically distributed main and additional coils is not an easy business. Evaluated must be not only distribution of the magnetic field, but also forces acting in particular coils. Use of differential methods (such as FDM or FEM for this purpose may be complicated because of geometrical incommensurability of particular subregions in the investigated area or problems with the boundary conditions. That is why integral formulation of the problem may sometimes be an advantages. The theoretical analysis is illustrated on an example processed by both methods, whose results are compared and discussed.

In this paper, an integrated speed-estimation model is developed based on empirical analyses for the basic sections of intercity multilane expressway un der the uncongested condition. This model enables a speed estimation for each lane at any site under arb itrary highway-alignment, traffic (traffic flow and truck percentage), and rainfall conditions. By combin ing this model and a lane-use model which estimates traffic distribution on the lanes by each vehicle type, it is also possible to es timate an average speed across all the lanes of one direction from a traffic demand by vehicle type under specific highway-alignment and rainfall conditions. This model is exp ected to be a tool for the evaluation of traffic performance for expressways when the performance me asure is travel speed, which is necessary for Performance-Oriented Highway Planning and Design. Regarding the highway-alignment condition, two new estimators, called effective horizo ntal curvature and effective vertical grade, are proposed in this paper which take into account the influence of upstream and downstream alignment conditions. They are applied to the speed-estimation model, and it shows increased accuracy of the estimation.

Mathematical modeling and performance simulation are playing an increasing role in large, high-technology projects. There are two reasons; first, projects are now larger than they were before, and the high cost calls for detailed performance prediction before construction. Second, in particular for space-related designs, it is often difficult to test systems under realistic conditions beforehand, and mathematical modeling is then needed to verify in advance that a system will work as planned. Computers have become much more powerful, permitting calculations that were not possible before. At the same time mathematical tools have been further developed and found acceptance in the community. Particular progress has been made in the fields of structural mechanics, optics and control engineering, where new methods have gained importance over the last few decades. Also, methods for combining optical, structural and control system models into global models have found widespread use. Such combined models are usually called integratedmodels and were the subject of this symposium. The objective was to bring together people working in the fields of groundbased optical telescopes, ground-based radio telescopes, and space telescopes. We succeeded in doing so and had 39 interesting presentations and many fruitful discussions during coffee and lunch breaks and social arrangements. We are grateful that so many top ranked specialists found their way to Kiruna and we believe that these proceedings will prove valuable during much future work.

… this book may be the first book on geometric modelling that also covers computer graphics. In addition, it may be the first book on computer graphics that integrates a thorough introduction to 'freedom' curves and surfaces and to the mathematical foundations for computer graphics. … the book is well suited for an undergraduate course. … The entire book is very well presented and obviously written by a distinguished and creative researcher and educator. It certainly is a textbook I would recommend. …-Computer-Aided Design, 42, 2010… Many books concentrate on computer programming and soon beco

Full Text Available The semiconductor industry has very important position in computer industry, ICT field, and new electronic technology developing. The IC design service is one of key factor of semiconductor industry development. There are more than 365 IC design service firms have been established around Hsinchu Science Park in Taiwan. Building an efficient planning model for IC design service firm resources integrating is very interest issue. This study aims to construct a planning model for IC design service firm implementation resources integration. This study uses the De Novo programming as an approach of criteria alternative to achieve optimal resource allocation on IC design firm. Results show the IC design service firm should conduct open innovation concept and utilizes design outsourcing obtains cost down and enhance IC design service business performance. This plan model of De Novo programming is not only for IC design service firm and also can apply to the other industrial implementation strategic alliance/integrating resource. This plan model is a universal model for the others industries field.

One of the seismological programs to manipulate seismic data is SGRAPH program. It consists of integrated tools to perform advanced seismological techniques. SGRAPH is considered a new system for maintaining and analyze seismic waveform data in a stand-alone Windows-based application that manipulate a wide range of data formats. SGRAPH was described in detail in the first part of this paper. In this part, I discuss the advanced techniques including in the program and its applications in seismology. Because of the numerous tools included in the program, only SGRAPH is sufficient to perform the basic waveform analysis and to solve advanced seismological problems. In the first part of this paper, the application of the source parameters estimation and hypocentral location was given. Here, I discuss SGRAPH waveform modeling tools. This paper exhibits examples of how to apply the SGRAPH tools to perform waveform modeling for estimating the focal mechanism and crustal structure of local earthquakes.

Bacause of high development costs of IC (Integrated Circuit)test programs,recycling existing test programs from one kind of ATE (Automatic Test Equipment) to another or generating directly from CAD simulation modules to ATE is more and more valuable.In this paper,a new approach to migrating test programs is presented.A virtual ATE model based on object-oriented paradigm is developed;it runs Test C++ (an intermediate test control language) programs and TeIF(Test Inftermediate Format-an intermediate pattern),migrates test programs among three kinds of ATE (Ando DIC8032,Schlumberger S15 and GenRad 1732) and generates test patterns from two kinds of CAD 9Daisy and Panda) automatically.

A measurement control program for the model plant is described. The discussion includes the technical basis for such a program, the application of measurement control principles to each measurement, and the use of special experiments to estimate measurement error parameters for difficult-to-measure materials. The discussion also describes the statistical aspects of the program, and the documentation procedures used to record, maintain, and process the basic data

Because of the growing number of information sources available through the internet there are many cases in which information needed to solve a problem or answer a question is spread across several information sources. For example, when given two sources, one about comic books and the other about super heroes, you might want to ask the question {open_quotes}Is Spiderman a Marvel Super Hero?{close_quotes} This query accesses both sources; therefore, it is necessary to have information about the relationships of the data within each source and between sources to properly access and integrate the data retrieved. The SIMS information broker captures this type of information in the form of a model. All the information sources map into the model providing the user a single interface to multiple sources.

Investigations of new knot polynomials discovered in the last few years have shown them to be intimately connected with soluble models of two dimensional lattice statistical mechanics. In this paper, these results, which in time may illuminate the whole question of why integrable lattice models exist, are reconsidered from the point of view of three dimensional gauge theory. Expectation values of Wilson lines in three dimensional Chern-Simons gauge theories can be computed by evaluating the partition functions of certain lattice models on finite graphs obtained by projecting the Wilson lines to the plane. The models in question - previously considered in both the knot theory and statistical mechanics literature - are IRF models in which the local Boltzmann weights are the matrix elements of braiding matrices in rational conformal field theories. These matrix elements, in turn, can be represented in three dimensional gauge theory in terms of the expectation value of a certain tetrahedral configuration of Wilson lines. This representation makes manifest a surprising symmetry of the braiding matrix elements in conformal field theory. (orig.)

The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of PETN Class 4. The PETN was found to have: 1) an impact sensitivity (DH50) range of 6 to 12 cm, 2) a BAM friction sensitivity (F50) range 7 to 11 kg, TIL (0/10) of 3.7 to 7.2 kg, 3) a ABL friction sensitivity threshold of 5 or less psig at 8 fps, 4) an ABL ESD sensitivity threshold of 0.031 to 0.326 j/g, and 5) a thermal sensitivity of an endothermic feature with Tmin = ~ 141 °C, and a exothermic feature with a Tmax = ~205°C.

The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of ammonium nitrate (AN). AN was tested, in most cases, as both received from manufacturer and dried/sieved. The participants found the AN to be: 1) insensitive in Type 12A impact testing (although with a wide range of values), 2) completely insensitive in BAM friction testing, 3) less sensitive than the RDX standard in ABL friction testing, 4) less sensitive than RDX in ABL ESD testing, and 5) less sensitive than RDX and PETN in DSC thermal analyses.

The Characterization, Monitoring, and Sensor Technology IntegratedProgram seeks to deliver needed technologies, timely and cost-effectively, to the Office of Waste Management (EM-30), the Office of Environmental Restoration (EM-40), and the Office of Facility Transition and Management (EM-60). The scope of characterizations monitoring, and sensor technology needs that are required by those organizations encompass: (1) initial location and characterization of wastes and waste environments - prior to treatment; (2) monitoring of waste retrieval, remediation and treatment processes; (3) characterization of the co-position of final waste treatment forms to evaluate the performance of waste treatments processes; and (4) site closure and compliance monitoring. Wherever possible, the CMST-IP fosters technology transfer and commercialization of technologies that it sponsors

The results are presented of the pressure tests performed as part of Phase I of the Steam Generator Tube Integrity (SGTI) program at Battelle Pacific Northwest Laboratory. These tests were performed to establish margin-to-failure predictions for mechanically defected Pressurized Water Reactor (PWR) steam generator tubing under operating and accident conditions. Defect geometries tested were selected because they simulate known or expected defects in PWR steam generators. These defect geometries are Electric Discharge Machining (EDM) slots, elliptical wastage, elliptical wastage plus through-wall slot, uniform thinning, denting, denting plus uniform thinning, and denting plus elliptical wastage. All defects were placed in tubing representative of that currently used in PWR steam generators

This study tested an integrated process model of travel behavior modification. We used a model that combined the theory of planned behavior (TPB), norm activation theory (NAT), a theory of implementation intention, and theories of habit. To test the integratedmodel, we used panel data (n = 208) obtained before and after travel feedback programs (TFPs); the TFP is a communication program aimed at voluntary travel behavior modification, from automobile use to non-auto means of travel such as p...

The Integrated Landscape Modeling (ILM) partnership is an effort by the U.S. Geological Survey (USGS) and U.S. Department of Agriculture (USDA) to identify, evaluate, and develop models to quantify services derived from ecosystems, with a focus on wetland ecosystems and conservation effects. The ILM partnership uses the Integrated Valuation of Ecosystem Services and Tradeoffs (InVEST) modeling platform to facilitate regional quantifications of ecosystem services under various scenarios of land-cover change that are representative of differing conservation program and practice implementation scenarios. To date, the ILM InVEST partnership has resulted in capabilities to quantify carbon stores, amphibian habitat, plant-community diversity, and pollination services. Work to include waterfowl and grassland bird habitat quality is in progress. Initial InVEST modeling has been focused on the Prairie Pothole Region (PPR) of the United States; future efforts might encompass other regions as data availability and knowledge increase as to how functions affecting ecosystem services differ among regions.The ILM partnership is also developing the capability for field-scale process-based modeling of depressional wetland ecosystems using the Agricultural Policy/Environmental Extender (APEX) model. Progress was made towards the development of techniques to use the APEX model for closed-basin depressional wetlands of the PPR, in addition to the open systems that the model was originally designed to simulate. The ILM partnership has matured to the stage where effects of conservation programs and practices on multiple ecosystem services can now be simulated in selected areas. Future work might include the continued development of modeling capabilities, as well as development and evaluation of differing conservation program and practice scenarios of interest to partner agencies including the USDA’s Farm Service Agency (FSA) and Natural Resources Conservation Service (NRCS). When

This is the final report of the International Piping Integrity Research Group (IPIRG) Programme. The IPIRG Programme was an international group programme managed by the U.S. Nuclear Regulatory Commission and funded by a consortium of organizations from nine nations: Canada, France, Italy, Japan, Sweden, Switzerland, Taiwan, the United Kingdom, and the United states. The objective of the programme was to develop data needed to verify engineering methods for assessing the integrity of nuclear power plant piping that contains circumferential defects. The primary focus was an experimental task that investigated the behaviour of circumferentially flawed piping and piping systems to high-rate loading typical of seismic events. To accomplish these objectives a unique pipe loop test facility was designed and constructed. The pipe system was an expansion loop with over 30 m of 406-mm diameter pipe and five long radius elbows. Five experiments on flawed piping were conducted to failure in this facility with dynamic excitation. The report: provides background information on leak-before-break and flaw evaluation procedures in piping; summarizes the technical results of the programme; gives a relatively detailed assessment of the results from the various pipe fracture experiments and complementary analyses; and, summarizes the advances in the state-of-the-art of pipe fracture technology resulting from the IPIRG Program

Eleven major Department of Energy (DOE) site contractors were chartered by the Assistant Secretary to use a systems engineering approach to develop and evaluate technically defensible cost savings opportunities across the complex. Known as the complex-wide Environmental Management Integration (EMI), this process evaluated all the major DOE waste streams including high level waste (HLW). Across the DOE complex, this waste stream has the highest life cycle cost and is scheduled to take until at least 2035 before all HLW is processed for disposal. Technical contract experts from the four DOE sites that manage high level waste participated in the integration analysis: Hanford, Savannah River Site (SRS), Idaho National Engineering and Environmental Laboratory (INEEL), and West Valley Demonstration Project (WVDP). In addition, subject matter experts from the Yucca Mountain Project and the Tanks Focus Area participated in the analysis. Also, departmental representatives from the US Department of Energy Headquarters (DOE-HQ) monitored the analysis and results. Workouts were held throughout the year to develop recommendations to achieve a complex-wide integratedprogram. From this effort, the HLW Environmental Management (EM) Team identified a set of programmatic and technical opportunities that could result in potential cost savings and avoidance in excess of $18 billion and an accelerated completion of the HLW mission by seven years. The cost savings, schedule improvements, and volume reduction are attributed to a multifaceted HLW treatment disposal strategy which involves waste pretreatment, standardized waste matrices, risk-based retrieval, early development and deployment of a shipping system for glass canisters, and reasonable, low cost tank closure

The last few years have seen significant progress in constructing the atomic models required for non-local thermodynamic equilibrium (NLTE) simulations. Along with this has come an increased understanding of the requirements for accurately modeling the ionization balance, energy content and radiative properties of different elements for a wide range of densities and temperatures. Much of this progress is the result of a series of workshops dedicated to comparing the results from different codes and computational approaches applied to a series of test problems. The results of these workshops emphasized the importance of atomic model completeness, especially in doubly excited states and autoionization transitions, to calculating ionization balance, and the importance of accurate, detailed atomic data to producing reliable spectra. We describe a simple screened-hydrogenic model that calculates NLTE ionization balance with surprising accuracy, at a low enough computational cost for routine use in radiation-hydrodynamics codes. The model incorporates term splitting, {Delta}n = 0 transitions, and approximate UTA widths for spectral calculations, with results comparable to those of much more detailed codes. Simulations done with this model have been increasingly successful at matching experimental data for laser-driven systems and hohlraums. Accurate and efficient atomic models are just one requirement for integrated NLTE simulations. Coupling the atomic kinetics to hydrodynamics and radiation transport constrains both discretizations and algorithms to retain energy conservation, accuracy and stability. In particular, the strong coupling between radiation and populations can require either very short timesteps or significantly modified radiation transport algorithms to account for NLTE material response. Considerations such as these continue to provide challenges for NLTE simulations.

A new safety culture model is constructed and is applied to analyze the correlations between safety culture and SMS. On the basis of previous typical definitions, models and theories of safety culture, an in-depth analysis on safety culture's structure, composing elements and their correlations was conducted. A new definition of safety culture was proposed from the perspective of sub-cuhure. 7 types of safety sub-culture, which are safety priority culture, standardizing culture, flexible culture, learning culture, teamwork culture, reporting culture and justice culture were defined later. Then integrated safety culture model (ISCM) was put forward based on the definition. The model divided safety culture into intrinsic latency level and extrinsic indication level and explained the potential relationship between safety sub-culture and all safety culture dimensions. Finally in the analyzing of safety culture and SMS, it concluded that positive safety culture is the basis of im-plementing SMS effectively and an advanced SMS will improve safety culture from all around.

Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integratedmodeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integratedmodel resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the methods used for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis during the IDCA program. These methods changed throughout the Proficiency Test and the reasons for these changes are documented in this report. The most significant modifications in standard testing methods are: 1) including one specified sandpaper in impact testing among all the participants, 2) diversifying liquid test methods for selected participants, and 3) including sealed sample holders for thermal testing by at least one participant. This effort, funded by the Department of Homeland Security (DHS), is putting the issues of safe handling of these materials in perspective with standard military explosives. The study is adding SSST testing results for a broad suite of different HMEs to the literature. Ultimately the study will suggest new guidelines and methods and possibly establish the SSST testing accuracies needed to develop safe handling practices for HMEs. Each participating testing laboratory uses identical test materials and preparation methods wherever possible. The testing performers involved are Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Indian Head Division, Naval Surface Warfare Center, (NSWC IHD), Sandia National Laboratories (SNL), and Air Force Research Laboratory (AFRL/RXQL). These tests are conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to compare results when these testing variables cannot be made consistent.

Under the German EU Council presidency, the European Union adopted an ambitious climate protection program in spring this year which has consequences for the entire energy sector. A fair system of burden sharing is currently being sought on the level of the European Union. However, the German federal government does not wait for that agreement to be reached, but has added to the clearcut EU plans in order to achieve more climate protection. At the closed meeting of the federal cabinet in Meseberg on August 23-24, 2007, the key points of an integrated energy and climate program were adopted. The unprecedented set of measures comprises 30 points. In many cases, legal measures are required for implementation, which implies a heavy workload facing the federal government and parliament. A major step forward is seen in the federal government's intention to preserve the international competitiveness of the producing sector and energy-intensive industries also under changed framework conditions. The imperative guiding principle must be that care should take precedence over speed. European or worldwide solutions must be found for all measures, be it energy efficiency or climate protection, and all countries must be involved because, otherwise, specific measures taken by individual states will be ineffective. (orig.)

1 - Description of problem or function: GENP-2 is a system of programs that use 'generalized perturbation theory' to calculate the perturbations of reactor integral characteristics which can be expressed by means of ratios between linear or bilinear functionals of the real and/or adjoint fluxes (e.g. reaction rate ratios), due to cross section perturbations. 2 - Method of solution: GENP-2 consists of the following codes: DDV, SORCI, CIAP-PMN and GLOBP-2D. DDV calculates the real or adjoint fluxes and power distribution using multigroup diffusion theory in 2-dimensions. SORCI uses the fluxes from DDV to calculate the real and/or adjoint general perturbation sources. CIAP-PMN reads the sources from SORCI and uses them in the real or adjoint generalised importance calculations (2 dimensions, multi- group diffusion). GLOBP-2D uses the importance calculated by CIAP-PMN, and the fluxes calculated by DDV, in generalised perturbation expressions to calculate the perturbation in the quantity of interest. 3 - Restrictions on the complexity of the problem: DDV although variably dimensioned has the following restrictions: - max. number of mesh points 6400; - max. number of mesh points in 1-dimension 81; - max. number of regions 6400; - max. number of energy groups 100; - if power distribution calculated, product of number of groups and number of regions 2500. The other programs have the same restrictions if applicable

Molecular electronics, in which single organic molecules are designed to perform the functions of transistors, diodes, switches and other circuit elements used in current siliconbased microelecronics, is drawing wide interest as a potential replacement technology for conventional silicon-based lithographically etched microelectronic devices. In addition to their nanoscopic scale, the additional advantage of molecular electronics devices compared to silicon-based lithographically etched devices is the promise of being able to produce them cheaply on an industrial scale using wet chemistry methods (i.e., self-assembly from solution). The design of molecular electronics devices, and the processes to make them on an industrial scale, will require a thorough theoretical understanding of the molecular and higher level processes involved. Hence, the development of modeling techniques for molecular electronics devices is a high priority from both a basic science point of view (to understand the experimental studies in this field) and from an applied nanotechnology (manufacturing) point of view. Modeling molecular electronics devices requires computational methods at all length scales - electronic structure methods for calculating electron transport through organic molecules bonded to inorganic surfaces, molecular simulation methods for determining the structure of self-assembled films of organic molecules on inorganic surfaces, mesoscale methods to understand and predict the formation of mesoscale patterns on surfaces (including interconnect architecture), and macroscopic scale methods (including finite element methods) for simulating the behavior of molecular electronic circuit elements in a larger integrated device. Here we describe a large Department of Energy project involving six universities and one national laboratory aimed at developing integrated multiscale methods for modeling molecular electronics devices. The project is funded equally by the Office of Basic

Full Text Available An integrated approach to enhance rice production in Indonesia is very prospectus throughout the implementation of adapted and liable integratedprogram. One of the challenges in rice crop sub sector is the stagnation of its production due to the limitation of organic matter availability. This provides an opportunity for livestock development to overcome the problems on land fertility through the use of manure as the source of organic fertilizer. Ministry of Agriculture had implemented a program on Increasing Integrated Rice Productivity with an Integrated Crop Livestock System as one of the potential components since 2002. Integrated crop livestock system program with special reference to rice field and beef cattle is an alternative to enhance the potential development of agriculture sector in Indonesia. The implementation on this integratedprogram is to enhance rice production and productivity through a system involving beef cattle with its goal on increasing farmers’ income. Household economic model can be used as one of the analysis to evaluate the success of the implemented crop livestock system program. The specificity of the farmers is that rationality behavior of the role as production and consumption decision making. In this case, farmers perform the production to meet home consumption based on the resources that used directly for its production. The economic analysis of farmers household can be described to anticipate policy options through this model. Factors influencing farmers’ decisions and direct interrelations to production and consumption aspects that have complex implications for the farmers’ welfare of the integrated crop livestock system program.

The Critical Issues Forum (CIF) funded by the US Department of Energy is a collaborative effort between the Science Education Team of Los Alamos National Laboratory (LANL) and New Mexico high schools to improve science education throughout the state of New Mexico as well as nationally. By creating an education relationship between the LANL with its unique scientific resources and New Mexico high schools, students and teachers participate in programs that increase not only their science content knowledge but also their critical thinking and problem-solving skills. The CIF program focuses on current, globally oriented topics crucial to the security of not only the US but to that of all nations. The CIF is an academic-year program that involves both teachers and students in the process of seeking solutions for real world concerns. Built around issues tied to LANL`s mission, participating students and teachers are asked to critically investigate and examine the interactions among the political, social, economic, and scientific domains while considering diversity issues that include geopolitical entities and cultural and ethnic groupings. Participants are expected to collaborate through telecommunications during the research phase and participate in a culminating multimedia activity, where they produce and deliver recommendations for the current issues being studied. The CIF was evaluated and found to be an effective approach for teacher professional training, especially in the development of skills for critical thinking and questioning. The CIF contributed to students` ability to integrate diverse disciplinary content about science-related topics and supported teachers in facilitating the understanding of their students using the CIF approach. Networking technology in CIF has been used as an information repository, resource delivery mechanism, and communication medium.

Full Text Available Background: The Integrated Child Development Services (ICDS scheme is the largest program for promotion of maternal and child health and nutrition. Aims: The present study is aimed to evaluate ICDS program in terms of infrastructure of anganwadi centers (AWCs, characteristics of anganwadi workers (AWWs, coverage of supplementary nutrition (SN, and preschool education (PSE to the beneficiaries. Methods: A total of 39 AWCs from a rural area and 15 from the urban area were surveyed. AWWs were interviewed, and records were reviewed. Information was collected using a predesigned and pretested questionnaire. Results: In the selected AWCs, 88.9% were running in Pucca buildings, 38.9% had electricity, 35.1% had a separate kitchen, 1.8% had cooking gas, and toilets were available in 59.3% AWCs. All the AWW have received job training, 83.3% AWW have received refresher training. 38.8% AWW have received orientation training, 37% have received skill training in World Health Organization growth standards and 18.5% AWW have received skill training in mother and child health. 86.9% registered pregnant women, 90.7% registered lactating women, 72.6% registered adolescent girls were availing SN. 95.4% registered children 6 months to 3 years and 92.4% registered children 3-6 years of age were availing SN. Interruption in SN in last 6 months was seen in 22.2% AWCs. Appropriate and adequate PSE material was available in 59.2% AWCs. Conclusion: There are program gaps in the infrastructure of AWCs, training of AWW, coverage of SN, interruption in the supply of SN.

the Integrated Management System. CM ensures that during the entire operational life of the plant the following requirements are met: · The basic design requirements of the plant are established, documented and maintained; · The physical structures, systems and components (SSCs) of the plant are in conformity with the design requirements; · The physical and functional characteristics of the plant are correctly incorporated in the operational and maintenance documentation, as well as in the documents for testing and training; · The changes in the design documentation are incorporated in the physical configuration and · the operative documentation; · The changes in the design are minimized by management process for review according to approved criteria. The purpose of this report is to try to clarify the place of configuration management program within the Integrated Management System of Kozloduy NPP and to present the computerized information system for organization of the operational activities (IS OOA) as a tool for effective management of the facility. (authors)

Full Text Available While some evidence supports the beneficial effects of integrating neglected tropical disease (NTD programs to optimize coverage and reduce costs, there is minimal information regarding when or how to effectively operationalize programintegration. The lack of systematic analyses of integration experiences and of integration processes may act as an impediment to achieving more effective NTD programming. We aimed to learn about the experiences of NTD stakeholders and their perceptions of integration.We evaluated differences in the definitions, roles, perceived effectiveness, and implementation experiences of integrated NTD programs among a variety of NTD stakeholder groups, including multilateral organizations, funding partners, implementation partners, national Ministry of Health (MOH teams, district MOH teams, volunteer rural health workers, and community members participating in NTD campaigns. Semi-structured key informant interviews were conducted. Coding of themes involved a mix of applying in-vivo open coding and a priori thematic coding from a start list.In total, 41 interviews were conducted. Salient themes varied by stakeholder, however dominant themes on integration included: significant variations in definitions, differential effectiveness of specific integrated NTD activities, community member perceptions of NTD programs, the influence of funders, perceived facilitators, perceived barriers, and the effects of integration on health system strength. In general, stakeholder groups provided unique perspectives, rather than contrarian points of view, on the same topics. The stakeholders identified more advantages to integration than disadvantages, however there are a number of both unique facilitators and challenges to integration from the perspective of each stakeholder group.Qualitative data suggest several structural, process, and technical opportunities that could be addressed to promote more effective and efficient integrated NTD

An integrated calculation model for simulating the interaction of physics phenomena taking place in the plasma core, in the plasma edge and in the SOL and divertor of tokamaks has been developed and applied to study such interactions. The model synthesises a combination of numerical calculations (1) the power and particle balances for the core plasma, using empirical confinement scaling laws and taking into account radiation losses (2), the particle, momentum and power balances in the SOL and divertor, taking into account the effects of radiation and recycling neutrals, (3) the transport of feeling and recycling neutrals, explicitly representing divertor and pumping geometry, and (4) edge pedestal gradient scale lengths and widths, evaluation of theoretical predictions (5) confinement degradation due to thermal instabilities in the edge pedestals, (6) detachment and divertor MARFE onset, (7) core MARFE onsets leading to a H-L transition, and (8) radiative collapse leading to a disruption and evaluation of empirical fits (9) power thresholds for the L-H and H-L transitions and (10) the width of the edge pedestals. The various components of the calculation model are coupled and must be iterated to a self-consistent convergence. The model was developed over several years for the purpose of interpreting various edge phenomena observed in DIII-D experiments and thereby, to some extent, has been benchmarked against experiment. Because the model treats the interactions of various phenomena in the core, edge and divertor, yet is computationally efficient, it lends itself to the investigation of the effects of different choices of various edge plasma operating conditions on overall divertor and core plasma performance. Studies of the effect of feeling location and rate, divertor geometry, plasma shape, pumping and over 'edge parameters' on core plasma properties (line average density, confinement, density limit, etc.) have been performed for DIII-D model problems. A

This study examines the effects of integration on the performance ratings of the top 100 integrated healthcare networks (IHNs) in the United States. A strategic-contingency theory is used to identify the relationship of IHNs' performance to their structural and operational characteristics and integration strategies. To create a database for the panel study, the top 100 IHNs selected by the SMG Marketing Group in 1998 were followed up in 1999 and 2000. The data were merged with the Dorenfest data on information system integration. A growth curve model was developed and validated by the Mplus statistical program. Factors influencing the top 100 IHNs' performance in 1998 and their subsequent rankings in the consecutive years were analyzed. IHNs' initial performance scores were positively influenced by network size, number of affiliated physicians and profit margin, and were negatively associated with average length of stay and technical efficiency. The continuing high performance, judged by maintaining higher performance scores, tended to be enhanced by the use of more managerial or executive decision-support systems. Future studies should include time-varying operational indicators to serve as predictors of network performance.

Experimental results of the aerodynamic performance of seven candidate diffusers are presented to assist in determining their suitability for joining an MHD channel to a steam generator at minimum spacing. The three dimensional diffusers varied in area ratio from 2 to 3.8 and wall half angle from 2 to 5 degrees. The program consisted of five phases: (1) tailoring a diffuser inlet nozzle to a 15 percent blockage; (2) comparison of isolated diffusers at enthalpy ratios 0.5 to 1.0 with respect to separation characteristics and pressure recovery coefficients; (3) recording the optimum diffuser exit flow distribution; (4) recording the internal flow distribution within the steam generator when attached to the diffuser; and (5) observing isolated diffuser exhaust dynamic characteristics. The 2 and 2-1/3 degree half angle rectangular diffusers showed recovery coefficients equal to 0.48 with no evidence of flow separation or instability. Diffusion at angles greater than these produced flow instabilities and with angles greater than 3 degrees random flow separation and reattachment.

Experimental results of the aerodynamic performance of seven candidate diffusers are presented to assist in determining their suitability for joining an MHD channel to a steam generator at minimum spacing. The three dimensional diffusers varied in area ratio from 2 to 3.8 and wall half angle from 2 to 5 degrees. The program consisted of five phases: (1) tailoring a diffuser inlet nozzle to a 15 percent blockage; (2) comparison of isolated diffusers at enthalpy ratios 0.5 to 1.0 with respect to separation characteristics and pressure recovery coefficients; (3) recording the optimum diffuser exit flow distribution; (4) recording the internal flow distribution within the steam generator when attached to the diffuser; and (5) observing isolated diffuser exhaust dynamic characteristics. The 2 and 2-1/3 degree half angle rectangular diffusers showed recovery coefficients equal to 0.48 with no evidence of flow separation or instability. Diffusion at angles greater than these produced flow instabilities and with angles greater than 3 degrees random flow separation and reattachment

This "Integrated Pest Management Toolkit for Early Care and Education Programs" presents practical information about using integrated pest management (IPM) to prevent and manage pest problems in early care and education programs. This curriculum will help people in early care and education programs learn how to keep pests out of early…

Integrative medicine (IM) refers to the combination of conventional and "complementary" medical services (e.g., chiropractic, acupuncture, massage, mindfulness training). More than half of all medical schools in the United States and Canada have programs in IM, and more than 30 academic health centers currently deliver multidisciplinary IM care. What remains unclear, however, is the ideal delivery model (or models) whereby individuals can responsibly access IM care safely, effectively, and reproducibly in a coordinated and cost-effective way.Current models of IM across existing clinical centers vary tremendously in their organizational settings, principal clinical focus, and services provided; practitioner team composition and training; incorporation of research activities and educational programs; and administrative organization (e.g., reporting structure, use of medical records, scope of clinical practice) and financial strategies (i.e., specific business plans and models for sustainability).In this article, the authors address these important strategic issues by sharing lessons learned from the design and implementation of an IM facility within an academic teaching hospital, the Brigham and Women's Hospital at Harvard Medical School; and review alternative options based on information about IM centers across the United States.The authors conclude that there is currently no consensus as to how integrative care models should be optimally organized, implemented, replicated, assessed, and funded. The time may be right for prospective research in "best practices" across emerging models of IM care nationally in an effort to standardize, refine, and replicate them in preparation for rigorous cost-effectiveness evaluations.

Online coupled meteorology–atmospheric chemistry models have greatly evolved in recent years. Although mainly developed by the air quality modeling community, these integratedmodels are also of interest for numerical weather prediction and climate modeling, as they can con...

Full Text Available The study critically explored how a PASCO-designed technology (SPARK ScienceLearning System is meaningfully integrated into the teaching of selected topics in Earth and Environmental Science. It highlights on modelling the effectiveness of using the SPARK Learning System as a primary tool in learning science that leads to learning and achievement of the students. Data and observation gathered and correlation of the ability of the technology to develop high intrinsic motivation to student achievement were used to design framework on how to meaningfully integrate SPARK ScienceLearning System in teaching Earth and Environmental Science. Research instruments used in this study were adopted from standardized questionnaires available from literature. Achievement test and evaluation form were developed and validated for the purpose of deducing data needed for the study. Interviews were done to delve into the deeper thoughts and emotions of the respondents. Data from the interviews served to validate all numerical data culled from this study. Cross-case analysis of the data was done to reveal some recurring themes, problems and benefits derived by the students in using the SPARK Science Learning System to further establish its effectiveness in the curriculum as a forerunner to the shift towards the 21st Century Learning.

Schizophrenia remains a major burden1. The dopamine (DA) and neurodevelopmental hypotheses attempt to explain the pathogenic mechanisms and origins of the disorder respectively2-4. Recently an alternative, the cognitive model, has gained popularity5. However the first two theories have not been satisfactorily integrated, and the most influential iteration of the cognitive model makes no mention of DA, neurodevelopment, or indeed the brain5. Here we show that developmental alterations secondary to variant genes, early hazards to the brain and childhood adversity, sensitise the DA system, and result in excessive presynaptic DA synthesis and DA release. Social adversity biases the cognitive schema that the individual uses to interpret experiences towards paranoid interpretations. Subsequent stress results in dysregulated DA release, causing the misattribution of salience to stimuli, which are then misinterpreted by the biased cognitive processes. The resulting paranoia and hallucinations in turn cause further stress, and eventually repeated DA dysregulation hard-wires the psychotic beliefs. Finally we consider the implications of this model for understanding and treating schizophrenia. PMID:24315522

This paper proposes a hybrid programming framework for modeling and solving of constraint satisfaction problems (CSPs) and constraint optimization problems (COPs). Two paradigms, CLP (constraint logic programming) and MP (mathematical programming), are integrated in the framework. The integration is supplemented with the original method of problem transformation, used in the framework as a presolving method. The transformation substantially reduces the feasible solution space. The framework a...

Images of Earth from space popularized the view of our planet as a single, fragile entity against the vastness and darkness of space. In the 1980s, the International Geosphere-Biosphere Program (IGBP) was set up to produce a predictive understanding of this fragile entity as the ‘Earth System.’ In order to do so, the program sought to create a common research framework for the different disciplines involved. It suggested that integrated numerical models could provide such a framework. The pap...

EPA’s Sustainable and Healthy Communities Research Program (SHC) is conducting transdisciplinary research to inform and empower decision-makers. EPA tools and approaches are being developed to enable communities to effectively weigh and integrate human health, socioeconomic, environmental, and ecological factors into their decisions to promote community sustainability. To help achieve this goal, EPA researchers have developed systems approaches to account for the linkages among resources, assets, and outcomes managed by a community. System dynamics (SD) is a member of the family of systems approaches and provides a framework for dynamic modeling that can assist with assessing and understanding complex issues across multiple dimensions. To test the utility of such tools when applied to a real-world situation, the EPA has developed a prototype SD model for community sustainability using the proposed Durham-Orange Light Rail Project (D-O LRP) as a case study.The EPA D-O LRP SD modeling team chose the proposed D-O LRP to demonstrate that an integratedmodeling approach could represent the multitude of related cross-sectoral decisions that would be made and the cascading impacts that could result from a light rail transit system connecting Durham and Chapel Hill, NC. In keeping with the SHC vision described above, the proposal for the light rail is a starting point solution for the more intractable problems of population growth, unsustainable land use, environmenta

Integrating science into resource management activities is a goal of the CALFED Bay-Delta Program, a multi-agency effort to address water supply reliability, ecological condition, drinking water quality, and levees in the Sacramento-San Joaquin Delta of northern California. Under CALFED, many different strategies were used to integrate science, including interaction between the research and management communities, public dialogues about scientific work, and peer review. This paper explores ways science was (and was not) integrated into CALFED's management actions and decision systems through three narratives describing different patterns of scientific integration and application in CALFED. Though a collaborative process and certain organizational conditions may be necessary for developing new understandings of the system of interest, we find that those factors are not sufficient for translating that knowledge into management actions and decision systems. We suggest that the application of knowledge may be facilitated or hindered by (1) differences in the objectives, approaches, and cultures of scientists operating in the research community and those operating in the management community and (2) other factors external to the collaborative process and organization.

Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis.

The MULTEQ computer program has become an industry wide tool which can be used to calculate the chemical composition in a flow occluded region as the solution within concentrates due to a local boiling process. These results can be used to assess corrosion concerns in plant equipment such as steam generators. Corrosion modeling attempts to quantify corrosion assessments by accounting for the mass transport processes involved in the corrosion mechanism. MULTEQ has played an ever increasing role in defining the local chemistry for such corrosion models. This paper will outline how the integration of corrosion modeling with the analysis of corrosion films and deposits can lead to the development of a useful modeling tool, wherein MULTEQ is interactively linked to a diffusion and migration transport process. This would provide a capability to make detailed inferences of the local crack chemistry based on the analyses of the local corrosion films and deposits inside a crack and thus provide guidance for chemical fixes to avoid cracking. This methodology is demonstrated for a simple example of a cracked tube. This application points out the utility of coupling MULTEQ with a mass transport process and the feasibility of an option in a future version of MULTEQ that would permit relating film and deposit analyses to the local chemical environment. This would increase the amount of information obtained from removed tube analyses and laboratory testing that can contribute to an overall program for mitigating tubing and crevice corrosion

The MULTEQ computer program has become an industry wide tool which can be used to calculate the chemical composition in a flow occluded region as the solution within concentrates due to a local boiling process. These results can be used to assess corrosion concerns in plant equipment such as steam generators. Corrosion modeling attempts to quantify corrosion assessments by accounting for the mass transport processes involved in the corrosion mechanism. MULTEQ has played an ever increasing role in defining the local chemistry for such corrosion models. This paper will outline how the integration of corrosion modeling with the analysis of corrosion films and deposits can lead to the development of a useful modeling tool, wherein MULTEQ is interactively linked to a diffusion and migration transport process. This would provide a capability to make detailed inferences of the local crack chemistry based on the analyses of the local corrosion films and deposits inside a crack and thus provide guidance for chemical fixes to avoid cracking. This methodology is demonstrated for a simple example of a cracked tube. This application points out the utility of coupling MULTEQ with a mass transport process and the feasibility of an option in a future version of MULTEQ that would permit relating film and deposit analyses to the local chemical environment. This would increase the amount of information obtained from removed tube analyses and laboratory testing that can contribute to an overall program for mitigating tubing and crevice corrosion

Tighter discharge permits often require wastewater treatment plants to maximize utilization of available facilities in order to cost-effectively reach these goals. Important aspects are minimizing internal disturbances and using available information in a smart way to improve plant performance. In this study, flow control throughout a large highly automated wastewater treatment plant (WWTP) was implemented in order to reduce internal disturbances and to provide a firm foundation for more advanced process control. A modular flow control system was constructed based on existing instrumentation and soft sensor flow models. Modules were constructed for every unit process in water treatment and integrated into a plant-wide model. The flow control system is used to automatically control recirculation flows and bypass flows at the plant. The system was also successful in making accurate flow estimations at points in the plant where it is not possible to have conventional flow meter instrumentation. The system provides fault detection for physical flow measuring devices. The module construction allows easy adaptation for new unit processes added to the treatment plant.

Integrated Environmental Modelling (IEM) is an invaluable tool for understanding the complex, dynamic ecosystems that house our natural resources and control our environments. Human behaviour affects the ways in which the science of IEM is assembled and used for meaningful societal applications. In particular, human biases and heuristics reflect adaptation and experiential learning to issues with frequent, sharply distinguished, feedbacks. Unfortunately, human behaviour is not adapted to the more diffusely experienced problems that IEM typically seeks to address. Twelve biases are identified that affect IEM (and science in general). These biases are supported by personal observations and by the findings of behavioural scientists. A process for critical analysis is proposed that addresses some human challenges of IEM and solicits explicit description of (1) represented processes and information, (2) unrepresented processes and information, and (3) accounting for, and cognizance of, potential human biases. Several other suggestions are also made that generally complement maintaining attitudes of watchful humility, open-mindedness, honesty and transparent accountability. These suggestions include (1) creating a new area of study in the behavioural biogeosciences, (2) using structured processes for engaging the modelling and stakeholder communities in IEM, and (3) using ‘red teams’ to increase resilience of IEM constructs and use.

The Yucca Mountain IntegratingModel (YMIM) is an integratedmodel of the Engineered barrier System has been developed to assist project managers at LLNL in identifying areas where research emphasis should be placed. The model was designed to be highly modular so that a model of an individual process could be easily modified or replaced without interfering with the models of other processes. The modules modelling container failure and the dissolution of nuclides include particularly detailed, temperature dependent models of their corresponding processes

A computer program, PharmK, was developed for pharmacokinetic modeling of experimental data. The program was written in C computer language based on the high-level user-interface Macintosh operating system. The intention was to provide a user-friendly tool for users of Macintosh computers. An interactive algorithm based on the exponential stripping method is used for the initial parameter estimation. Nonlinear pharmacokinetic model fitting is based on the maximum likelihood estimation method and is performed by the Levenberg-Marquardt method based on chi 2 criterion. Several methods are available to aid the evaluation of the fitting results. Pharmacokinetic data sets have been examined with the PharmK program, and the results are comparable with those obtained with other programs that are currently available for IBM PC-compatible and other types of computers.

This paper presents a model for developing an interdisciplinary principal preparation program, an MBA in Education Leadership, which integrates best practices in both education and business within an educational context. The paper addresses gaps that exist in many traditional principal preparation programs and provides an alternative model, which…

Full Text Available Purpose: The point of departure of this exploratory study is the gap between the increasing importance of business model innovation (BMI in science and management and the limited conceptual assistance available. Therefore, the study identi es and explores scattered BMI insights and deduces them into an integrative framework to enhance our understanding about this phenomenon and to present a helpful guidance for researchers and practitioners. Design/Methodology/Approach: The study identi es BMI insights through a literature-based investigation and consolidates them into an integrative BMI framework that presents the key elements and dimensions of BMI as well as their presumed relationships. Findings: The study enhances our understanding about the key elements and dimensions of BMI, presents further conceptual insights into the BMI phenomenon, supplies implications for science and management, and may serve as a helpful guidance for future research. Practical Implications: The presented framework provides managers with a tool to identify critical BMI issues and can serve as a conceptual BMI guideline. Research limitations: Given the vast amount of academic journals, it is unlikely that every applicable scienti c publication is included in the analysis. The illustrative examples are descriptive in nature, and thus do not provide empirical validity. Several implications for future research are provided. Originality/Value: The study’s main contribution lies in the unifying approach of the dispersed BMI knowledge. Since our understanding of BMI is still limited, this study should provide the necessary insights and conceptual assistance to further develop the concept and guide its practical application.

Integrated business structure presented as complementary pool of its participants skills. The methodical approach to integrated business structure life cycle modeling proposed. Recommendations of enterprises life cycles stages correlate are submitted.

Full Text Available This paper analyzes the integration of two combinatorial problems that frequently arise in production and distribution systems. One is the Bin Packing Problem (BPP problem, which involves finding an ordering of some objects of different volumes to be packed into the minimal number of containers of the same or different size. An optimal solution to this NP-Hard problem can be approximated by means of meta-heuristic methods. On the other hand, we consider the Capacitated Vehicle Routing Problem with Time Windows (CVRPTW, which is a variant of the Travelling Salesman Problem (again a NP-Hard problem with extra constraints. Here we model those two problems in a single framework and use an evolutionary meta-heuristics to solve them jointly. Furthermore, we use data from a real world company as a test-bed for the method introduced here.

The Mixed Waste IntegratedProgram (MWIP) is responding to the need for DOE mixed waste treatment technologies that meet these dual regulatory requirements. MWIP is developing emerging and innovative treatment technologies to determine process feasibility. Technology demonstrations will be used to determine whether processes are superior to existing technologies in reducing risk, minimizing life-cycle cost, and improving process performance. Technology development is ongoing in technical areas required to process mixed waste: materials handling, chemical/physical treatment, waste destruction, off-gas treatment, final forms, and process monitoring/control. MWIP is currently developing a suite of technologies to process heterogeneous waste. One robust process is the fixed-hearth plasma-arc process that is being developed to treat a wide variety of contaminated materials with minimal characterization. Additional processes encompass steam reforming, including treatment of waste under the debris rule. Advanced off-gas systems are also being developed. Vitrification technologies are being demonstrated for the treatment of homogeneous wastes such as incinerator ash and sludge. An alternative to conventional evaporation for liquid removal--freeze crystallization--is being investigated. Since mercury is present in numerous waste streams, mercury removal technologies are being developed

This SpringerBrief explores the internal workings of service systems. The authors propose a lightweight semantic model for an effective representation to capture the essence of service systems. Key topics include modeling frameworks, service descriptions and linked data, creating service instances, tool support, and applications in enterprises.Previous books on service system modeling and various streams of scientific developments used an external perspective to describe how systems can be integrated. This brief introduces the concept of white-box service system modeling as an approach to mo

In his study, Mahdi Derakhshanmanesh builds on the state of the art in modeling by proposing to integratemodels into running software on the component-level without translating them to code. Such so-called model-integrating software exploits all advantages of models: models implicitly support a good separation of concerns, they are self-documenting and thus improve understandability and maintainability and in contrast to model-driven approaches there is no synchronization problem anymore between the models and the code generated from them. Using model-integrating components, software will be

Full Text Available Abstract Background The investigation of gene regulatory networks is an important issue in molecular systems biology and significant progress has been made by combining different types of biological data. The purpose of this study was to characterize the transcriptional program induced by etanercept therapy in patients with rheumatoid arthritis (RA. Etanercept is known to reduce disease symptoms and progression in RA, but the underlying molecular mechanisms have not been fully elucidated. Results Using a DNA microarray dataset providing genome-wide expression profiles of 19 RA patients within the first week of therapy we identified significant transcriptional changes in 83 genes. Most of these genes are known to control the human body's immune response. A novel algorithm called TILAR was then applied to construct a linear network model of the genes' regulatory interactions. The inference method derives a model from the data based on the Least Angle Regression while incorporating DNA-binding site information. As a result we obtained a scale-free network that exhibits a self-regulating and highly parallel architecture, and reflects the pleiotropic immunological role of the therapeutic target TNF-alpha. Moreover, we could show that our integrativemodeling strategy performs much better than algorithms using gene expression data alone. Conclusion We present TILAR, a method to deduce gene regulatory interactions from gene expression data by integrating information on transcription factor binding sites. The inferred network uncovers gene regulatory effects in response to etanercept and thus provides useful hypotheses about the drug's mechanisms of action.

Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programmingmodel with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generality and thus applicability, a minimal programmingmodel (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.

The Integrated Dynamic Landscape Analysis and Modeling System (IDLAMS) is a prototype, integrated land management technology developed through a joint effort between Argonne National Laboratory (ANL) and the US Army Corps of Engineers Construction Engineering Research Laboratories (USACERL). Dr. Ronald C. Sundell, Ms. Pamela J. Sydelko, and Ms. Kimberly A. Majerus were the principal investigators (PIs) for this project. Dr. Zhian Li was the primary software developer. Dr. Jeffrey M. Keisler, Mr. Christopher M. Klaus, and Mr. Michael C. Vogt developed the decision analysis component of this project. It was developed with funding support from the Strategic Environmental Research and Development Program (SERDP), a land/environmental stewardship research program with participation from the US Department of Defense (DoD), the US Department of Energy (DOE), and the US Environmental Protection Agency (EPA). IDLAMS predicts land conditions (e.g., vegetation, wildlife habitats, and erosion status) by simulating changes in military land ecosystems for given training intensities and land management practices. It can be used by military land managers to help predict the future ecological condition for a given land use based on land management scenarios of various levels of training intensity. It also can be used as a tool to help land managers compare different land management practices and further determine a set of land management activities and prescriptions that best suit the needs of a specific military installation.

Effective Space Asset Management is one key to addressing the ever-growing issue of space congestion. It is imperative that agencies around the world have access to data regarding the numerous active assets and pieces of space junk currently tracked in orbit around the Earth. At the center of this issues is the effective management of data of many types related to orbiting objects. As the population of tracked objects grows, so too should the data management structure used to catalog technical specifications, orbital information, and metadata related to those populations. Marshall Space Flight Center's Space Asset Management Database (SAM-D) was implemented in order to effectively catalog a broad set of data related to known objects in space by ingesting information from a variety of database and processing that data into useful technical information. Using the universal NORAD number as a unique identifier, the SAM-D processes two-line element data into orbital characteristics and cross-references this technical data with metadata related to functional status, country of ownership, and application category. The SAM-D began as an Excel spreadsheet and was later upgraded to an Access database. While SAM-D performs its task very well, it is limited by its current platform and is not available outside of the local user base. Further, while modeling and simulation can be powerful tools to exploit the information contained in SAM-D, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. This paper provides a summary of SAM-D development efforts to date and outlines a proposed data management infrastructure that extends SAM-D to support the larger data sets to be generated. A service-oriented architecture model using an information sharing platform named SIMON will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for

A multidisciplinary, finite element-based, highly graphics-oriented, linear and nonlinear analysis capability that includes such disciplines as structures, heat transfer, linear aerodynamics, computational fluid dynamics, and controls engineering has been achieved by integrating several new modules in the original STARS (STructural Analysis RoutineS) computer program. Each individual analysis module is general-purpose in nature and is effectively integrated to yield aeroelastic and aeroservoelastic solutions of complex engineering problems. Examples of advanced NASA Dryden Flight Research Center projects analyzed by the code in recent years include the X-29A, F-18 High Alpha Research Vehicle/Thrust Vectoring Control System, B-52/Pegasus Generic Hypersonics, National AeroSpace Plane (NASP), SR-71/Hypersonic Launch Vehicle, and High Speed Civil Transport (HSCT) projects. Extensive graphics capabilities exist for convenient model development and postprocessing of analysis results. The program is written in modular form in standard FORTRAN language to run on a variety of computers, such as the IBM RISC/6000, SGI, DEC, Cray, and personal computer; associated graphics codes use OpenGL and IBM/graPHIGS language for color depiction. This program is available from COSMIC, the NASA agency for distribution of computer programs.

Full Text Available Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behavior as a means of computation, communication and constraints, using computational blocks and aggregates for both discrete and continuous behavior, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI, University of Southern Denmark. Once specified, the software model has to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behavior, and the transformation of the software system into the S

Preparing an effective workforce in high technology is the goal of both academic and industry training, and has been the engine that drives innovation and product development in the United States for over a century. During the last 50 years, technician training has comprised a combination of two-year academic programs, internships and apprentice training, and extensive On-the-Job Training (OJT). Recently, and especially in Silicon Valley, technicians have four-year college degrees, as well as relevant hands-on training. Characterization in general, and microscopy in particular, is an essential tool in process development, manufacturing and QA/QC, and failure analysis. Training for a broad range of skills and practice is challenging, especially for community colleges. Workforce studies (SRI/Boeing) suggest that even four year colleges often do not provide the relevant training and experience in laboratory skills, especially design of experiments and analysis of data. Companies in high-tech further report difficulty in finding skilled labor, especially with industry specific experience. Foothill College, in partnership with UCSC, SJSU, and NASA-Ames, has developed a microscopy training program embedded in a research laboratory, itself a partnership between university and government, providing hands-on experience in advanced instrumentation, experimental design and problem solving, with real-world context from small business innovators, in an environment called `the collaboratory'. The program builds on AFM-SEM training at Foothill, and provides affordable training in FE-SEM and TEM through a cost recovery model. In addition to instrument and engineering training, the collaboratory also supports academic and personal growth through a multiplayer social network of students, faculty, researchers, and innovators.

There is a proliferation of medical devices across the globe for the diagnosis and therapy of diseases. Biomedical engineering (BME) plays a significant role in healthcare and advancing medical technologies thus creating a substantial demand for biomedical engineers at undergraduate and graduate levels. There has been a surge in undergraduate programs due to increasing demands from the biomedical industries to cover many of their segments from bench to bedside. With the requirement of multidisciplinary training within allottable duration, it is indeed a challenge to design a comprehensive standardized undergraduate BME program to suit the needs of educators across the globe. This paper's objective is to describe three major models of undergraduate BME programs and their curricular requirements, with relevant recommendations to be applicable in institutions of higher education located in varied resource settings. Model 1 is based on programs to be offered in large research-intensive universities with multiple focus areas. The focus areas depend on the institution's research expertise and training mission. Model 2 has basic segments similar to those of Model 1, but the focus areas are limited due to resource constraints. In this model, co-op/internship in hospitals or medical companies is included which prepares the graduates for the work place. In Model 3, students are trained to earn an Associate Degree in the initial two years and they are trained for two more years to be BME's or BME Technologists. This model is well suited for the resource-poor countries. All three models must be designed to meet applicable accreditation requirements. The challenges in designing undergraduate BME programs include manpower, facility and funding resource requirements and time constraints. Each academic institution has to carefully analyze its short term and long term requirements. In conclusion, three models for BME programs are described based on large universities, colleges, and

The Integrated Tokamak Modeling Task Force (ITM-TF) was set up in 2004. The main target is to coordinate the European fusion modeling effort and providing a complete European modeling structure for International Thermonuclear Experimental Reactor (ITER), with the highest degree of flexibility. For the accurate simulation of the processes in the active fusion reactor in the ITM-TF, numerous atomic, molecular, nuclear and surface related data are required. In this work we present total-, single- and multiple-ionization and charge exchange cross sections in close connection to the ITM-TF. Interpretation of these cross sections in multi-electron ion-atom collisions is a challenging task for theories. The main difficulty is caused by the many-body feature of the collision, involving the projectile, projectile electron(s), target nucleus, and target electron(s). The classical trajectory Monte Carlo (CTMC) method has been quite successful in dealing with the atomic processes in ion-atom collisions. One of the advantages of the CTMC method is that many-body interactions are exactly taken into account related CTMC simulations for a various collision systems are presented. To highlight the efficiency of the method we present electron emission cross sections in collision between dressed Al q+ ions with He target. The theory delivers separate spectra for electrons emitted from the target and the projectile. By summing these two components in the rest frame of the target we may make a comparison with available experimental data. For the collision system in question, a significant contribution from Fermi-shuttle ionization has to be expected in the spectra at energies higher than E=0.5 m e (nV) 2 , where m e is the mass of the electron, V the projectile velocity and n an integer greater than 1. We found enhanced electron yields compared to first order theory in this region of CTMC spectra, which can be directly attributed to the contribution of Fermi-shuttle type multiple

The UK Government is formally committed to reducing carbon emissions and protecting and improving natural capital and the environment. However, actually delivering on these objectives requires an integrated approach to addressing two parallel challenges: de-carbonising future energy system pathways; and safeguarding natural capital to ensure the continued flow of ecosystem services. Although both emphasise benefiting from natural resources, efforts to connect natural capital and energy systems research have been limited, meaning opportunities to improve management of natural resources and meet society's energy needs could be missed. The ecosystem services paradigm provides a consistent conceptual framework that applies in multiple disciplines across the natural and economic sciences, and facilitates collaboration between them. At the forefront of the field, integrated ecosystem service - economy models have guided public- and private-sector decision making at all levels. Models vary in sophistication from simple spreadsheet tools to complex software packages integrating biophysical, GIS and economic models and draw upon many fields, including ecology, hydrology, geography, systems theory, economics and the social sciences. They also differ in their ability to value changes in natural capital and ecosystem services at various spatial and temporal scales. Despite these differences, current models share a common feature: their treatment of energy systems is superficial at best. In contrast, energy systems research has no widely adopted, unifying conceptual framework that organises thinking about key system components and interactions. Instead, the literature is organised around modelling approaches, including life cycle analyses, econometric investigations, linear programming and computable general equilibrium models. However, some consistencies do emerge. First, often contain a linear set of steps, from exploration to resource supply, fuel processing, conversion

This paper is the second of two that describe the Predictive Maintenance Program for rotating machinery at the Palo Verde Nuclear Generating Station. The Predictive Maintenance program has been enhanced through organizational changes and improved interdisciplinary usage of technology. This paper will discuss current program strategies that have improved the interaction between the Vibration and Lube Oil programs. The {open_quotes}Lube Oil{close_quotes} view of the combined program along with case studies will then be presented.

This paper is the second of two that describe the Predictive Maintenance Program for rotating machinery at the Palo Verde Nuclear Generating Station. The Predictive Maintenance program has been enhanced through organizational changes and improved interdisciplinary usage of technology. This paper will discuss current program strategies that have improved the interaction between the Vibration and Lube Oil programs. The open-quotes Lube Oilclose quotes view of the combined program along with case studies will then be presented

This research project studies the feasibility of developing and applying an integrated field simulator to simulate the production performance of an entire oil or gas field. It integrates the performance of the reservoir, the wells, the chokes, the gathering system, the surface processing facilities and whenever applicable, gas and water injection systems. The approach adopted for developing the integrated simulator is to couple existing commercial reservoir and process simulators using available linking technologies. The simulators are dynamically linked and customised into a single hybrid application that benefits from the concept of open software architecture. The integrated field simulator is linked to an optimisation routine developed based on the genetic algorithm search strategies. This enables optimisation of the system at field level, from the reservoir to the process. Modelling the wells and the gathering network is achieved by customising the process simulator. This study demonstrated that the integrated simulation improves current capabilities to simulate the performance of the entire field and optimise its design. This is achieved by evaluating design options including spread and layout of the wells and gathering system, processing alternatives, reservoir development schemes and production strategies. Effectiveness of the integrated simulator is demonstrated and tested through several field-level case studies that discuss and investigate technical problems relevant to offshore field development. The case studies cover topics such as process optimisation, optimum tie-in of satellite wells into existing process facilities, optimal well location and field layout assessment of a high pressure high temperature deepwater oil field. Case study results confirm the viability of the total field simulator by demonstrating that the field performance simulation and optimal design were obtained in an automated process with treasonable computation time. No significant

Full Text Available Constructing inclusive societies, leaving no one behind, it is an ethical obligation. Developing inclusive educational programs allows ensuring equal opportunities in one of the most critical stages of development. The aim of this study is to describe the implementation of the School IntegrationProgram (SIP in its different dimensions and in different zones of Chile. A descriptive and cross-sectional study of the perception of SIP Coordinators was performed in public and subsidized schools at the country through a web-based survey. A simple random convenience sampling of schools was performed, obtaining 1742 answers from educational establishments with SIP. Higher level of implementation of the program was identified in areas related to interdisciplinary work and comprehensive training, curricular and institutional aspects. On the other hand, deficiencies were identified in the implementation of accessibility, development of reasonable adjustments and participation of the educational community. Likewise, there are differences between the zones of Chile, with the North zone having the least progress. Although there are results in the work team and institutional development, the development of objective conditions and participation is still a pending task in the implementation of the SIP.

This article examines theoretical frameworks and models that focus on the pedagogical aspects of online education. After a review of learning theory as applied to online education, a proposal for an integrated "Multimodal Model for Online Education" is provided based on pedagogical purpose. The model attempts to integrate the work of…

We propose a static and a dynamic approach to model biological signaling networks, and show how each can be used to answer relevant biological questions. For this, we use the two different mathematical tools of Propositional Logic and Integer Programming. The power of discrete mathematics for handling qualitative as well as quantitative data has so far not been exploited in molecular biology, which is mostly driven by experimental research, relying on first-order or statistical models. The arising logic statements and integer programs are analyzed and can be solved with standard software. For a restricted class of problems the logic models reduce to a polynomial-time solvable satisfiability algorithm. Additionally, a more dynamic model enables enumeration of possible time resolutions in poly-logarithmic time. Computational experiments are included.

The Artificial Intelligence field of Integrating Multiple Learned Models (IMLM) explores ways to combine results from sets of trained programs. Aroclor Interpretation is an ill-conditioned problem in which trained programs must operate in scenarios outside their training ranges because it is intractable to train them completely. Consequently, they fail in ways related to the scenarios. We developed a general-purpose IMLM solution, the Combiner, and applied it to Aroclor Interpretation. The Combiner's first step, Scenario Identification (M), learns rules from very sparse, synthetic training data consisting of results from a suite of trained programs called Methods. S1 produces fuzzy belief weights for each scenario by approximately matching the rules. The Combiner's second step, Aroclor Presence Detection (AP), classifies each of three Aroclors as present or absent in a sample. The third step, Aroclor Quantification (AQ), produces quantitative values for the concentration of each Aroclor in a sample. AP and AQ use automatically learned empirical biases for each of the Methods in each scenario. Through fuzzy logic, AP and AQ combine scenario weights, automatically learned biases for each of the Methods in each scenario, and Methods' results to determine results for a sample.

The paper gives a description of mathematical models and computer programs for analysing possible strategies for spent fuel management, with emphasis on economic analysis. The computer programs developed, describe the material flows, facility construction schedules, capital investment schedules and operating costs for the facilities used in managing the spent fuel. The computer programs use a combination of simulation and optimization procedures for the economic analyses. Many of the fuel cycle steps (such as spent fuel discharges, storage at the reactor, and transport to the RFCC) are described in physical and economic terms through simulation modeling, while others (such as reprocessing plant size and commissioning schedules, interim storage facility commissioning schedules etc.) are subjected to economic optimization procedures to determine the approximate lowest-cost plans from among the available feasible alternatives

textabstractDue to the large number of chemical species and the three space dimensions, off-the-shelf stiff ODE integrators are not feasible for the numerical time integration of stiff systems of advection-diffusion-reaction equations [ fracpar{c{t + nabla cdot left( vu{u c right) = nabla cdot left(

The first phase of the development of MEASURE, an integrated data analysis and model identification facility is described. The facility takes system activity data as input and produces as output representative behavioral models of the system in near real time. In addition a wide range of statistical characteristics of the measured system are also available. The usage of the system is illustrated on data collected via software instrumentation of a network of SUN workstations at the University of Illinois. Initially, statistical clustering is used to identify high density regions of resource-usage in a given environment. The identified regions form the states for building a state-transition model to evaluate system and program performance in real time. The model is then solved to obtain useful parameters such as the response-time distribution and the mean waiting time in each state. A graphical interface which displays the identified models and their characteristics (with real time updates) was also developed. The results provide an understanding of the resource-usage in the system under various workload conditions. This work is targeted for a testbed of UNIX workstations with the initial phase ported to SUN workstations on the NASA, Ames Research Center Advanced Automation Testbed.

Methods of modeling complex power networks with short circuits in the networks are described. The methods are implemented in integrated computation programs for short circuit currents and equivalents in electrical networks with a large number of branch points (up to 1000) on a computer with a limited on line memory capacity (M equals 4030 for the computer).

This paper describes the importance of foreign languages and cultures and their integration into U.S. international business programs. The author juxtaposes globalization strategies of European and American business schools and highlights pre-university foreign language study in Europe and the U.S. The paper goes on to describe model U.S.…

The purpose of the study was to develop an MST IntegratedProgram for making a Maglev hands-on activity for higher elementary school students in Korea. In this MST IntegratedProgram, students will apply Mathematics, Science, and Technology principles and concepts to the design, construction, and evaluation of a magnetically levitated vehicle. The…

Full Text Available The results of scientific research aimed at development of methodology-theoretical mechanisms of building the effective models of vertically-integrated structures are presented. A presence of vertically-integrated structures on natural-monopolistic markets at private and governmental sectors of economy and priority directions of integration are given.

The Integrated Human Futures Project provides a set of analytical and quantitative modeling and simulation tools that help explore the links among human social, economic, and ecological conditions, human resilience, conflict, and peace, and allows users to simulate tradeoffs and consequences associated with different future development and mitigation scenarios. In the current study, we integrate five distinct modeling platforms to simulate the potential risk of social unrest in Egypt resulting from the Grand Ethiopian Renaissance Dam (GERD) on the Blue Nile in Ethiopia. The five platforms simulate hydrology, agriculture, economy, human ecology, and human psychology/behavior, and show how impacts derived from development initiatives in one sector (e.g., hydrology) might ripple through to affect other sectors and how development and security concerns may be triggered across the region. This approach evaluates potential consequences, intended and unintended, associated with strategic policy actions that span the development-security nexus at the national, regional, and international levels. Model results are not intended to provide explicit predictions, but rather to provide system-level insight for policy makers into the dynamics among these interacting sectors, and to demonstrate an approach to evaluating short- and long-term policy trade-offs across different policy domains and stakeholders. The GERD project is critical to government-planned development efforts in Ethiopia but is expected to reduce downstream freshwater availability in the Nile Basin, fueling fears of negative social and economic impacts that could threaten stability and security in Egypt. We tested these hypotheses and came to the following preliminary conclusions. First, the GERD will have an important short-term impact on water availability, food production, and hydropower production in Egypt, depending on the short- term reservoir fill rate. Second, the GERD will have a very small impact on

Describes the bijural program of McGill University Faculty of Law. The program educates all first-degree law students in both the common law and civil law traditions, preparing them for the increasing globalization of legal practice. (EV)

The Korean nuclear community also recognizes the importance of outreach from its experience with rad waste and nuclear power programs. Accordingly, nationwide programs dealing with public information, support for local community development, and HRD are implemented continuously involving a number of organizations concerned. The Nuclear Training and Education Center (NTC) of the Korea Atomic Energy Research Institute (KAERI), with its unique function and capability as a national research organization, has needs for the enhancement of public acceptance for KAERI programs, a better contribution to the national effort, and addressing the emerging needs for international education/training on nuclear outreach. This paper presents an integrated education/training based nuclear outreach model with a set of reference program, which is developed for NTC. An integrated education/training based nuclear outreach model for NTC is developed addressing the increasing needs for public acceptance on the peaceful use of nuclear energy, in terms of supporting KAERI activities, contributing to the national nuclear outreach efforts, and promoting international education and training on nuclear outreach. The model, harmonized with the national nuclear outreach system, consists of objectives, target audiences, a set of reference program supported by infrastructure and networking, and an evaluation system. The program is further specified into sub-programs with detailed design for the respective audiences. The developed model with a reference program is characterized by its integrity in terms of encompassing the whole outreach process cycle, and setting up of a target audience based total program structure with existing and new sub-programs. Also, it intends to be sustainable by addressing future generations' needs as well as innovative in the program delivery. The model will be continuously upgraded and applied addressing respective needs of the audiences

The Korean nuclear community also recognizes the importance of outreach from its experience with rad waste and nuclear power programs. Accordingly, nationwide programs dealing with public information, support for local community development, and HRD are implemented continuously involving a number of organizations concerned. The Nuclear Training and Education Center (NTC) of the Korea Atomic Energy Research Institute (KAERI), with its unique function and capability as a national research organization, has needs for the enhancement of public acceptance for KAERI programs, a better contribution to the national effort, and addressing the emerging needs for international education/training on nuclear outreach. This paper presents an integrated education/training based nuclear outreach model with a set of reference program, which is developed for NTC. An integrated education/training based nuclear outreach model for NTC is developed addressing the increasing needs for public acceptance on the peaceful use of nuclear energy, in terms of supporting KAERI activities, contributing to the national nuclear outreach efforts, and promoting international education and training on nuclear outreach. The model, harmonized with the national nuclear outreach system, consists of objectives, target audiences, a set of reference program supported by infrastructure and networking, and an evaluation system. The program is further specified into sub-programs with detailed design for the respective audiences. The developed model with a reference program is characterized by its integrity in terms of encompassing the whole outreach process cycle, and setting up of a target audience based total program structure with existing and new sub-programs. Also, it intends to be sustainable by addressing future generations' needs as well as innovative in the program delivery. The model will be continuously upgraded and applied addressing respective needs of the audiences.

A program was developed to integrate the wave equation through a plane stratified plasma with a general density distribution. The reflection and transmission of a plane wave are computed as a function of the angle of incidence. The polarization of the electric vector is assumed to be perpendicular to the plane of incidence. The model for absorption by classical inverse bremsstrahlung avoids the improper extrapolation of underdense formulae that are singular at the plasma critical surface. Surprisingly good agreement with the geometric-optics analysis of a linear layer was found. The system of ordinary differential equations is integrated by the variable-step, variable-order Adams method in the Lawrence Livermore Laboratory Gear package. Parametric studies of the absorption are summarized, and some possibilities for further development of the code are discussed. (auth)

We showcase demonstrations of “program & compile” styled optical networking as well as open platforms & standards based NFV service provisioning using a proof-of-concept implementation of the Software-Programmed Networking Operating System (SPN OS).......We showcase demonstrations of “program & compile” styled optical networking as well as open platforms & standards based NFV service provisioning using a proof-of-concept implementation of the Software-Programmed Networking Operating System (SPN OS)....

The program assessment process combines assessments from individual courses to generate final program assessment to match accreditation benchmarks. In developing countries, industrial environment is not diversified to allow graduating engineers to seek jobs in all disciplines or specializations of an engineering program. Hence, it seems necessary to seek evolution of an engineering program assessment for specialized requirements of the industry. This paper describes how specialization-specifi...

... bachelor's and master's degree programs, and 20 years for programs that lead to a doctoral or first...-risk and underserved populations of students; and limit the growth of, and innovation in, new programs... and to society in general, nor that they would represent a poor financial risk. Sen. Rep. No. 758...

The accurate integral cross sectional reaction rates in representative spectra for the actinides are discussed at OSMOSE program. The first step in obtaining better nuclear data consists of measuring accurate integral data and comparing it to integrated energy dependent data: this comparison provides a direct assessment of the effect of deficiencies in the differential data. The OSMOSE program includes a complete analytical program associated with experimental measurement program and aims at understanding and resolving discrepancies between calculated and measured values. The measurement covers a wide range of neutron spectra, from over-moderate thermal spectra to fast spectra. (authors)

The essential features of the programme of the Anne Frank Haven are the complete integration of children from low SES and different cultural backgrounds with Kibbutz children; a holistic approach to education; and the involvement of the whole community in an "open" residential school. After 33 years, it is argued that the experiment has proved successful in absorbing city-born youth in the Kibbutz, enabling at-risk populations to reach significant academic achievements, and ensuring their continued participation in the dominant culture. The basic integrationmodel consists of "layers" of concentric circles, in dynamic interaction. The innermost circle is the class, the learning community. The Kibbutz community and the foster parents form a supportive, enveloping circle, which enables students to become part of the outer community and to intervene in it. A kind of meta-environment, the inter-Kibbutz partnership and the Israeli educational system, influence the program through decision making and guidance. Some of the principles of the Haven — integration, community involvement, a year's induction for all new students, and open residential settings — could be useful for cultures and societies outside the Kibbutz. The real "secret" of success of an alternative educational program is the dedicated, motivated and highly trained staff.

The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Graphical Evaluation Module (GEM) is a special application tool designed for evaluation of operational occurrences using the Accident Sequence Precursor (ASP) program methods. GEM provides the capability for an analyst to quickly and easily perform conditional core damage probability (CCDP) calculations. The analyst can then use the CCDP calculations to determine if the occurrence of an initiating event or a condition adversely impacts safety. It uses models and data developed in the SAPHIRE specially for the ASP program. GEM requires more data than that normally provided in SAPHIRE and will not perform properly with other models or data bases. This is the first release of GEM and the developers of GEM welcome user comments and feedback that will generate ideas for improvements to future versions. GEM is designated as version 5.0 to track GEM codes along with the other SAPHIRE codes as the GEM relies on the same, shared database structure

Health service organizations and professionals are under increasing pressure to work together to deliver integrated patient care. A common understanding of integration strategies may facilitate the delivery of integrated care across inter-organizational and inter-professional boundaries. This paper aims to build a framework for exploring and potentially aligning multiple stakeholder perspectives of systems integration. The authors draw from the literature on shared mental models, strategic management and change, framing, stakeholder management, and systems theory to develop a new construct, Mental Models of Integrated Care (MMIC), which consists of three types of mental models, i.e. integration-task, system-role, and integration-belief. The MMIC construct encompasses many of the known barriers and enablers to integrating care while also providing a comprehensive, theory-based framework of psychological factors that may influence inter-organizational and inter-professional relations. While the existing literature on integration focuses on optimizing structures and processes, the MMIC construct emphasizes the convergence and divergence of stakeholders' knowledge and beliefs, and how these underlying cognitions influence interactions (or lack thereof) across the continuum of care. MMIC may help to: explain what differentiates effective from ineffective integration initiatives; determine system readiness to integrate; diagnose integration problems; and develop interventions for enhancing integrative processes and ultimately the delivery of integrated care. Global interest and ongoing challenges in integrating care underline the need for research on the mental models that characterize the behaviors of actors within health systems; the proposed framework offers a starting point for applying a cognitive perspective to health systems integration.

An integratedprogramming environment represents a robust approach to building a valid model for landfill site selection. One of the main challenges in the integratedmodel is the complicated processing and modelling due to the programming stages and several limitations. An automation process helps avoid the limitations and improve the interoperability between integratedprogramming environments. This work targets the automation of a spatial data-mining model for landfill site selection by integrating between spatial programming environment (Python-ArcGIS) and non-spatial environment (MATLAB). The model was constructed using neural networks and is divided into nine stages distributed between Matlab and Python-ArcGIS. A case study was taken from the north part of Peninsular Malaysia. 22 criteria were selected to utilise as input data and to build the training and testing datasets. The outcomes show a high-performance accuracy percentage of 98.2% in the testing dataset using 10-fold cross validation. The automated spatial data mining model provides a solid platform for decision makers to performing landfill site selection and planning operations on a regional scale.

SuperMC is a (Computer-Aided-Design) CAD-based Monte Carlo (MC) program for integrated simulation of nuclear systems developed by FDS Team (China), making use of hybrid MC-deterministic method and advanced computer technologies. The design aim, architecture and main methodology of SuperMC are presented in this paper. The taking into account of multi-physics processes and the use of advanced computer technologies such as automatic geometry modeling, intelligent data analysis and visualization, high performance parallel computing and cloud computing, contribute to the efficiency of the code. SuperMC2.1, the latest version of the code for neutron, photon and coupled neutron and photon transport calculation, has been developed and validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model

Full Text Available PDDP, the parallel data distribution preprocessor, is a data parallel programmingmodel for distributed memory parallel computers. PDDP implements high-performance Fortran-compatible data distribution directives and parallelism expressed by the use of Fortran 90 array syntax, the FORALL statement, and the WHERE construct. Distributed data objects belong to a global name space; other data objects are treated as local and replicated on each processor. PDDP allows the user to program in a shared memory style and generates codes that are portable to a variety of parallel machines. For interprocessor communication, PDDP uses the fastest communication primitives on each platform.

This article details the evaluation of a clinical services program for teen mothers in the District of Columbia. The program's primary objectives are to prevent unintended subsequent pregnancy and to promote contraceptive utilization. We calculated contraceptive utilization at 6, 12, 18, and 24 months after delivery, as well as occurrence of subsequent pregnancy and birth. Nearly seven in ten (69.5%) teen mothers used contraception at 24 months after delivery, and 57.1% of contraceptive users elected long-acting reversible contraception. In the 24-month follow-up period, 19.3% experienced at least one subsequent pregnancy and 8.0% experienced a subsequent birth. These results suggest that an integrated clinical services model may contribute to sustained contraceptive use and may prove beneficial in preventing subsequent teen pregnancy and birth.

Full Text Available As an inherently interdisciplinary endeavor, quantitative reasoning (QR risks falling through the cracks between the traditional “silos” of higher education. This article describes one strategy for developing a truly cross-campus QR initiative: leverage the existing structures of campus writing programs by placing QR in the context of argument. We first describe the integration of Carleton College’s Quantitative Inquiry, Reasoning, and Knowledge initiative with the Writing Program. Based on our experience, we argue that such an approach leads to four benefits: it reflects important aspects of QR often overlooked by other approaches; it defuses the commonly raised objection that QR is merely remedial math; it sidesteps challenges of institutional culture (idiosyncratic campus history, ownership, and inertia; and it improves writing instruction. We then explore the implications of our approach for QR graduation standards. Our experience suggests that once we engaged faculty from across the curriculum in our work, it would have been difficult to adopt a narrowly defined requirement of skills-based courses. The article concludes by providing resources for those who would like to implement this approach at the course and institutional level.

Full Text Available The article introduces a new methodology of temporal influence measurement (seasonal oscillations, temporal patterns for behavioural scoring development purposes. The paper shows how significant temporal variables can be recognised and then integrated into the behavioural scoring models in order to improve model performance. Behavioural scoring models are integral parts of the Basel II standard on Internal Ratings-Based Approaches (IRB. The IRB approach much more precisely reflects individual risk bank profile.A solution of the problem of how to analyze and integrate macroeconomic and microeconomic factors represented in time series into behavioural scorecard models will be shown in the paper by using the REF II model.

This article aims at developing an operational tool for integratedmodelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integratingmodels of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...... with a major international engineering company....

Drinking water treatment plants automation becomes more sophisticated, more on-line monitoring systems become available and integration of modeling environments with control systems becomes easier. This gives possibilities for model-based optimization. In operation of drinking water treatment

National Aeronautics and Space Administration — ABSTRACT: The Integrated Biosphere Simulator (or IBIS) is designed to be a comprehensive model of the terrestrial biosphere. Tthe model represents a wide range of...

National Aeronautics and Space Administration — The Integrated Biosphere Simulator (or IBIS) is designed to be a comprehensive model of the terrestrial biosphere. Tthe model represents a wide range of processes,...

This report summarizes work performed by Argonne National Laboratory on the Steam Generator Tube IntegrityProgram from the inception of the program in August 1995 through September 1996. The program is divided into five tasks: (1) assessment of inspection reliability, (2) research on ISI (inservice-inspection) technology, (3) research on degradation modes and integrity, (4) tube removals from steam generators, and (5) program management. Under Task 1, progress is reported on the preparation of facilities and evaluation of nondestructive evaluation techniques for inspecting a mock-up steam generator for round-robin testing, the development of better ways to correlate failure pressure and leak rate with eddy current (EC) signals, the inspection of sleeved tubes, workshop and training activities, and the evaluation of emerging NDE technology. Results are reported in Task 2 on closed-form solutions and finite-element electromagnetic modeling of EC probe responses for various probe designs and flaw characteristics. In Task 3, facilities are being designed and built for the production of cracked tubes under aggressive and near-prototypical conditions and for the testing of flawed and unflawed tubes under normal operating, accident, and severe-accident conditions. Crack behavior and stability are also being modeled to provide guidance for test facility design, develop an improved understanding of the expected rupture behavior of tubes with circumferential cracks, and predict the behavior of flawed and unflawed tubes under severe accident conditions. Task 4 is concerned with the acquisition of tubes and tube sections from retired steam generators for use in the other research tasks. Progress on the acquisition of tubes from the Salem and McGuire 1 nuclear plants is reported

This report summarizes work performed by Argonne National Laboratory on the Steam Generator Tube IntegrityProgram from the inception of that program in August 1995 through March 1996. The program is divided into five tasks, namely (1) Assessment of Inspection Reliability, (2) Research on ISI (in-service-inspection) Technology, (3) Research on Degradation Modes and Integrity, (4) Development of Methodology and Technical Requirements for Current and Emerging Regulatory Issues, and (5) Program Management. Under Task 1, progress is reported on the preparation of and evaluation of nondestructive evaluation (NDE) techniques for inspecting a mock-up steam generator for round-robin testing, the development of better ways to correlate burst pressure and leak rate with eddy current (EC) signals, the inspection of sleeved tubes, workshop and training activities, and the evaluation of emerging NDE technology. Under Task 2, results are reported on closed-form solutions and finite element electromagnetic modeling of EC probe response for various probe designs and flaw characteristics. Under Task 3, facilities are being designed and built for the production of cracked tubes under aggressive and near-prototypical conditions and for the testing of flawed and unflawed tubes under normal operating, accident, and severe accident conditions. In addition, crack behavior and stability are being modeled to provide guidance on test facility design, to develop an improved understanding of the expected rupture behavior of tubes with circumferential cracks, and to predict the behavior of flawed and unflawed tubes under severe accident conditions. Task 4 is concerned with the cracking and failure of tubes that have been repaired by sleeving, and with a review of literature on this subject

By using Toda field theories we show that there are perturbations of direct products of conformal theories that lead to irreducible integrable field theories. The same affine Toda theory can be truncated to different quantum integrablemodels for different choices of the charge at infinity and the coupling. The classification of integrablemodels that can be obtained in this fashion follows the classification of symmetric spaces of type G/H with rank H = rank G. (orig.)

, communication and constraints, using computational blocks and aggregates for both discrete and continuous behaviour, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite...... to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set...... of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behaviour, and the transformation of the software system into the S-functions. The general aim of this work is the improvement of multi-disciplinary development of embedded systems with the focus on the relation...

Neurobiology and cognitive psychology have provided us with a dual process model of addiction. According to this model, behavior is considered to be the dynamic result of a combination of automatic and controlling processes. In cases of addiction the balance between these two processes is severely disturbed. Automated processes will continue to produce impulses that ensure the continuance of addictive behavior. Weak, reflective or controlling processes are both the reason for and the result of the inability to forgo addiction. To identify features that are common to current neurocognitive insights into addiction and psychodynamic views on addiction. The picture that emerges from research is not clear. There is some evidence that attentional bias has a causal effect on addiction. There is no evidence that automatic associations have a causal effect, but there is some evidence that automatic action-tendencies do have a causal effect. Current neurocognitive views on the dual process model of addiction can be integrated with an evidence-based approach to addiction and with psychodynamic views on addiction.

Successful simulations of the global circulation and climate require accurate representation of the properties of shallow and deep convective clouds, stable-layer clouds, and the interactions between various cloud types, the boundary layer, and the radiative fluxes. Each of these phenomena play an important role in the global energy balance, and each must be parameterized in a global climate model. These processes are highly interactive. One major problem limiting the accuracy of parameterizations of clouds and other processes in general circulation models (GCMs) is that most of the parameterization packages are not linked with a common physical basis. Further, these schemes have not, in general, been rigorously verified against observations adequate to the task of resolving subgrid-scale effects. To address these problems, we are designing a new Integrated Cumulus Ensemble and Turbulence (ICET) parameterization scheme, installing it in a climate model (CCM2), and evaluating the performance of the new scheme using data from Atmospheric Radiation Measurement (ARM) Program Cloud and Radiation Testbed (CART) sites.

Conducting transnational programs can be a very rewarding activity for a School, Faculty or University. Apart from increasing the profile of the university, the conduct of transnational programs can also provide the university with openings for business opportunities, consultative activities, and collaborative research. It can also be a costly exercise placing an enormous strain on limited resources with little reward for the provider. Transnational ventures can become nonviable entities in a very short period of time due to unanticipated global economic trends. Transnational courses offered by Faculties of Business and Computing are commonplace, however, there is a growing number of health science programs, particularly nursing that are being offered transnational. This paper plans an overview of several models employed for the delivery of transnational nursing courses and discusses several key issues pertaining to conducting courses outside the host university's country.

Policy makers have a growing interest in integrated assessments of policies. The Integrated Assessment Modelling (IAM) community is reacting to this interest by extending the application of model development from pure scientific analysis towards application in decision making or policy context by

The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the use the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification to report generation. Version 1.0 of the IRRAS program was released in February of 1987. Since then, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 5.0 and is the subject of this Reference Manual. Version 5.0 of IRRAS provides the same capabilities as earlier versions and ads the ability to perform location transformations, seismic analysis, and provides enhancements to the user interface as well as improved algorithm performance. Additionally, version 5.0 contains new alphanumeric fault tree and event used for event tree rules, recovery rules, and end state partitioning

An integrative suicide prevention program was implemented to tackle an outbreak of visitor charcoal burning suicides in Cheung Chau, an island in Hong Kong, in 2002. This study evaluated the effectiveness of the program. The numbers of visitor suicides reduced from 37 deaths in the 51 months prior to program implementation to 6 deaths in the 42…

... effectiveness of their integrity management programs. Program evaluation is one of the key required program... activities that are in place to control risk. These measures indicate how well an operator is implementing... outcome is being achieved or not, despite the risk control activities in place. Failure Measures that...

With the strong link between programming and the underlying technology, the incorporation of computer technology into the teaching of a programming language course should be a natural progression. However, the abstract nature of programming can make such integration a difficult prospect to achieve. As a result, the main development tool, the…

Summary form only given, as follows. The development of a data model for a project on the test and certification of computer-based information systems required a more expressive data model than that supplied by either the network, hierarchical or relational models. A data model was developed to describe the work environment and the work itself. This model is based on the entity-relationship data model of Chen and on heuristic principles of knowledge organisation used in artificial intelligence. The ER data model is reviewed and the extensions to the model are discussed.

Continued behavior observation is mandated by ANSI/ANS 3.3. This paper presents a model for behavior observation training that is in accordance with this standard and the recommendations contained in US NRC publications. The model includes seventeen major topics or activities. Ten of these are discussed: Pretesting of supervisor's knowledge of behavior observation requirements, explanation of the goals of behavior observation programs, why behavior observation training programs are needed (legal and psychological issues), early indicators of emotional instability, use of videotaped interviews to demonstrate significant psychopathology, practice recording behaviors, what to do when unusual behaviors are observed, supervisor rationalizations for noncompliance, when to be especially vigilant, and prevention of emotional instability

Global change science is ideal for NGSS-informed teaching, but presents a serious challenge to K-12 educators because it is complex and interdisciplinary- combining earth science, biology, chemistry, and physics. Global systems are themselves complex. Adding anthropogenic influences on those systems creates a formidable list of topics - greenhouse effect, climate change, nitrogen enrichment, introduced species, land-use change among them - which are often presented as a disconnected "laundry list" of "facts." This complexity, combined with public and mass-media scientific illiteracy, leaves global change science vulnerable to misrepresentation and politicization, creating additional challenges to teachers in public schools. Ample stand-alone, one-off, online resources, many of them excellent, are (to date) underutilized by teachers in the high school science course taken by most students: biology. The Understanding Global Change project (UGC) from the UC Berkeley Museum of Paleontology has created a conceptual framework that organizes, connects, and explains global systems, human and non-human drivers of change in those systems, and measurable changes in those systems. This organization and framework employ core ideas, crosscutting concepts, structure/function relationships, and system models in a unique format that facilitates authentic understanding, rather than memorization. This system serves as an organizing framework for the entire ecology unit of a forthcoming mainstream high school biology program. The UGC system model is introduced up front with its core informational graphic. The model is elaborated, step by step, by adding concepts and processes as they are introduced and explained in each chapter. The informational graphic is thus used in several ways: to organize material as it is presented, to summarize topics in each chapter and put them in perspective, and for review and critical thinking exercises that supplement the usual end-of-chapter lists of

Integration of readily available resources on care of older adults increased student and faculty interest and knowledge of gerontological nursing. The authors describe their use of these practical and easy-to-implement resources.

Introduction: The Integrated Medical Model (IMM) Project represents one aspect of NASA's Human Research Program (HRP) to quantitatively assess medical risks to astronauts for existing operational missions as well as missions associated with future exploration and commercial space flight ventures. The IMM takes a probabilistic approach to assessing the likelihood and specific outcomes of one hundred medical conditions within the envelope of accepted space flight standards of care over a selectable range of mission capabilities. A specially developed Integrated Medical Evidence Database (iMED) maintains evidence-based, organizational knowledge across a variety of data sources. Since becoming operational in 2011, version 3.0 of the IMM, the supporting iMED, and the expertise of the IMM project team have contributed to a wide range of decision and informational processes for the space medical and human research community. This presentation provides an overview of the IMM conceptual architecture and range of application through examples of actual space flight community questions posed to the IMM project. Methods: Figure 1 [see document] illustrates the IMM modeling system and scenario process. As illustrated, the IMM computational architecture is based on Probabilistic Risk Assessment techniques. Nineteen assumptions and limitations define the IMM application domain. Scenario definitions include crew medical attributes and mission specific details. The IMM forecasts probabilities of loss of crew life (LOCL), evacuation (EVAC), quality time lost during the mission, number of medical resources utilized and the number and type of medical events by combining scenario information with in-flight, analog, and terrestrial medical information stored in the iMED. In addition, the metrics provide the integrated information necessary to estimate optimized in-flight medical kit contents under constraints of mass and volume or acceptable level of mission risk. Results and Conclusions

Group decision making with preference information on alternatives is an interesting and important research topic which has been receiving more and more attention in recent years. The purpose of this paper is to investigate multiple-attribute group decision-making (MAGDM) problems with distinct uncertain preference structures. We develop some linear-programmingmodels for dealing with the MAGDM problems, where the information about attribute weights is incomplete, and the decision makers have their preferences on alternatives. The provided preference information can be represented in the following three distinct uncertain preference structures: 1) interval utility values; 2) interval fuzzy preference relations; and 3) interval multiplicative preference relations. We first establish some linear-programmingmodels based on decision matrix and each of the distinct uncertain preference structures and, then, develop some linear-programmingmodels to integrate all three structures of subjective uncertain preference information provided by the decision makers and the objective information depicted in the decision matrix. Furthermore, we propose a simple and straightforward approach in ranking and selecting the given alternatives. It is worth pointing out that the developed models can also be used to deal with the situations where the three distinct uncertain preference structures are reduced to the traditional ones, i.e., utility values, fuzzy preference relations, and multiplicative preference relations. Finally, we use a practical example to illustrate in detail the calculation process of the developed approach.

The Department of Energy`s (DOE`s) Environmental Management Program is the country`s largest and most sophisticated environmental program to date. The rapid expansion of the DOE`s environmental restoration efforts has led to increased scrutiny of its management processes and systems. As the program continues to grow and mature, maintaining adequate accountability for resources and clearly communicating progress will be essential to sustaining public confidence. The Office of Environmental Management must ensure that adequate processes and systems are in place at Headquarters, Operation Offices, and contractor organizations. These systems must provide the basis for sound management, cost control, and reporting. To meet this challenge, the Office of Environmental Restoration introduced the Management Action Plan process. This process was designed to serve three primary functions: (1) define the program`s management capabilities at Headquarters and Operations Offices; (2) describe how management initiatives address identified program deficiencies; and (3) identify any duplication of efforts or program deficiencies. The Environmental Restoration Management Action Plan is a tracking, reporting, and statusing tool, used primarily at the Headquarters level, for assessing performance in key areas of project management and control. BY DOE to communicate to oversight agencies and stakeholders a clearer picture of the current status of the environmental restoration project management system. This paper will discuss how Management Action Plans are used to provide a program-wide assessment of management capabilities.

The K Basins Materials Accounting (MAC) and Material Balance (MBA) database system were set up to run under one common applications program. This Acceptance Test Plan (ATP) describes how the code was to be tested to verify its correctness. The scope of the tests is minimal, since both MAC and MBA have already been tested in detail as stand-alone programs

Synopsis The development of persistent symptoms following whiplash injury from a motor vehicle collision is common and contributes substantially to societal and personal costs. The popular Quebec Task Force classification system of whiplash-associated disorders (WADs) was meant to function as a prognostic and intervention decision aid, but its usefulness has been questioned. Emerging evidence highlights the heterogeneity of WAD by demonstrating physical and psychological impairments that are unique to those who develop persistent symptoms. These impairments are not recognized in the Quebec Task Force classification system. The purpose of this clinical commentary is to describe an integratedmodel that focuses on how psychological and neurobiological factors interact with, and are influenced by, existing personal and environmental factors to contribute to the development of chronic WAD. The model has been developed through more than 20 years of work in the field, consultation with experts, in-depth synthesis of existing evidence, and new evidence from the authors' own research programs. A subtheme is that a point of convergence currently exists between the psychological, physiological, and social determinants of health literature that can further explain the complex presentation of WAD. The new model is proposed to orient future research toward more interdisciplinary efforts across nontraditional fields, including data scientists and consumers, to clarify the WAD condition. J Orthop Sports Phys Ther 2017;47(7):462-471. Epub 16 Jun 2017. doi:10.2519/jospt.2017.7455.

The US Department of Energy's Hanford Site, north of Richland, Washington, has a mission of defense production, waste management, environmental restoration, advanced reactor design, and research development. Environmental programs at Hanford are conducted by Pacific Northwest Laboratory (PNL) and the Westinghouse Hanford Company (WHC). The WHC environmental programs include the compliance and surveillance activities associated with site operations and waste management. The PNL environmental programs address the site-wide and the of-site areas. They include the environmental surveillance and the associated support activities, such as dose calculations, and also the monitoring of environmental conditions to comply with federal and state environmental regulations on wildlife and cultural resources. These are called ''independent environmental programs'' in that they are conducted completely separate from site operations. The Environmental Surveillance and Oversight Program consists of the following projects: surface environmental surveillance; ground-water surveillance; wildlife resources monitoring; cultural resources; dose overview; radiation standards and calibrations; meteorological and climatological services; emergency preparedness

The cycles of growth of the nursing profession depict subordination of nursing to hospital administration and medicine. Nursing is ready to move into an integrative, collaborative stage of development that places nurses directly responsible to patients, and this would facilitate nursing's response to clients' health concerns wherever they occur.…

Every teacher expects optimum level of processing in mind of them students. The level of processing is mainly depends upon memory process. Most of the students have retrieval difficulties on past learning. Memory difficulties directly related to sensory integration. In these circumstances the investigator made an attempt to construct Multisensory…

A goal of the ACCESS program (Absolute Color Calibration Experiment for Standard Stars) is to enable greater discrimination between theoretical astrophysical models and observations, where the comparison is limited by systematic errors associated with the relative flux calibration of the targets. To achieve these goals, ACCESS has been designed as a sub-orbital rocket borne payload and ground calibration program, to establish absolute flux calibration of stellar targets at flight candidates, as well as a selection of A and G stars from the CALSPEC database. Stellar atmosphere models were generated using Atlas 9 and Atlas 12 Kurucz stellar atmosphere software. The effective temperature, log(g), metallicity, and redenning were varied and the chi-squared statistic was minimized to obtain a best-fit model. A comparison of these models and the results from interpolation between grids of existing models will be presented. The impact of the flexibility of the Atlas 12 input parameters (e.g. solar metallicity fraction, abundances, microturbulent velocity) is being explored.

A new spectral problem and the associated integrable hierarchy of nonlinear evolution equations are presented in this paper. It is shown that the hierarchy is completely integrable in the Liouville sense and possesses bi-Hamiltonian structure. An explicit symmetry constraint is proposed for the Lax pairs and the adjoint Lax pairs of the hierarchy. Moreover, the corresponding Lax pairs and adjoint Lax pairs are nonlinearized into a hierarchy of commutative, new finite-dimensional completely integrable Hamiltonian systems in the Liouville sense. Further, an involutive representation of solution of each equation in the hierarchy is given. Finally, expanding integrablemodels of the hierarchy are constructed by using a new Loop algebra

ROSEN TM is an object oriented, visual programming environment used for many applications, including the development of power plant simulators. ROSE provides an integrated suite of tools for the creation, calibration, test, integration, configuration management and documentation of process, electrical and I and C models. CAE recently undertook an ambitious project to integrate its two phase thermal hydraulic model ANTHEM TM into the ROSE environment. ANTHEM is a non equilibrium, non-homogenous model based on the drift flux formalism. CAE has used the model in numerous two phase applications for nuclear and fossil power plant simulators. The integration of ANTHEM into ROSE brings the full power of visual based programming to two phase modeling applications. Features include graphical model building, calibration tools, a superior test environment and process visualisation. In addition the integration of ANTHEM into ROSE makes it possible to easily apply the fidelity of ANTHEM to BOP applications. This paper describes the implementation of the ANTHEM model within the ROSE environment and gives examples of its use. (author)

Including children with emotional and behavioral needs in mainstream school systems leads to growing concern about the increasing number of violent and nonviolent conflicts. Schools must adapt to this evolution and adopt a more therapeutic dimension. This paper explores the possibility of integrating school-based and therapeutic conflict management models and compares two management models: a school-based conflict management program. Teaching Students To Be Peacemakers; and a therapeutic conflict management program, Life Space Crisis Intervention. The authors conclude that integration might be possible, but depends on establishing a positive school atmosphere, the central position of the teacher, and collaborative and social learning for pupils. Further implementation of integrated conflict management models can be considered but must be underpinned by appropriate scientific research.

Nuclear thermal propulsion (NTP) has long been identified as a key enabling technology for space exploration beyond LEO. From Wernher Von Braun's early concepts for crewed missions to the Moon and Mars to the current Mars Design Reference Architecture (DRA) 5.0 and recent lunar and asteroid mission studies, the high thrust and specific impulse of NTP opens up possibilities such as reusability that are just not feasible with competing approaches. Although NTP technology was proven in the Rover / NERVA projects in the early days of the space program, an integrated spacecraft using NTP has never been developed. Such a spacecraft presents a challenging multidisciplinary systems integration problem. The disciplines that must come together include not only nuclear propulsion and power, but also thermal management, power, structures, orbital dynamics, etc. Some of this integration logic was incorporated into a vehicle sizing code developed at NASA's Glenn Research Center (GRC) in the early 1990s called MOMMA, and later into an Excel-based tool called SIZER. Recently, a team at GRC has developed an open source framework for solving Multidisciplinary Design, Analysis and Optimization (MDAO) problems called OpenMDAO. A modeling approach is presented that builds on previous work in NTP vehicle sizing and mission analysis by making use of the OpenMDAO framework to enable modular and reconfigurable representations of various NTP vehicle configurations and mission scenarios. This approach is currently applied to vehicle sizing, but is extensible to optimization of vehicle and mission designs. The key features of the code will be discussed and examples of NTP transfer vehicles and candidate missions will be presented.

Under health care reform, a series of new financing and delivery models are being piloted to integrate health and long-term care services for older adults. To date, these programs have not encompassed residential care facilities, with most programs focusing on long-term care recipients in the community or the nursing home. Our analyses indicate that individuals living in residential care facilities have similarly high rates of chronic illness and Medicare utilization when compared with simila...

It is well-known that dynamic programming algorithms can utilize tree decompositions to provide a way to solve some \\emph{NP}-hard problems on graphs where the complexity is polynomial in the number of nodes and edges in the graph, but exponential in the width of the underlying tree decomposition. However, there has been relatively little computational work done to determine the practical utility of such dynamic programming algorithms. We have developed software to construct tree decompositions using various heuristics and have created a fast, memory-efficient dynamic programming implementation for solving maximum weighted independent set. We describe our software and the algorithms we have implemented, focusing on memory saving techniques for the dynamic programming. We compare the running time and memory usage of our implementation with other techniques for solving maximum weighted independent set, including a commercial integer programming solver and a semi-definite programming solver. Our results indicate that it is possible to solve some instances where the underlying decomposition has width much larger than suggested by the literature. For certain types of problems, our dynamic programming code runs several times faster than these other methods.

The Department of Energy's (DOE's) Environmental Management Program is the country's largest and most sophisticated environmental program to date. The rapid expansion of the DOE's environmental restoration efforts has led to increased scrutiny of its management processes and systems. As the program continues to grow and mature, maintaining adequate accountability for resources and clearly communicating progress will be essential to sustaining public confidence. The Office of Environmental Management must ensure that adequate processes and systems are in place at Headquarters, Operation Offices, and contractor organizations. These systems must provide the basis for sound management, cost control, and reporting. To meet this challenge, the Office of Environmental Restoration introduced the Management Action Plan process. This process was designed to serve three primary functions: (1) define the program's management capabilities at Headquarters and Operations Offices; (2) describe how management initiatives address identified program deficiencies; and (3) identify any duplication of efforts or program deficiencies. The Environmental Restoration Management Action Plan is a tracking, reporting, and statusing tool, used primarily at the Headquarters level, for assessing performance in key areas of project management and control. BY DOE to communicate to oversight agencies and stakeholders a clearer picture of the current status of the environmental restoration project management system. This paper will discuss how Management Action Plans are used to provide a program-wide assessment of management capabilities

) experience with methods of protein purification; (iii) incorporation of appropriate controls into experiments; (iv) use of basic statistics in data analysis; (v) writing papers and grant proposals in accepted scientific style; (vi) peer review; (vii) oral presentation of results and proposals; and (viii) introduction to molecular modeling. Figure 1 illustrates the modular nature of the lab curriculum. Elements from each of the exercises can be separated and treated as stand-alone exercises, or combined into short or long projects. We have been able to offer the opportunity to use sophisticated molecular modeling in the final module through funding from an NSF-ILI grant. However, many of the benefits of the research proposal can be achieved with other computer programs, or even by literature survey alone. Figure 1.Design of project-based biochemistry laboratory. Modules (projects, or portions of projects) are indicated as boxes. Each of these can be treated independently, or used as part of a larger project. Solid lines indicate some suggested paths from one module to the next. The skills and knowledge required for protein purification and design are developed in three units: (i) an introduction to critical assays needed to monitor degree of purification, including an evaluation of assay parameters; (ii) partial purification by ion-exchange techniques; and (iii) preparation of a grant proposal on protein design by mutagenesis. Brief descriptions of each of these units follow, with experimental details of each project at the end of this paper. Assays for Lysozyme Activity and Protein Concentration (4 weeks) The assays mastered during the first unit are a necessary tool for determining the purity of the enzyme during the second unit on purification by ion exchange. These assays allow an introduction to the concept of specific activity (units of enzyme activity per milligram of total protein) as a measure of purity. In this first sequence, students learn a turbidimetric assay

One of the main problems in model-based software engineering is modelling behaviour in such a way that the behaviour models can be easily integrated with each other, with the structural software models and with pre-existing software. In this paper, we propose an event coordination notation (ECNO)...

Full Text Available The Inozemtsev model is considered to be a multivaluable generalization of Heun's equation. We review results on Heun's equation, the elliptic Calogero-Moser-Sutherland model and the Inozemtsev model, and discuss some approaches to the finite-gap integration for multivariable models.

We combine extant theories of evidence accumulation and multi-modal integration to develop an integrated framework for modeling multimodal integration as a process that unfolds in real time. Many studies have formulated sensory processing as a dynamic process where noisy samples of evidence are accumulated until a decision is made. However, these studies are often limited to a single sensory modality. Studies of multimodal stimulus integration have focused on how best to combine different sources of information to elicit a judgment. These studies are often limited to a single time point, typically after the integration process has occurred. We address these limitations by combining the two approaches. Experimentally, we present data that allow us to study the time course of evidence accumulation within each of the visual and auditory domains as well as in a bimodal condition. Theoretically, we develop a new Averaging Diffusion Model in which the decision variable is the mean rather than the sum of evidence samples and use it as a base for comparing three alternative models of multimodal integration, allowing us to assess the optimality of this integration. The outcome reveals rich individual differences in multimodal integration: while some subjects' data are consistent with adaptive optimal integration, reweighting sources of evidence as their relative reliability changes during evidence integration, others exhibit patterns inconsistent with optimality.

This paper critically reviews the ownership, location, and internalization (OLI) model, and the Uppsala internationalization process (UIP) framework. Both the OLI model and the UIP model ignore to incorporate the insights of each other and fail to include corporate entrepreneurship in their analy...