Sample records for model program pnl

Pacific Northwest Laboratory (PNL) conducts fundamental and applied research in support of the US Department of Energy`s (DOE) core missions in science and technology, environmental quality, energy resources, and national security. Much of this research is funded by the program offices of DOE`s Office of Energy Research (DOE-ER), primarily the Office of Basic Energy Sciences (BES) and the Office of Health and Environmental Research (OHER), and by PNL`s Laboratory Directed Research and Development (LDRD) Program. This document is a collection of research highlights that describe PNL`s accomplishments in DOE-ER funded programs during Fiscal Year 1995. Included are accomplishments in research funded by OHER`s Analytical Technologies, Environmental Research, Health Effects, General Life Sciences, and Carbon Dioxide Research programs; BES`s Materials Science, Chemical Sciences, Engineering and Geoscience, and Applied Mathematical Sciences programs; and PNL`s LDRD Program. Summaries are given for 70 projects.

An overview of the administration for the In Vivo Monitoring Program (IVMP) for Hanford. This includes organizational structure and program responsibilities; coordination of in vivo measurements; scheduling measurements; performing measurements; reporting results; and quality assurance. Overall responsibility for the management of the IVMP rests with the Program Manager (PM). The PM is responsible for providing the required in vivo counting services for Hanford Site contractor employees in accordance with Department of Energy (DOE) requirements and the specific statements of work.

This manual is a guide to the services provided by the Hanford Internal Dosimetry Program (IDP), which is operated by the Pacific Northwest National Laboratory.( ) for the U.S. Department of Energy Richland Operations Office, Office of River Protection and their Hanford Site contractors. The manual describes the roles of and relationships between the IDP and the radiation protection programs of the Hanford Site contractors. Recommendations and guidance are also provided for consideration in implementing bioassay monitoring and internal dosimetry elements of radiation protection programs.

The following sections provide an overview of the administration for the In Vivo Monitoring Program (IVMP) for Hanford. This includes the organizational structure and program responsibilities; coordination of in vivo measurements; scheduling measurements; performing measurements; reporting results; and quality assurance.

Pacific Northwest Laboratory (PNL) is examining the analysis of large data sets (ALDS). After one year's work, a panel was convened to evaluate the project. This document is the permanent record of that panel review. It consists of edited transcripts of presentations made to the panel by the PNL staff, a summary of the responses of the panel to these presentations, and PNL's plans for the development of the ALDS project. The representations of the PNL staff described various aspects of the project and/or the philosophy surrounding the project. Supporting materials appear in appendixes. 20 figures, 4 tables. (RWR)

PNL1 and PNL2 are the closest Arabidopsis relatives of maize pan1. pan1 and the PNL family of 11 genes encode leucine-rich repeat, receptor-like kinases, however none of these putative kinases is predicted to have actual kinase function, due to one or more amino acid substitutions in residues necessary for kinase function. Because PAN1 plays a role in subsidiary cell formation in maize, it is hypothesized that PNL1 and PNL2 are involved in stomatal formation in Arabidopsis. YFP fusions of the...

Asymmetric cell division is a vital component of plant development. It enables cell differentiation and cell diversity. A key component of asymmetric cell division is cell signaling. Signals are believed to control polarization and orientation of asymmetric divisions during stomatal development. The findings of this report suggest that PNL1 and PNL2, two LRR-RLKs found in Arabidopsis and closely related to maize PAN1 LRR-RLK, are possibly involved in the signaling events occurring during the ...

The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at Hanford. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with 10 CFR 835, DOELAP, DOE-RL, ORP, PNSO, and Hanford contractor requirements. The dosimetry system is operated by PNNL’s Hanford External Dosimetry Program which provides dosimetry services to all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. Rev. 0 marks the first revision to be released through PNNL’s Electronic Records & Information Capture Architecture (ERICA) database.

The Pacific Northwest Laboratory (PNL), operated by Battelle Memorial Institute under contract to the U.S. Department of Energy, operates tank systems for the U.S. Department of Energy, Richland Operations Office (DOE-RL), that contain dangerous waste constituents as defined by Washington State Department of Ecology (WDOE) Dangerous Waste Regulations, Washington Administrative Code (WAC) 173-303-040(18). Chapter 173-303-640(2) of the WAC requires the performance of integrity assessments for each existing tank system that treats or stores dangerous waste, except those operating under interim status with compliant secondary containment. This Integrity Assessment Plan (IAP) identifies all tasks that will be performed during the integrity assessment of the PNL-operated Radioactive Liquid Waste Systems (RLWS) associated with the 324 and 325 Buildings located in the 300 Area of the Hanford Site. It describes the inspections, tests, and analyses required to assess the integrity of the PNL RLWS (tanks, ancillary equipment, and secondary containment) and provides sufficient information for adequate budgeting and control of the assessment program. It also provides necessary information to permit the Independent, Qualified, Registered Professional Engineer (IQRPE) to approve the integrity assessment program.

The Pacific Northwest Laboratory (PNL), operated by Battelle Memorial Institute under contract to the U.S. Department of Energy, operates tank systems for the U.S. Department of Energy, Richland Operations Office (DOE-RL), that contain dangerous waste constituents as defined by Washington State Department of Ecology (WDOE) Dangerous Waste Regulations, Washington Administrative Code (WAC) 173-303-040(18). Chapter 173-303-640(2) of the WAC requires the performance of integrity assessments for each existing tank system that treats or stores dangerous waste, except those operating under interim status with compliant secondary containment. This Integrity Assessment Plan (IAP) identifies all tasks that will be performed during the integrity assessment of the PNL-operated Radioactive Liquid Waste Systems (RLWS) associated with the 324 and 325 Buildings located in the 300 Area of the Hanford Site. It describes the inspections, tests, and analyses required to assess the integrity of the PNL RLWS (tanks, ancillary equipment, and secondary containment) and provides sufficient information for adequate budgeting and control of the assessment program. It also provides necessary information to permit the Independent, Qualified, Registered Professional Engineer (IQRPE) to approve the integrity assessment program.

We apply a kinematic finite-fault inversion scheme to Pnl displacement waveforms recorded at 14 regional stations (Δsingle Mw 5.0 aftershock. Slip is modeled on a rectangular fault subdivided into 2×2 km subfaults assuming a constant rupture velocity and a 0.5 sec rise time. A passband filter of 0.1–0.5 Hz is applied to both data and subfault responses prior to waveform inversion. The SGF inversions are performed such that the final seismic moment is consistent with the known magnitude (Mw 6.0) of the earthquake. For these runs, it is difficult to reproduce the entire Pnl waveform due to inaccuracies in the assumed crustal structure. Also, the misfit between observed and predicted vertical waveforms is similar in character for different rupture velocities, indicating that neither the rupture velocity nor the exact position of slip sources along the fault can be uniquely identified. The pattern of coseismic slip, however, compares well with independent source models derived using other data types, indicating that the SGF inversion procedure provides a general first-order estimate of the 2004 Parkfield rupture using the vertical Pnl records. The best-constrained slip model is obtained using the single-aftershock EGF approach. In this case, the waveforms are very well reproduced for both vertical and horizontal components, suggesting that the method provides a powerful tool for estimating the distribution of coseismic slip using the regional Pnl waveforms. The inferred slip model shows a localized patch of high slip (55 cm peak) near the hypocenter and a larger slip area (~50 cm peak) extending between 6 and 20 km to the northwest.

The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at Hanford. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with 10 CFR 835, DOELAP, DOE-RL, ORP, PNSO, and Hanford contractor requirements. The dosimetry system is operated by PNNL’s Hanford External Dosimetry Program (HEDP) which provides dosimetry services to all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee (HPDAC) which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. The first revision to be released through PNNL’s Electronic Records & Information Capture Architecture (ERICA) database was designated Revision 0. Revision numbers that are whole numbers reflect major revisions typically involving changes to all chapters in the document. Revision numbers that include a decimal fraction reflect minor revisions, usually restricted to selected chapters or selected pages in the document.

[EN] Nonlinear Programming (NLP) is a widely applicable tool in modeling real life problems applied to business, economics and engineering. Is to maximize or minimize a scalar field whose domain is given as a set of constraints given by equalities and/or inequalities not necessarily linear. In this paper we present a virtual laboratory to study the PNL graphically and numerically in the case of two variables [EN] La Programación No Lineal (PNL) constituye una herramienta de amp...

The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at the U.S. Department of Energy (DOE) Hanford site. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with requirements of 10 CFR 835, the DOE Laboratory Accreditation Program, the DOE Richland Operations Office, DOE Office of River Protection, DOE Pacific Northwest Office of Science, and Hanford’s DOE contractors. The dosimetry system is operated by the Pacific Northwest National Laboratory (PNNL) Hanford External Dosimetry Program which provides dosimetry services to PNNL and all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since its inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. The first revision to be released through PNNL’s Electronic Records & Information Capture Architecture database was designated Revision 0. Revision numbers that are whole numbers reflect major revisions typically involving significant changes to all chapters in the document. Revision

The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at the U.S. Department of Energy (DOE) Hanford site. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with requirements of 10 CFR 835, the DOE Laboratory Accreditation Program, the DOE Richland Operations Office, DOE Office of River Protection, DOE Pacific Northwest Office of Science, and Hanford’s DOE contractors. The dosimetry system is operated by the Pacific Northwest National Laboratory (PNNL) Hanford External Dosimetry Program which provides dosimetry services to PNNL and all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since its inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. The first revision to be released through PNNL’s Electronic Records & Information Capture Architecture database was designated Revision 0. Revision numbers that are whole numbers reflect major revisions typically involving significant changes to all chapters in the document. Revision

The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at Hanford. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with 10 CFR 835, DOELAP, DOE-RL, ORP, PNSO, and Hanford contractor requirements. The dosimetry system is operated by PNNL’s Hanford External Dosimetry Program (HEDP) which provides dosimetry services to all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee (HPDAC) which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. Rev. 0 marks the first revision to be released through PNNL’s Electronic Records & Information Capture Architecture (ERICA) database. Revision numbers that are whole numbers reflect major revisions typically involving changes to all chapters in the document. Revision numbers that include a decimal fraction reflect minor revisions, usually restricted to selected chapters or selected pages in the document. Revision Log: Rev. 0 (2/25/2005) Major revision and expansion. Rev. 0.1 (3/12/2007) Minor

Currently, PNL is the treatment of choice for large and/or otherwise complex urolithiasis. PNL was initially performed with the patient in a supine-oblique position, but later on the prone position became the conventional one for habit and handiness. The prone position provides a larger area for percutaneous renal access, a wider space for instrument manipulation, and a claimed lower risk of splanchnic injury. Nonetheless, it implies important anaesthesiological risks, including circulatory, haemodynamic, and ventilatory difficulties; need of several nurses to be present for intraoperative changes of the decubitus in case of simultaneous retrograde instrumentation of the ureter, implying evident risks related to pressure points; an increased radiological hazard to the urologist's hands; patient discomfort. To overcome these drawbacks, various safe and effective changes in patient positioning for PNL have been proposed over the years, including the reverse lithotomy position, the prone split-leg position, the lateral decubitus, the supine position, and the Galdakao-modified supine Valdivia (GMSV) position. Among these, the GMSV position is safe and effective, and seems profitable and ergonomic. It allows optimal cardiopulmonary control during general anaesthesia; an easy puncture of the kidney; a reduced risk of colonic injury; simultaneous antero-retrograde approach to the renal cavities (PNL and retrograde ureteroscopy = ECIRS, Endoscopic Combined IntraRenal Surgery), with no need of intraoperative repositioning of the anaesthetized patient, less need for nurses in the operating room, less occupational risk due to shifting of heavy loads, less risk of pressure injuries related to inaccurate repositioning, and reduced duration of the procedure; facilitated spontaneous evacuation of stone fragments; a comfortable sitting position and a restrained X-ray exposure of the hands for the urologist. But, first of all, GMSV position fully supports a new comprehensive

Introduction To compare the outcomes of tubeless day care PNL using hemostatic seal in the access tract versus standard PNL. Material and methods It was a prospective randomized controlled study. Cases were randomized to either the day care group with hemostatic seal (DCS) or the control group where patients were admitted and a nephrostomy tube was placed at the conclusion of surgery. Results A total of 180 cases were screened and out of these, 113 were included in the final analysis. The stone clearance rates were comparable in both the groups. The mean drop in hemoglobin was significantly lower in DCS group than the control group (1.05 ±0.68 vs. 1.30 ±0.58 gm/dl, p = 0.038).Mean postoperative pain score, analgesic requirement (paracetamol) and duration of hospital stay were also significantly lower in the DCS group (3.79 ±1.23 vs. 6.12 ±0.96, 1.48 ±0.50 vs. 4.09 ±1.11 grams and 0.48 ±0.26 vs. 4.74 ±1.53 days respectively; p PNL with composite hemostatic tract seal is considered safe. It resulted in a significant reduction of blood loss and analgesic requirement with significantly reduced hospital stay, nephrostomy tube site morbidity and time required to resume normal activity when compared to the standard PNL. However, patients must be compliant with the given instructions and should have access to a health care facility, as few of them may need re-admission. PMID:27551557

The Hanford External Dosimetry Technical Basis Manual PNL-MA-842 documents the design and implementation of the external dosimetry system used at Hanford. The manual describes the dosimeter design, processing protocols, dose calculation methodology, radiation fields encountered, dosimeter response characteristics, limitations of dosimeter design under field conditions, and makes recommendations for effective use of the dosimeters in the field. The manual describes the technical basis for the dosimetry system in a manner intended to help ensure defensibility of the dose of record at Hanford and to demonstrate compliance with 10 CFR 835, DOELAP, DOE-RL, ORP, PNSO, and Hanford contractor requirements. The dosimetry system is operated by PNNL’s Hanford External Dosimetry Program (HEDP) which provides dosimetry services to all Hanford contractors. The primary users of this manual are DOE and DOE contractors at Hanford using the dosimetry services of PNNL. Development and maintenance of this manual is funded directly by DOE and DOE contractors. Its contents have been reviewed and approved by DOE and DOE contractors at Hanford through the Hanford Personnel Dosimetry Advisory Committee (HPDAC) which is chartered and chaired by DOE-RL and serves as means of coordinating dosimetry practices across contractors at Hanford. This manual was established in 1996. Since its inception, it has been revised many times and maintained by PNNL as a controlled document with controlled distribution. The first revision to be released through PNNL’s Electronic Records & Information Capture Architecture (ERICA) database was designated Revision 0. Revision numbers that are whole numbers reflect major revisions typically involving significant changes to all chapters in the document. Revision numbers that include a decimal fraction reflect minor revisions, usually restricted to selected chapters or selected pages in the document. Maintenance and distribution of controlled hard copies of the

Pacific Northwest Laboratory (PNL) was asked to develop and recommend a regulatory position that the Nuclear Regulatory Commission (NRC) should adopt regarding the ability of reactor pressure vessels to withstand the effects of pressurized thermal shock (PTS). Licensees of eight pressurized water reactors provided NRC with estimates of remaining effective full power years before corrective actions would be required to prevent an unsafe operating condition. PNL reviewed these responses and the results of supporting research and concluded that none of the eight reactors would undergo vessel failure from a PTS event before several more years of operation. Operator actions, however, were often required to terminate a PTS event before it deteriorated to the point where failure could occur. Therefore, the near-term (less than one year) recommendation is to upgrade, on a site-specific basis, operational procedures, training, and control room instrumentation. Also, uniform criteria should be developed by NRC for use during future licensee analyses. Finally, it was recommended that NRC upgrade nondestructive inspection techniques used during vessel examinations and become more involved in the evaluation of annealing requirements.

The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge for future research.

This report compiles information and conclusions gathered as part of the “Modeling EERE Deployment Programs” project. The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge in which future research is needed.

This is the PDF of a powerpoint presentation from a teleconference on Los Alamos programmingmodels. It starts by listing their assumptions for the programmingmodels and then details a hierarchical programmingmodel at the System Level and Node Level. Then it details how to map this to their internal nomenclature. Finally, a list is given of what they are currently doing in this regard.

The purpose of this report is to compile information and conclusions gathered as part of three separate tasks undertaken as part of the overall project, “Modeling EERE Deployment Programs,” sponsored by the Planning, Analysis, and Evaluation office within the Department of Energy’s Office of Energy Efficiency and Renewable Energy (EERE). The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address improvements to modeling in the near term, and note gaps in knowledge where future research is needed.

Full Text Available Programs and project evaluation models can be extremely useful in project planning and management. The aim is to set the right questions as soon as possible in order to see in time and deal with the unwanted program effects, as well as to encourage the positive elements of the project impact. In short, different evaluation models are used in order to minimize losses and maximize the benefits of the interventions upon small or large social groups. This article introduces some of the most recently used evaluation models.

Two pectin lyase genes, designated pnl-1 and pnl-2, were cloned from Colletotrichum gloeosporioides f. sp. malvae, a pathogen of round-leaved mallow (Malva pusilla). pnl-1 was isolated using cDNA from infected plant material; pnl-2 was isolated using cDNA from 3-day-old mycelia grown in mallow-cell-wall extract (MCWE) broth. pnl-1 is the first pectinase gene described thus far to encode a cellulose-binding domain (CBD), which is common in cellulases and xylanases, whereas pnl-2 encodes a pectin lyase that lacks a CBD. In pure culture, pnl-1 expression could be detected when purified pectin or glucose was the sole carbon source, but not when MCWE was the sole carbon source. The lack of pnl-1 expression appeared to be due to gene repression by some unknown factor(s) in the cell-wall extract. In contrast, expression of pnl-2 was detected in cultures when MCWE, but not when purified pectin or glucose, was the sole carbon source. In infected tissue, detection of pnl-1 expression by Northern-blot hybridization and by RT-PCR began with the onset of the necrotrophic phase of infection. Expression ofpnl-2 was not detectable by Northern-blot hybridization, but was observed byRT-PCR in both the biotrophic and necrotrophic phases of infection. The differences between pnl-1 and pnl-2 (i.e. pnl-1 encoding a CBD and differences in the expression patterns of both genes) may be related to the requirements of C. gloeosporioides f. sp. malvae to be able to grow in host tissue under the different conditions present during the biotrophic and necrotrophic phases of infection.

The MELCOR code was used to simulate PNL`s Ice Condenser Experiments 11-6 and 16-11. In these experiments, ZnS was injected into a mixing chamber, and the combined steam/air/aerosol mixture flowed into an ice condenser which was l4.7m tall. Experiment 11-6 was a low flow test; Experiment l6-1l was a high flow test. Temperatures in the ice condenser region and particle retention were measured in these tests. MELCOR predictions compared very well to the experimental data. The MELCOR calculations were also compared to CONTAIN code calculations for the same tests. A number of sensitivity studies were performed. It as found that simulation time step, aerosol parameters such as the number of MAEROS components and sections used and the particle density, and ice condenser parameters such as the energy capacity of the ice, ice heat transfer coefficient multiplier, and ice heat structure characteristic length all could affect the results. Thermal/hydraulic parameters such as control volume equilibrium assumptions, flow loss coefficients, and the bubble rise model were found to affect the results less significantly. MELCOR results were not machine dependent for this problem.

Lawrence Livermore National Laboratory (LLNL) has conducted a long-term single-pass continuous-flow (SPCF) leaching test of the glass waste form PNL 76-68. Leaching rates of Np, Pu, and various stable elements were measured at 25/sup 0/ and 75/sup 0/C with three different solutions and three different flow rates. The purposes of the study were: (1) to compare SPCF leaching results with the results of a modified IAEA leach test performed by Pacific Northwest Laboratories (PNL); (2) to establish elemental leach rates and their variation with temperature, flow rate and solution composition; and (3) to gain insight into the leaching mechanisms. The LLNL and PNL leach tests yielded results which appear to agree within experimental uncertainties. The magnitude of the leach rates determined for Np and the glass matrix elements is 10/sup -5/ grams of glass/cm/sup 2/ geometric solid surface area/day. The rates increase with temperature and with solution flow rate, and are similar in brine and distilled water but higher in a bicarbonate solution. Other cations exhibit somewhat different behavior, and Pu in particular yields a much lower apparent leach rate, probably because of sorption or precipitation effects after release from the glass matrix. After the initial few days, most elements are leached at a constant rate. Matrix dissolution appears to be the most probable rate controlling step for the leaching of most elements.

A long-term single-pass continuous-flow (SPCF) leaching test was conducted on the glass waste form PNL 76-68. Leaching rates of Np, Pu and various stable elements were measured at 25 and 75/sup 0/C with three different solutions and three different flow rates. The SPCF leaching results were compared with results of a modified IAEA leach test performed by Pacific Northwest Laboratories (PNL). Elemental leach rates and their variation with temperature, flow rate and solution composition were established. The LLNL and PNL leach test results appear to agree within experimental uncertainties. The magnitude of the leach rates determined for Np and the glass matrix elements is 10/sup -5/ grams of glass/cm/sup 2/ geometric solid surface area/day. The rates increase with temperature and with solution flow rate, and are similar in brine and distilled water but higher in a bicarbonate solution. Other cations exhibit somewhat different behavior, and Pu in particular yields a much lower apparent leach rate, probably because of sorption or precipitation effects after release from the glass matrix. After the initial few days, most elements are leached at a constant rate. Matrix dissolution appears to be the most probable rate controlling step for the leaching of most elements. 23 figures, 12 tables.

Full Text Available A task of evaluation of program performance often occurs in the process of design of computer systems or during iterative compilation. A traditional way to solve this problem is emulation of program execution on the target system. A modern alternative approach to evaluation of program performance is based on statistical modeling of program performance on a computer under investigation. This statistical method of modelingprogram performance called Velocitas was introduced in this work. The method and its implementation in the Adaptor framework were presented. Investigation of the method's effectiveness showed high adequacy of program performance prediction.

The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologic systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This is the third of 3 volumes of the description of the VTT (Variable Thickness Transient) Groundwater Hydrologic Model - second level (intermediate complexity) two-dimensional saturated groundwater flow.

Constraint programming can definitely be seen as a model-driven paradigm. The users write programs for modeling problems. These programs are mapped to executable models to calculate the solutions. This paper focuses on efficient model management (definition and transformation). From this point of view, we propose to revisit the design of constraint-programming systems. A model-driven architecture is introduced to map solving-independent constraint models to solving-dependent decision models. Several important questions are examined, such as the need for a visual highlevel modeling language, and the quality of metamodeling techniques to implement the transformations. A main result is the s-COMMA platform that efficiently implements the chain from modeling to solving constraint problems

Introduction to sequential decision processes covers use of dynamic programming in studying models of resource allocation, methods for approximating solutions of control problems in continuous time, production control, more. 1982 edition.

According to current guideline recommendations extracorporeal shock wave lithotripsy (SWL) remains the first choice treatment for small and mid-sized renal calculi. However, the results of SWL treatment for lower pole stones can be disappointing whilst more invasive endoscopic modalities, such as flexible ureterorenoscopy (fURS) and percutaneous nephrolithotomy (PNL) are often considered more effective. This article summarizes a point-counterpoint discussion at the 9th eULIS symposium in Como, Italy, and discusses the potential advantages and disadvantages of the different therapeutic approaches.

Full Text Available Purpose: The aim of study was to evaluate the clinical outcomes of PNL in comparison with laparoscopic ureterolithotomy (LUL in proximal ureteral stones larger than 1 cm. Materials and Methods: A total of 80 patients who were candidates for treatment of large ureteral stones in our urology center were enrolled in the study between September 2004 and September 2008. By using patient randomization, they were assigned into two forty-patient groups (PNL and LUL. After evaluating the patients with laboratory tests and IVP, PNL was performed under sonography guidance in the prone position or the patients were submitted to classic laparoscopic ureterolithotomy (LUL transperitoneally. All patients underwent postoperative assessments including KUB and ultrasonography. Results: A hundred-percent success was achieved in both groups. The mean age of the patients were 39.4 (16-63 and 35.2 (18-57 years old in PNL and LUL groups, respectively. The mean stone size in PNL group was 14.2 (10-25 mm and in LUL group was 13.5 (10-28 mm. The duration of the operations were 54.35 (50-82 minutes, and 82.15 (73-180 minutes (P < 0.0001; and the average hospital stay days were 2.6 (2-5 and 3.5 (3-8 days (p = 0.011 in groups PNL and LUL, accordingly. The mean Hb decrease in PNL group was 0.9mg/dL and in LUL group was 0.4mg/dL (p = 0.001. No statistically significant differences in terms of blood transfusion, fever, ICU admission, and prolonged urinary leakage were detected in both groups. Conclusion: According to our study, percutaneous nephrolithotomy under ultrasonography guidance is comparable with the laparoscopic ureterolithotomy for the treatment of proximal ureteral stones larger than 1 cm.

In this study it is aimed to compare the success and complication rates of mini-PNL procedure in supine and prone positions. In this retrospective study data of 180 patients treated with MPNL either in supine (n = 54) or prone (n = 126) positions between May 2009 and August 2014 was investigated. Success was defined as no visible stones >2 mm. Perioperative complications were classified using the modified Clavien system. Groups were compared with Chi square test or Student t test and for statistical significance p value of 0.05 was accepted. Mean age of the population was 42.5 ± 8.2 years and mean stone size was 23.9 ± 4.1 mm. The two groups were similar with regard to demographic characteristics and stone related characteristics except the ASA status. Success rates of the supine and prone groups were 85.1 and 87.3%, respectively (p = 0.701). No statistically significant differences in terms of complications were observed. Mean operative time was the only parameter different between the two groups (55 vs 82 min, p = 0.001). Supine position for PNL seems to be promising and the complication and success rates are shown to be similar to the prone position with MPNL technique. The only significant benefit of this technique is shorter operative time.

Java Pathfinder (JPF) is a verification and testing environment for Java that integrates model checking, program analysis, and testing. JPF consists of a custom-made Java Virtual Machine (JVM) that interprets bytecode, combined with a search interface to allow the complete behavior of a Java program to be analyzed, including interleavings of concurrent programs. JPF is implemented in Java, and its architecture is highly modular to support rapid prototyping of new features. JPF is an explicit-state model checker, because it enumerates all visited states and, therefore, suffers from the state-explosion problem inherent in analyzing large programs. It is suited to analyzing programs less than 10kLOC, but has been successfully applied to finding errors in concurrent programs up to 100kLOC. When an error is found, a trace from the initial state to the error is produced to guide the debugging. JPF works at the bytecode level, meaning that all of Java can be model-checked. By default, the software checks for all runtime errors (uncaught exceptions), assertions violations (supports Java s assert), and deadlocks. JPF uses garbage collection and symmetry reductions of the heap during model checking to reduce state-explosion, as well as dynamic partial order reductions to lower the number of interleavings analyzed. JPF is capable of symbolic execution of Java programs, including symbolic execution of complex data such as linked lists and trees. JPF is extensible as it allows for the creation of listeners that can subscribe to events during searches. The creation of dedicated code to be executed in place of regular classes is supported and allows users to easily handle native calls and to improve the efficiency of the analysis.

and abstract interpretation. Model checking views the program as a finite automaton and tries to prove logical properties over the automaton model, or present a counter-example if not possible — with a focus on precision. Abstract interpretation translates the program semantics into abstract semantics...... problems as the other by a reformulation. This thesis argues that there is even a convergence on the practical level, and that a generalisation of the formalism of timed automata into lattice automata captures key aspects of both methods; indeed model checking timed automata can be formulated in terms...... of an abstract interpretation. For the generalisation to lattice automata to have benefit it is important that efficient tools exist. This thesis presents multi-core tools for efficient and scalable reachability and Büchi emptiness checking of timed/lattice automata. Finally, a number of case studies...

The Inert Electrodes Program, being conducted by Pacific Northwest Laboratory (PNL), involves improving the Hall-Heroult cells used by the Aluminum Industry for the electrochemical production of aluminum. The PNL research centers on developing more energy efficient, longer-lasting anodes and cathodes and ancillary equipment. Major accomplishments for Fiscal Year 1988 are summarized below. 14 refs., 56 figs., 9 tabs.

The MATRA-S, a subchannel analysis code has been used to thermal-hydraulic design of SMART core. As the safety enhancement is getting important more and more, some features of the MATRA-S code are required to be validated in order to be applied to nonnominal operating conditions in addition to its applicability to reactor design under normal operating conditions. The MATRA-S code has two numerical schemes, SCHEME for implicit application and XSCHEM for explicit one. The implicit scheme had been developed under assumptions that the axial flow is larger enough than the crossflow. Under certain conditions, especially low flow and low pressure operating conditions, this implicit SCHEME oscillates or becomes unstable numerically and then MATRA-S fails to obtain good solution. These demerits were known as common in implicit schemes of many COBRA families. Efforts have been exerted to resolve these limitations in SCHEME of the MATRA-S such as a once through marching scheme against the multi-pass marching scheme and an adaptive multi-grid method. These remedies can reduce the numerically unstable range for SCHEME but some unstable regions still remain. The XSCHEM, an explicit scheme of MATRA-S was validated using the PNL 2x6 rod bundle flow transient test. The explicit scheme agreed with implicit scheme for steady state calculations. And it showed its capability to predict low flow conditions such as negative flow and recirculation flow.

Julia is a young high-performance dynamic programming language for scientific computations. It provides an extensive mathematical function library, a clean syntax and its own parallel execution model. We developed 2d wave equation modelingprograms using Julia and C programming languages and compared their performance. We used the same modeling algorithm for the two modelingprograms. We used Julia version 0.3.9 in this comparison. We declared data type of function arguments and used inbounds macro in the Julia program. Numerical results showed that the C programs compiled with Intel and GNU compilers were faster than Julia program, about 18% and 7%, respectively. Taking the simplicity of dynamic programming language into consideration, Julia can be a novel alternative of existing statically typed programming languages.

This report was prepared by the Applied Research Corporation (ARC), College Station, Texas, under subcontract to Pacific Northwest Laboratory (PNL) as part of a global climate studies task. The task supports site characterization work required for the selection of a potential high-level nuclear waste repository and is part of the Performance Assessment Scientific Support (PASS) Program at PNL. The work is under the overall direction of the Office of Civilian Radioactive Waste Management (OCRWM), US Department of Energy Headquarters, Washington, DC. The scope of the report is to present the results of the third year`s work on the atmospheric modeling part of the global climate studies task. The development testing of computer models and initial results are discussed. The appendices contain several studies that provide supporting information and guidance to the modeling work and further details on computer model development. Complete documentation of the models, including user information, will be prepared under separate reports and manuals.

This paper deals with the construction and use of simple synthetic programs that model the behavior of more complex, real parallel programs. Synthetic programs can be used in many ways: to construct an easily ported suite of benchmark programs, to experiment with alternate parallel implementations of a program without actually writing them, and to predict the behavior and performance of an algorithm on a new or hypothetical machine. Synthetic programs are constructed easily from scratch, from existing programs, and can even be constructed using nothing but information obtained from traces of the real program's execution.

of this paper is to go back to the future and get inspiration from SIMULA and propose a unied approach. In addition to reintroducing the contributions of SIMULA and the Scandinavian approach to object-oriented programming, we do this by discussing a number of issues in modeling and programming and argue3 why we......SIMULA was a language for modeling and programming and provided a unied approach to modeling and programming in contrast to methodologies based on structured analysis and design. The current development seems to be going in the direction of separation of modeling and programming. The goal...

Autographa californica nucleopolyhedrovirus (AcMNPV) ORF 86, located within the HindIII C fragment, potentially encodes a protein which shares sequence similarity with two T4 bacteriophage gene products, RNA ligase and polynucleotide kinase. This AcMNPV gene has been designated pnk/pnl but has yet to be assigned a function in virus replication. It has been classified as an immediate early virus gene, since the promoter was active in uninfected insect cells and mRNA transcripts were detectable from 4 to 48 h post-infection and in the presence of cycloheximide or aphidicolin in virus-infected cells. The extremities of the transcript have been mapped by primer extension and 3' RACE-PCR to positions -18 from the translational start codon and +15 downstream of the stop codon. The function of pnk/pnl was investigated by producing a recombinant virus (Acdel86lacZ) with the coding region replaced with that of lacZ. This virus replicated normally in Spodoptera frugiperda (Sf 21) cells, indicating that pnk/pnl is not essential for propagation in these cells. Virus protein production in Acdel86lacZ-infected Sf 21 cells also appeared to be unaffected, with normal synthesis of the IE-1, GP64, VP39 and polyhedrin proteins. Shut-down of host protein synthesis was not abolished in recombinant infection. When other baculovirus genomes were examined for the presence of pnk/pnl by restriction enzyme digestion and PCR, a deletion was found in AcMNPV 1.2, Galleria mellonella NPV (GmMNPV) and Bombyx mori NPV (BmNPV), suggesting that in many isolates this gene has either never been acquired or has been lost during genome evolution. This is one of the first baculovirus immediate early genes that appears to be nonessential for virus survival.

In this report we present a type graph that models all executable constructs of the Java programming language. Such a model is useful for any graph-based technique that relies on a representation of Java programs as graphs. The model can be regarded as a common representation to which all Java

In this work we present a type graph that models all executable constructs of the Java programming language. Such a model is useful for any graph-based technique that relies on a representation of Java programs as graphs. The model can be regarded as a common representation to which all Java syntax

In this work we present a type graph that models all executable constructs of the Java programming language. Such a model is useful for any graph-based technique that relies on a representation of Java programs as graphs. The model can be regarded as a common representation to which all Java syntax

In this report we present a type graph that models all executable constructs of the Java programming language. Such a model is useful for any graph-based technique that relies on a representation of Java programs as graphs. The model can be regarded as a common representation to which all Java synta

Predicting the success of students participating in introductory programming courses has been an active research area for more than 25 years. Until recently, no variables or tests have had any significant predictive power. However, Dehnadi and Bornat claim to have found a simple test...... for programming aptitude to cleanly separate programming sheep from non-programming goats. We briefly present their theory and test instrument. We have repeated their test in our local context in order to verify and perhaps generalise their findings, but we could not show that the test predicts students' success...... in our introductory program-ming course. Based on this failure of the test instrument, we discuss various explanations for our differing results and suggest a research method from which it may be possible to generalise local results in this area. Furthermore, we discuss and criticize Dehnadi and Bornat...

The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologic systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This document consists of the description of the FE3DGW (Finite Element, Three-Dimensional Groundwater) Hydrologic model third level (high complexity) three-dimensional, finite element approach (Galerkin formulation) for saturated groundwater flow.

Increasingly attention has been focused to the degree to which social programs have effectively and efficiently delivered services. Using the differential program evaluation model by Tripodi, Fellin, and Epstein (1978) and by Bielawski and Epstein (1984), this paper described the application of this model to evaluating a multidisciplinary clinical consultation practice in child protection. This paper discussed the uses of the model by demonstrating them through the four stages of program initiation, contact, implementation, and stabilization. This organizational case study made a contribution to the model by introducing essential and interrelated elements of a "practical evaluation" methodology in evaluating social programs, such as a participatory evaluation approach; learning, empowerment and sustainability; and a flexible individualized approach to evaluation. The study results demonstrated that by applying the program development model, child-protective administrators and practitioners were able to evaluate the existing practices and recognize areas for program improvement.

This document begins by defining and discussing educational planning. A brief overview of mathematical programing with an explanation of the general linear programingmodel is then provided. Some recent applications of mathematical programing techniques to educational planning problems are reviewed, and their implications for educational research…

Parallel programming is more difficult than sequential programming in part because of the complexity of reasoning, testing, and debugging in the...context of concurrency. In the thesis, we present and investigate a parallel programmingmodel that provides direct control of parallelism in a notation

An accessible treatment of the modeling and solution of integer programming problems, featuring modern applications and software In order to fully comprehend the algorithms associated with integer programming, it is important to understand not only how algorithms work, but also why they work. Applied Integer Programming features a unique emphasis on this point, focusing on problem modeling and solution using commercial software. Taking an application-oriented approach, this book addresses the art and science of mathematical modeling related to the mixed integer programming (MIP) framework and

This paper describes a translator called JAVA PATHFINDER from JAVA to PROMELA, the "programming language" of the SPIN model checker. The purpose is to establish a framework for verification and debugging of JAVA programs based on model checking. This work should be seen in a broader attempt to make formal methods applicable "in the loop" of programming within NASA's areas such as space, aviation, and robotics. Our main goal is to create automated formal methods such that programmers themselves can apply these in their daily work (in the loop) without the need for specialists to manually reformulate a program into a different notation in order to analyze the program. This work is a continuation of an effort to formally verify, using SPIN, a multi-threaded operating system programmed in Lisp for the Deep-Space 1 spacecraft, and of previous work in applying existing model checkers and theorem provers to real applications.

We observed the effect of the fault finiteness in the Pnl waveforms from regional distances (4° to 12° ) for the Mw6.5 San Simeon Earthquake on 22 December 2003. We aimed to include more of the high frequencies (2 seconds and longer periods) than the studies that use regional data for focal solutions (5 to 8 seconds and longer periods). We calculated 1-D synthetic seismograms for the Pn_l portion for both a point source, and a finite fault solution. The comparison of the point source and finite fault waveforms with data show that the first several seconds of the point source synthetics have considerably higher amplitude than the data, while finite fault does not have a similar problem. This can be explained by reversely polarized depth phases overlapping with the P waves from the later portion of the fault, and causing smaller amplitudes for the beginning portion of the seismogram. This is clearly a finite fault phenomenon; therefore, can not be explained by point source calculations. Moreover, the point source synthetics, which are calculated with a focal solution from a long period regional inversion, are overestimating the amplitude by three to four times relative to the data amplitude, while finite fault waveforms have the similar amplitudes to the data. Hence, a moment estimation based only on the point source solution of the regional data could have been wrong by half of magnitude. We have also calculated the shifts of synthetics relative to data to fit the seismograms. Our results reveal that the paths from Central California to the south are faster than to the paths to the east and north. The P wave arrival to the TUC station in Arizona is 4 seconds earlier than the predicted Southern California model, while most stations to the east are delayed around 1 second. The observed higher uppermost mantle velocities to the south are consistent with some recent tomographic models. Synthetics generated with these models significantly improves the fits and the

Program slicing can be effectively used to debug, test, analyze, understand and maintain objectoriented software. In this paper, a new slicing model is proposed to slice Java programs based on their inherent hierarchical feature. The main idea of hierarchical slicing is to slice programs in a stepwise way, from package level, to class level, method level, and finally up to statement level. The stepwise slicing algorithm and the related graph reachability algorithms are presented, the architecture of the Java program Analyzing Tool (JATO) based on hierarchical slicing model is provided, the applications and a small case study are also discussed.

99mTc-DMSA renal scintigraphy was utilized to investigate the influence of ESWL on renal function in comparison with that of PNL. In the beginning, the reproducibility of renal uptake rate by the scintigraphy was examined in eleven healthy volunteers under both non-diuretic and diuretic states. The renal uptake rate was shown to be sufficiently reproducible in the same person in the two different trials. However, the differences and the standard deviations were shown to be a few percentages, which were not statistically significant. Changes in the repeated renal uptake rate seem to indicate not only changes of renal function with the treatment but also some technical errors. Herein, to investigate changes in renal function of the therapeutic side, the uptake ratio rate (rate of uptake rate in the therapeutic side/uptake rate in the contral lateral side) was utilized instead of uptake rate. Renal scintigraphy was carried out in 48 patients with unilateral renal stones before and after ESWL or PNL monotherapy or the combined ESWL and PNL therapies. Within one week of treatment, the uptake ratio rate significantly decreased in patients with PNL or the combined ESWL and PNL, although DMSA uptake rate in the therapeutic side did not significantly changes. Neither renal uptake rate nor uptake ratio rate significantly changed after ESWL treatment. There was no significant difference in changes of uptake ratio rate between Siemens Lithostars Plus and the improved Dornier HM-3 lithotriptors. This study indicated that ESWL monotherapy did not affect the uptake ratio rate, although PNL monotherapy and the combined ESWL and PNL therapies may affect the uptake ratio rate to some extent.

Full Text Available Extreme programming is one of the commonly used agile methodologies in software development. It is very responsive to changing requirements even in the late phases of the project. However, quality activities in extreme programming phases are implemented sequentially along with the activities that work on the functional requirements. This reduces the agility to deliver increments continuously and makes an inverse relationship between quality and agility. Due to this relationship, extreme programming does not consume enough time on making extensive documentation and robust design. To overcome these issues, an enhanced extreme programmingmodel is proposed. Enhanced extreme programming introduces parallelism in the activities' execution through putting quality activities into a separate execution line. In this way, the focus on delivering increments quickly is achieved without affecting the quality of the final output. In enhanced extreme programming, the quality concept is extended to include refinement of all phases of classical extreme programming and creating architectural design based on the refined design documents.

The recent years have seen an impressive increase in the use of Integer Programmingmodels for the solution of optimization problems originating in Molecular Biology. In this survey, some of the most successful Integer Programming approaches are described, while a broad overview of application areas being is given in modern Computational Molecular Biology.

This report describes the history of the Pacific Northwest Laboratory`s (PNL`s) work in development of energy standards for commercial and residential construction in the United States. PNL`s standards development efforts are concentrated in the Building Energy Standards Program (the Program), which PNL conducts for the U.S. Department of Energy (DOE) Office of Codes and Standards. The Program has worked with DOE, the American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. (ASHRAE), and other building codes and standards organizations to develop, evaluate, and promulgate energy standards in all sectors of the building industry. This report describes the recent history of U.S. code development and PNL`s contributions through the 1980s and early 1990s, up to the passage of the Energy Policy Act of 1992. Impacts to standards development resulting from the passage of this act will be described in other reports.

Carrier plate assemblies of NASA Space Shuttle thermal protection system provided for easy access to protected vital parts of Shuttle. Each assembly mounted on substructure with fasteners through holes in protective tiles. Automatic System of Kinematic Analysis (ASKA) finite-element program evaluates these assemblies. PLATEFORT computer program developed as data generator for ASKA modeling. PLATEFORT greatly reduces amount of time and data required for building ASKA model of these assemblies.

This report was prepared at the Lamont-Doherty Geological Observatory of Columbia University at Palisades, New York, under subcontract to Pacific Northwest Laboratory it is a part of a larger project of global climate studies which supports site characterization work required for the selection of a potential high-level nuclear waste repository and forms part of the Performance Assessment Scientific Support (PASS) Program at PNL. The work under the PASS Program is currently focusing on the proposed site at Yucca Mountain, Nevada, and is under the overall direction of the Yucca Mountain Project Office US Department of Energy, Las Vegas, Nevada. The final results of the PNL project will provide input to global atmospheric models designed to test specific climate scenarios which will be used in the site specific modeling work of others. The primary purpose of the data bases compiled and of the astronomic predictive models is to aid in the estimation of the probabilities of future climate states. The results will be used by two other teams working on the global climate study under contract to PNL. They are located at and the University of Maine in Orono, Maine, and the Applied Research Corporation in College Station, Texas. This report presents the results of the third year`s work on the global climate change models and the data bases describing past climates.

To evaluate the true necessity of open end ureteral catheter insertion in patients with moderate to severe pelvicalyceal system dilation treated with percutaneous nephrolithotomy (PNL) under sonographic guidance. 50 cases treated with PNL under sonographic guidance in prone position for solitary obstructing renal stones were evaluated. Patients were randomly divided into two groups; Group 1: Patients in whom a open end ureteral catheter was inserted prior to the procedure; Group 2: Patients receiving no catheter before PNL. In addition to the duration of the procedure as a whole and also all relevant stages as well, radiation exposure time, hospitalization period, mean nephrostomy tube duration, mean drop in Hb levels and all intra and postoperative complications have been evaluated. Mean size of the stones was 308.5 ± 133.2 mm2. Mean total duration of the PNL procedure in cases with open end ureteral catheter was significantly longer than the other cases (p < 0.001). Evaluation of the outcomes of the PNL procedures revealed no statistically significant difference between two groups regarding the stone-free rates (86% vs 84%). Additionally, there was no significant difference with respect to the duration of nephrostomy tube, hospitalization period and secondary procedures needed, complication rates as well as the post-operative Hb drop levels in both groups (p = 0.6830). Our results indicate that the placement of an open end ureteral catheter prior to a PNL procedure performed under sonographic access may not be indicated in selected cases presenting with solitary obstructing renal pelvic and/or calyceal stones.

Multiphysics and multiscale simulation systems share a common software requirement-infrastructure to implement data exchanges between their constituent parts-often called the coupling problem. On distributed-memory parallel platforms, the coupling problem is complicated by the need to describe, transfer, and transform distributed data, known as the parallel coupling problem. Parallel coupling is emerging as a new grand challenge in computational science as scientists attempt to build multiscale and multiphysics systems on parallel platforms. An additional coupling problem in these systems is language interoperability between their constituent codes. We have created a multilingual parallel coupling programmingmodel based on a successful open-source parallel coupling library, the Model Coupling Toolkit (MCT). This programmingmodel's capabilities reach beyond MCT's native Fortran implementation to include bindings for the C++ and Python programming languages. We describe the method used to generate the interlanguage bindings. This approach enables an object-based programmingmodel for implementing parallel couplings in non-Fortran coupled systems and in systems with language heterogeneity. We describe the C++ and Python versions of the MCT programmingmodel and provide short examples. We report preliminary performance results for the MCT interpolation benchmark. We describe a major Python application that uses the MCT Python bindings, a Python implementation of the control and coupling infrastructure for the community climate system model. We conclude with a discussion of the significance of this work to productivity computing in multidisciplinary computational science.

The problem associated with evaluating an econometric model using values outside those used in the model estimation is illustrated in the evaluations of a residential load management program during each of two successive years. Analysis reveals that attention must be paid to this problem. (Author/TJH)

-based scenarios. Language-based technologies have been suggested to support developers of those applications---the \\$\\backslash\\$emph{Decentralized Label Model} and \\$\\backslash\\$emph{Secure Program Partitioning} allow to annotate programs with security specifications, and to partition the annotated program...... across a set of hosts, obeying both the annotations and the trust relation between the principals. The resulting applications guarantee \\$\\backslash\\$emph{by construction} that safety and confidentiality of both data and computations are ensured. In this work, we develop a generalised version...

Using a three-step laser saturation excitation technique, the saturation effects on the Ba 6pns (J = 1) and 6pnd (J = 1, 3) autoionization spectra are observed systemically in zero field. These saturation spectra are introduced to determine the high n members of 6pnl (l = 0, 2) autoionizing series and are used to analyse the channel interactions among the autoionizing series in zero field. Furthermore, the saturation excitation technique is applied to the electric field case, in which the saturation spectra of Ba 6pnk (|M|= 0, 1) autoionizing Stark states are measured. Most of these saturation spectra are observed for the first time so far as we know, which indicate the mixing of the autoionizing states in the electric fields.

A single-pass continuous-flow leach test of PNL 76-68 glass beads (7 mm dia) was concluded after 420 days of uninterrupted operation. Variables included in the experimental matrix were flow-rate, leachant composition, and temperature. Analysis was conducted on all leachate samples for /sup 237/Np and /sup 239/Pu as well as a number of nonradioactive elements. Results indicated that flow-rate and leachant systematically affected the leach rate, but only slightly. Temperature effects were significant. Plutonium leach rate was lower at higher temperature suggesting that Pu sorption onto the beads was enhanced at the higher temperature. The range of leach rates for all analyzed elements (except Pu), at both temperatures, at all three flow rates, and with all three leachant compositions varied over only three orders of magnitude. The range of variables used in this experiment covered those expected in many proposed repository environments.

The models proposed by many authors for the prediction of retention times and temperatures, peak widths, retention indices and separation numbers in programmed temperature and pressure gas chromatography by starting from preliminary measurements of the retention in isothermal and isobaric conditions are reviewed. Several articles showing the correlation between retention data and thermodynamic parameters and the determination of the optimum programming rate are reported. The columns of different polarity used for the experimental measurement and the main equations, mathematical models and calculation procedures are listed. An empirical approach was used in the early models, followed by the application of thermodynamic considerations, iterative calculation procedures and statistical methods, based on increased computing power now available. Multiple column arrangements, simultaneous temperature and pressure programming, applications of two-dimensional and fast chromatography are summarised.

We present a formulation of the problem of probabilistic model checking as one of query evaluation over probabilistic logic programs. To the best of our knowledge, our formulation is the first of its kind, and it covers a rich class of probabilistic models and probabilistic temporal logics. The inference algorithms of existing probabilistic logic-programming systems are well defined only for queries with a finite number of explanations. This restriction prohibits the encoding of probabilistic model checkers, where explanations correspond to executions of the system being model checked. To overcome this restriction, we propose a more general inference algorithm that uses finite generative structures (similar to automata) to represent families of explanations. The inference algorithm computes the probability of a possibly infinite set of explanations directly from the finite generative structure. We have implemented our inference algorithm in XSB Prolog, and use this implementation to encode probabilistic model...

There is a proliferation of medical devices across the globe for the diagnosis and therapy of diseases. Biomedical engineering (BME) plays a significant role in healthcare and advancing medical technologies thus creating a substantial demand for biomedical engineers at undergraduate and graduate levels. There has been a surge in undergraduate programs due to increasing demands from the biomedical industries to cover many of their segments from bench to bedside. With the requirement of multidisciplinary training within allottable duration, it is indeed a challenge to design a comprehensive standardized undergraduate BME program to suit the needs of educators across the globe. This paper's objective is to describe three major models of undergraduate BME programs and their curricular requirements, with relevant recommendations to be applicable in institutions of higher education located in varied resource settings. Model 1 is based on programs to be offered in large research-intensive universities with multiple focus areas. The focus areas depend on the institution's research expertise and training mission. Model 2 has basic segments similar to those of Model 1, but the focus areas are limited due to resource constraints. In this model, co-op/internship in hospitals or medical companies is included which prepares the graduates for the work place. In Model 3, students are trained to earn an Associate Degree in the initial two years and they are trained for two more years to be BME's or BME Technologists. This model is well suited for the resource-poor countries. All three models must be designed to meet applicable accreditation requirements. The challenges in designing undergraduate BME programs include manpower, facility and funding resource requirements and time constraints. Each academic institution has to carefully analyze its short term and long term requirements. In conclusion, three models for BME programs are described based on large universities, colleges, and

The theory of developmental programming suggests that diseases such as the metabolic syndrome may be 'programmed' by exposure to adverse stimuli during early development. The developmental programming literature encompasses the study of a wide range of suboptimal intrauterine environments in a variety of species and correlates these with diverse phenotypic outcomes in the offspring. At a molecular level, a large number of variables have been measured and suggested as the basis of the programmed phenotype. The range of both dependent and independent variables studied often makes the developmental programming literature complex to interpret and the drawing of definitive conclusions difficult. A common, though under-explored, theme of many developmental programmingmodels is a sex difference in offspring outcomes. This holds true across a range of interventions, including dietary, hypoxic, and surgical models. The molecular and phenotypic outcomes of adverse in utero conditions are often more prominent in male than female offspring, although there is little consideration given to the basis for this observation in most studies. We review the evidence that maternal energy investment in male and female conceptuses may not be equal and may be environment dependent. It is suggested that male and female development could be viewed as separate processes from the time of conception, with differences in both timing and outcomes.

A nonlinear data modeling computer program, STEW, employing the Levenberg-Marquardt algorithm, has been developed to model the experimental {sup 239}Pu(n,f) and {sup 235}U(n,f) cross sections. This report presents results of the modeling of the {sup 239}Pu(n,f) and {sup 235}U(n,f) cross-section data. The calculation of the fission transmission coefficient is based on the double-humped-fission-barrier model of Bjornholm and Lynn. Incident neutron energies of up to 5 MeV are considered.

We propose a static and a dynamic approach to model biological signaling networks, and show how each can be used to answer relevant biological questions. For this, we use the two different mathematical tools of Propositional Logic and Integer Programming. The power of discrete mathematics for handling qualitative as well as quantitative data has so far not been exploited in molecular biology, which is mostly driven by experimental research, relying on first-order or statistical models. The arising logic statements and integer programs are analyzed and can be solved with standard software. For a restricted class of problems the logic models reduce to a polynomial-time solvable satisfiability algorithm. Additionally, a more dynamic model enables enumeration of possible time resolutions in poly-logarithmic time. Computational experiments are included.

The term complex mixture has been recently applied to energy-related process streams, products and wastes that typically contain hundreds or thousands of individual organic compounds, like petroleum or synthetic fuel oils; but it is more generally applicable. A six-year program of ecological research has focused on four areas important to understanding the environmental behavior of complex mixtures: physicochemical variables, individual organism responses, ecosystems-level determinations, and metabolism. Of these areas, physicochemical variables and organism responses were intensively studied; system-level determinations and metabolism represent more recent directions. Chemical characterization was integrated throughout all areas of the program, and state-of-the-art methods were applied. 155 references, 35 figures, 4 tables.

How do we advance the environmental literacy of young people, support the next generation of environmental stewards and increase the diversity of the leadership of zoos and aquariums? We believe it is through ongoing evaluation of zoo and aquarium teen programming and have founded a consortium to pursue those goals. The Zoo and Aquarium Teen Program Assessment Consortium (ZATPAC) is an initiative by six of the nation's leading zoos and aquariums to strengthen institutional evaluation capacity, model a collaborative approach toward assessing the impact of youth programs, and bring additional rigor to evaluation efforts within the field of informal science education. Since its beginning in 2004, ZATPAC has researched, developed, pilot-tested and implemented a pre-post program survey instrument designed to assess teens' knowledge of environmental issues, skills and abilities to take conservation actions, self-efficacy in environmental actions, and engagement in environmentally responsible behaviors. Findings from this survey indicate that teens who join zoo/aquarium programs are already actively engaged in many conservation behaviors. After participating in the programs, teens showed a statistically significant increase in their reported knowledge of conservation and environmental issues and their abilities to research, explain, and find resources to take action on conservation issues of personal concern. Teens also showed statistically significant increases pre-program to post-program for various conservation behaviors, including "I talk with my family and/or friends about things they can do to help the animals or the environment," "I save water...," "I save energy...," "When I am shopping I look for recycled products," and "I help with projects that restore wildlife habitat."

Full Text Available PDDP, the parallel data distribution preprocessor, is a data parallel programmingmodel for distributed memory parallel computers. PDDP implements high-performance Fortran-compatible data distribution directives and parallelism expressed by the use of Fortran 90 array syntax, the FORALL statement, and the WHERE construct. Distributed data objects belong to a global name space; other data objects are treated as local and replicated on each processor. PDDP allows the user to program in a shared memory style and generates codes that are portable to a variety of parallel machines. For interprocessor communication, PDDP uses the fastest communication primitives on each platform.

It is a fact that the component-oriented programming, well organized, can bring a large increase in efficiency in the development of large software systems. This paper proposes a model for building software systems by assembling components that can operate independently of each other. The model is based on a computing environment that runs parallel and distributed applications. This paper introduces concepts as: abstract aggregation scheme and aggregation application. Basically, an aggregation application is an application that is obtained by combining corresponding components. In our model an aggregation application is a word in a language.

The National Institute of Allergy and Infectious Diseases (NIAID) Radiation/Nuclear Medical Countermeasures Development Program has developed an integrated approach to providing the resources and expertise required for the research, discovery, and development of radiation/nuclear medical countermeasures (MCMs). These resources and services lower the opportunity costs and reduce the barriers to entry for companies interested in working in this area and accelerate translational progress by providing goal-oriented stewardship of promising projects. In many ways, the radiation countermeasures program functions as a "virtual pharmaceutical firm," coordinating the early and mid-stage development of a wide array of radiation/nuclear MCMs. This commentary describes the radiation countermeasures program and discusses a novel business model that has facilitated product development partnerships between the federal government and academic investigators and biopharmaceutical companies.

A single-pass continuous-flow leach test of PNL 76-68 glass beads (7 mm dia) was concluded after 420 days of uninterrupted operation. Variables included in the experimental matrix were flow-rate, leachant composition, and temperature. Analysis was conducted on all leachate samples for /sup 237/Np and /sup 239/Pu as well as a number of nonradioactive elements. Results indicated that flow-rate and leachant systematically affected the leach rate, but only slightly. Temperature effects were significant. Plutonium leach rate was lower at higher temperature suggesting that Pu sorption onto the beads was enhanced at the higher temperature. The range of leach rates for all analyzed elements (except Pu), at both temperature, at all three flow rates, and with all three leachant compositions varied only three orders of magnitude. The range of variables used in this experiment covered those expected in many proposed repository environments. The preliminary interpretation of the results also indicated that matrix dissolution may be the dominant leaching mechanism, at least for Np in bicarbonate leachant. Regardless of the leaching mechanism the importance of this study is that it bounds the effects of repository environments when the ground water is oxidizing and when it doesn't reach the waste form until the waste has cooled to ambient rock temperature.

One of the recent challenges for the hydrologic research community is the need for the development of coupled systems that involves the integration of hydrologic, atmospheric and socio-economic relationships. This poses a requirement for novel modelling frameworks that can accurately represent complex systems, given, the limited understanding of underlying processes, increasing volume of data and high levels of uncertainity. Each of the existing hydrological models vary in terms of conceptualization and process representation and is the best suited to capture the environmental dynamics of a particular hydrological system. Data driven approaches can be used in the integration of alternative process hypotheses in order to achieve a unified theory at catchment scale. The key steps in the implementation of integrated modelling framework that is influenced by prior understanding and data, include, choice of the technique for the induction of knowledge from data, identification of alternative structural hypotheses, definition of rules, constraints for meaningful, intelligent combination of model component hypotheses and definition of evaluation metrics. This study aims at defining a Genetic Programming based modelling framework that test different conceptual model constructs based on wide range of objective functions and evolves accurate and parsimonious models that capture dominant hydrological processes at catchment scale. In this paper, GP initializes the evolutionary process using the modelling decisions inspired from the Superflex framework [Fenicia et al., 2011] and automatically combines them into model structures that are scrutinized against observed data using statistical, hydrological and flow duration curve based performance metrics. The collaboration between data driven and physical, conceptual modelling paradigms improves the ability to model and manage hydrologic systems. Fenicia, F., D. Kavetski, and H. H. Savenije (2011), Elements of a flexible approach

Wet deposition of airborne chemical pollutants occurs primarily from precipitation. Precipitation rate, amount, duration, and location are important meteorological factors to be considered when attempting to understand the relationship of precipitation to pollutant deposition. The Pacific Northwest Laboratory (PNL) has conducted studies and experiments in numerous locations to collect data that can be incorporated into theories and models that attempt to describe the complex relationship between precipitation occurrence and chemical wet desposition. Model development often requires the use of average rather than random condition as input. To provide mean values of storm parameters, the task, Climatological Analysis of Mesoscale Storms, was created as a facet of the Environmental Protection Agency's related-service project, Precipitation Scavenging Module Development. Within this task computer programs have been developed at PNL which incorporate hourly precipitation data from National Weather Service stations to calculate mean values and frequency distributions of precipitation periods and of the interspersed dry periods. These programs have been written with a degree of flexibiity that will allow user modification for applications to different, but similar, analyses. This report describes in detail the rationale and operation of the two computer programs which produce the tables of average and frequency distributions of storm and dry period parameters from the precipitation data. A listing of the programs and examples of the generated output are included in the appendices. 3 references, 3 figures, 6 tables.

HII regions are areas of singly ionized Hydrogen formed by the ionizing radiaiton of upper main sequence stars. The infrared fine-structure line emissions, particularly Oxygen, Nitrogen, and Neon, can give important information about HII regions including gas temperature and density, elemental abundances, and the effective temperature of the stars that form them. The processes involved in calculating this information from observational data are complex. Models, such as those provided in Rubin 1984 and those produced by Cloudy (Ferland et al, 2013) enable one to extract physical parameters from observational data. However, the multitude of search parameters can make sifting through models tedious. I digitized Rubin's models and wrote a Python program that is able to take observed line ratios and their uncertainties and find the Rubin or Cloudy model that best matches the observational data. By creating a Python script that is user friendly and able to quickly sort through models with a high level of accuracy, this work increases efficiency and reduces human error in matching HII region models to observational data.

Demonstrates how a Wellness model can be an effective vehicle for promoting developmental programs in residence halls. The Wellness model is examined in terms of marketing, student development theory, and balanced programming. (BL)

The purpose of this study was to formulate a linear programmingmodel to simulate a foundation type support program and to apply this model to a state support program for the public elementary and secondary school districts in the State of Iowa. The model was successful in producing optimal solutions to five objective functions proposed for…

This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...

Radioactive waste exists at the US Department of Energy's (DOE's) Hanford Site in a variety of locations, including subsurface grout and tank farms, solid waste burial grounds, and contaminated soil sites. Some of these waste sites may need to be isolated from percolating water to minimize the potential for transport of the waste to the ground water, which eventually discharges to the Columbia River. Multilayer protective barriers have been proposed as a means of limiting the flow of water through the waste sites (DOE 1987). A multiyear research program (managed jointly by Pacific Northwest Laboratory (PNL) and Westinghouse Hanford Company for the DOE) is aimed at assessing the performance of these barriers. One aspect of this program involves the use of computer models to predict barrier performance. Three modeling studies have already been conducted and a test plan was produced. The simulation work reported here was conducted by PNL and extends the previous modeling work. The purpose of this report are to understand phenomena that have been observed in the field and to provide information that can be used to improve hydrologic modeling of the protective barrier. An improved modeling capability results in better estimates of barrier performance. Better estimates can be used to improve the design of barriers and the assessment of their long-term performance.

The staff of the Nuclear Regulatory Commission is performing nuclear power plant design certification reviews based on a design process plan that describes the human factors engineering (HFE) program elements that are necessary and sufficient to develop an acceptable detailed design specification and an acceptable implemented design. There are two principal reasons for this approach. First, the initial design certification applications submitted for staff review did not include detailed design information. Second, since human performance literature and industry experiences have shown that many significant human factors issues arise early in the design process, review of the design process activities and results is important to the evaluation of an overall design. However, current regulations and guidance documents do not address the criteria for design process review. Therefore, the HFE Program Review Model (HFE PRM) was developed as a basis for performing design certification reviews that include design process evaluations as well as review of the final design. A central tenet of the HFE PRM is that the HFE aspects of the plant should be developed, designed, and evaluated on the basis of a structured top-down system analysis using accepted HFE principles. The HFE PRM consists of ten component elements. Each element in divided into four sections: Background, Objective, Applicant Submittals, and Review Criteria. This report describes the development of the HFE PRM and gives a detailed description of each HFE review element.

This paper describes and evaluates the use of a molecular modeling computer program (Alchemy II) in a pharmaceutical education program. Provided are the hardware requirements and basic program features as well as several examples of how this program and its features have been applied in the classroom. (GLR)

Full Text Available This contribution describes the programme for one part of theautomatic Text-to-Speech (TTS synthesis. Some experiments (for example[14] documented the considerable improvement of the naturalness ofsynthetic speech, but this approach requires completing the inputfeature values by hand. This completing takes a lot of time for bigfiles. We need to improve the prosody by other approaches which useonly automatically classified features (input parameters. Theartificial neural network (ANN approach is used for the modeling ofprosody parameters. The program package contains all modules necessaryfor the text and speech signal pre-processing, neural network training,sensitivity analysis, result processing and a module for the creationof the input data protocol for Czech speech synthesizer ARTIC [1].

This article examines four influential programs-Citizen Schools, After School Matters, career academies, and Job Corps-to demonstrate the diversity of approaches to career programming for youth. It compares the specific programmodels and draws from the evaluation literature to discuss strengths and weaknesses of each. The article highlights three key lessons derived from the models that have implications for career development initiatives more generally: (1) career programming can and should be designed for youth across a broad age range, (2) career programming does not have to come at the expense of academic training or preparation for college, and (3) program effectiveness depends on intentional design and high-quality implementation.

Full Text Available In short, we can say that dynamic programming is a method of optimization of systems, using their mathematical representation in phases or sequences or as we say, periods. Such systems are common in economic studies at the implementation of programs on the most advanced techniques, such as for example that involving cosmic navigation. Another concept that is involved in the study of dynamic programs is the economic horizon (number of periods or phases that a dynamic program needs. This concept often leads to the examination of the convergence of certain variables on infinite horizon. In many cases from the real economy by introducing updating, dynamic programs can be made convergent.

投影Landweber (projected Landweber,PL)算法具有良好的频谱外推能力,能够应用于低信噪比降晰图像,是一种适合无源毫米波成像的超分辨算法.但其缺点是收敛速度缓慢,运算量不稳定,难以满足实时性要求.针对实时性问题,提出一种投影Newton-Landweber (projected Newton-Landweber,PNL)超分辨算法,首先使用Newton求逆法得到粗恢复图像,然后运用PL算法对图像做精细恢复.实验结果表明,该算法显著提高了收敛速度,图像恢复质量接近PL算法的性能.%The projected Landweber (PL) algorithm is competent for passive millimeter wave (PMMW) imaging for its spectrum extrapolation performance and low signal to noise ratio images processing ability, however, the slow speed of convergence and the computation instability limit its applications in real-time. A projected Newton-Landweber (PNL) algorithm is proposed to achieve real-time ability on the basis of the PL algorithm. Firstly, a fast Newton inversion algorithm is introduced to get coarse image recovery. Then the PL algorithm is used to get refined image recovery. Experimental results demonstrate that the performance of the PNL algorithm approaches that of the PL algorithm, moreover the computation is stable and far less than the PL algorithm.

The objectives of this project were to develop and evaluate promising low-cost dielectric and polymer-protected thin-film reflective metal coatings to be applied to preformed continuously-curved solar reflector panels to enhance their solar reflectance, and to demonstrate protected solar reflective coatings on preformed solar concentrator panels. The opportunity for this project arose from a search by United Solar Technologies (UST) for organizations and facilities capable of applying reflective coatings to large preformed panels. PNL was identified as being uniquely qualified to participate in this collaborative project.

A guide was prepared to allow a user to run the PNL long-range transport model, REGIONAL 1. REGIONAL 1 is a computer model set up to run atmospheric assessments on a regional basis. The model has the capability of being run in three modes for a single time period. The three modes are: (1) no deposition, (2) dry deposition, (3) wet and dry deposition. The guide provides the physical and mathematical basis used in the model for calculating transport, diffusion, and deposition for all three modes. Also the guide includes a program listing with an explanation of the listings and an example in the form of a short-term assessment for 48 hours. The purpose of the example is to allow a person who has past experience with programming and meteorology to operate the assessment model and compare his results with the guide results. This comparison will assure the user that the program is operating in a proper fashion.

Energy efficiency is an important goal of modern computing, with direct impact on system operational cost, reliability, usability and environmental sustainability. This dissertation describes the design and implementation of two innovative programming languages for constructing energy-aware systems. First, it introduces ET, a strongly typed programming language to promote and facilitate energy-aware programming, with a novel type system design called Energy Types. Energy Types is built upon a key insight into today's energy-efficient systems and applications: despite the popular perception that energy and power can only be described in joules and watts, real-world energy management is often based on discrete phases and modes, which in turn can be reasoned about by type systems very effectively. A phase characterizes a distinct pattern of program workload, and a mode represents an energy state the program is expected to execute in. Energy Types is designed to reason about energy phases and energy modes, bringing programmers into the optimization of energy management. Second, the dissertation develops Eco, an energy-aware programming language centering around sustainability. A sustainable program built from Eco is able to adaptively adjusts its own behaviors to stay on a given energy budget, avoiding both deficit that would lead to battery drain or CPU overheating, and surplus that could have been used to improve the quality of the program output. Sustainability is viewed as a form of supply and demand matching, and a sustainable program consistently maintains the equilibrium between supply and demand. ET is implemented as a prototyped compiler for smartphone programming on Android, and Eco is implemented as a minimal extension to Java. Programming practices and benchmarking experiments in these two new languages showed that ET can lead to significant energy savings for Android Apps and Eco can efficiently promote battery awareness and temperature awareness in real

Activities in five major areas were undertaken during the WRIP: experiments using PNL-developed bend-over-sheave fatigue test machines to generate data on which to base a model for predicting large-diameter rope performance from that of small-diameter ropes; bend-over-sheave fatigue testing to determine differences in rope failure rates at varying rope loads; analyses to determine how wire ropes actually fail; development of a load sensor to record and quantity operational loads on drag and hoist ropes; and technology transfer activities to disseminate useful program findings to coal mine operators. Data obtained during the 6-year program support are included. High loads on wire ropes are damaging. As an adjunct, however, potentially useful countermeasures to high loads were identified. Large-diameter rope bend-over-sheave performance can be predicted from small-diameter rope test behavior, over some ranges.

This report was prepared for the Office of Buildings and Community Systems, US Department of Energy (DOE). The principal objective of the report is to present information on existing Home Energy Rating Systems (HERS) and their features. Much of the information in this report updates a 1982 report (PNL-4359), also prepared by the Pacific Northwest Laboratory (PNL) for DOE. Secondary objectives of the report are to qualitatively examine the benefits and costs of HERS programs, review survey results on the attitudes of various user groups toward the programs, and discuss selected design and implementation issues.

California State Dept. of Education, Sacramento. Bureau of Industrial Education.

Intended to provide assistance for developing new programs and improving existing ones, the guide was constructed by dental assisting instructors and other professional participants in a 196 5 workshop conference. Elements of the modelprogram were derived from a statistical analysis of California junior colleg e programs in dental assisting and…

Nonprofit agencies are a critical component of the health and human services system in the US. It has been clearly demonstrated by programs that offer energy efficiency services to nonprofits that, with minimal investment, they can educe their energy consumption by ten to thirty percent. This energy conservation potential motivated the Department of Energy and Oak Ridge National Laboratory to conceive a project to help states develop energy efficiency programs for nonprofits. The purpose of the project was two-fold: (1) to analyze existing programs to determine which design and delivery mechanisms are particularly effective, and (2) to create modelprograms for states to follow in tailoring their own plans for helping nonprofits with energy efficiency programs. Twelve existing programs were reviewed, and three modelprograms were devised and put into operation. The modelprograms provide various forms of financial assistance to nonprofits and serve as a source of information on energy efficiency as well. After examining the results from the modelprograms (which are still on-going) and from the existing programs, several replicability factors'' were developed for use in the implementation of programs by other states. These factors -- some concrete and practical, others more generalized -- serve as guidelines for states devising program based on their own particular needs and resources.

This article examines four influential programs--Citizen Schools, After School Matters, career academies, and Job Corps--to demonstrate the diversity of approaches to career programming for youth. It compares the specific programmodels and draws from the evaluation literature to discuss strengths and weaknesses of each. The article highlights…

The traditional methods of assessing the academic programs in the liberal arts are inappropriate for evaluating vocational and technical programs. In traditional academic disciplines, assessment of instruction is conducted in two fashions: student evaluation at the end of a course and institutional assessment of its goals and mission. Because of…

This study examines detailed usage of online training videos that were designed to address specific course problems that were encountered in an online computer programming course. The study presents the specifics of a programming course where training videos were used to provide students with a quick start path to learning a new programming…

The Division of International Affairs of the Energy Research and Development Administration is assessing the long-range economic effects of energy research and development programs in the U.S. and other countries, particularly members of the International Energy Agency (IEA). In support of this effort, a program was designed to coordinate the capabilities of five research groups--Rand, Virginia Polytechnic Institute, Brookhaven National Laboratory, Lawrence Livermore Laboratory, and Pacific Northwest Laboratory. The program could evaluate the international economics of proposed or anticipated sources of energy. This program is designed to be general, flexible, and capable of evaluating a diverse collection of potential energy (nuclear and nonnuclear) related problems. For example, the newly developed methodology could evaluate the international and domestic economic impact of nuclear-related energy sources, but also existing nonnuclear and potential energy sources such as solar, geothermal, wind, etc. Major items to be included would be the cost of exploration, cost of production, prices, profit, market penetration, investment requirements and investment goods, economic growth, change in balance of payments, etc. In addition, the changes in cost of producing all goods and services would be identified for each new energy source. PNL developed (1) a means of estimating the demands for major forms of energy by country, and (2) a means of identifying results or impacts on each country. The results for each country were then to be compared to assess relative advantages. PNL relied on its existing general econometric model, EXPLOR, to forecast the demand for energy by country. (MCW)

The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate ModelingProgram is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate ModelingProgram is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

We present Mantis, a new framework that automatically predicts program performance with high accuracy. Mantis integrates techniques from programming language and machine learning for performance modeling, and is a radical departure from traditional approaches. Mantis extracts program features, which are information about program execution runs, through program instrumentation. It uses machine learning techniques to select features relevant to performance and creates prediction models as a function of the selected features. Through program analysis, it then generates compact code slices that compute these feature values for prediction. Our evaluation shows that Mantis can achieve more than 93% accuracy with less than 10% training data set, which is a significant improvement over models that are oblivious to program features. The system generates code slices that are cheap to compute feature values.

To study the relationship between the evolutions of Chinese Traditional Culture (CTC) and program organization, an outline of the CTC is generalized by reviewing literature, and which is also compartmentalized into two aspects according to economic philosophy views: traditional philosophy aspect and value judgment. Based on three dimensions, which are the philosophy aspect (P), program organization model (P), and value judgment from economic philosophy views (V), and this evolution sequence, the CTC's influence on the program organization model in the evolution is discussed; then the cultural spatial evolution model for program organization based on the three dimensions (PPV) is constructed. From analyzing the plane matrix of P-P and empirical investigating on the organizational model of construction enterprises, it is found that the ancient Chinese government organizational model still has prevailing influence on the modern program organizational model in China.

A model of MPI synchronization communication programs is presented and its three basic simplified models are also defined. A series of theorems and methods for deciding whether deadlocks will occur among the three models are given and proved strictly. These theories and methods for simple models' deadlock detection are the necessary base for real MPI program deadlock detection. The methods are based on a static analysis through programs and with runtime detection in necessary cases and they are able to determine before compiling whether it will be deadlocked for two of the three basic models. For another model, some deadlock cases can be found before compiling and others at runtime. Our theorems can be used to prove the correctness of currently popular MPI program deadlock detection algorithms. Our methods may decrease codes that those algorithms need to change to MPI source or profiling interface and may detects deadlocks ahead of program execution, thus the overheads can be reduced greatly.

This study explored the possibility of modeling the effects of a study abroad program on students from a university in the northeastern United States. A program effect model was proposed after conducting an extensive literature review and empirically examining a sample of 265 participants in 2005. Exploratory factor analysis (EFA), confirmatory factor analysis (CFA),...

The maximum clique or maximum independent set of graph is a classical problem in graph theory. Combined with Boolean algebra and integer programming, two integer programmingmodels for maximum clique problem,which improve the old results were designed in this paper. Then, the programmingmodel for maximum independent set is a corollary of the main results. These two models can be easily applied to computer algorithm and software, and suitable for graphs of any scale. Finally the models are presented as Lingo algorithms, verified and compared by several examples.

Full Text Available Conventional statistical analysis includes the capacity to systematically assign individuals to groups. We suggest alternative assignment procedures, utilizing a set of interrelated goal programming formulations. This paper represents an effort to suggest ways by which the discriminant problem might reasonably be addressed via straightforward linear goal programming formulations. Simple and direct, such formulations may ultimately compete with conventional approaches - free of the classical assumptions and possessing a stronger intuitive appeal. We further demonstrate via simple illustration the potential of these procedures to play a significant part in addressing the discriminant problem, and indicate fundamental ideas that lay the foundation for other more sophisticated approaches.

In 2006, Honeywell Federal Manufacturing & Technologies (FM&T) announced an updatedvision statement for the organization. The vision is “To be the most admired team within the NNSA [National Nuclear Security Administration] for our relentless drive to convert ideas into the highest quality products and services for National Security by applying the right technology, outstanding program management and best commercial practices.” The challenge to provide outstanding program management was taken up by the Program Management division and the Program Integration Office (PIO) of the company. This article describes how Honeywell developed and deployed a program management maturity model to drive toward excellence.

Little attention has been paid to the organizational and administrative characteristics of effective community support programs for the chronic mentally ill. The authors analyzed three successful support programs in Wisconsin that employ three different models of service delivery: one provides services through caseworkers who carry specialized caseloads, another through local nonprofessionals who work with a centrally located program coordinator, and the third through a team of various mental health workers. Each program has tailored its organizational process to suit the types of clients it sees, the size of its catchment area, and the availability of other professional resources. The interrelated strengths and weaknesses of each model are discussed.

With a growing number of leadership programs in universities and colleges in North America, leadership educators and researchers are engaged in a wide ranging dialogue to propose clear processes, content, and designs for providing academic leadership education. This research analyzes the curriculum design of 52 institutions offering a "Minor…

Answer Set Programming (ASP) is a logic programming paradigm that has been shown as a useful tool in various application areas due to its expressive modelling language. These application areas include Bourided Model Checking (BMC). BMC is a verification technique that is recognized for its strong ability of finding errors in computer systems. To apply BMC, a system needs to be modelled in a formal specification language, such as the widely used formalism of Abstract State Machines (ASMs). In ...

Radioactive waste exists at the US Department of Energy`s (DOE`s) Hanford Site in a variety of locations, including subsurface grout and tank farms, solid waste burial grounds, and contaminated soil sites. Some of these waste sites may need to be isolated from percolating water to minimize the potential for transport of the waste to the ground water, which eventually discharges to the Columbia River. Multilayer protective barriers have been proposed as a means of limiting the flow of water through the waste sites (DOE 1987). A multiyear research program [managed jointly by Pacific Northwest Laboratory (PNL) and Westinghouse Hanford Company for the DOE] is aimed at assessing the performance of these barriers. One aspect of this program involves the use of computer models to predict barrier performance. Three modeling studies have already been conducted and a test plan was produced. The simulation work reported here was conducted by PNL and extends the previous modeling work. The purpose of this report are to understand phenomena that have been observed in the field and to provide information that can be used to improve hydrologic modeling of the protective barrier. An improved modeling capability results in better estimates of barrier performance. Better estimates can be used to improve the design of barriers and the assessment of their long-term performance.

Teacher education programs in the United States face increasing pressure to demonstrate their effectiveness through pupils' learning gains in classrooms where program graduates teach. The link between teacher candidates' learning in teacher education programs and pupils' learning in K-12 classrooms implicit in the policy discourse suggests a one-to-one correspondence. However, the logical steps leading from what teacher candidates have learned in their programs to what they are doing in classrooms that may contribute to their pupils' learning are anything but straightforward. In this paper, we argue that the logic model approach from scholarship on evaluation can enhance research on teacher education by making explicit the logical links between program processes and intended outcomes. We demonstrate the usefulness of the logic model approach through our own work on designing a longitudinal study that focuses on examining the process and impact of an undergraduate mathematics and science teacher education program.

Systemic infrastructure is key to public health achievements. Individual public health program infrastructure feeds into this larger system. Although program infrastructure is rarely defined, it needs to be operationalized for effective implementation and evaluation. The Ecological Model of Infrastructure (EMI) is one approach to defining program infrastructure. The EMI consists of 5 core (Leadership, Partnerships, State Plans, Engaged Data, and Managed Resources) and 2 supporting (Strategic Understanding and Tactical Action) elements that are enveloped in a program's context. We conducted a literature search across public health programs to determine support for the EMI. Four of the core elements were consistently addressed, and the other EMI elements were intermittently addressed. The EMI provides an initial and partial model for understanding program infrastructure, but additional work is needed to identify evidence-based indicators of infrastructure elements that can be used to measure success and link infrastructure to public health outcomes, capacity, and sustainability.

This SpringerBrief explores the internal workings of service systems. The authors propose a lightweight semantic model for an effective representation to capture the essence of service systems. Key topics include modeling frameworks, service descriptions and linked data, creating service instances, tool support, and applications in enterprises.Previous books on service system modeling and various streams of scientific developments used an external perspective to describe how systems can be integrated. This brief introduces the concept of white-box service system modeling as an approach to mo

In this paper, we present a method for automatic verification of real-time control programs running on LEGO(R) RCX(TM) bricks using the verification tool UPPALL. The control programs, consisting of a number of tasks running concurrently, are automatically translated into the mixed automata model...

It provides fuzzy programming approach to solve real-life decision problems in fuzzy environment. Within the framework of credibility theory, it provides a self-contained, comprehensive and up-to-date presentation of fuzzy programmingmodels, algorithms and applications in portfolio analysis.

Background: Following the 2000 Sydney Olympics, the NSW Premier, Mr Bob Carr, launched a school-based initiative in NSW government primary schools called the "Gold Medal Fitness Program" to encourage children to be fitter and more active. The Program was introduced into schools through a model of professional development, "Quality…

PLANS, a software package for integrated timber-harvest planning, uses digital terrain models to provide the topographic data needed to fit harvest and transportation designs to specific terrain. MAP, an integral program in the PLANS package, is used to construct the digital terrain models required by PLANS. MAP establishes digital terrain models using digitizer-traced...

This paper presents the approach used by the Technical Assistance Center (TAC) of the University of Minnesota's Refugee Assistance Program in Mental Health for identifying successful and culturally sensitive mental health service delivery models. It divides these into four categories: the psychiatric model; the community mental health model; the…

This paper proposes a Genetic Programming-Based Modeling (GPM) algorithm on chaotic time series. GP is used here to search for appropriate model structures in function space, and the Particle Swarm Optimization (PSO) algorithm is used for Nonlinear Parameter Estimation (NPE) of dynamic model structures. In addition, GPM integrates the results of Nonlinear Time Series Analysis (NTSA) to adjust the parameters and takes them as the criteria of established models. Experiments showed the effectiveness of such improvements on chaotic time series modeling.

. It is shown that it is possible to include a transient model in dynamic stability programs and thus obtain correct results also in dynamic stability programs. A mechanical model of the shaft system has also been included in the generator model...... with and without a model of the mechanical shaft. The reason for the discrepancies are explained, and it is shown that the phenomenon is due partly to the presence of DC offset currents in the induction machine stator, and partly to the mechanical shaft system of the wind turbine and the generator rotor......For AC networks with large amounts of induction generators-in case of e.g. windmills-the paper demonstrates a significant discrepancy in the simulated voltage recovery after faults in weak networks, when comparing result obtained with dynamic stability programs and transient programs, respectively...

Programmable logic controllers (PLCs) are embedded computers widely used in industrial control systems. Ensuring that a PLC software complies with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of safety-critical software but is still underused in industry due to the complexity of building and managing formal models of real applications. In this paper, we propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (\\eg CTL, LTL) on PLC programs. This methodology is based on an intermediate model (IM), meant to transform PLC programs written in various standard languages (ST, SFC, etc.) to different modeling languages of verification tools. We present the syntax and semantics of the IM and the transformation rules of the ST and SFC languages to the nuXmv model checker passing through the intermediate model. Finally, two real cases studies of \\CERN PLC programs, written mainly in th...

The Hanford Site Surface Barrier Development Program was organized in 1985 to test the effectiveness of various barrier designs in minimizing the effects of water infiltration; plant, animal and human intrusion; and wind and water erosion on buried wastes, plus preventing or minimizing the emanation of noxious gases. A team of scientists from the Pacific Northwest Laboratory (PNL) and engineers from Westinghouse Hanford Company (WHC) direct the barrier development effort. ICF Kaiser Hanford Company, in conjunction with WHC and PNL, developed design drawings and construction specifications for a 5-acre prototype barrier. The highlight of efforts in FY 1994 was the construction of the prototype barrier. The prototype barrier was constructed on the Hanford Site at the 200 BP-1 Operable Unit of the 200 East Area. Construction was completed in August 1994 and monitoring instruments are being installed so experiments on the prototype barrier can begin in FY 1995. The purpose of the prototype barrier is to provide insights and experience with issues regarding barrier design, construction, and performance that have not been possible with individual tests and experiments conducted to date. Additional knowledge and experience was gained in FY 1994 on erosion control, physical stability, water infiltration control, model testing, Resource Conservation and Recovery Act (RCRA) comparisons, biointrusion control, long-term performance, and technology transfer.

Complete documentation of the marketing and distribution (M and D) computer model is provided. The purpose is to estimate the costs of selling and transporting photovoltaic solar energy products from the manufacturer to the final customer. The model adjusts for the inflation and regional differences in marketing and distribution costs. The model consists of three major components: the marketing submodel, the distribution submodel, and the financial submodel. The computer program is explained including the input requirements, output reports, subprograms and operating environment. The program specifications discuss maintaining the validity of the data and potential improvements. An example for a photovoltaic concentrator collector demonstrates the application of the model.

Complete documentation of the marketing and distribution (M and D) computer model is provided. The purpose is to estimate the costs of selling and transporting photovoltaic solar energy products from the manufacturer to the final customer. The model adjusts for the inflation and regional differences in marketing and distribution costs. The model consists of three major components: the marketing submodel, the distribution submodel, and the financial submodel. The computer program is explained including the input requirements, output reports, subprograms and operating environment. The program specifications discuss maintaining the validity of the data and potential improvements. An example for a photovoltaic concentrator collector demonstrates the application of the model.

Considerable evidence indicates that variability in implementation of prevention programs is related to the outcomes achieved by these programs. However, while implementation has been conceptualized as a multidimensional construct, few studies examine more than a single dimension, and no theoretical framework exists to guide research on the effects of implementation. We seek to address this need by proposing a theoretical model of the relations between the dimensions of implementation and outcomes of prevention programs that can serve to guide future implementation research. In this article, we focus on four dimensions of implementation, which we conceptualize as behaviors of program facilitators (fidelity, quality of delivery, and adaptation) and behaviors of participants (responsiveness) and present the evidence supporting these as predictors of program outcomes. We then propose a theoretical model by which facilitator and participant dimensions of implementation influence participant outcomes. Finally, we provide recommendations and directions for future implementation research.

0-1 programming problem is an important problem in opsearch with very widespread applications. In this paper, a new DNA computation model utilizing solution-based and surface-based methods is presented to solve the 0-1 programming problem. This model contains the major benefits of both solution-based and surface-based methods; including vast parallelism, extraordinary information density and ease of operation. The result, verified by biological experimentation, revealed the potential of DNA computation in solving complex programming problem.

A theoretical framework for the interpretation of satellite measurements of stratospheric temperature and trace gases is provided. This problem is quite complicated since the distributions of trace gases are dependent on dynamics and photochemistry. Therefore, the problem was attacked with models employing varying degrees of photochemical and dynamical complexity. The relationship between dynamics and trace gas transport and wave transience, dissipation and critical levels and the net (permanent) transport of trace gases, the role of photochemistry in trace gas transport, photochemistry and dynamics and altering the mean-zonal distribution of stratospheric ozone, and approximations to simplify the interpretation of observations and General Circulation Models are discussed.

CoMD-Em is a software implementation suite of the CoMD [4] proxy app using different emerging programmingmodels. It is intended to analyze the features and capabilities of novel programmingmodels that could help ensure code and performance portability and scalability across heterogeneous platforms while improving programmer productivity. Another goal is to provide the authors and venders with some meaningful feedback regarding the capabilities and limitations of their models. The actual application is a classical molecular dynamics (MD) simulation using either the Lennard-Jones method (LJ) or the embedded atom method (EAM) for primary particle interaction. The code can be extended to support alternate interaction models. The code is expected ro run on a wide class of heterogeneous hardware configurations like shard/distributed/hybrid memory, GPU's and any other platform supported by the underlying programmingmodel.

Geometric information is important for automatic programming of arc welding robot. Complete geometric models of robotic arc welding are established in this paper. In the geometric model of weld seam, an equation with seam length as its parameter is introduced to represent any weld seam. The method to determine discrete programming points on a weld seam is presented. In the geometric model of weld workpiece, three class primitives and CSG tree are used to describe weld workpiece. Detailed data structure is presented. In pose transformation of torch, world frame, torch frame and active frame are defined, and transformation between frames is presented. Based on these geometric models, an automatic programming software package for robotic arc welding, RAWCAD, is developed. Experiments show that the geometric models are practical and reliable.

A Body Sensor Network (BSN) must be designed to work autonomously. On the other hand, BSNs need mechanisms that allow changes in their behavior in order to become a clinically useful tool. The purpose of this paper is to present a new programmingmodel that will be useful for programming BSN sensor nodes. This model is based on an intelligent intermediate-level compiler. The main purpose of the proposed compiler is to increase the efficiency in system use, and to increase the lifetime of the application, considering its requirements, hardware possibilities and specialist knowledge. With this model, it is possible to maintain the autonomous operation capability of the BSN and still offer tools that allow users with little grasp on programming techniques to program these systems.

Health care providers' opinions can influence how parents place their infant to sleep. Neonatal nurses can improve how they teach and model safe infant sleep practices to parents. To increase neonatal nurses' knowledge, a sudden infant death syndrome (SIDS) prevention program was implemented. Program components included a computerized teaching tool, a crib card, sleep sacks, and discharge instructions. Initial program evaluation showed that 98 percent of infants slept supine and 93 percent slept in sleep sacks in open cribs. However, nurses continued to swaddle some infants with blankets to improve thermoregulation. To increase nursing compliance in modeling safe infant sleep practices, Halo SleepSack Swaddles were provided for nurses to use in place of a blanket to regulate infant temperature. Recent data show that 100 percent of infants in open cribs are now sleeping supine wearing a Halo Swaddle or a traditional Halo SleepSack. This modelprogram can easily be replicated to enhance neonatal nurses' knowledge about SIDS prevention.

A need for modeling abnormal behavior on a comprehensive, systematic basis exists. Computer modeling and simulation tools offer especially good opportunities to establish such a program of studies. Issues concern deciding which modeling tools to use, how to relate models to behavioral data, what level of modeling to employ, and how to articulate theory to facilitate such modeling. Four levels or types of modeling, two qualitative and two quantitative, are identified. Their properties are examined and interrelated to include illustrative applications to the study of abnormal behavior, with an emphasis on schizophrenia.

Full Text Available He/she is carried out a study referred to tools of Programming Neuro-linguistics (PNL for the Selection, Employment and Training that allow choosing the appropriate personnel taking the language and the behavior as a result. For their development theories were revised referred to the PNL and the recruitment process and selection sustained in the process of the interview. The summations are oriented to the importance and convenience for the Management of Human resources of applying the Programming Neuro-linguistics as selection tool of personal. Finally it is recommended to apply the proposal inside the mark of adaptability according to the necessities and demands of each organization.

This annual report briefly describes the technical progress within each segment of the WCPE from October 1979 through September 1980. It includes the progress accomplished directly by the Pacific Northwest Laboratory (PNL) and by subcontractors funded directly by DOE or through PNL. To expedite the management of the activities to produce the required information, the WCPE has been divided into three program areas: Wind Energy Prospecting, Support for Design and Operations, and Site Evaluation. Accomplishments in each of these program areas provide a highlight of WCPE activities in FY 1980.

a methodology adopted from water distribution network that automatically sets up the independent loops and is easy to implement into a computer program. Finally an example of verification of the model is given which demonstrates the ability of the models to accurately predict the airflow of a simple multizone...

This paper presents a novel nonlinear binary programmingmodel designed to improve the reliability indices of a distribution network. This model identifies the type and location of protection devices that should be installed in a distribution feeder and is a generalization of the classical optimizat

Background/Objectives: Diet models based on goal programming (GP) are valuable tools in designing diets that comply with nutritional, palatability and cost constraints. Results derived from GP models are usually very sensitive to the type of achievement function that is chosen. This paper aims to pr

Presented is an introduction to the operation and mechanics of the ACTP production system, a version of Anderson's (1976) ACT system. ACTP is already in use modeling geometry theorem proving and counting of a set of objects, and has been identified as a potentially useful programing framework for developing models of the cognitive processes used…

Full Text Available Present work is a part of the ACC autonomous car project. This paper will focuson the control program architecture. To design this architecture we will start from thehuman driver behavior model. Using this model we have constructed a three level controlprogram. Preliminary results are presented.

Two teaching models of a service delivery program designed to prevent speech-language problems in lower socioeconomic children were compared. Specific goals included increasing mothers' awareness of the sensory input to which infants are responsive and increasing mothers' abilities to read infant nonverbal signals. In Model 1, two speech-language…

In this article, we examine the use of a new binary integer programming (BIP) model to detect arbitrage opportunities in currency exchanges. This model showcases an excellent application of mathematics to the real world. The concepts involved are easily accessible to undergraduate students with basic knowledge in Operations Research. Through this…

A description is given of the equations and the computer program being developed to model the complete Otto cycle. The program incorporates such important features as: (1) heat transfer, (2) finite combustion rates, (3) complete chemical kinetics in the burned gas, (4) exhaust gas recirculation, and (5) manifold vacuum or supercharging. Changes in thermodynamic, kinetic and transport data as well as model parameters can be made without reprogramming. Preliminary calculations indicate that: (1) chemistry and heat transfer significantly affect composition and performance, (2) there seems to be a strong interaction among model parameters, and (3) a number of cycles must be calculated in order to obtain steady-state conditions.

The current shortfall in effectiveness within conservation biology is illustrated by increasing interest in "evidence-based conservation," whose proponents have identified the need to benchmark conservation initiatives against actions that lead to proven positive effects. The effectiveness of conservation policies, approaches, and evaluation is under increasing scrutiny, and in these areas models of excellence used in business could prove valuable. Typically, conservation programs require years of effort and involve rigorous long-term implementation processes. Successful balance of long-term efforts alongside the achievement of short-term goals is often compromised by management or budgetary constraints, a situation also common in commercial businesses. "Business excellence" is an approach many companies have used over the past 20 years to ensure continued success. Various business excellence evaluations have been promoted that include concepts that could be adapted and applied in conservation programs. We describe a conservation excellence model that shows how scientific processes and results can be aligned with financial and organizational measures of success. We applied the model to two well-documented species conservation programs. In the first, the Po'ouli program, several aspects of improvement were identified, such as more authority for decision making in the field and better integration of habitat management and population recovery processes. The second example, the black-footed ferret program, could have benefited from leadership effort to reduce bureaucracy and to encourage use of best-practice species recovery approaches. The conservation excellence model enables greater clarity in goal setting, more-effective identification of job roles within programs, better links between technical approaches and measures of biological success, and more-effective use of resources. The model could improve evaluation of a conservation program's effectiveness and may be

Mediation modeling can explain the nature of the relation among three or more variables. In addition, it can be used to show how a variable mediates the relation between levels of intervention and outcome. The Sobel test, developed in 1990, provides a statistical method for determining the influence of a mediator on an intervention or outcome. Although interactive Web-based and stand-alone methods exist for computing the Sobel test, SPSS and SAS programs that automatically run the required regression analyses and computations increase the accessibility of mediation modeling to nursing researchers. To illustrate the utility of the Sobel test and to make this programming available to the Nursing Research audience in both SAS and SPSS. The history, logic, and technical aspects of mediation testing are introduced. The syntax files sobel.sps and sobel.sas, created to automate the computation of the regression analysis and test statistic, are available from the corresponding author. The reported programming allows the user to complete mediation testing with the user's own data in a single-step fashion. A technical manual included with the programming provides instruction on program use and interpretation of the output. Mediation modeling is a useful tool for describing the relation between three or more variables. Programming and manuals for using this model are made available.

The Constellation Program (CxP) is NASA's effort to replace the Space Shuttle, return humans to the moon, and prepare for a human mission to Mars. The major elements of the Constellation Lunar sortie design reference mission architecture are shown. Unlike the Apollo Program of the 1960's, affordability is a major concern of United States policy makers and NASA management. To measure Constellation affordability, a total ownership cost life-cycle parametric cost estimating capability is required. This capability is being developed by the Constellation Systems Engineering and Integration (SE&I) Directorate, and is called the Lifecycle Cost Analysis Model (LCAM). The requirements for LCAM are based on the need to have a parametric estimating capability in order to do top-level program analysis, evaluate design alternatives, and explore options for future systems. By estimating the total cost of ownership within the context of the planned Constellation budget, LCAM can provide Program and NASA management with the cost data necessary to identify the most affordable alternatives. LCAM is also a key component of the Integrated ProgramModel (IPM), an SE&I developed capability that combines parametric sizing tools with cost, schedule, and risk models to perform program analysis. LCAM is used in the generation of cost estimates for system level trades and analyses. It draws upon the legacy of previous architecture level cost models, such as the Exploration Systems Mission Directorate (ESMD) Architecture Cost Model (ARCOM) developed for Simulation Based Acquisition (SBA), and ATLAS. LCAM is used to support requirements and design trade studies by calculating changes in cost relative to a baseline option cost. Estimated costs are generally low fidelity to accommodate available input data and available cost estimating relationships (CERs). LCAM is capable of interfacing with the Integrated ProgramModel to provide the cost estimating capability for that suite of tools.

Bacause of high development costs of IC (Integrated Circuit)test programs,recycling existing test programs from one kind of ATE (Automatic Test Equipment) to another or generating directly from CAD simulation modules to ATE is more and more valuable.In this paper,a new approach to migrating test programs is presented.A virtual ATE model based on object-oriented paradigm is developed;it runs Test C++ (an intermediate test control language) programs and TeIF(Test Inftermediate Format-an intermediate pattern),migrates test programs among three kinds of ATE (Ando DIC8032,Schlumberger S15 and GenRad 1732) and generates test patterns from two kinds of CAD 9Daisy and Panda) automatically.

Since 1995, a model cancer screening program has been in operation in Hungary, the overall purpose of which is to promote the establishment of effective and efficient screening programs by means of adapting the internationally agreed principles of organized screening to the needs and opportunities in Hungary. The establishment and operation of a national population-based cancer registration system is an other aim of the Program. The modelprogram--financed partly from a loan from the World Bank, partly from local funds provided by the Government of Hungary--is to develop standard procedure for cervical, breast and colorectal screening and to end up with tested recommendations for introduction of organized screening of proved effectiveness, integrated into the health care system, on country-wide service bases in Hungary.

One of the responsibilities of power market regulator is setting rules for selecting and prioritizing demand response (DR) programs. There are many different alternatives of DR programs for improving load profile characteristics and achieving customers' satisfaction. Regulator should find the optimal solution which reflects the perspectives of each DR stakeholder. Multi Attribute Decision Making (MADM) is a proper method for handling such optimization problems. In this paper, an extended responsive load economic model is developed. The model is based on price elasticity and customer benefit function. Prioritizing of DR programs can be realized by means of Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method. Considerations of ISO/utility/customer regarding the weighting of attributes are encountered by entropy method. An Analytical Hierarchy Process (AHP) is used for selecting the most effective DR program. Numerical studies are conducted on the load curve of the Iranian power grid in 2007. (author)

Rice University's achievements as part of the Center for ProgrammingModels for Scalable Parallel Computing include: (1) design and implemention of cafc, the first multi-platform CAF compiler for distributed and shared-memory machines, (2) performance studies of the efficiency of programs written using the CAF and UPC programmingmodels, (3) a novel technique to analyze explicitly-parallel SPMD programs that facilitates optimization, (4) design, implementation, and evaluation of new language features for CAF, including communication topologies, multi-version variables, and distributed multithreading to simplify development of high-performance codes in CAF, and (5) a synchronization strength reduction transformation for automatically replacing barrier-based synchronization with more efficient point-to-point synchronization. The prototype Co-array Fortran compiler cafc developed in this project is available as open source software from http://www.hipersoft.rice.edu/caf.

Programmingmodels and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programmingmodels and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that make design, prototyping and delivery of new programmingmodels easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programmingmodels and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) ProgrammingModels & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.

Developmental programming can be defined as a response to a specific challenge to the mammalian organism during a critical developmental time window that alters the trajectory of development with persistent effects on offspring phenotype and predisposition to future illness. We focus on the need for studies in relevant, well-characterized animal models in the context of recent research discoveries on the challenges, mechanisms and outcomes of developmental programming. We discuss commonalitie...

The paper presents a brief summary of the research on the Business English training model within MBA program students. This study is devoted to the problem of developing a professional foreign language communicative competency of MBA program participants. A particular feature of additional MBA qualification is its international status which presupposes that its graduates (mid-level and top managers) should realize their professional tasks in a foreign language. The analysis of literary ...

The difficulty of developing reliable parallel software is generating interest in deterministic environments, where a given program and input can yield only one possible result. Languages or type systems can enforce determinism in new code, and runtime systems can impose synthetic schedules on legacy parallel code. To parallelize existing serial code, however, we would like a programmingmodel that is naturally deterministic without language restrictions or artificial scheduling. We propose "...

In this paper a canonical neural network with adaptively changing synaptic weights and activation function parameters is presented to solve general nonlinear programming problems. The basic part of the model is a sub-network used to find a solution of quadratic programming problems with simple upper and lower bounds. By sequentially activating the sub-network under the control of an external computer or a special analog or digital processor that adjusts the weights and parameters, one then solves general nonlinear programming problems. Convergence proof and numerical results are given.

The proposed implementation of work hour restrictions has presented a significant challenge of maintaining the quality of resident education and ensuring adequate hands-on experience that is essential for novice surgeons. To maintain the level of resident surgical competency, revision of the apprentice model of surgical education to include supplementary educational methods, such as laboratory and virtual reality (VR) simulations, have become frequent topics of discussion. We aimed to better understand the role of supplementary educational methods in Canadian neurosurgery residency training. An online survey was sent to program directors of all 14 Canadian neurosurgical residency programs and active resident members of the Canadian Neurosurgical Society (N=85). We asked 16 questions focusing on topics of surgeon perception, current implementation and barriers to supplementary educational models. Of the 99 surveys sent, 8 out of 14 (57%) program directors and 37 out of 85 (44%) residents completed the survey. Of the 14 neurosurgery residency programs across Canada, 7 reported utilizing laboratory-based teaching within their educational plan, while only 3 programs reported using VR simulation as a supplementary teaching method. The biggest barriers to implementing supplementary educational methods were resident availability, lack of resources, and cost. Work-hour restrictions threaten to compromise the traditional apprentice model of surgical training. The potential value of supplementary educational methods for surgical education is evident, as reported by both program directors and residents across Canada. However, availability and utilization of laboratory and VR simulations are limited by numerous factors such as time constrains and lack of resources.

Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis.

The Hanford Site Protective Barrier Development Program was jointly developed by Pacific Northwest Laboratory (PNL) and Westinghouse Hanford Company (WHC) to design and test an earthen cover system(s) that can be used to inhibit water infiltration; plant, animal, and human intrusion; and wind and water erosion. The joint PNL/WHC program was initiated in FY 1986. To date, research findings support the initial concepts of barrier designs for the Hanford Site. A fine-soil surface is planned to partition surface water into runoff and temporary storage. Transpiration by vegetation that grows in the fine-soil layer will return stored water to the atmosphere as will surface evaporation. A capillary break created by the interface of the fine-soil layer and coarser textured materials below will further limit the downward migration of surface water, making it available over a longer period of time for cycling to the atmosphere. Should water pass the interface, it will drain laterally through a coarse textured sand/gravel layer. Tested barrier designs appear to work adequately to prevent drainage under current and postulated wetter-climate (added precipitation) conditions. Wind and water erosion tasks are developing data to predict the extent of erosion on barrier surfaces. Data collected during the last year confirm the effectiveness of small burrowing animals in removing surface water. Water infiltrating through burrows of larger mammals was subsequently lost by natural processes. Natural analog and climate change studies are under way to provide credibility for modeling the performance of barrier designs over a long period of time and under shifts in climate. 10 refs., 30 figs.

This paper proposes a Genetic Programming-Based Modeling(GPM)algorithm on chaotic time series. GP is used here to search for appropriate model structures in function space,and the Particle Swarm Optimization(PSO)algorithm is used for Nonlinear Parameter Estimation(NPE)of dynamic model structures. In addition,GPM integrates the results of Nonlinear Time Series Analysis(NTSA)to adjust the parameters and takes them as the criteria of established models.Experiments showed the effectiveness of such improvements on chaotic time series modeling.

The aim of this article is to propose a general approach to link a stochastic programming enabler to a mathematical programmingmodeling language. Modelers often choose to formulate their problems in well- tested, general purpose modeling languages such as GAMS and AMPL, but these modeling languages do not currently implement a natural syntax for stochastic programming. Specialized stochastic programming tools are available to efficiently generate and solve large-scale stochastic programs, but they lack many of the convenient features of the modeling languages. The lack of a well developed link between these tools and modeling languages prevents many modelers from accessing a powerful and convenient technique to take into account uncertainties. As an attempt to fill this gap, we will present SISP (Simplified Interface for Stochastic Programming), an interface between Algebraic Modeling Languages and specialized Stochastic Programming solvers, also known as SP solvers. 12 Refs.

The goal of the Benchmark ModelsProgram is to provide data useful in the development and evaluation of aeroelastic computational fluid dynamics (CFD) codes. To that end, a series of three similar wing models are being flutter tested in the Langley Transonic Dynamics Tunnel. These models are designed to simultaneously acquire model response data and unsteady surface pressure data during wing flutter conditions. The supercritical wing is the second model of this series. It is a rigid semispan model with a rectangular planform and a NASA SC(2)-0414 supercritical airfoil shape. The supercritical wing model was flutter tested on a flexible mount, called the Pitch and Plunge Apparatus, that provides a well-defined, two-degree-of-freedom dynamic system. The supercritical wing model and associated flutter test apparatus is described and experimentally determined wind-off structural dynamic characteristics of the combined rigid model and flexible mount system are included.

Researchers and practioners alike recognize that "the national goal that every child in the United States has access to high-quality school education in science and mathematics cannot be realized without the availability of effective professional development of teachers" (Hewson, 1997, p. 16). Further, there is a plethora of reports calling for the improvement of professional development efforts (Guskey & Huberman, 1995; Kyle, 1995; Loucks-Horsley, Hewson, Love, & Stiles, 1997). In this study I analyze a successful 3-year teacher enhancement program, one form of professional development, to: (1) identify essential components of an effective teacher enhancement program; and (2) create a model to identify and articulate the critical issues in designing, implementing, and evaluating teacher enhancement programs. Five primary sources of information were converted into data: (1) exit questionnaires, (2) exit surveys, (3) exit interview transcripts, (4) focus group transcripts, and (5) other artifacts. Additionally, a focus group was used to conduct member checks. Data were analyzed in an iterative process which led to the development of the list of essential components. The Components are categorized by three organizers: Structure (e.g., science research experience, a mediator throughout the program), Context (e.g., intensity, collaboration), and Participant Interpretation (e.g., perceived to be "safe" to examine personal beliefs and practices, actively engaged). The model is based on: (1) a 4-year study of a successful teacher enhancement program; (2) an analysis of professional development efforts reported in the literature; and (3) reflective discussions with implementors, evaluators, and participants of professional development programs. The model consists of three perspectives, cognitive, symbolic interaction, and organizational, representing different viewpoints from which to consider issues relevant to the success of a teacher enhancement program. These

In August 1992, the Energy Research Center (ERC) at the University of Kansas was awarded a contract by the US Department of Energy (DOE) to develop a technology transfer regional model. This report describes the development and testing of the Kansas Technology Transfer Model (KTTM) which is to be utilized as a regional model for the development of other technology transfer programs for independent operators throughout oil-producing regions in the US. It describes the linkage of the regional model with a proposed national technology transfer plan, an evaluation technique for improving and assessing the model, and the methodology which makes it adaptable on a regional basis. The report also describes management concepts helpful in managing a technology transfer program.

This report is one in a series of documents describing research activities in support of the US Department of Energy (DOE) Building Energy Codes Program. The Pacific Northwest Laboratory (PNL) leads the program for DOE. The goal of the program is to develop and support the adopting, implementation, and enforcement of Federal, State, and Local energy codes for new buildings. The program approach to meeting the goal is to initiate and manage individual research and standards and guidelines development efforts that are planned and conducted in cooperation with representatives from throughout the buildings community. Projects under way involve practicing architects and engineers, professional societies and code organizations, industry representatives, and researchers from the private sector and national laboratories. Research results and technical justifications for standards criteria are provided to standards development and model code organizations and to Federal, State, and local jurisdictions as a basis to update their codes and standards. This effort helps to ensure that building standards incorporate the latest research results to achieve maximum energy savings in new buildings, yet remain responsive to the needs of the affected professions, organizations, and jurisdictions. Also supported are the implementation, deployment, and use of energy-efficient codes and standards. This report documents findings from an analysis conducted by PNL of the State`s building codes to determine if the codes meet or exceed the 1992 MEC energy efficiency requirements (CABO 1992a).

To evaluate the potential annual net cost savings of implementing an ICU early rehabilitation program. Using data from existing publications and actual experience with an early rehabilitation program in the Johns Hopkins Hospital Medical ICU, we developed a model of net financial savings/costs and presented results for ICUs with 200, 600, 900, and 2,000 annual admissions, accounting for both conservative- and best-case scenarios. Our example scenario provided a projected financial analysis of the Johns Hopkins Medical ICU early rehabilitation program, with 900 admissions per year, using actual reductions in length of stay achieved by this program. U.S.-based adult ICUs. Financial modeling of the introduction of an ICU early rehabilitation program. Net cost savings generated in our example scenario, with 900 annual admissions and actual length of stay reductions of 22% and 19% for the ICU and floor, respectively, were $817,836. Sensitivity analyses, which used conservative- and best-case scenarios for length of stay reductions and varied the per-day ICU and floor costs, across ICUs with 200-2,000 annual admissions, yielded financial projections ranging from -$87,611 (net cost) to $3,763,149 (net savings). Of the 24 scenarios included in these sensitivity analyses, 20 (83%) demonstrated net savings, with a relatively small net cost occurring in the remaining four scenarios, mostly when simultaneously combining the most conservative assumptions. A financial model, based on actual experience and published data, projects that investment in an ICU early rehabilitation program can generate net financial savings for U.S. hospitals. Even under the most conservative assumptions, the projected net cost of implementing such a program is modest relative to the substantial improvements in patient outcomes demonstrated by ICU early rehabilitation programs.

, in particular for complex cyber-physical systems or systems of systems. Though modelling, programming, and verification will certainly become more closely integrated in the future, we do not expect a single formalism to become universally applicable and accepted by the development and verification communities......In this article, the feasibility of a unified modelling and programming paradigm is discussed from the perspective of large scale system development and verification in collaborative development environments. We motivate the necessity to utilise multiple formalisms for development and verification...

Most of CERN’s industrial installations rely on PLC-based (Programmable Logic Controller) control systems developed using the UNICOS framework. This framework contains common, reusable program modules and their correctness is a high priority. Testing is already applied to find errors, but this method has limitations. In this work an approach is proposed to transform automatically PLC programs into formal models, with the goal of applying formal verification to ensure their correctness. We target model checking which is a precise, mathematical-based method to check formalized requirements automatically against the system.

Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

Reuse is viewed as a realistically effective approach to solving software crisis. For an organization that wants to build a reuse program, technical and non-technical issues must be considered in parallel. In this paper, a model-based approach to building systematic reuse program is presented. Component-based reuse is currently a dominant approach to software reuse. In this approach, building the right reusable component model is the first important step. In order to achieve systematic reuse, a set of component models should be built from different perspectives. Each of these models will give a specific view of the components so as to satisfy different needs of different persons involved in the enterprise reuse program. There already exist some component models for reuse from technical perspectives. But less attention is paid to the reusable components from a non-technical view, especially from the view of process and management. In our approach, a reusable component model--FLP model for reusable component--is introduced. This model describes components from three dimensions (Form, Level, and Presentation) and views components and their relationships from the perspective of process and management. It determines the sphere of reusable components, the time points of reusing components in the development process, and the needed means to present components in terms of the abstraction level, logic granularity and presentation media. Being the basis on which the management and technical decisions are made, our model will be used as the kernel model to initialize and normalize a systematic enterprise reuse program.

Full Text Available The main part of developing computer-aided design of roads are simulation systems to see the road in action. Modeling of the functioning of the road in such a simulation system - this test road design in the computer. This article describes three modules: PARK, PROFILE, COMPOSITION and comprising a set of process simulation programs functioning road. A significant increase in the accuracy of simulation results provides software parks established normative reference database of technical and economic parameters of vehicles belonging to the stream. Completeness framework allows continuous adjustment and constant up-dating of the parameters types of cars in different scales calculation excludes construction and operating costs in justifying economic calculations optimality design solutions and increases the reliability of evaluating the effectiveness of capital investments in the construction and reconstruction of roads. Optimization of the design solutions in general, as a single continuous sequence of combinations of elements contributes to road profile program that analyzes the geometric elements of the plan, longitudinal section, compressing the geometry information of the way for the subsequent modeling of the functioning of the road. Program PROFILE (and built on its basis BASIS program, PROFILE is a nexus between the projecting programs and programs that simulate traffic. Transport and road performance computer modeled for a particular stream of automobile. Technical and economic parameters of vehicles belonging to the flow (up to 20, which is sufficient for practical and research tasks and their percentage in the flow of the program selects COMPOSITION regulatory reference framework articulated earlier PARK module and writes them to a working file for their subsequent use module RIDE.

This document describes the phantom dosimetry used for the PFP Area Monitoring program and establishes the basis for the Plutonium Finishing Plant's (PFP) area monitoring dosimetry program in accordance with the following requirements: Title 10, Code of Federal Regulations (CFR), part 835, ''Occupational Radiation Protection'' Part 835.403; Hanford Site Radiological Control Manual (HSRCM-1), Part 514; HNF-PRO-382, Area Dosimetry Program; and PNL-MA-842, Hanford External Dosimetry Technical Basis Manual.

Operational semantics has established itself as a flexible but rigorous means to describe the meaning of programming languages. Oftentimes, it is felt necessary to keep a semantics small, for example to facilitate its use for model checking by avoiding state space explosion. However, omitting many details in a semantics typically makes results valid for a limited core language only, leaving a wide gap towards any real implementation. In this paper we present a full-fledged semantics of the concurrent object-oriented programming language SCOOP (Simple Concurrent Object-Oriented Programming). The semantics has been found detailed enough to guide an implementation of the SCOOP compiler and runtime system, and to detect and correct a variety of errors and ambiguities in the original informal specification and prototype implementation. In our formal specification, we use abstract data types with preconditions and axioms to describe the state, and introduce a number of special run-time operations to model the runti...

Diabetes mellitus (DM) was a chronic metabolic disease characterized by higher than normal blood glucose level (normal blood glucose level = = 80 -120 mg/dl). In this study, type 2 DM which mostly caused by unhealthy eating habits would be investigated. Related to eating habit, DM patients needed dietary menu planning with an extracare regarding their nutrients intake (energy, protein, fat and carbohydrate). Therefore, the measures taken were by organizing nutritious dietary menu for diabetes mellitus patients. Dietary menu with appropriate amount of nutrients was organized by considering the amount of calories, proteins, fats and carbohydrates. In this study, Goal Programmingmodel was employed to determine optimal dietary menu variations for diabetes mellitus patients by paying attention to optimal expenses. According to the data obtained from hospitals in Yogyakarta, optimal menu variations would be analyzed by using Goal Programmingmodel and would be completed by using LINGO computer program.

As part of the Green Plan, introduced by the Federal Government in late 1990, a network of model forests was developed to demonstrate the concept of sustainable forest management in practical terms on a working scale. This annual report describes the competitive site selection process, the forests involved in the project, program milestones, the operation of model forests, and highlights of the project. Financial data is included.

The State of Alaska Department of Education has created a handbook for establishing budgets for the following three types of construction projects: new schools or additions; renovations; and combined new work and renovations. The handbook supports a demand cost model computer program that includes detailed renovation cost data, itemized by…

Inventory problems generally have a structure that can be exploited for computational purposes. Here, we look at the duals of two seemingly unrelated inventory models that suggest an interesting duality between discrete time optimal control and optimization over an ordered sequence of variables. Concepts from conjugate duality and generalized geometric programming are used to establish the duality.

It is the purpose of this article to discuss the development and application of a logic model in the context of a large scientific research program within the Commonwealth Scientific and Industrial Research Organisation (CSIRO). CSIRO is Australia's national science agency and is a publicly funded part of Australia's innovation system. It conducts…

This article describes Universal Instructional Design as an inclusive pedagogical model for use in educational programs, whether provided by traditional educational institutions, community-based initiatives, or workplace literacy projects. For the benefit of public relations specialists and classroom educators alike, the article begins with a…

Modelprograms designed to promote diversity within the West Valley-Mission Community College District (WVMCCD) in California are discussed and described in this report. First, an introductory chapter, "The Importance of Cultural Issues to Higher Education," by Gustavo A. Mellander and Fred Prochaska, reviews the diversity recommendations of the…

Full Text Available The web is a large repository of information and to facilitate the search and retrieval of pages from it,categorization of web documents is essential. An effective means to handle the complexity of information retrieval from the internet is through automatic classification of web pages. Although lots of automatic classification algorithms and systems have been presented, most of the existing approaches are computationally challenging. In order to overcome this challenge, we have proposed a parallel algorithm, known as MapReduce programmingmodel to automatically categorize the web pages. This approach incorporates three concepts. They are web crawler, MapReduce programmingmodel and the proposed web page categorization approach. Initially, we have utilized web crawler to mine the World Wide Web and the crawled web pages are then directly given as input to the MapReduce programmingmodel. Here the MapReduce programmingmodel adapted to our proposed web page categorization approach finds the appropriate category of the web page according to its content. The experimental results show that our proposed parallel web page categorization approach achieves satisfactory results in finding the right category for any given web page.

Presents an adaptable, context-sensitive model for ESL/EFL program evaluation, consisting of seven steps that guide an evaluator through consideration of relevant issues, information, and design elements. Examples from an evaluation of the Reading for Science and Technology Project at the University of Guadalajara, Mexico are given. (31…

It is the purpose of this article to discuss the development and application of a logic model in the context of a large scientific research program within the Commonwealth Scientific and Industrial Research Organisation (CSIRO). CSIRO is Australia's national science agency and is a publicly funded part of Australia's innovation system. It conducts…

The Arctic Climate ModelingProgram (ACMP) offered yearlong science, technology, engineering, and math (STEM) professional development to teachers in rural Alaska. Teacher training focused on introducing youth to workforce technologies used in Arctic research. Due to challenges in making professional development accessible to rural teachers, ACMP…

The Council for Accreditation of Counseling and Related Educational Programs (2001) has approved the use of triadic supervision as an alternative to individual supervision in clinical instruction. However, literature describing this mode of supervision is very limited. A model for triadic supervision is described, including presession planning,…

There is currently a large research and development effort within the high-performance computing community on advanced parallel programmingmodels. This research can potentially have an impact on parallel applications, system software, and computing architectures in the next several years. Given Sandia's expertise and unique perspective in these areas, particularly on very large-scale systems, there are many areas in which Sandia can contribute to this effort. This technical report provides a survey of past and present parallel programmingmodel research projects and provides a detailed description of the Partitioned Global Address Space (PGAS) programmingmodel. The PGAS model may offer several improvements over the traditional distributed memory message passing model, which is the dominant model currently being used at Sandia. This technical report discusses these potential benefits and outlines specific areas where Sandia's expertise could contribute to current research activities. In particular, we describe several projects in the areas of high-performance networking, operating systems and parallel runtime systems, compilers, application development, and performance evaluation.

The paper deals with integer linear programming problems. As is well known, these are extremely complex problems, even when the number of integer variables is quite low. Literature provides examples of various methods to solve such problems, some of which are of a heuristic nature. This paper proposes an alternative strategy based on the Hopfield neural network. The advantage of the strategy essentially lies in the fact that hardware implementation of the neural model allows for the time required to obtain a solution so as not depend on the size of the problem to be solved. The paper presents a particular class of integer linear programming problems, including well-known problems such as the Travelling Salesman Problem and the Set Covering Problem. After a brief description of this class of problems, it is demonstrated that the original Hopfield model is incapable of supplying valid solutions. This is attributed to the presence of constant bias currents in the dynamic of the neural model. A demonstration of this is given and then a novel neural model is presented which continues to be based on the same architecture as the Hopfield model, but introduces modifications thanks to which the integer linear programming problems presented can be solved. Some numerical examples and concluding remarks highlight the solving capacity of the novel neural model.

Schools are adopting evidence-based programs designed to enhance students' emotional and behavioral competencies at increasing rates (Hemmeter et al. in Early Child Res Q 26:96-109, 2011). At the same time, teachers express the need for increased support surrounding implementation of these evidence-based programs (Carter and Van Norman in Early Child Educ 38:279-288, 2010). Ongoing professional development in the form of coaching may enhance teacher skills and implementation (Noell et al. in School Psychol Rev 34:87-106, 2005; Stormont et al. 2012). There exists a need for a coaching model that can be applied to a variety of teacher skill levels and one that guides coach decision-making about how best to support teachers. This article provides a detailed account of a two-phased coaching model with empirical support developed and tested with coaches and teachers in urban schools (Becker et al. 2013). In the initial universal coaching phase, all teachers receive the same coaching elements regardless of their skill level. Then, in the tailored coaching phase, coaching varies according to the strengths and needs of each teacher. Specifically, more intensive coaching strategies are used only with teachers who need additional coaching supports, whereas other teachers receive just enough support to consolidate and maintain their strong implementation. Examples of how coaches used the two-phased coaching model when working with teachers who were implementing two universal prevention programs (i.e., the PATHS curriculum and PAX Good Behavior Game [PAX GBG]) provide illustrations of the application of this model. The potential reach of this coaching model extends to other school-based programs as well as other settings in which coaches partner with interventionists to implement evidence-based programs.

A computer program entitled PAIN (Propeller Aircraft Interior Noise) has been developed to permit calculation of the sound levels in the cabin of a propeller-driven airplane. The fuselage is modeled as a cylinder with a structurally integral floor, the cabin sidewall and floor being stiffened by ring frames, stringers and floor beams of arbitrary configurations. The cabin interior is covered with acoustic treatment and trim. The propeller noise consists of a series of tones at harmonics of the blade passage frequency. Input data required by the program include the mechanical and acoustical properties of the fuselage structure and sidewall trim. Also, the precise propeller noise signature must be defined on a grid that lies in the fuselage skin. The propeller data are generated with a propeller noise prediction program such as the NASA Langley ANOPP program. The program PAIN permits the calculation of the space-average interior sound levels for the first ten harmonics of a propeller rotating alongside the fuselage. User instructions for PAIN are given in the report. Development of the analytical model is presented in NASA CR 3813.

Modern real-time applications are increasingly large, complex and concurrent systems which must meet stringent performance and predictability requirements. Programming those systems require fundamental advances in programming languages and runtime systems. This talk presents our work on Flexotasks, a programmingmodel for concurrent, real-time systems inspired by stream-processing and concurrent active objects. Some of the key innovations in Flexotasks are that it support both real-time garbage collection and region-based memory with an ownership type system for static safety. Communication between tasks is performed by channels with a linear type discipline to avoid copying messages, and by a non-blocking transactional memory facility. We have evaluated our model empirically within two distinct implementations, one based on Purdue’s Ovm research virtual machine framework and the other on Websphere, IBM’s production real-time virtual machine. We have written a number of small programs, as well as a 30 KLOC avionics collision detector application. We show that Flexotasks are capable of executing periodic threads at 10 KHz with a standard deviation of 1.2us and have performance competitive with hand coded C programs.

Gstat is a computer program for variogram modelling, and geostatistical prediction and simulation. It provides a generic implementation of the multivariable linear model with trends modelled as a linear function of coordinate polynomials or of user-defined base functions, and independent or dependent, geostatistically modelled, residuals. Simulation in gstat comprises conditional or unconditional (multi-) Gaussian sequential simulation of point values or block averages, or (multi-) indicator sequential simulation. Besides many of the popular options found in other geostatistical software packages, gstat offers the unique combination of (i) an interactive user interface for modelling variograms and generalized covariances (residual variograms), that uses the device-independent plotting program gnuplot for graphical display, (ii) support for several ascii and binary data and map file formats for input and output, (iii) a concise, intuitive and flexible command language, (iv) user customization of program defaults, (v) no built-in limits, and (vi) free, portable ANSI-C source code. This paper describes the class of problems gstat can solve, and addresses aspects of efficiency and implementation, managing geostatistical projects, and relevant technical details.

PDDP, the Parallel Data Distribution Preprocessor, is a data parallel programmingmodel for distributed memory parallel computers. PDDP impelments High Performance Fortran compatible data distribution directives and parallelism expressed by the use of Fortran 90 array syntax, the FORALL statement, and the (WRERE?) construct. Distribued data objects belong to a global name space; other data objects are treated as local and replicated on each processor. PDDP allows the user to program in a shared-memory style and generates codes that are portable to a variety of parallel machines. For interprocessor communication, PDDP uses the fastest communication primitives on each platform.

A selection program for available watershed models (also known as SPAWM) was developed. Thirty-three commonly used watershed models were analyzed in depth and classified in accordance to their attributes. These attributes consist of: (1) land use; (2) event or continuous; (3) time steps; (4) water quality; (5) distributed or lumped; (6) subsurface; (7) overland sediment; and (8) best management practices. Each of these attributes was further classified into sub-attributes. Based on user selected sub-attributes, the most appropriate watershed model is selected from the library of watershed models. SPAWM is implemented using Excel Visual Basic and is designed for use by novices as well as by experts on watershed modeling. It ensures that the necessary sub-attributes required by the user are captured and made available in the selected watershed model.

Program SMART (Spectra and Model Atmospheres by Radiative Transfer) has been composed for modelling atmospheres and spectra of hot stars (O, B and A spectral classes) and studying different physical processes in them (Sapar & Poolam\\"ae 2003, Sapar et al. 2007). Line-blanketed models are computed assuming plane-parallel, static and horizontally homogeneous atmosphere in radiative, hydrostatic and local thermodynamic equilibrium. Main advantages of SMART are its shortness, simplicity, user friendliness and flexibility for study of different physical processes. SMART successfully runs on PC both under Windows and Linux.

An artificial immune system was modelled with self/non-self selection to overcome abnormity in a mobile robot demo. The immune modelling includes the innate immune modelling and the adaptive immune modelling. The self/non-self selection includes detection and recognition, and the self/non-self detection is based on the normal model of the demo. After the detection, the non-self recognition is based on learning unknown non-self for the adaptive immunization. The learning was designed on the neural network or on the learning mechanism from examples. The last step is elimination of all the non-self and failover of the demo. The immunization of the mobile robot demo is programmed with Java to test effectiveness of the approach. Some worms infected the mobile robot demo, and caused the abnormity. The results of the immunization simulations show that the immune program can detect 100% worms,recognize all known Worms and most unknown worms, and eliminate the worms. Moreover, the damaged files of the mobile robot demo can all be repaired through the normal model and immunization. Therefore, the immune modelling of the mobile robot demo is effective and programmable in some anti-worms and abnormity detection applications.

Future multiscale and multiphysics models must use the power of high performance computing (HPC) systems to enable research into human disease, translational medical science, and treatment. Previously we showed that computationally efficient multiscale models will require the use of sophisticated hybrid programmingmodels, mixing distributed message passing processes (e.g. the message passing interface (MPI)) with multithreading (e.g. OpenMP, POSIX pthreads). The objective of this work is to compare the performance of such hybrid programmingmodels when applied to the simulation of a lightweight multiscale cardiac model. Our results show that the hybrid models do not perform favourably when compared to an implementation using only MPI which is in contrast to our results using complex physiological models. Thus, with regards to lightweight multiscale cardiac models, the user may not need to increase programming complexity by using a hybrid programming approach. However, considering that model complexity will increase as well as the HPC system size in both node count and number of cores per node, it is still foreseeable that we will achieve faster than real time multiscale cardiac simulations on these systems using hybrid programmingmodels.

Full Text Available Background Patients receiving cancer treatment start lifestyle changes mostly at the end of the treatment during the rehabilitation period. Most often, the first step is a dietary change and physical exercises built into the daily routine. Patients who do this in groups led by qualified therapists and based on professional counseling can build more effective and more permanent changes into their life. To develop a complex rehabilitation program which, in the short term, aims to familiarize patients with a lifestyle which harmonizes the physical, mental, spiritual and social spheres of life and, in the long term, to build it into their everyday life in order to ameliorate the physical and mental state and reduce the psychological symptoms and the isolation of patients. The physical component focuses on diet and exercise. The psycho-social-spiritual support focuses on discovering inner sources of strength, developing active coping mechanisms and helping to achieve more open communication. Participants and procedure In February and March 2011, 8 patients treated for malignant tumors participated in the modelprogram. The components of the modelprogram were psychotherapy, physiotherapy, cancer consultation, nutrition counseling, creative activities and walking. Results During the period of the modelprogram the isolation of the patients decreased and their social support and ability of coping with the illness ameliorated. They reported an ease in anxiety and depression in their everyday activities. According to feedback, their communication with each other, with the staff and with their relatives became more open. Altogether this had advantageous effects on the functioning of the ward and the mood of the staff. Conclusions The rehabilitation program confirmed that beside individual psycho-social support, beneficial and economic psycho-social support can be provided for the patients in group form along with the most effective assignment of the

Developmental programming can be defined as a response to a specific challenge to the mammalian organism during a critical developmental time window that alters the trajectory of development with persistent effects on offspring phenotype and predisposition to future illness. We focus on the need for studies in relevant, well-characterized animal models in the context of recent research discoveries on the challenges, mechanisms and outcomes of developmental programming. We discuss commonalities and differences in general principles of developmental programming as they apply to several species, including humans. The consequences of these differences are discussed. Obesity, metabolic disorders and cardiovascular diseases are associated with the highest percentage of morbidity and mortality worldwide. Although many of the causes are associated with lifestyle, high-energy diets and lack of physical activity, recent evidence has linked developmental programming to the epidemic of metabolic diseases. A better understanding of comparative systems physiology of mother, fetus and neonate using information provided by rapid advances in molecular biology has the potential to improve the lifetime health of future generations by providing better women's health, diagnostic tools and preventative and therapeutic interventions in individuals exposed during their development to programming influences.

The effects of flash EEPROM floating gate (FG) morphology on the generation and density of fast programming bits in a 2Mbit flash EEPROM array has been characterized. These fast programming bits exhibit identical subthreshold characteristics similar to that of a normal bit after UV erase, thus establishing that the initial charge stored on the FG of both fast and normal bit is the same. Experimental results clearly indicates that the fast programming phenomena result from an interaction of the programming process with the FG polysilicon microstructure. An in-depth experimentation previously reported, reveals that the FG poly deposition and doping processes are crucial for controlling the desired Fowler-Nordheim tunneling. A correlation is established between the fast bit density observed in the memory arrays, the FG polysilicon grain size and tunneling field enhancement factor μ( Rc). A compact model of the fast programming bit memory threshold voltage as a function of the effective FG polysilicon grain area factor Geff, and tunneling field enhancement factor μ( Rc) has been developed for the first time.

An effective office ergonomics program can predict positive results in reducing musculoskeletal injury rates, enhancing productivity, and improving staff well-being and job satisfaction. Its objective is to provide a systematic solution to manage the potential risk of musculoskeletal disorders among computer users in an office setting. A FITS Model office ergonomics program is developed. The FITS Model Office Ergonomics Program has been developed which draws on the legislative requirements for promoting the health and safety of workers using computers for extended periods as well as previous research findings. The Model is developed according to the practical industrial knowledge in ergonomics, occupational health and safety management, and human resources management in Hong Kong and overseas. This paper proposes a comprehensive office ergonomics program, the FITS Model, which considers (1) Furniture Evaluation and Selection; (2) Individual Workstation Assessment; (3) Training and Education; (4) Stretching Exercises and Rest Break as elements of an effective program. An experienced ergonomics practitioner should be included in the program design and implementation. Through the FITS Model Office Ergonomics Program, the risk of musculoskeletal disorders among computer users can be eliminated or minimized, and workplace health and safety and employees' wellness enhanced.

Alcohol and other drug (AOD) abuse affects every sector of society, and student-athletes are no exception. Because many factors affecting athletes do not affect other students, athletic departments commonly approach prevention through AOD education. Different educational approaches are described in this article, particularly the Athletic Prevention Programming and Leadership Education (APPLE) model. Project APPLE is designed to enable an athletic department to systematically analyze its AOD p...

This technical report presented the methodologies, processes, and results of comparing three Building Energy ModelingPrograms (BEMPs) for load calculations: EnergyPlus, DeST and DOE-2.1E. This joint effort, between Lawrence Berkeley National Laboratory, USA and Tsinghua University, China, was part of research projects under the US-China Clean Energy Research Center on Building Energy Efficiency (CERC-BEE). Energy Foundation, an industrial partner of CERC-BEE, was the co-sponsor of this study work. It is widely known that large discrepancies in simulation results can exist between different BEMPs. The result is a lack of confidence in building simulation amongst many users and stakeholders. In the fields of building energy code development and energy labeling programs where building simulation plays a key role, there are also confusing and misleading claims that some BEMPs are better than others. In order to address these problems, it is essential to identify and understand differences between widely-used BEMPs, and the impact of these differences on load simulation results, by detailed comparisons of these BEMPs from source code to results. The primary goal of this work was to research methods and processes that would allow a thorough scientific comparison of the BEMPs. The secondary goal was to provide a list of strengths and weaknesses for each BEMP, based on in-depth understandings of their modeling capabilities, mathematical algorithms, advantages and limitations. This is to guide the use of BEMPs in the design and retrofit of buildings, especially to support China’s building energy standard development and energy labeling program. The research findings could also serve as a good reference to improve the modeling capabilities and applications of the three BEMPs. The methodologies, processes, and analyses employed in the comparison work could also be used to compare other programs. The load calculation method of each program was analyzed and compared to

In this article, the feasibility of a unified modelling and programming paradigm is discussed from the perspective of large scale system development and verification in collaborative development environments. We motivate the necessity to utilise multiple formalisms for development and verification......, in particular for complex cyber-physical systems or systems of systems. Though modelling, programming, and verification will certainly become more closely integrated in the future, we do not expect a single formalism to become universally applicable and accepted by the development and verification communities....... It is illustrated by means of a case study from the railway domain, how this can be achieved, using concepts from the theory of institutions. This also enables the utilisation of verification tools in different formalisms, despite the fact that these tools are usually developed for one specific formal method....

This report summarizes our investigations into multi-core processors and programmingmodels for parallel scientific applications. The motivation for this study was to better understand the landscape of multi-core hardware, future trends, and the implications on system software for capability supercomputers. The results of this study are being used as input into the design of a new open-source light-weight kernel operating system being targeted at future capability supercomputers made up of multi-core processors. A goal of this effort is to create an agile system that is able to adapt to and efficiently support whatever multi-core hardware and programmingmodels gain acceptance by the community.

THE EVALUATION OF SCIENCE TEACHING ON JUNIOR HIGH SCHOOL USING STAKE’S COUNTENANCE MODEL Abstract The purpose of the study was to describe the science learning program on junior high school in Bone Bolanga district based on the Regulation of Minister of Education and Culture of the Republic of Indonesia, Number 65 of 2013 about Processing Standard of Primary and Secondary Education. This study used Stake’s Countanance evaluation model. The data were collected using observation, interview and documentation techniques. The conclusion was: (1 the planning of science learning was categorized fair (68%, it was found that lesson plan was not in accordance with the learning processing standard. (2 The implementation of science learning was categorized fair (57%, that unconformitted with learning processing implementation standard. (3 Student learning outcomes have not met the completeness of minimum criteria (KKM that categorized enough (65% and (4 There were the contingency of planing learning proces and outcome. Keywords: Program Evaluation, Stake's Countenance, Science Learning

Full Text Available This study develops a hybrid multiple criteria decision making (MCDM model to select program projects for nonprofit TV stations on the basis of managers’ perceptions. By the concept of balanced scorecard (BSC and corporate social responsibility (CSR, we collect criteria for selecting the best program project. Fuzzy Delphi method, which can lead to better criteria selection, is used to modify criteria. Next, considering the interdependence among the selection criteria, analytic network process (ANP is then used to obtain the weights of them. To avoid calculation and additional pairwise comparisons of ANP, technique for order preference by similarity to ideal solution (TOPSIS is used to rank the alternatives. A case study is presented to demonstrate the applicability of the proposed model.

As part of the Center for ProgrammingModels for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programmingmodels to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

This paper describes how two runtime analysis algorithms, an existing data race detection algorithm and a new deadlock detection algorithm, have been implemented to analyze Java programs. Runtime analysis is based on the idea of executing the program once. and observing the generated run to extract various kinds of information. This information can then be used to predict whether other different runs may violate some properties of interest, in addition of course to demonstrate whether the generated run itself violates such properties. These runtime analyses can be performed stand-alone to generate a set of warnings. It is furthermore demonstrated how these warnings can be used to guide a model checker, thereby reducing the search space. The described techniques have been implemented in the b e grown Java model checker called PathFinder.

built and the tools have been used to analyse a number of systems for properties such as worst case execution time, schedulability and energy optimization [12–14,19,34,36,38]. In this paper we will elaborate on the theoretical underpinning of the translation from Java programs to timed automata models...... frameworks, we have in recent years pursued an agenda of translating hard-real-time embedded safety critical programs written in the Safety Critical Java Profile [33] into networks of timed automata [4] and subjecting those to automated analysis using the UPPAAL model checker [10]. Several tools have been...... and briefly summarize some of the results based on this translation. Furthermore, we discuss future work, especially relations to the work in [16,24] as Java recently has adopted first class higher order functions in the form of lambda abstractions....

Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

would assist the Monitors in the assignment process. Though these studies contain very thorough analyses, they differ from the approach taken in this...thesis in that they do not look into using a low cost, yet very efficient, decision modeling approach of integer programming as a method of...2012 BAH Rates-with Dependents. Defense Travel Mangement Office. (2011, December). 2012 BAH Rates-without Dependents. M ileage C ost 1 Per D iem

In this paper we introduce the concept of generalized d-graph (admitting cycles) as special dependency-graphs for modelling dynamic programming (DP) problems. We describe the d-graph versions of three famous single-source shortest algorithms (The algorithm based on the topological order of the vertices, Dijkstra algorithm and Bellman-Ford algorithm), which can be viewed as general DP strategies in the case of three different class of optimization problems. The new modelling method also makes possible to classify DP problems and the corresponding DP strategies in term of graph theory.

The influence of nutrition on offspring metabolism has become a hot topic in recent years owing to the growing prevalence of maternal and childhood obesity. Studies in mammals have identified several factors correlating with parental and early offspring dietary influences on progeny health; however, the molecular mechanisms that underlie these factors remain undiscovered. Mammalian metabolic tissues and pathways are heavily conserved in Drosophila melanogaster, making the fly an invaluable genetic model organism for studying metabolism. In this review, we discuss the metabolic similarities between mammals and Drosophila and present evidence supporting its use as an emerging model of metabolic programming.

A fuzzy goal programming (FGP) model for biodiesel production in the Philippines was formulated with Coconut (Cocos nucifera) and Jatropha (Jatropha curcas) as sources of biodiesel. Objectives were maximization of feedstock production and overall revenue and, minimization of energy used in production and working capital for farming subject to biodiesel and non-biodiesel requirements, and availability of land, labor, water and machine time. All these objectives and constraints were assumed to be fuzzy. Model was tested for different sets of weights. Results for all sets of weights showed the same optimal allocation. Coconut alone can satisfy the biodiesel requirement of 2% per volume.

This book systematically introduces the development of simulation models as well as the implementation and evaluation of simulation experiments with Tecnomatix Plant Simulation. It deals with all users of Plant Simulation, who have more complex tasks to handle. It also looks for an easy entry into the program. Particular attention has been paid to introduce the simulation flow language SimTalk and its use in various areas of the simulation. The author demonstrates with over 200 examples how to combine the blocks for simulation models and how to deal with SimTalk for complex control and analys

Permissive-Nominal Logic (PNL) extends first-order predicate logic with term-formers that can bind names in their arguments. It takes a semantics in (permissive-)nominal sets. In PNL, the forall-quantifier or lambda-binder are just term-formers satisfying axioms, and their denotation is functions on nominal atoms-abstraction. Then we have higher-order logic (HOL) and its models in ordinary (i.e. Zermelo-Fraenkel) sets; the denotation of forall or lambda is functions on full or partial function spaces. This raises the following question: how are these two models of binding connected? What translation is possible between PNL and HOL, and between nominal sets and functions? We exhibit a translation of PNL into HOL, and from models of PNL to certain models of HOL. It is natural, but also partial: we translate a restricted subsystem of full PNL to HOL. The extra part which does not translate is the symmetry properties of nominal sets with respect to permutations. To use a little nominal jargon: we can translate na...

AXE, An Experimental Environment for Parallel Systems, was designed to model and simulate for parallel systems at the process level. It provides an integrated environment for specifying computation models, multiprocessor architectures, data collection, and performance visualization. AXE is being used at NASA-Ames for developing resource management strategies, parallel problem formulation, multiprocessor architectures, and operating system issues related to the High Performance Computing and Communications Program. AXE's simple, structured user-interface enables the user to model parallel programs and machines precisely and efficiently. Its quick turn-around time keeps the user interested and productive. AXE models multicomputers. The user may easily modify various architectural parameters including the number of sites, connection topologies, and overhead for operating system activities. Parallel computations in AXE are represented as collections of autonomous computing objects known as players. Their use and behavior is described. Performance data of the multiprocessor model can be observed on a color screen. These include CPU and message routing bottlenecks, and the dynamic status of the software.

The purpose was to propose a balanced program of aerosol backscatter research leading to the development of a global model of aerosol backscatter. Such a model is needed for feasibility studies and systems simulation studies for NASA's prospective satellite-based Doppler lidar wind measurement system. Systems of this kind measure the Doppler shift in the backscatter return from small atmospheric aerosol wind tracers (of order 1 micrometer diameter). The accuracy of the derived local wind estimates and the degree of global wind coverage for such a system are limited by the local availability and by the global scale distribution of natural aerosol particles. The discussions here refer primarily to backscatter model requirements at CO2 wavelengths, which have been selected for most of the Doppler lidar systems studies to date. Model requirements for other potential wavelengths would be similar.

The purpose was to propose a balanced program of aerosol backscatter research leading to the development of a global model of aerosol backscatter. Such a model is needed for feasibility studies and systems simulation studies for NASA's prospective satellite-based Doppler lidar wind measurement system. Systems of this kind measure the Doppler shift in the backscatter return from small atmospheric aerosol wind tracers (of order 1 micrometer diameter). The accuracy of the derived local wind estimates and the degree of global wind coverage for such a system are limited by the local availability and by the global scale distribution of natural aerosol particles. The discussions here refer primarily to backscatter model requirements at CO2 wavelengths, which have been selected for most of the Doppler lidar systems studies to date. Model requirements for other potential wavelengths would be similar.

We present an extension of the programming-by-contract (PBC) paradigm to a concurrent and distributed environment. Classical PBC is characterized by absolute conformance of code to its specification, assigning blame in case of failures, and a hierarchical, cooperative decomposition model – none...... of which extend naturally to a distributed environment with multiple administrative peers. We therefore propose a more nuanced contract model based on quantifiable performance of implementations; assuming responsibility for success; and a fundamentally adversarial model of system integration, where each...... component provider is optimizing its behavior locally, with respect to potentially conflicting demands. This model gives rise to a game-theoretic formulation of contract-governed process interactions that supports compositional reasoning about contract conformance....

Full Text Available The model of learning is a vital thing in education. A good appropriate model of learning could reach the goal of learning efficently and effectively. The lecturers of education and teacher training program of STAIN Samarinda implement a various teaching and learning models when they perform their teaching, such as: model of contectual teaching, social interaction, informational proces, personal-based learning, behaviorism, cooperative learning, and problem-based learning.

Complex multidimensional nonlinear dynamic systems are controlled with the use of control computers differing in their complexity, from large ones at the level of the modeling of the dynamics of the system and synthesis of the control algorithm to microcomputers directly controlling the system. One of the main decisive problems in controlling an industrial robot is the generation of a dynamic on-line model. For this purpose, a separate microcomputer or several computers operating in the multiprocessor mode are used. The existing methods of robot dynamics simulation which are based on the equations of Newton-Euler, Lagrange and others require a large number of numerical operations and they cannot be realized in real time: the time spent on the construction of a model is several times greater than that acceptable for the computer. This is explained by the following factors: complexity of the mathematical model, the use of a high-level language for programming, the desire to construct an algorithm of a general kind, which leads to high redundancy of numerical operations; the use of recursion relations, which excludes the possibility of performing parallel operations during the realization in a computer. Special characteristics of the construction of a program system which is to a considerable degree free of the above drawbacks is considered below. Instead of directly constructing a model in the numerical form on the basis of one of the approaches of classical mechanics, the concept of constructing a model in symbols is proposed here. It is possible to show that in this way it is possible to obtain a compact mathematical model which requires fewer numerical operations (multiplication, addition) for its realization by two orders in comparison with the methods used earlier. 9 references, 2 tables.

Full Text Available Introduction: In e-learning, people get involved in a process and create the content (product and make it available for virtual learners. The present study was carried out in order to evaluate the first virtual master program in medical education at Shiraz University of Medical Sciences according to P3 Model. Methods: This is an evaluation research study with post single group design used to determine how effective this program was. All students 60 who participated more than one year in this virtual program and 21 experts including teachers and directors participated in this evaluation project. Based on the P3 e-learning model, an evaluation tool with 5-point Likert rating scale was designed and applied to collect the descriptive data. Results: Students reported storyboard and course design as the most desirable element of learning environment (2.30±0.76, but they declared technical support as the less desirable part (1.17±1.23. Conclusion: Presence of such framework in this regard and using it within the format of appropriate tools for evaluation of e-learning in universities and higher education institutes, which present e-learning curricula in the country, may contribute to implementation of the present and future e-learning curricula efficiently and guarantee its implementation in an appropriate way.

Introduction: In e-learning, people get involved in a process and create the content (product) and make it available for virtual learners. The present study was carried out in order to evaluate the first virtual master program in medical education at Shiraz University of Medical Sciences according to P3 Model. Methods: This is an evaluation research study with post single group design used to determine how effective this program was. All students 60 who participated more than one year in this virtual program and 21 experts including teachers and directors participated in this evaluation project. Based on the P3 e-learning model, an evaluation tool with 5-point Likert rating scale was designed and applied to collect the descriptive data. Results: Students reported storyboard and course design as the most desirable element of learning environment (2.30±0.76), but they declared technical support as the less desirable part (1.17±1.23). Conclusion: Presence of such framework in this regard and using it within the format of appropriate tools for evaluation of e-learning in universities and higher education institutes, which present e-learning curricula in the country, may contribute to implementation of the present and future e-learning curricula efficiently and guarantee its implementation in an appropriate way. PMID:27795971

This paper deals with development of a seasonal fraction-removal policy model for waste load allocation in streams addressing uncertainties due to randomness and fuzziness. A stochastic dynamic programming (SDP) model is developed to arrive at the steady-state seasonal fraction-removal policy. A fuzzy decision model (FDM) developed by us in an earlier study is used to compute the system performance measure required in the SDP model. The state of the system in a season is deﬁned by streamﬂows at the headwaters during the season and the initial DO deﬁcit at some pre-speciﬁed checkpoints. The random variation of streamﬂows is included in the SDP model through seasonal transitional probabilities. The decision vector consists of seasonal fraction-removal levels for the efﬂuent dischargers. Uncertainty due to imprecision (fuzziness) associated with water quality goals is addressed using the concept of fuzzy decision. Responses of pollution control agencies to the resulting end-of-season DO deﬁcit vector and that of dischargers to the fraction-removal levels are treated as fuzzy, and modelled with appropriate membership functions. Application of the model is illustrated with a case study of the Tungabhadra river in India.

The overall objective of this project was to develop an updated model Energy Conservation training program for stationary engineers. This revision to the IUOE National Training Fund’s existing Energy Conservation training curriculum is designed to enable stationary engineers to incorporate essential energy management into routine building operation and maintenance tasks. The curriculum uses a blended learning approach that includes classroom, hands-on, computer simulation and web-based training in addition to a portfolio requirement for a workplace-based learning application. The Energy Conservation training program goal is development of a workforce that can maintain new and existing commercial buildings at optimum energy performance levels. The grant start date was July 6, 2010 and the project continued through September 30, 2012, including a three month non-funded extension.

Full Text Available The objective of this paper is to present a model to design effective Production Improvement Programs (PIP in order tocontribute in the solution of the problematic situations generally faced by the Mexican manufacturing micro, small and mediumsizedenterprises (M‐SME. In this proposal, we imply that facilitating their development is a natural way to improve theirperformance, especially in terms of productive efficiency. The study picked up empirical evidence from the ProcessesReengineering Workshop (PRW, one of the leading services of the National Committee of Productivity and TechnologicalInnovation (NCPTI which is considered a Mexican successful case. We show through a comparative analysis that it is possible tohave better programs when they follow a continuous improvement process involving the owner of the firm and workforceparticipation. Furthermore, we suggest a series of methods for planning, structuring and improvement according to theimitative, tacit and qualitative M‐SME specific competence.

Environmental factors, particularly nutrition during pregnancy and early life can influence the risk of chronic diseases in later life. The underlying mechanism, termed "programing", postulates that an environmental stimulus during a critical window of time, early in life, has a permanent effect on subsequent structure and function of the organism. In this study we review the concept of fetal programing on chronic diseases and the proposed hypotheses for the association between early development and later disease, including epigenetic variation. We concentrate on specific aspects of maternal nutrition, particularly under-nutrition and over-nutrition, in humans and animal models. An adequate maternal nutrition during pregnancy is crucial for the health outcome of the offspring at adulthood.

Contemporary astronomy is characterized by increasingly complex instruments and observational techniques, higher data collection rates, and large data archives, placing severe stress on software analysis systems. The object-oriented paradigm represents a significant new approach to software design and implementation that holds great promise for dealing with this increased complexity. The basic concepts of this approach will be characterized in contrast to more traditional procedure-oriented approaches. The fundamental features of objected-oriented programming will be discussed from a C++ programming language perspective, using examples familiar to astronomers. This discussion will focus on objects, classes and their relevance to the data type system; the principle of information hiding; and the use of inheritance to implement generalization/specialization relationships. Drawing on the object-oriented approach, features of a new database model to support astronomical data analysis will be presented.

This paper traces the historical development of cartography graduate programs, establishes an evolutionary model, and evaluates the model to determine if it has some utility today for the development of programs capable of producing highly skilled cartographers. Cartography is defined to include traditional cartography, computer cartography,…

Accurate predictions of the thermodynamic state of the cryogenic propellants, pressurization rate, and performance of pressure control techniques in cryogenic tanks are required for development of cryogenic fluid long-duration storage technology and planning for future space exploration missions. This Technical Memorandum (TM) presents the analytical tool, Tank System Integrated Model (TankSIM), which can be used for modeling pressure control and predicting the behavior of cryogenic propellant for long-term storage for future space missions. Utilizing TankSIM, the following processes can be modeled: tank self-pressurization, boiloff, ullage venting, mixing, and condensation on the tank wall. This TM also includes comparisons of TankSIM program predictions with the test data andexamples of multiphase mission calculations.

Complex networks theory has commonly been used for modelling and understanding the interactions taking place between the elements composing complex systems. More recently, the use of generative models has gained momentum, as they allow identifying which forces and mechanisms are responsible for the appearance of given structural properties. In spite of this interest, several problems remain open, one of the most important being the design of robust mechanisms for finding the optimal parameters of a generative model, given a set of real networks. In this contribution, we address this problem by means of Probabilistic Constraint Programming. By using as an example the reconstruction of networks representing brain dynamics, we show how this approach is superior to other solutions, in that it allows a better characterisation of the parameters space, while requiring a significantly lower computational cost.

Traditional scheduling problems assume that there are always infinitely many resources for delivering finished jobs to their destinations, and no time is needed for their transportation, so that finished products can be transported to customers without delay. So, for coordination of these two different activities in the implementation of a supply chain solution, we studied the problem of synchronizing production and air transportation scheduling using mathematical programmingmodels. The overall problem is decomposed into two sub-problems, which consists of air transportation allocation problem and a single machine scheduling problem which they are considered together. We have taken into consideration different constraints and assumptions in our modeling such as special flights, delivery tardiness and no delivery tardiness. For these purposes, a variety of models have been proposed to minimize supply chain total cost which encompass transportation, makespan, delivery earliness tardiness and departure time earliness tardiness costs.

the C programming language in its automatic generation of C programs from the input MODEL language specification Theretore. though the MTE is primarily...program (the MODEL compiler), it is known beforehand that not all of the features of the C programming language will be used and the MTE has been

Iron is a trace metal, key for the development of living organisms. Its absorption process is complex and highly regulated at the transcriptional, translational and systemic levels. Recently, the internalization of the DMT1 transporter has been proposed as an additional regulatory mechanism at the intestinal level, associated to the mucosal block phenomenon. The short-term effect of iron exposure in apical uptake and initial absorption rates was studied in Caco-2 cells at different apical iron concentrations, using both an experimental approach and a mathematical modeling framework. This is the first report of short-term studies for this system. A non-linear behavior in the apical uptake dynamics was observed, which does not follow the classic saturation dynamics of traditional biochemical models. We propose a method for developing mathematical models for complex systems, based on a genetic programming algorithm. The algorithm is aimed at obtaining models with a high predictive capacity, and considers an additional parameter fitting stage and an additional Jackknife stage for estimating the generalization error. We developed a model for the iron uptake system with a higher predictive capacity than classic biochemical models. This was observed both with the apical uptake dataset used for generating the model and with an independent initial rates dataset used to test the predictive capacity of the model. The model obtained is a function of time and the initial apical iron concentration, with a linear component that captures the global tendency of the system, and a non-linear component that can be associated to the movement of DMT1 transporters. The model presented in this paper allows the detailed analysis, interpretation of experimental data, and identification of key relevant components for this complex biological process. This general method holds great potential for application to the elucidation of biological mechanisms and their key components in other complex

The Buildings Energy Program at PNL conducts research and development (R&D) for DOE`s Office of Building Technologies (OBT). The OBT`s mission is to lead a national program supporting private and federal sector efforts to improve the energy efficiency of the nation`s buildings and to increase the use of renewable energy sources. Under an arrangement with DOE, Battelle staff also conduct research and development projects for other federal agencies and private clients. This annual report contains an account of the buildings-related research projects conducted at PNL during fiscal year (FY) 1991. A major focus of PNL`s energy projects is to improve the energy efficiency of commercial and residential buildings. Researchers who are developing solutions to energy-use problems view a building as an energy-using system. From this perspective, a desirable solution is not only one that is cost-effective and responsive to the needs of the occupants, but also one that optimizes the interaction among the energy components and systems that compose the whole.

A generalized fish life-cycle population model and computer program have been prepared to evaluate the long-term effect of changes in mortality in age class 0. The general question concerns what happens to a fishery when density-independent sources of mortality are introduced that act on age class 0, particularly entrainment and impingement at power plants. This paper discusses the model formulation and computer program, including sample results. The population model consists of a system of difference equations involving age-dependent fecundity and survival. The fecundity for each age class is assumed to be a function of both the fraction of females sexually mature and the weight of females as they enter each age class. Natural mortality for age classes 1 and older is assumed to be independent of population size. Fishing mortality is assumed to vary with the number and weight of fish available to the fishery. Age class 0 is divided into six life stages. The probability of survival for age class 0 is estimated considering both density-independent mortality (natural and power plant) and density-dependent mortality for each life stage. Two types of density-dependent mortality are included. These are cannibalism of each life stage by older age classes and intra-life-stage competition.

Full Text Available The objective of our investigation is to establish robust inverse algorithms to convert GRACE gravity and ICESat altimetry mission data into global current and past surface mass variations. To assess separation of global sources of change and to evaluate spatio-temporal resolution and accuracy statistically from full posterior covariance matrices, a high performance version of a global simultaneous grid inverse algorithm is essential. One means to accomplish this is to implement a general, well-optimized, parallel global model on massively parallel supercomputers. In our present work, an efficient parallel version of a global inverse program has been implemented on the Origin 2000 using the OpenMP programmingmodel. In this paper, porting a sequential global code to a shared-memory computing system is discussed; several efficient strategies to optimize the code are reported; well-optimized scientific libraries are used; detailed parallel implementation of the global model is reported; performance data of the code are analyzed. Scaling performance on a shared-memory system is also discussed. The parallel version software gives good speedup and dramatically reduces total data processing time.

Full Text Available A dynamic alcohol consumption model with awareness programs and time delay is formulated and analyzed. The aim of this model is to capture the effects of awareness programs and time delay in controlling the alcohol problems. We introduce awareness programs by media in the model as a separate class with growth rate of the cumulative density of them being proportional to the number of mortalities induced by heavy drinking. Susceptible population will isolate themselves and avoid contact with the heavy drinkers or become aware of risk of heavy drinking and decline to drink due to such programs. In particular, we incorporate time delay because the nonconsumer population will take a period of time to become an alcohol consumer. We find that the model has two equilibria: one without alcohol problems and one where alcohol problems are endemic in population. The model analysis shows that though awareness programs cannot eradicate alcohol problems, they are effective measures in controlling the alcohol problems. Further, we conclude that the time delay in alcohol consumption habit which develops in susceptible population may result in Hopf bifurcation by increasing the value of time delay. Some numerical simulation results are also given to support our theoretical predictions.

Full Text Available The article analyses the use of logical variables in economic models solved by linear programming. Focus is given to the presentation of the way logical constraints are obtained and of the definition rules based on predicate logic. Emphasis is also put on the possibility to use logical variables in constructing a linear objective function on intervals. Such functions are encountered when costs or unitary receipts are different on disjunct intervals of production volumes achieved or sold. Other uses of Boolean variables are connected to constraint systems with conditions and the case of a variable which takes values from a finite set of integers.

Full Text Available The technology evolution creates the prerequisites for the emergence of new informational concept and approaches to the formation of a fundamentally new principles of biological objects understanding. The aim was to study the activators of the programmed cell death in an isolated system model. Cell culture aging parameters were performed on flow cytometer. It had formed the theory that the changes in the concentrations of metal ions and increase their extracellular concentration had formed a negative gradient into the cells.regulation of cell death. It was shown that the metals ions concentrations.

This article presents a modelprogram for managing problem employees that includes a description ofthe basic types of problem employees and employee problems, as well as practical recommendations for. (1) selection and screening, (2) education and training, (3) coaching and counseling, (4) discipline, (5) psychological fitness-for-duty evaluations, (6) mental health services, (7) termination, and (8) leadership and administrative strategies. Throughout, the emphasis on balancing the need for order and productivity in the workplace with fairness and concern for employee health and well-being.

Within health science programs there has been a call for more faculty development, particularly for teaching and learning. The primary objectives of this review were to describe the current landscape for faculty development programs for teaching and learning and make recommendations for the implementation of new faculty development programs. A thorough search of the pertinent health science databases was conducted, including the Education Resource Information Center (ERIC), MEDLINE, and EMBASE, and faculty development books and relevant information found were reviewed in order to provide recommendations for best practices. Faculty development for teaching and learning comes in a variety of forms, from individuals charged to initiate activities to committees and centers. Faculty development has been effective in improving faculty perceptions on the value of teaching, increasing motivation and enthusiasm for teaching, increasing knowledge and behaviors, and disseminating skills. Several models exist that can be implemented to support faculty teaching development. Institutions need to make informed decisions about which plan could be most successfully implemented in their college or school. PMID:24954939

Within health science programs there has been a call for more faculty development, particularly for teaching and learning. The primary objectives of this review were to describe the current landscape for faculty development programs for teaching and learning and make recommendations for the implementation of new faculty development programs. A thorough search of the pertinent health science databases was conducted, including the Education Resource Information Center (ERIC), MEDLINE, and EMBASE, and faculty development books and relevant information found were reviewed in order to provide recommendations for best practices. Faculty development for teaching and learning comes in a variety of forms, from individuals charged to initiate activities to committees and centers. Faculty development has been effective in improving faculty perceptions on the value of teaching, increasing motivation and enthusiasm for teaching, increasing knowledge and behaviors, and disseminating skills. Several models exist that can be implemented to support faculty teaching development. Institutions need to make informed decisions about which plan could be most successfully implemented in their college or school.

The U.S. Geological Survey (USGS) Cooperative Fish and Wildlife Research Units (CRU) program is a unique model of cooperative partnership among the USGS, other U.S. Department of the Interior and Federal agencies, universities, State fish and wildlife agencies, and the Wildlife Management Institute. These partnerships are maintained as one of the USGS’s strongest links to Federal and State land and natural resource management agencies.Established in 1935 to meet the need for trained professionals in the growing field of wildlife management, the program currently consists of 40 Cooperative Fish and Wildlife Research Units located on university campuses in 38 States and supports 119 research scientist positions when fully funded. The threefold mission of the CRU program is to (1) conduct scientific research for the management of fish, wildlife, and other natural resources; (2) provide technical assistance to natural resource managers in the application of scientific information to natural resource policy and management; and (3) train future natural resource professionals.

Full Text Available Faculty development is an imperative if institutions are to develop professional and competent teachers, educators, researchers and leaders. Planning of faculty development currently focuses on meeting the perceived needs of staff and their interests. We would like to propose the Compass Model as a conceptual framework to plan faculty development, which was inspired by the interplay between intrinsic and extrinsic forces for learning, as outlined in the Self-Determination Theory (SDT. In planning faculty development, the Compass Model acknowledges four agendas (directions from various stakeholders: Strategies (N, Competencies (E, Resources (S and Wish lists (W. The model then describes four avenues for faculty development offerings (quadrants: Foundation (NE, Innovation (SE, Response (SW and Motivation (NW (i.e. outputs, activities. The model was compared theoretically with another approach to faculty development planning. It was then piloted as a quality measure for a current program to check for omissions or missed opportunities. We plan to use it in a multi-center study to compare approaches in faculty development planning in different contexts. We hope our model assists faculty developers to consider all stakeholders’ agendas when planning faculty development, beyond the current standard customer-based approach.

TRAC (Transient Reactor Analysis Code) is a computer code for best-estimate analysis for the thermal hydraulic conditions in a reactor system. The development and assessment of the BWR component models developed under the Refill/Reflood Program that are necessary to structure a BWR-version of TRAC are described in this report. These component models are the jet pump, steam separator, steam dryer, two-phase level tracking model, and upper-plenum mixing model. These models have been implemented into TRAC-B02. Also a single-channel option has been developed for individual fuel-channel analysis following a system-response calculation.

In this work MagnetoEncephaloGram (MEG) recordings of epileptic patients are modeled using a genetic programming approach. This is the first time that genetic programming is used to model MEG signal. Numerous experiments were conducted giving highly successful results. It is demonstrated that genetic programming can produce very simple nonlinear models that fit with great accuracy the observed data of MEG.

Bus timetable gave an information for passengers to ensure the availability of bus services. Timetable optimal condition happened when bus trips frequency could adapt and suit with passenger demand. In the peak time, the number of bus trips would be larger than the off-peak time. If the number of bus trips were more frequent than the optimal condition, it would make a high operating cost for bus operator. Conversely, if the number of trip was less than optimal condition, it would make a bad quality service for passengers. In this paper, the bus timetabling problem would be solved by integer programmingmodel with modified genetic algorithm. Modification was placed in the chromosomes design, initial population recovery technique, chromosomes reconstruction and chromosomes extermination on specific generation. The result of this model gave the optimal solution with accuracy 99.1%.

This paper is based on a resource constrained active network project; the constraint of the local resource and the time constraint of the cooperation resource are considered simultaneously. And the respective benefit of the manager and cooperation partners is also considered simultaneously. And a cooperation-planning model based on bilevel multi-objective programming is designed, according to the due time and total cost. And an extended CNP based on the permitted range for resource and time requests is presented. A larger task set in scheduling cycle is on the permitting for the request of cooperation resource and time while the task manager itself may be permitted biding for tasks. As a result, the optimization space for the cooperation planning is enlarged. So not every bidding task is successfully bid by invitee, and the task manager itself takes on some bidding tasks. Finally, the genetic algorithm is given and the validity and feasibility of the model is proved by a case.

Based on the definition of component ontology, an effective component classification mechanism and a facet named component relationship are proposed. Then an application domain oriented, hierarchical component organization model is established. At last a hierarchical component semantic network (HCSN) described by ontology interchange language(OIL) is presented and then its function is described. Using HCSN and cooperating with other components retrieving algorithms based on component description, other components information and their assembly or composite modes related to the key component can be found. Based on HCSN, component directory library is catalogued and a prototype system is constructed. The prototype system proves that component library organization based on this model gives guarantee to the reliability of component assembly during program mining.

Many viruses produce multiple proteins from a single mRNA sequence by encoding overlapping genes. One mechanism to decode both genes, which reside in alternate reading frames, is -1 programmed ribosomal frameshifting. Although recognized for over 25 years, the molecular and physical mechanism of -1 frameshifting remains poorly understood. We have developed a mathematical model that treats mRNA translation and associated -1 frameshifting as a stochastic process in which the transition probabilities are based on the energetics of local molecular interactions. The model predicts both the location and efficiency of -1 frameshift events in HIV-1. Moreover, we compute -1 frameshift efficiencies upon mutations in the viral mRNA sequence and variations in relative tRNA abundances, predictions that are directly testable in experiment.

Current research suggests that many students do not know how to program very well at the conclusion of their introductory programming course. We believe that a reason novices have such difficulties learning programming is because engineering novices often learn through a lecture format where someone with programming knowledge lectures to novices, the novices attempt to absorb the content, and then reproduce it during exams. By primarily appealing to programming novices who prefer to understand visually, we research whether programming novices understand programming better if computer science concepts are presented using a visual programming language than if these programs are presented using a text-based programming language. This method builds upon previous research that suggests that most engineering students are visual learners, and we propose that using a flow-based visual programming language will address some of the most important and difficult topics to novices of programming. We use an existing flow-model tool, RAPTOR, to test this method, and share the program understanding results using this theory.

Full Text Available In High Performance Computing, energy consump-tion is becoming an important aspect to consider. Due to the high costs that represent energy production in all countries it holds an important role and it seek to find ways to save energy. It is reflected in some efforts to reduce the energy requirements of hardware components and applications. Some options have been appearing in order to scale down energy use and, con-sequently, scale up energy efficiency. One of these strategies is the multithread programming paradigm, whose purpose is to produce parallel programs able to use the full amount of computing resources available in a microprocessor. That energy saving strategy focuses on efficient use of multicore processors that are found in various computing devices, like mobile devices. Actually, as a growing trend, multicore processors are found as part of various specific purpose computers since 2003, from High Performance Computing servers to mobile devices. However, it is not clear how multiprogramming affects energy efficiency. This paper presents an analysis of different types of multicore-based architectures used in computing, and then a valid model is presented. Based on Amdahl’s Law, a model that considers different scenarios of energy use in multicore architectures it is proposed. Some interesting results were found from experiments with the developed algorithm, that it was execute of a parallel and sequential way. A lower limit of energy consumption was found in a type of multicore architecture and this behavior was observed experimentally.

World wide web is a decentralized system that consists of a repository of information on the basis of web pages. These web pages act as a source of information or data in the present analytics world. Web crawlers are used for extracting useful information from web pages for different purposes. Firstly, it is used in web search engines where the web pages are indexed to form a corpus of information and allows the users to query on the web pages. Secondly, it is used for web archiving where the web pages are stored for later analysis phases. Thirdly, it can be used for web mining where the web pages are monitored for copyright purposes. The amount of information processed by the web crawler needs to be improved by using the capabilities of modern parallel processing technologies. In order to solve the problem of parallelism and the throughput of crawling this work proposes to optimize the Crawler4j using the Hadoop MapReduce programmingmodel by parallelizing the processing of large input data. Crawler4j is a web crawler that retrieves useful information about the pages that it visits. The crawler Crawler4j coupled with data and computational parallelism of Hadoop MapReduce programmingmodel improves the throughput and accuracy of web crawling. The experimental results demonstrate that the proposed solution achieves significant improvements with respect to performance and throughput. Hence the proposed approach intends to carve out a new methodology towards optimizing web crawling by achieving significant performance gain.

World wide web is a decentralized system that consists of a repository of information on the basis of web pages. These web pages act as a source of information or data in the present analytics world. Web crawlers are used for extracting useful information from web pages for different purposes. Firstly, it is used in web search engines where the web pages are indexed to form a corpus of information and allows the users to query on the web pages. Secondly, it is used for web archiving where the web pages are stored for later analysis phases. Thirdly, it can be used for web mining where the web pages are monitored for copyright purposes. The amount of information processed by the web crawler needs to be improved by using the capabilities of modern parallel processing technologies. In order to solve the problem of parallelism and the throughput of crawling this work proposes to optimize the Crawler4j using the Hadoop MapReduce programmingmodel by parallelizing the processing of large input data. Crawler4j is a web crawler that retrieves useful information about the pages that it visits. The crawler Crawler4j coupled with data and computational parallelism of Hadoop MapReduce programmingmodel improves the throughput and accuracy of web crawling. The experimental results demonstrate that the proposed solution achieves significant improvements with respect to performance and throughput. Hence the proposed approach intends to carve out a new methodology towards optimizing web crawling by achieving significant performance gain.

This paper aims to develop a replacement model of city bus vehicles operated in Bandung City. This study is driven from real cases encountered by the Damri Company in the efforts to improve services to the public. The replacement model propounds two policy alternatives: First, to maintain or keep the vehicles, and second is to replace them with new ones taking into account operating costs, revenue, salvage value, and acquisition cost of a new vehicle. A deterministic dynamic programming approach is used to solve the model. The optimization process was heuristically executed using empirical data of Perum Damri. The output of the model is to determine the replacement schedule and the best policy if the vehicle has passed the economic life. Based on the results, the technical life of the bus is approximately 20 years old, while the economic life is an average of 9 (nine) years. It means that after the bus is operated for 9 (nine) years, managers should consider the policy of rejuvenation.

Outcomes-based program evaluation is a systematic approach to identifying outcome indicators and measuring results against those indicators. One dimension of program evaluation is assessing the level of learner acquisition to determine if learning objectives were achieved as intended. The purpose of the proposed model is to use Bloom's Taxonomy to…

International audience; This paper presents Signal-Meta, the metamodel designed for the synchronous data-flow language Signal. It relies on the Generic Modeling Environment (GME), a configurable object-oriented toolkit that supports the creation of domain-specific modeling and program synthesis environments. The graphical description constitutes the base to build multi-clock environments, and a good front-end for the Polychrony platform. To complete this frontend, we develop a tool that trans...

This document supplements the Manufactured Housing Acquisition Program (MAP) impact evaluation report, Lee et al. (1995). MAP is a voluntary energy-efficiency program for HUD-code manufactured homes conducted in the Pacific Northwest beginning in April 1992. Pacific Northwest Laboratory (PNL) prepared this and the impact evaluation reports for the Bonneville Power Administration (Bonneville). Lee et al. (1995) presents the objectives, methodology, and findings of the program evaluation. This report presents more details about specific aspects of the analysis. The authors used a three-tier approach to analyze the energy consumption of MAP and baseline homes. Chapter 2 discusses Tier 1, the billing data and simplified regression analysis. Chapter 3 presents the details of the Tier 2 analysis, the PRInceton Scorekeeping Method (PRISM). Chapter 4 presents details of the primary analysis technique that they used, a comprehensive regression analysis. Chapter 5 and 6 review two other studies of energy savings associated with MAP. Chapter 5 discusses the simulation model analysis conducted by Ecotope, Inc. Chapter 6 reviews the analysis by Regional Economic Research conducted for three Pacific Northwest investor-owned utilities. The final chapter, Chapter 7, presents details of the Bonneville levelized cost methodology used to estimate the cost of energy savings associated with MAP. Results are presented and discussed in many cases for the three different climate zones found in the Pacific Northwest. 18 refs., 29 tabs.

Full Text Available This paper deals with the problem of assigning students to elective courses according to their preferences. This process of assigning students to elective courses according to their preferences often places before academic institutions numerous obstacles, the most typical being a limited number of students who can be assigned to any particular class. Furthermore, due to financial or technical reasons, the maximum number of the elective courses is determined in advance, meaning that the institution decides which courses to conduct. Therefore, the expectation that all the students will be assigned to their first choice of courses is not realistic (perfect satisfaction. This paper presents an integer programmingmodel that maximizes the total student satisfaction in line with a number of different constraints. The measure of student satisfaction is based on a student's order of preference according to the principle: the more a choice is met the higher the satisfaction. Following the basic model, several versions of the models are generated to cover possible real-life situations, while taking into consideration the manner student satisfaction is measured, as well as the preference of academic institution within set technical and financial constraints. The main contribution of the paper is introducing the concept of the minimal student satisfaction level that reduces the number of students dissatised with the courses to which they were assigned.

NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

The ability to assess and optimize economics of biomass resource utilization for the production of fuels, chemicals and power is essential for the ultimate success of a bioenergy industry. The team of authors, consisting of members from the National Renewable Energy Laboratory (NREL) and the Idaho National Laboratory (INL), has developed simple biorefinery linear programming (LP) models to enable the optimization of theoretical or existing biorefineries. The goal of this analysis is to demonstrate how such models can benefit the developing biorefining industry. It focuses on a theoretical multi-pathway, thermochemical biorefinery configuration and demonstrates how the biorefinery can use LP models for operations planning and optimization in comparable ways to the petroleum refining industry. Using LP modeling tools developed under U.S. Department of Energy's Bioenergy Technologies Office (DOE-BETO) funded efforts, the authors investigate optimization challenges for the theoretical biorefineries such as (1) optimal feedstock slate based on available biomass and prices, (2) breakeven price analysis for available feedstocks, (3) impact analysis for changes in feedstock costs and product prices, (4) optimal biorefinery operations during unit shutdowns / turnarounds, and (5) incentives for increased processing capacity. These biorefinery examples are comparable to crude oil purchasing and operational optimization studies that petroleum refiners perform routinely using LPs and other optimization models. It is important to note that the analyses presented in this article are strictly theoretical and they are not based on current energy market prices. The pricing structure assigned for this demonstrative analysis is consistent with $4 per gallon gasoline, which clearly assumes an economic environment that would favor the construction and operation of biorefineries. The analysis approach and examples provide valuable insights into the usefulness of analysis tools for

This paper describes a model for computer programming outreach workshops aimed at second-level students (ages 15-16). Participants engage in a series of programming activities based on the Scratch visual programming language, and a very strong group-based pedagogy is followed. Participants are not required to have any prior programming experience.…

A suite of computer programs was developed by U.S. Geological Survey personnel for forward and inverse modeling of acoustic and electromagnetic data. This report describes the computer resources that are needed to execute the programs, the installation of the programs, the program designs, some tests of their accuracy, and some suggested improvements.

Full Text Available Current performance prediction analytical models try to characterize the performance behavior of actual machines through a small set of parameters. In practice, substantial deviations are observed. These differences are due to factors as memory hierarchies or network latency. A natural approach is to associate a different proportionality constant with each basic block, and analogously, to associate different latencies and bandwidths with each "communication block". Unfortunately, to use this approach implies that the evaluation of parameters must be done for each algorithm. This is a heavy task, implying experiment design, timing, statistics, pattern recognition and multi-parameter fitting algorithms. Software support is required. We present a compiler that takes as source a C program annotated with complexity formulas and produces as output an instrumented code. The trace files obtained from the execution of the resulting code are analyzed with an interactive interpreter, giving us, among other information, the values of those parameters.

Charged particle acceleration in imploding plasmas is an important phenomenon which occurs in various natural and laboratory plasmas. A new research project at the Naval Research Laboratory (NRL) has been started to investigate this phenomenon both experimentally-in a dense plasma focus (DPF) device-and theoretically using analytical and computational modeling. The DPF will be driven by the high-inductance (607 nH) Hawk pulsed-power generator, with a rise time of 1.2 μs and a peak current of 665 kA. In this poster we present an overview of the research project, and some preliminary results from fluid simulations of the m = 0 instability in an idealized DPF pinch. This work was supported by the Naval Research Laboratory Base Program.

Full Text Available The complexity of the operation of the road caused by continuously varying from picket to picket road conditions caused by a variety of parameters projected (existing road , the variety of types of cars, their technical and economic parameters , a variety of climatic and weather conditions required to develop a complex simulation programs . This paper describes a set of programs that form the core of the subsystem "driver-vehicle-road environment". Optimization of the design solutions developed modules contribute WAY type and columns, not using indicators averaged transport - road performance, and detailed process model of functioning of the road. WAY module provides continuous sequence modeling perception of road elements mechanical subsystem "road-car " (by continuous formation and solution of the equations of motion and the characteristics of this mode. WAY module (with module PARK brings the technical contradiction between the 20-year term of road design and use of existing practices in their justification of design decisions technical parameters of cars today. The complexity of the operation of the road due to the random nature of traffic demanded inclusion in the computer -aided design of roads STREAM module. STREAM module allows to obtain simulation results of a random process, sufficient to optimize the design decisions in general and in the areas of local variation of the plan, longitudinal section, the way the situation, etc. Varie ty of road conditions possible to classify on the specifics of the formation of the flow regimes. This builds on the results of study of the process of movement of cars in the stream.

The marketing mix model was applied with a focus on Web media to re-strategize a Web-based Master's program in a southern state university in U.S. The program's existing marketing strategy was examined using the four components of the model: product, price, place, and promotion, in hopes to repackage the program (product) to prospective students…

The marketing mix model was applied with a focus on Web media to re-strategize a Web-based Master's program in a southern state university in U.S. The program's existing marketing strategy was examined using the four components of the model: product, price, place, and promotion, in hopes to repackage the program (product) to prospective students…

Functioning program infrastructure is necessary for achieving public health outcomes. It is what supports program capacity, implementation, and sustainability. The public health program infrastructure model presented in this article is grounded in data from a broader evaluation of 18 state tobacco control programs and previous work. The newly developed Component Model of Infrastructure (CMI) addresses the limitations of a previous model and contains 5 core components (multilevel leadership, managed resources, engaged data, responsive plans and planning, networked partnerships) and 3 supporting components (strategic understanding, operations, contextual influences). The CMI is a practical, implementation-focused model applicable across public health programs, enabling linkages to capacity, sustainability, and outcome measurement.

The transition to fatherhood, with its numerous challenges, has been well documented. Likewise, fathers' relationships with health and social services have also begun to be explored. Yet despite the problems fathers experience in interactions with healthcare services, few programs have been developed for them. To explain this, some authors point to the difficulty practitioners encounter in developing and structuring the theory of programs they are trying to create to promote and support father involvement (Savaya, R., & Waysman, M. (2005). Administration in Social Work, 29(2), 85), even when such theory is key to a program's effectiveness (Chen, H.-T. (2005). Practical program evaluation. Thousand Oaks, CA: Sage Publications). The objective of the present paper is to present a tool, the logic model, to bridge this gap and to equip practitioners for structuring program theory. This paper addresses two questions: (1) What would be a useful instrument for structuring the development of program theory in interventions for fathers? (2) How would the concepts of a father involvement program best be organized? The case of the Father Friendly Initiative within Families (FFIF) program is used to present and illustrate six simple steps for developing a logic model that are based on program theory and demonstrate its relevance.

Molecular modeling using structure elucidation programs in conjunction with molecular simulation programs has been performed on asphaltene molecules, the heaviest fraction of crude oil, in order to obtain a chemical model allowing the tentative study of their physicochemical properties. Boscan asphaltenes (Venezuela) derived from a marine source rock have been analysed. The different steps of this molecular modeling are described. First, a 3-D chemical representation of Boscan asphaltene is defined from an analytical data set. Second, the results of molecular dynamic simulations indicate that only a few stable conformations are possible due to the high reticulation of the model of the asphaltene unit obtained. 42 refs., 6 figs., 9 tabs.

The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologic systems, model the transport of released radionuclides through the hydrologic systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. The various input parameters required in the analysis are compiled in data systems. The data are organized and prepared by various input subroutines for utilization by the hydraulic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required.The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System is a storage and retrieval system for model input and output data, including graphical interpretation and display. This is the second of four volumes of the description of the CIRMIS Data System.

A logic model was developed based on an analysis of the 2012 American School Counselor Association (ASCA) National Model in order to provide direction for program evaluation initiatives. The logic model identified three outcomes (increased student achievement/gap reduction, increased school counseling program resources, and systemic change and…

VERDI is a flexible, modular, Java-based program used for visualizing multivariate gridded meteorology, emissions and air quality modeling data created by environmental modeling systems such as the CMAQ model and WRF.

To overcome inefficiency in traditional logic programming, a declarative programming language COPS is designed based on the notion of concurrent constraint programming (CCP). The improvement is achieved by the adoption of constraint-based heuristic strategy and the introduction of deterministic components in the framework of CCP. Syntax specification and an operational semantic description are presented.

This document contains the appendices to Technology Transfer Recommendations for the US Department of Energy's Storage Program (PNL-6484, Vol. 1). These appendices are a list of projects, publications, and presentations connected with the Energy Storage (STOR) program. In Volume 1, the technology transfer activities of the STOR program are examined and mechanisms for increasing the effectiveness of those activities are recommended.

and scalability in an image processing application with the aim of providing insight into parallel programming issues. The second part proposes and presents the tile-based Clupea many-core architecture, which has the objective of providing configurable support for programmingmodels to allow different programming......This thesis addresses aspects of support for programmingmodels in Network-on- Chip-based many-core architectures. The main focus is to consider architectural support for a plethora of programmingmodels in a single system. The thesis has three main parts. The first part considers parallelization...

The Logic Model Workshop Toolkit is designed to help practitioners learn the purpose of logic models, the different elements of a logic model, and the appropriate steps for developing and using a logic model for program evaluation. Topics covered in the sessions include an overview of logic models, the elements of a logic model, an introduction to…

The ATLAS B0 model coil has been extensively tested, reproducing the operational conditions of the final ATLAS Barrel Toroid coils. Two test campaigns have taken place on B0, at the CERN facility where the individual BT coils are about to be tested. The first campaign aimed to test the cool-down, warm-up phases and to commission the coil up to its nominal current of 20.5 kA, reproducing Lorentz forces similar to the ones on the BT coil. The second campaign aimed to evaluate the margins above the nominal conditions. The B0 was tested up to 24 kA and specific tests were performed to assess: the coil temperature margin with respect to the design value, the performance of the double pancake internal joints, static and dynamic heat loads, behavior of the coil under quench conditions. The paper reviews the overall test program with emphasis on second campaign results not covered before. 10 Refs.

Suppose a small compact object (black hole or neutron star) of mass $m$ orbits a large black hole of mass $M \\gg m$. This system emits gravitational waves (GWs) that have a radiation-reaction effect on the particle's motion. EMRIs (extreme--mass-ratio inspirals) of this type will be important GW sources for LISA; LISA's data analysis will require highly accurate EMRI GW templates. In this article I outline the "Capra" research program to try to model EMRIs and calculate their GWs \\textit{ab initio}, assuming only that $m \\ll M$ and that the Einstein equations hold. Here we treat the EMRI spacetime as a perturbation of the large black hole's "background" (Schwarzschild or Kerr) spacetime and use the methods of black-hole perturbation theory, expanding in the small parameter $m/M$. The small body's motion can be described either as the result of a radiation-reaction "self-force" acting in the background spacetime or as geodesic motion in a perturbed spacetime. Several different lines of reasoning lead to the (s...

Full Text Available The industrial logistics is a fundamental pillar for the survival of companies in the actual increasingly competitive market. It is not exclusively about controlling the flow of external material between suppliers and the company, but for developing a detailed study of how to plan, control, handle and package those materials as well. Logistics activities must ensure the maximum efficiency in using corporate resources once they do not add value to the final product. The creation of a logistic plan, for each piece of the company’s production, has to adapt the demand parameters, seasonal or not, in the timeline. Thus, the definition of packaging (transportation and consumption must adjust in accordance with the demand, in order to allow the logistic planning to work, constantly, with order of economy batches. The packaging calculation for each part in every demand can become well complicated due to the large amount of parts in the production process. Automating the calculation process for choosing the right package for each piece is an effective method in logistics planning. This article will expose a simple and practical mathematical model for automating the packaging calculation and a logic program, created in Visual Basic language in the Excel software, used for creating graphic designs that show how the packages are being filled.

The objective of this Cooperative Research and Development Agreement (CRADA) was to develop and commercialize a technology conceived by scientists at Pacific Northwest National Laboratory (PNNL) and manufactured by Beckman Instruments, Inc. (Beckman), and to apply this technology to the characterization of and soils. The technology is the Unsaturated Flow Apparatus (UFA). The UFA provides a highly efficient method of direct, rapid measurement of hydraulic conductivity and other flow properties according to Darcy-Buckingham principles because the operator controls both the fluid driving force, using an ultracentrifuge, and the flow into the sample while it is spinning, with a rotating seal assembly. The concept of using centrifugation to significantly decrease the time needed, from years or months to days, for study of subsurface transport, particularly under unsaturated conditions, was conceived by James Conca, Ph.D., and Judith Wright, Ph.D., in 1986. The prototype UFA was developed in 1988 because there was a need to rapidly and accurately determine transport parameters in soils, sediments, and rocks for the Grout Waste Disposal Program. Transport parameters are critical to modeling outcomes for site-specific solutions to environmental remediation and waste disposal problems.

Simple instructional systems design (ISD) model is based on a fast development, usability test, and continuos feedback, which are necessary for educational program development in medical school. This study aims to figure out the usability of Simple ISD model for a medical ethics education program by describing a developmental details of each phase and its evaluation results. Research has been conducted in two steps. First, while researchers participated in the program development by using Simple ISD model, we collected empirical data of each development activities. Second, the developed program was evaluated by students' web-based usability test, a 8-students' focus group interview and 5 faculty members' individual interviews in 4 domains; learning contents, instructional methods and strategies, achievement evaluation, and self-evaluation. Following the circular process of analysis, design, development, and usability test of Simple ISD model, a 10-week medical ethics program covering 9 instructional topics was developed. The average points of response on the developed medical ethics program in 2008 and 2009 are increased from 3.96 to 4.59 and 4.41, respectively. The prospects and limitations of the program are discussed. From a development study of the medical ethics program by using Simple ISD model, we could implement a more usable medical ethics program, and found 4 different usability of the Simple ISD model; the rapid development of educational program, program improvement by continuous feedback, faculty members' engagement in instructional design, and professional development of the faculty members.

The notion of dominance most familiar to agricultural economists is perhaps the decision theoretic concept entailed in comparing one risky prospect to others. But dominance concepts are also relevant in the linear programming context, for example in identifying redundant constraints. In this note, the standard concept of dominance in linear programming is generalized by defining dominance with respect to differing levels of information about the programming problem.

Recently we developed a mathematical model for microbial growth in food. The model successfully predicted microbial growth at various patterns of temperature. In this study, we developed a program to fit data to the model with a spread sheet program, Microsoft Excel. Users can instantly get curves fitted to the model by inputting growth data and choosing the slope portion of a curve. The program also could estimate growth parameters including the rate constant of growth and the lag period. This program would be a useful tool for analyzing growth data and further predicting microbial growth.

...) into the Watershed Modeling System (WMS) was initiated as part of an overall goal of the Water Quality Research Program to provide water quality capabilities within the framework of a comprehensive graphical modeling environment...

The use of multiprocessors is an important way to increase the performance of a supercom-puting program. This means that the program has to be parallelized to make use of the multi-ple processors. The parallelization is unfortunately not an easy task. Development tools supporting parallel programs are important. Further, it is the customer that decides the number of processors in the target machine, and as a result the developer has to make sure that the pro-gram runs efficiently on any numbe...

A knowledge representation has been proposed using the state-space theory of Artificial Intelligencefor Dynamic ProgrammingModel, in which a model can be defined as a six-tuple M = (I,G,O,T,D,S). Abuilding block modeling method uses the modules of a six-tuple to form a rule-based solution model. Moreover,a rule-based system has been designed and set up to solve the Dynamic ProgrammingModel. This knowledge-based representation can be easily used to express symbolical knowledge and dynamic characteristics for Dynam-ic ProgrammingModel, and the inference based on the knowledge in the process of solving Dynamic Program-mingModel can also be conveniently realized in computer.

The need to estimate indoor temperatures, heating or cooling load and energy requirements for buildings arises in many stages of a buildings life cycle, e.g. at the early layout stage, during the design of a building and for energy retrofitting planning. Other purposes are to meet the authorities requirements given in building codes. All these situations require good calculation methods. The main purpose of this report is to present the authors work with problems related to thermal models and calculation methods for determination of temperatures and heating or cooling loads in buildings. Thus the major part of the report deals with treatment of solar radiation in glazing systems, shading of solar and sky radiation and the computer program JULOTTA used to simulate the thermal behavior of rooms and buildings. Other parts of thermal models of buildings are more briefly discussed and included in order to give an overview of existing problems and available solutions. A brief presentation of how thermal models can be built up is also given and it is a hope that the report can be useful as an introduction to this part of building physics as well as during development of calculation methods and computer programs. The report may also serve as a help for the users of energy related programs. Independent of which method or program a user choose to work with it is his or her own responsibility to understand the limits of the tool, else wrong conclusions may be drawn from the results 52 refs, 22 figs, 4 tabs

This final report discusses the activities and outcomes of the Navajo Assistive Bank of Loanable Equipment (Navajo-ABLE), a federally funded program designed to provide assistive technology (AT) devices, services, technical information, funding information, and training for Navajo children and youth with disabilities. The program was operated and…

Pacific Northwest Laboratory (PNL) provided technical assistance to the Office of Operational Safety (OOS) in developing their Assurance Program for Remedial Action (APRA). The APRA Bibliography Management System (BMS), a microcomputer-operated system designed to file, locate and retrieve project-specific bibliographic data, was developed to manage the documentation associated with APRA. The BMS uses APRABASE, a PNL-developed computer program written in dBASE II language, which is designed to operate using the commercially available dBASE II database software. This document describes the APRABASE computer program, its associated subprograms, and the dBASE II APRA file. A User's Manual is also provided in the document. Although the BMS was designed to manage APRA-associated documents, it could be easily adapted for use in handling bibliographic data associated with any project.

This article proposes a STEM workforce education logic model, tailored to the particular context of the National Science Foundation's Innovative Technology Experiences for Students and Teachers (ITEST) program. This model aims to help program designers and researchers address challenges particular to designing, implementing, and studying education innovations in the ITEST program, considering ongoing needs and challenges in STEM workforce education in the USA. It is grounded in conceptual frameworks developed previously by teams of ITEST constituents, for their part intended to frame STEM career education, consider how people select and prepare for STEM careers, and reinforce the important distinction between STEM content and STEM career learnings. The authors take a first step in what they hope will be an ongoing discussion and research agenda by test-fitting assumptions of the model to exploratory case studies of recent NSF ITEST projects. Brief implications for future research and other considerations are provided.

This article proposes a STEM workforce education logic model, tailored to the particular context of the National Science Foundation's Innovative Technology Experiences for Students and Teachers (ITEST) program. This model aims to help program designers and researchers address challenges particular to designing, implementing, and studying education innovations in the ITEST program, considering ongoing needs and challenges in STEM workforce education in the USA. It is grounded in conceptual frameworks developed previously by teams of ITEST constituents, for their part intended to frame STEM career education, consider how people select and prepare for STEM careers, and reinforce the important distinction between STEM content and STEM career learnings. The authors take a first step in what they hope will be an ongoing discussion and research agenda by test-fitting assumptions of the model to exploratory case studies of recent NSF ITEST projects. Brief implications for future research and other considerations are provided.

The relevancy of program curricula in tourism and hospitality education has been called into question by key stakeholders in light of ongoing changes in the multifaceted tourism and hospitality industry. Various programmodels have been identified. Program content and quality of student preparedness have been debated. Balance and areas of emphasis…

Outdoor programs can offset initial investment costs in services and products by developing integrated program areas. The experience of Outdoors Unlimited, a recently created kayaking program at Brigham Young University (Utah), is provided as a model. The purchase of 11 kayaks for rental was followed by the introduction of retail sales, repair…

Geriatric psychosocial problems are prevalent and significantly affect the physical health and overall well-being of older adults. Geriatrics fellows require psychosocial education, and yet to date, geriatrics fellowship programs have not developed a comprehensive geriatric psychosocial curriculum. Fellowship programs in the New York tristate area…

The digital simulation system that has been developed for impact test against safety barriers has proved to be a valuable tool; it may reduce the cost of a program, or better increase largely the extent of a program without increasing the cost. In fact it may permit to reduce considerably the number

The digital simulation system that has been developed for impact test against safety barriers has proved to be a valuable tool; it may reduce the cost of a program, or better increase largely the extent of a program without increasing the cost. In fact it may permit to reduce considerably the number

The number of American students studying abroad increases every year. That might suggest that recruiting students to participate in such an educational opportunity would present little difficulty. On the contrary, as domestic student participation in such programs has risen, so has the number of competing programs. Thus, the viability of any study…

Schools are adopting evidence-based programs designed to enhance students' emotional and behavioral competencies at increasing rates (Hemmeter et al. in "Early Child Res Q" 26:96-109, 2011). At the same time, teachers express the need for increased support surrounding implementation of these evidence-based programs (Carter and Van Norman in "Early…

. In this paper, we propose an efficient game semantics based approach for verifying open program families, i.e. program families with free (undefined) identifiers. We use symbolic representation of algorithmic game semantics, where concrete values are replaced with symbolic ones. In this way, we can compactly...

Modern computers often use multi-core architectures, cov- ering clusters of homogeneous cores for high performance computing,to heterogeneous architectures typically found in embedded systems. To efficiently program such architectures, it is important to be able to par- tition and map programs on...

Schools are adopting evidence-based programs designed to enhance students' emotional and behavioral competencies at increasing rates (Hemmeter et al. in "Early Child Res Q" 26:96-109, 2011). At the same time, teachers express the need for increased support surrounding implementation of these evidence-based programs (Carter and Van Norman in "Early…

As the demand for registered nurses continues to rise, so too has the creation of accelerated baccalaureate nursing programs for second-degree students. This article describes an 11-month Accelerated Career Entry (ACE) Nursing Program's innovative curriculum design, which has a heavy emphasis on technology, professional socialization, and the use of a standardized patient experience as a form of summative evaluation. In addition, challenges of this program are presented. Since 2002, the ACE Program has graduated over 500 students with an average first-time NCLEX pass rate of 95-100%. Although the number of graduates from accelerated programs does not solve the severe nursing shortage, the contributions of these intelligent, assertive, pioneering graduates are important for health care.

Social TV is a social media service via TV and social networks through which TV users exchange their experiences about TV programs that they are viewing. For social TV service, two technical aspects are envisioned: grouping of similar TV users to create social TV communities and recommending TV programs based on group and personal interests for personalizing TV. In this paper, we propose a unified topic model based on grouping of similar TV users and recommending TV programs as a social TV service. The proposed unified topic model employs two latent Dirichlet allocation (LDA) models. One is a topic model of TV users, and the other is a topic model of the description words for viewed TV programs. The two LDA models are then integrated via a topic proportion parameter for TV programs, which enforces the grouping of similar TV users and associated description words for watched TV programs at the same time in a unified topic modeling framework. The unified model identifies the semantic relation between TV user groups and TV program description word groups so that more meaningful TV program recommendations can be made. The unified topic model also overcomes an item ramp-up problem such that new TV programs can be reliably recommended to TV users. Furthermore, from the topic model of TV users, TV users with similar tastes can be grouped as topics, which can then be recommended as social TV communities. To verify our proposed method of unified topic-modeling-based TV user grouping and TV program recommendation for social TV services, in our experiments, we used real TV viewing history data and electronic program guide data from a seven-month period collected by a TV poll agency. The experimental results show that the proposed unified topic model yields an average 81.4% precision for 50 topics in TV program recommendation and its performance is an average of 6.5% higher than that of the topic model of TV users only. For TV user prediction with new TV programs, the average

In this paper, it is shown that stable model semantics, perfect model semantics, and partial stable model semantics of disjunctive logic programs have the same expressive power with respect to the polynomial-time model-equivalent reduction. That is, taking perfect model semantics and stable model semantic as an example, any logic program P can be transformed in polynomial time to another logic program p' such that perfect models (resp. stable models) of P 1-1 correspond to stable models (resp. perfect models) of P', and the correspondence can be computed also in polynomial time. However, the minimal model semantics has weaker expressiveness than other mentioned semantics, otherwise, the polynomial hierarchy would collapse to NP.

This paper proposes a novel SPMD programmingmodel of OpenACC. Our model integrates the different granularities of parallelism from vector-level parallelism to node-level parallelism into a single, unified model based on OpenACC. It allows programmers to write programs for multiple accelerators using a uniform programmingmodel whether they are in shared or distributed memory systems. We implement a prototype of our model and evaluate its performance with a GPU-based supercomputer using three benchmark applications.

This paper proposes a novel SPMD programmingmodel of OpenACC. Our model integrates the different granularities of parallelism from vector-level parallelism to node-level parallelism into a single, unified model based on OpenACC. It allows programmers to write programs for multiple accelerators using a uniform programmingmodel whether they are in shared or distributed memory systems. We implement a prototype of our model and evaluate its performance with a GPU-based supercomputer using three benchmark applications.

Full Text Available In this article we try to show how new devices and methods can help in the educationof programming. At Kecskemét College programmable mobile robots and instead of behavioral,the constructivist pedagogical methods were used. Our experiments have proved our hypothesisas the improved new methodical education using devices can give more practical programmingknowledge, increases the attitude towards programming and helps to have positive programmingself-image. The results of the experimental and control groups were compared at the beginning andat the end of semester, when the programming knowledge and motives were measured. During thelearning process only the experimental groups used devices and new methods.

This viewgraph presentation reviews some of the complications in developing metrics for systems integration. Specifically it reviews a case study of how two programs within NASA try to develop and measure performance while meeting the encompassing organizational goals.

Examined effects of a pilot voucher program on the price, supply, and quality of day care. Findings offered no conclusive evidence concerning expected benefits. Discusses vouchers' potential for easing the day care crisis. (RJC)

THE OBJECTIVES OF THE PROJECT WERE (1) TO DEVELOP AN EVALUATION MODEL IN THE FORM OF A HOW-TO-DO-IT MANUAL WHICH OUTLINES PROCEDURES FOR OBTAINING IMMEDIATE INFORMATION REGARDING THE DEGREE TO WHICH A PILOT PROGRAM ACHIEVES ITS STATED FINAL OBJECTIVES, (2) TO EVALUATE THIS MODEL BY USING IT TO EVALUATE TWO ONGOING PILOT PROGRAMS, AND (3) TO…

This annex supplements the Puerto Rico Experimental Model Dental Training Program Comprehensive Report (CE 028 213) and is comprised of exhibits A through F. Among the information included in the exhibits is the experimental model schedule, the schematic representation, the content display, and the course outlines for all courses in the program.…

Full Text Available This article presents a method for representing the C/C++ function call in terms of compositional Petri Nets. Principles of modeling function and function call in the program are described. Formal composition operations to construct programmodel from mod ...

The train-the-trainer model has great potential for expanding information literacy programs without placing undue burden on already overextended librarians; it is surprisingly underused in academic libraries. At the University of Kentucky, we employed this model to create a new information literacy program in an introductory biology lab. We…

In an effort to standardize training delivery and to individualize staff development based on observation and reflective practice, the Air Force implemented the Developmental Training Model (DTM) in its Child Development Programs. The goal of the Developmental Training Model is to enhance high quality programs through improvements in the training…

Assigned 26 kindergarten children to either a sexual abuse prevention program which taught self-protective skills through modeling and active rehearsal (PM) or a program which taught the same skills by having children watch skills modeled by experimenter (SM). Results provide support for greater efficacy of PM relative to SM for learning of…

The Women's Health Program at the University of Michigan was established in 1993 and has developed into a successful, federally supported program that links clinical research and education activities across the University. It has focused on human resource capacity building, sustainable financial support and infrastructure, and adaptability to change and opportunities. Widely accepted standards, demonstrated value, committed leaders/champions, and participatory culture have contributed to its success and are important to its future.

This report describes the development and testing of the Kansas Technology Transfer Model (KTTM) which is to be utilized as a regional model for the development of other technology transfer programs for independent operators throughout oil-producing regions in the US. It describes the linkage of the regional model with a proposed national technology transfer plan, an evaluation technique for improving and assessing the model, and the methodology which makes it adaptable on a regional basis. The report also describes management concepts helpful in managing a technology transfer program. The original Tertiary Oil Recovery Project (TORP) activities, upon which the KTTM is based, were developed and tested for Kansas and have proved to be effective in assisting independent operators in utilizing technology. Through joint activities of TORP and the Kansas Geological Survey (KGS), the KTTM was developed and documented for application in other oil-producing regions. During the course of developing this model, twelve documents describing the implementation of the KTTM were developed as deliverables to DOE. These include: (1) a problem identification (PI) manual describing the format and results of six PI workshops conducted in different areas of Kansas, (2) three technology workshop participant manuals on advanced waterflooding, reservoir description, and personal computer applications, (3) three technology workshop instructor manuals which provides instructor material for all three workshops, (4) three technologies were documented as demonstration projects which included reservoir management, permeability modification, and utilization of a liquid-level acoustic measuring device, (5) a bibliography of all literature utilized in the documents, and (6) a document which describes the KTTM.

A room acoustical model capable of modelling point, line and surface sources is presented. Line and surface sources are modelled using a special ray-tracing algorithm detecting the radiation pattern of the surfaces in the room. Point sources are modelled using a hybrid calculation method combining...... this ray-tracing method with Image source modelling. With these three source types, it is possible to model large and complex sound sources in workrooms....

A room acoustical model capable of modelling point, line and surface sources is presented. Line and surface sources are modelled using a special ray-tracing algorithm detecting the radiation pattern of the surfaces in the room. Point sources are modelled using a hybrid calculation method combining...... this ray-tracing method with Image source modelling. With these three source types, it is possible to model large and complex sound sources in workrooms....

This implementation guide contains information based on experiences that occurred during the development and implementation of the Rhode Island Tech Prep Model. It is intended to assist educators in addressing challenges and obstacles faced by the program early in the planning process. It begins with a rationale for tech prep. Rhode Island…

The Model Forest Program was initiated in response to concerns expressed by Canadians about their environment during a nationwide consultative process carried out in 1990. The Program is designed to promote the creation of local partnerships and to encourage these partnerships to formulate and implement their own working vision of sustainable forest management. This document presents developments to date, the Model Forest Network, and models across the country. Information is also included on the International Model Forest Program and Russia joining the Network. A budget for the year and an organizational chart are included.

A room acoustical model capable of modelling point, line and surface sources is presented. Line and surfacesources are modelled using a special ray-tracing algorithm detecting the radiation pattern of the surfaces in the room.Point sources are modelled using a hybrid calculation method combining...... this ray-tracing method with Image sourcemodelling. With these three source types, it is possible to model large and complex sound sources in workrooms....

A room acoustical model capable of modelling point, line and surface sources is presented. Line and surfacesources are modelled using a special ray-tracing algorithm detecting the radiation pattern of the surfaces in the room.Point sources are modelled using a hybrid calculation method combining...... this ray-tracing method with Image sourcemodelling. With these three source types, it is possible to model large and complex sound sources in workrooms....

Full Text Available This research aims to compile the programs for poverty alleviation by community empowerment model and review the determination program as effectiveness evaluation poverty alleviation program which still can’t be worked properly. Stages the compiling program of poverty alleviation is mapping the socioeconomic conditions of the poor, basic infrastructure conditions, socio-cultural issues, and potential issues; identifying the hopes and predicting the economic development opportunities; creating the poverty alleviation program by SWOT analysis and planning implementation program with KPD. Based on the result of SWOT and scoring analysis, the selected programs are training and assistance, the establishment of cooperative saving and loans, clean water for poor households, rural development with the utilization of clean water, household waste management, and package education program A, B, and C.

Full Text Available In the management of higher education, demand a change can be done effectively when the leadership and management are well developed. In general, internal conflicts in the education management occurred due to poor leadership. This study is a qualitative study with a phenomenological approach. The subject of the study is Graduate Program of University Ahmad Dahlan which manages five master programs. Determination of each of people which considers his/her competence, knowledge, experience and personal attributes will affect to performance and operational leadership roles, organizational, and public. Cross-departmental activities and external parties become a focus for leadership in achieving the vision and mission of the organization.

The Program Demand Cost Model for Alaskan Schools (Cost Model) is a tool for use by school districts and their consultants in estimating school construction costs in the planning phase of a project. This document sets out the sixth edition of the demand-cost model, a rewrite of the whole system. The model can be used to establish a complete budget…

Developed a needs-assessment model and validated the model with five groups of stakeholders connected with an undergraduate university nursing program in Canada. Used focus groups, questionnaires, a hermeneutic approach, and the magnitude-estimation scaling model to validate the model. Results show that respondents must define need to clarify the…

The Department of Energy's Atmospheric Science Program (ASP) conducts research pertinent to radiative forcing of climate change by atmospheric aerosols. The program consists of approximately 40 highly interactive peer-reviewed research projects that examine aerosol properties and processes and the evolution of aerosols in the atmosphere. Principal components of the program are instrument development, laboratory experiments, field studies, theoretical investigations, and modeling. The objectives of the Program are to 1) improve the understanding of aerosol processes associated with light scattering and absorption properties and interactions with clouds that affect Earth's radiative balance and to 2) develop model-based representations of these processes that enable the effects of aerosols on Earth's climate system to be properly represented in global-scale numerical climate models. Although only a few of the research projects within ASP are explicitly identified as primarily modeling activities, modeling actually comprises a substantial component of a large fraction of ASP research projects. This document describes the modeling activities within the Program as a whole, the objectives and intended outcomes of these activities, and the linkages among the several modeling components and with global-scale modeling activities conducted under the support of the Department of Energy's Climate Sciences Program and other aerosol and climate research programs.

One of the main obstacle that prevents model checking from being widely used in industrial control systems is the complexity of building formal models out of PLC programs, especially when timing aspects need to be integrated. This paper brings an answer to this obstacle by proposing a methodology to model and verify timing aspects of PLC programs. Two approaches are proposed to allow the users to balance the trade-off between the complexity of the model, i.e. its number of states, and the set of specifications possible to be verified. A tool supporting the methodology which allows to produce models for different model checkers directly from PLC programs has been developed. Verification of timing aspects for real-life PLC programs are presented in this paper using NuSMV.

This paper provides an overview of the Tripler Army Medical Center LEAN Program for the treatment of obesity, hypercholesterolemia, and essential hypertension. The LEAN Program, a multi-disciplinary prevention program, emphasizes healthy Lifestyles, Exercise and Emotions, Attitudes, and Nutrition for active duty service members. The treatment model offers a medically healthy, emotionally safe, and reasonable, low-intensity exercise program to facilitate weight loss. We will discuss the philosophy behind the LEAN Program and the major components. Thereafter, we will briefly discuss the preliminary results.

Full Text Available Along with developing Demand Response Programs (DRPs, suitable chances have been created to take part the demand-side in electricity markets. The results of such programs are improvement of some technical and economical characteristic of power system. DRPs are divided into two categories which are priced-based and incentive-based demand response programs. This paper presents the application of power modeling for Real-Time Pricing programs (RTP as most prevalent priced-based DRPs. the nonlinear behavioral characteristic of elastic loads is considered which causes to more realistic modeling of demand response to RTP rates. In order to evaluation of proposed model, the impact of running RTP programs using proposed power model on load profile of the peak day of the Iranian power system in 2007 is investigated.

Cardiac rehabilitation programs are comprehensive life-style programs aimed at preventing recurrence of a cardiac event. However, the current programs have globally significantly low levels of uptake. Home-based model can be a viable alternative to hospital-based programs. We developed and analysed a service and business model for home based cardiac rehabilitation based on personal mentoring using mobile phones and web services. We analysed the different organizational and economical aspects of setting up and running the home based program and propose a potential business model for a sustainable and viable service. The model can be extended to management of other chronic conditions to enable transition from hospital and care centre based treatments to sustainable home-based care.

We describe new capabilities for modeling MPEC problems within the Pyomo modeling software. These capabilities include new modeling components that represent complementar- ity conditions, modeling transformations for re-expressing models with complementarity con- ditions in other forms, and meta-solvers that apply transformations and numeric optimization solvers to optimize MPEC problems. We illustrate the breadth of Pyomo's modeling capabil- ities for MPEC problems, and we describe how Pyomo's meta-solvers can perform local and global optimization of MPEC problems.

A room acoustical model capable of modeling point sources, line sources, and surface sources is presented. Line and surface sources are modeled using a special ray-tracing algorithm detecting the radiation pattern of the surfaces of the room. Point sources are modeled using a hybrid calculation...... method combining this ray-tracing method with image source modeling. With these three source types it is possible to model huge and complex sound sources in industrial environments. Compared to a calculation with only point sources, the use of extended sound sources is shown to improve the agreement...

Since the implementation of the ePortfolio Program in 2006, Clemson University has incorporated peer review for the formative feedback process. One of the challenges with this large-scale implementation has been ensuring that all work is reviewed and constructive feedback is provided in a timely manner. In this article, I discuss the strategies…

Purpose: This paper aims to show the process of designing and measuring learning competencies in program development. Design/methodology/approach: The paper includes cross-sectoral comparisons to draw on programmatic and pedagogical strategies, more commonly utilized in vocational education, and transfer the application of these strategies into…

Full Text Available The effect of literacy education is often disappointing, because many participants do not practice their literacy so that it becomes dull. Government has done efforts to preserve literariness through Koran Ibu program. This program is an effort to improve women literacy which is implemented after basic and advanced literacy education through the activity of journalism. The research focus is the improvement of literacy ability of women through Koran Ibu using with case study method. The research’s subjetcs are 20 housewives who have completed the basic and advanced literacy education program. The result of the research shows that participants significantly develops the literacy skills, it is indicated from that they are able to participate in Koran Ibu’s rubrics with their literary works based on reportage and personal experience. The program is succesful due to the coordination among local participants (Lurah, Camat, PKK of local level, the cooperation with proper stakeholders (university and local tutor, the appropriate approach of adult learning, flexible implementation strategy but planned, and the routine evaluation.

To address continually decreasing enrollment and rising attrition in post-secondary STEM degree (science, technology, engineering, and mathematics) programs, particularly for women, the present study examines the utility of motivation and emotion variables to account for persistence and achievement in science in male and female students…

The Women in Machining (WIM) program is a Machine Action Project (MAP) initiative that was developed in response to a local skilled metalworking labor shortage, despite a virtual absence of women and people of color from area shops. The project identified post-war stereotypes and other barriers that must be addressed if women are to have an equal…

Describes two adolescent substance abuse treatment programs in New England psychiatric center: Osgood Three, which is no longer in existence, and Tyler Three, which replaced it and is struggling to grow. Considers transition from Osgood Three to Tyler Three, process of change, and learning what can be preserved from past and what must be…

In 1971 the College of the Holy Cross (Minnesota) set up a summer language and cultural program in Cuernavaca, Mexico for their students of Spanish. After intensive grammar in the classroom students are sent out on "survival" situations involving verbal communication with Mexicans in the market place, schools, prisons, etc. (SC)

Over the last five years, the Urban Assembly School for Law and Justice (SLJ), a new, small public high school in Brooklyn, and The Essentials, a professional theater company, have joined forces to offer a low-budget, high-quality, in-house afterschool theater program for SLJ students. Both SLJ and The Essentials were in nascent stages when the…

High failure and drop-out rates from introductory programming courses continue to be of significant concern to computer science disciplines despite extensive research attempting to address the issue. In this study, we include the three entities of the didactic triangle, instructors, students and curriculum, to explore the learning difficulties…

Purpose: This paper aims to show the process of designing and measuring learning competencies in program development. Design/methodology/approach: The paper includes cross-sectoral comparisons to draw on programmatic and pedagogical strategies, more commonly utilized in vocational education, and transfer the application of these strategies into…

In this article we try to show how new devices and methods can help in the education of programming. At Kecskemét College programmable mobile robots and instead of behavioral, the constructivist pedagogical methods were used. Our experiments have proved our hypothesis as the improved new methodical education using devices can give more practical…

Basinger, Director, DCI, CFD Scientific Letter The PORTFOLIO CREATION MODEL developed for the Capital Investment Program Plan Review (CIPPR) To inform...the portfolio creation model that has been developed to produce project portfolios for the Capital Investment Program Plan Review (CIPPR). The portfolio ...creation model is one element of the portfolio approach that has been envisioned for CIPPR in order to enable better decisions concerning the

SCOOP is a concurrent object-oriented programmingmodel based on contracts. The model introduces processors as a new concept and it generalizes existing object-oriented concepts for the concurrent context. Simplicity is the main objective of SCOOP. The model guarantees the absence of data races in any execution of a SCOOP program. This article is a technical description of SCOOP as defined by Nienaltowski [11] and Meyer [7,9,10].

Suppose a small compact object (black hole or neutron star) of mass m orbits a large black hole of mass M ≫ m. This system emits gravitational waves (GWs) that have a radiation-reaction effect on the particle's motion. EMRIs (extreme-mass-ratio inspirals) of this type will be important GW sources for LISA. To fully analyze these GWs, and to detect weaker sources also present in the LISA data stream, will require highly accurate EMRI GW templates. In this article I outline the ``Capra'' research program to try to model EMRIs and calculate their GWs ab initio, assuming only that m ≪ M and that the Einstein equations hold. Because m ≪ M the timescale for the particle's orbit to shrink is too long for a practical direct numerical integration of the Einstein equations, and because this orbit may be deep in the large black hole's strong-field region, a post-Newtonian approximation would be inaccurate. Instead, we treat the EMRI spacetime as a perturbation of the large black hole's ``background'' (Schwarzschild or Kerr) spacetime and use the methods of black-hole perturbation theory, expanding in the small parameter m/M. The particle's motion can be described either as the result of a radiation-reaction ``self-force'' acting in the background spacetime or as geodesic motion in a perturbed spacetime. Several different lines of reasoning lead to the (same) basic O(m/M) ``MiSaTaQuWa'' equations of motion for the particle. In particular, the MiSaTaQuWa equations can be derived by modelling the particle as either a point particle or a small Schwarzschild black hole. The latter is conceptually elegant, but the former is technically much simpler and (surprisingly for a nonlinear field theory such as general relativity) still yields correct results. Modelling the small body as a point particle, its own field is singular along the particle worldline, so it's difficult to formulate a meaningful ``perturbation'' theory or equations of motion there. Detweiler and Whiting found

Presents a simple cost model to analyze trade-offs involved in considering storage and weeding as alternatives to new construction for academic libraries. References are provided, and the Palmour cost model is presented as an appendix. (RAA)

This paper discusses a Fortran 90 program referred to as BIEMS (Bayesian inequality and equality constrained model selection) that can be used for calculating Bayes factors of multivariate normal linear models with equality and/or inequality constraints between the model parameters versus a model containing no constraints, which is referred to as the unconstrained model. The prior that is used under the unconstrained model is the conjugate expected-constrained posterior prior and the prior un...

After more than 40 years of effort, energy efficiency program administrators and associated contractors still find it challenging to penetrate the home retrofit market, especially at levels commensurate with state and federal goals for energy savings and emissions reductions. Residential retrofit programs further have not coalesced around a reliably successful model. They still vary in design, implementation and performance, and they remain among the more difficult and costly options for acquiring savings in the residential sector. If programs are to contribute fully to meeting resource and policy objectives, administrators need to understand what program elements are key to acquiring residential savings as cost effectively as possible. To that end, the U.S. Department of Energy (DOE) sponsored a comprehensive review and analysis of home energy upgrade programs with proven track records, focusing on those with robustly verified savings and constituting good examples for replication. The study team reviewed evaluations for the period 2010 to 2014 for 134 programs that are funded by customers of investor-owned utilities. All are programs that promote multi-measure retrofits or major system upgrades. We paid particular attention to useful design and implementation features, costs, and savings for nearly 30 programs with rigorous evaluations of performance. This meta-analysis describes programmodels and implementation strategies for (1) direct install retrofits; (2) heating, ventilating and air-conditioning (HVAC) replacement and early retirement; and (3) comprehensive, whole-home retrofits. We analyze costs and impacts of these programmodels, in terms of both energy savings and emissions avoided. These programmodels can be useful guides as states consider expanding their strategies for acquiring energy savings as a resource and for emissions reductions. We also discuss the challenges of using evaluations to create programmodels that can be confidently applied in

A multi-period and multi-product production-planning model of an operational meat processing plant is presented. The model input is the time-varying customer demand and the output is the optimum product mix. The model results are interpreted and compared with actual data. Various production strategies are evaluated.

In this paper, a variation of traditional Genetic Programming(GP) is used to model the MagnetoencephaloGram(MEG) of Epileptic Patients. This variation is Linear Genetic Programming(LGP). LGP is a particular subset of GP wherein computer programs in population are represented as a sequence of instructions from imperative programming language or machine language. The derived models from this method were simplified using genetic algorithms. The proposed method was used to model the MEG signal of epileptic patients using 6 different datasets. Each dataset uses different number of previous values of MEG to predict the next value. The models were tested in datasets different from the ones which were used to produce them and the results were very promising.

Parallel programming is designed for the use of parallel computer systems for solving time-consuming problems that cannot be solved on a sequential computer in a reasonable time. These problems can be divided into two classes: 1. Processing large data arrays (including processing images and signals in real time)2. Simulation of complex physical processes and chemical reactions For each of these classes, prospective methods are designed for solving problems. For data processing, one of the most promising technologies is the use of artificial neural networks. Particles-in-cell method and cellular automata are very useful for simulation. Problems of scalability of parallel algorithms and the transfer of existing parallel programs to future parallel computers are very acute now. An important task is to optimize the use of the equipment (including the CPU cache) of parallel computers. Along with parallelizing information processing, it is essential to ensure the processing reliability by the relevant organization ...

HTTP Server is a computer programs that serves webpage content to clients. A webpage is a document or resource of information that is suitable for the World Wide Web and can be accessed through a web browser and displayed on a computer screen. This information is usually in HTML format, and may provide navigation to other webpage's via hypertext links. WebPages may be retrieved from a local computer or from a remote HTTP Server. WebPages are requested and served from HTTP Servers using Hypertext Transfer Protocol (HTTP). WebPages may consist of files of static or dynamic text stored within the HTTP Server's file system. Client-side scripting can make WebPages more responsive to user input once in the client browser. This paper encompasses the creation of HTTP server program using java language, which is basically supporting for HTML and JavaScript.

The software which implements two spring wheat phenology models is described. The main program routines for the Doraiswamy/Thompson crop phenology model and the basic Robertson crop phenology model are DTMAIN and BRMAIN. These routines read meteorological data files and coefficient files, accept the planting date information and other information from the user, and initiate processing. Daily processing for the basic Robertson program consists only of calculation of the basic Robertson increment of crop development. Additional processing in the Doraiswamy/Thompson program includes the calculation of a moisture stress index and correction of the basic increment of development. Output for both consists of listings of the daily results.

TLO) Program has become an institution that is relied upon by pm1icipant jurisdictions for intelligence and information sharing between federal, state...vetted-out partners throughout the public safety community to assist jurisdictions in addressing many high-risk events and incidents. In the...State University of New York College (Cortland), 1990 M.A., Northern Arizona University , 1999 Submitted in partial fulfillment of the

Background Patients receiving cancer treatment start lifestyle changes mostly at the end of the treatment during the rehabilitation period. Most often, the first step is a dietary change and physical exercises built into the daily routine. Patients who do this in groups led by qualified therapists and based on professional counseling can build more effective and more permanent changes into their life. To develop a complex rehabilitation program which, in the short term, aims to famil...

The Cyclotron Analytic ModelProgram (CAMP) written in C++ with the use of Visual C++ is described. The program is intended for the mean magnetic field calculation of the isochronous cyclotron with allowance for flutter. The program algorithm was developed on the basis of the paper 'Calculation of Isochronous Fields for Sector-Focused Cyclotrons', by M.M.Gordon (Particle Accelerators. 1983. V.13). The accuracy of the calculations, performed with this program, was tested with the use of maps of isochronous magnetic fields of different cyclotrons with the azimuthally varying fields - AVF cyclotrons, in which the ion beams were produced. The calculation by CAMP showed that the isochronous mean magnetic field curve for the measured magnetic field, in which the ion beam was produced, exactly corresponded to the curve of the isochronous mean magnetic field, calculated with the allowance for flutter for all the AVF cyclotrons that were considered. As is evident from the calculations, this program can be used for cal...

Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program.

Full Text Available The article considers a mathematical model of the beam adapted for use in general-purpose software systems to analyze dynamic characteristics. The elastic properties of the beam tension, bending and torsion are taken into account. Such a model significantly expands the functional capabilities of the complexes. The mathematical model of the beam designed for the finite element method is taken as a basis. Then its adaptation is carried out taking into account the beam joint to the arbitrary points of a rigid solid, thereby having a model suitable to analyze objects with lumped parameters. Beam parameters are the material parameters, the geometric characteristics, and the coordinates of joint points to the solids.The paper describes in detail the algorithm of computations performed at each step of numerical integration of the systems of ordinary differential equations, and presents an equivalent diagram of the mathematical model of the beam. Mathematical models of elastic rail guides, cylindrical and prismatic, derived from the mathematical model of the beam are more functional compared to the models based on the kinematic equations.The prismatic rail guide (V-guide, unlike the beam, does not counteract the translational motion of solids along it, i.e. only works in torsion and bending, and a deforming part of the rail guide length is variable. The cylindrical rail guide works in bending only. These differences can be easily implemented by modifying the equations of the mathematical model of the beam. Using these models allows us to connect solids by two or more rail guides, and it does not lead to the degeneration of the Jacobi matrix (unlike models based on kinematic equations. The models are implemented in PA8 and PA9 software and methodological support complexes for analysis of dynamic objects developed at the Department of CAD in Bauman MSTU.

Modeled after Barbara Byrne's other best-selling structural equation modeling (SEM) books, this practical guide reviews the basic concepts and applications of SEM using Mplus Versions 5 & 6. The author reviews SEM applications based on actual data taken from her own research. Using non-mathematical language, it is written for the novice SEM user. With each application chapter, the author "walks" the reader through all steps involved in testing the SEM model including: an explanation of the issues addressed illustrated and annotated testing of the hypothesized and post hoc models expl

minimal model forms the basis for verifying the LHA model. We consider two techniques to verify the reactive properties specified as CTL formulas: (i) reachability analysis and (ii) model checking. A systematic translation of LHA models into constraint logic programs is de- fined. This is mechanised...... by a compiler. To facilitate forward and backward reasoning, two different ways to model an LHA are defined. A framework consist- ing of general purpose constraint logic program tools is presented to accomplish the reachability analysis to verify a class of safety and liveness properties. A tool to compute...

Recently, a massive focus has been made on demand response (DR) programs, aimed to electricity price reduction, transmission lines congestion resolving, security enhancement and improvement of market liquidity. Basically, demand response programs are divided into two main categories namely, incentive-based programs and time-based programs. The focus of this paper is on Interruptible/Curtailable service (I/C) and capacity market programs (CAP), which are incentive-based demand response programs including penalties for customers in case of no responding to load reduction. First, by using the concept of price elasticity of demand and customer benefit function, economic model of above mentioned programs is developed. The proposed model helps the independent system operator (ISO) to identify and employ relevant DR program which both improves the characteristics of the load curve and also be welcome by customers. To evaluate the performance of the model, simulation study has been conducted using the load curve of the peak day of the Iranian power system grid in 2007. In the numerical study section, the impact of these programs on load shape and load level, and benefit of customers as well as reduction of energy consumption are shown. In addition, by using strategy success indices the results of simulation studies for different scenarios are analyzed and investigated for determination of the scenarios priority. (author)

The development of a simulation model, "JohneSSim", was part of a research program aimed at designing a national Johne's disease control program for The Netherlands. Initially, the focus was mainly directed towards different compulsory "test-and-cull" strategies. However, the results from the JohneS

This paper presents preliminary evidence for the effectiveness of a model of care designed to provide safe and effective services in both short-term shelter and short-term staff secure detention programs. Boys Town short-term crisis shelter programs were designed to provide a safe and therapeutic environment for homeless and runaway youth in need…

This annex supplements the Puerto Rico Experimental Model Dental Training Program Comprehensive Report (CE 028 213) and is comprised of exhibits G through L. Among the information included in the exhibits is the evaluation reports of the commission on accreditation, the detailed curriculum, and the accredited program's scope, sequence, and course…

A logic model is a visual representation of the assumptions and theory of action that underlie the structure of an education program. A program can be a strategy for instruction in a classroom, a training session for a group of teachers, a grade-level curriculum, a building-level intervention, or a district-or statewide initiative. This guide, an…

The teaching of introductory computer programming seems far from successful, with many first-year students performing more poorly than expected. One possible reason for this is that novices hold "non-viable" mental models (internal explanations of how something works) of key programming concepts which then cause misconceptions and difficulties. An…

Today, Montessori infant & toddler programs around the country usually have a similar look and feel--low floor beds, floor space for movement, low shelves, natural materials, tiny wooden chairs and tables for eating, and not a highchair or swing in sight. But Montessori toddler programs seem to fall into two paradigms--one model seeming more…

The Future Scientists Program of Texas A&M University and the Agricultural Research Service branch of USDA serves as a modelprogram of effective collaboration between a federal agency and K-12. It demonstrates true partnership that contextualizes learning of science and provides quality professional development, benefiting teachers and their…

Assumptions made and techniques used in modeling the power network to the 480 volt level are discussed. Basic computational techniques used in the short circuit program are described along with a flow diagram of the program and operational procedures. Procedures for incorporating network changes are included in this user's manual.

Today, Montessori infant & toddler programs around the country usually have a similar look and feel--low floor beds, floor space for movement, low shelves, natural materials, tiny wooden chairs and tables for eating, and not a highchair or swing in sight. But Montessori toddler programs seem to fall into two paradigms--one model seeming more…

A leadership program was created for students to gain skills and/or change their behavior using Appreciative Inquiry and Video Self Modeling, VSM. In 2011a youth that experiences a disability had been unable to achieve a skill utilizing traditional methods of skill acquisition. He employed the Appreciative Inquiry and VSM leadership program and…

Full Text Available This article presents computer program for estimation of multivariate (bivariate and trivariate volatility processes, written in EViews Version 4.1. In order to estimate multivariate volatility processes for analysis of the Serbian financial market, I had to write new subprograms within Eviews software package. The programs are written for the diagonal vector ARCH model (DVEC in bivariate and trivariate versions.

This chapter describes the application of the software tool MRQT in combination with lumped parameter models in the System Identification Competition. MRQT identifies the parameters of a deterrninistic model by minimizing the simulation error. using the Marquardt-Levenberg method. More than presenti

The Association to Advance Collegiate Schools of Business (AACSB) International's assurance of learning (AoL) standards require that schools develop a sophisticated continuous-improvement process. The authors review various assessment models and develop a practical, 6-step AoL model based on the literature and the authors' AoL-implementation…

This paper presents a stochastic robust framework for two-stage power system optimization problems with uncertainty. The model optimizes the probabilistic expectation of different worst-case scenarios with ifferent uncertainty sets. A case study of unit commitment shows the effectiveness of the proposed model and algorithms.

Described is a computer-based wind energy conversion system analysis model (WECSAM) developed to predict the technical and economic performance of wind energy conversion systems (WECS). The model is written in CDC FORTRAN V. The version described accesses a data base containing wind resource data, application loads, WECS performance characteristics, utility rates, state taxes, and state subsidies for a six state region (Minnesota, Michigan, Wisconsin, Illinois, Ohio, and Indiana). The model is designed for analysis at the county level. The computer model includes a technical performance module and an economic evaluation module. The modules can be run separately or together. The model can be run for any single user-selected county within the region or looped automatically through all counties within the region. In addition, the model has a restart capability that allows the user to modify any data-base value written to a scratch file prior to the technical or economic evaluation. Thus, any user-supplied data for WECS performance, application load, utility rates, or wind resource may be entered into the scratch file to override the default data-base value. After the model and the inputs required from the user and derived from the data base are described, the model output and the various output options that can be exercised by the user are detailed. The general operation is set forth and suggestions are made for efficient modes of operation. Sample listings of various input, output, and data-base files are appended. (LEW)

The authors describe their initial 5-year experience with a new model of residency directorship: a triumvirate of shared leadership consisting of a director and 2 associate directors with specific areas of expertise and assigned responsibility. The major appeal of this model is its potential to draw on the diverse talents of 3 individuals with responsibilities matched to their specific areas of strength. A major benefit of the model is that each director has more time and energy to devote to specific duties, resulting in a greater opportunity for innovation and creativity. In this article, the authors describe the roles, responsibilities, and accomplishments of each of the 3 directors. They also discuss potential benefits of the triumvirate model in comparison with a traditional residency directorship and potential pitfalls to avoid when implementing this model.

Full Text Available The Oasis Enrichment Model aims to promote the cognitive enrichment of the gifted. Several experts in gifted education were involved in the design, implementation and evaluation of the results. The model was tested in a group of public schools, in the Kingdom of Saudi Arabia (KSA. Before this pilot study, the model was revised based on the opinions of 75 expert teachers. This pilot study was conducted under the supervision of 14 managers. The administrators were staff from the Ministry of Education. During the experimental period, regular reports were written by teachers, administrators and school principals and were presented to the developers of the model in oder to improve it. In addition, both qualitative and quantitative evaluativons were administered to ensure the validity, reliability and efficiency of the model and to modify it when necessary (MOE, 2004.

Pareto genetic programming methodology is extended by additional generic model selection and generation strategies that (1) drive the modeling engine to creation of models of reduced non-linearity and increased generalization capabilities, and (2) improve the effectiveness of the search for robust m

The aim of this research is to develop a learning model which blends factors from learning environment and engineering design concept for learning in computer programming course. The usage of the model was also analyzed. This study presents the design, implementation, and evaluation of the model. The research methodology is divided into three…

This slide presentation reviews the use of animal models to research and inform bone morphology, in particular relating to human research in bone loss as a result of low gravity environments. Reasons for use of animal models as tools for human research programs include: time-efficient, cost-effective, invasive measures, and predictability as some model are predictive for drug effects.

The following chapters describe the structure and code of MAMO, and walk the reader through running the different components of the program with sample data. This manual should be used alongside a computer running R, so that the reader can copy and paste code into R, observe the output, and follow along interactively. Taken together, chapters 2–4 will allow the user to replicate a simulation study investigating the consequences of climate change and two potential management actions on the population dynamics of a vulnerable and iconic Hawaiian forest bird, the ‘I‘iwi (Drepanis coccinea; hereafter IIWI).

This is a descriptive case study investigating the use of two computer-based programming environments (CPEs), MicroWorlds(TM) (MW) and Stagecast Creator(TM) (SC), as modeling tools for collaborative fifth grade science learning. In this study I investigated how CPEs might support fifth grade student work and inquiry in science. There is a longstanding awareness of the need to help students learn about models and modeling in science, and CPEs are promising tools for this. A computer program can be a model of a physical system, and modeling through programming may make the process more tangible: Programming involves making decisions and assumptions; the code is used to express ideas; running the program shows the implications of those ideas. In this study I have analyzed and compared students' activities and conversations in two after-school clubs, one working with MW and the other with SC. The findings confirm the promise of CPEs as tools for teaching practices of modeling and science, and they suggest advantages and disadvantages to that purpose of particular aspects of CPE designs. MW is an open-ended, textual CPE that uses procedural programming. MW students focused on breaking down phenomena into small programmable pieces, which is useful for scientific modeling. Developing their programs, the students focused on writing, testing and debugging code, which are also useful for scientific modeling. SC is a non-linear, object-oriented CPE that uses visual program language. SC students saw their work as creating games. They were focused on the overall story which they then translated it into SC rules, which was in conflict with SC's object-oriented interface. However, telling the story of individual causal agents was useful for scientific modeling. Programming in SC was easier, whereas reading code in MW was more tangible. The latter helped MW students to use the code as the representation of the phenomenon rather than merely as a tool for creating a simulation. The

A computer program was developed for predicting nonlinear uniaxial material responses using viscoplastic constitutive models. Four specific models, i.e., those due to Miller, Walker, Krieg-Swearengen-Rhode, and Robinson, are included. Any other unified model is easily implemented into the program in the form of subroutines. Analysis features include stress-strain cycling, creep response, stress relaxation, thermomechanical fatigue loop, or any combination of these responses. An outline is given on the theoretical background of uniaxial constitutive models, analysis procedure, and numerical integration methods for solving the nonlinear constitutive equations. In addition, a discussion on the computer program implementation is also given. Finally, seven numerical examples are included to demonstrate the versatility of the computer program developed.

Discusses the results of a study designed to investigate the effects of five different programmodels on both acquisition and maintenance of Spanish by native Spanish-speaking kindergarten children. (Author/CFM)

Kirkpatrick reviews his 1959 article presenting his four-level model of evaluation. He suggests that training professionals should evaluate their programs and understanding those four levels is a good start. The text of the original article is included. (JOW)

For a generic flexible efficient array antenna receiver platform a hierarchical reconfigurable tiled architecture has been proposed. The architecture provides a flexible reconfigurable solution, but partitioning, mapping, modeling and programming such systems remains an issue. We will advocate a mod

The objectives of this mixed methods research were 1) to study effects of the health behavior modification program (HBMP) conducted under the principles of the PROMISE Model and the CIPP Model and 2) to compare the 3-self health behaviors and the biomedical indicators before with after the program completion. During the program, three sample groups including 30 program leaders, 30 commanders and 120 clients were assessed, and there were assessments taken on 4,649 volunteers who were at risk of metabolic syndrome before and after the program conducted in 17 hospitals. The collected data were analyzed by the t-test and the path analysis. The research instruments were questionnaires used for program evaluation, structuralized interview forms, and questionnaires used for 3-self health behavior assessment. The findings were as follows: 1) During the program, the assessment result deriving from comparing the overall opinions toward the program among the three sample groups showed no difference (F=2.219), 2) The program management factors based on the PROMISE Model (positive reinforcement, optimism, context, and process or activity provision) had an overall influence on the product or success of the HBMP (p< 0.05) with size effects at 0.37, 0.13, 0.31 and 0.88 respectively. All of the factors could predict the product of the program by 69%. 3) After participating in the program, the clients' 3-self health behaviors (self-efficacy, self-regulation, and self-care) were significantly higher than those appeared before the participation (p< 0.05), and their biomedical indicators (BMI, blood pressure, waistline, blood glucose, lipid profiles, cholesterol, and HbA1c) were significantly lower than those measured before the program (p< 0.05).

NAMMU is a computer program for modelling groundwater flow and transport through porous media. This document provides an overview of the use of the program for geosphere modelling in performance assessment calculations and gives a detailed description of the program itself. The aim of the document is to give an indication of the grounds for having confidence in NAMMU as a performance assessment tool. In order to achieve this the following topics are discussed. The basic premises of the assessment approach and the purpose of and nature of the calculations that can be undertaken using NAMMU are outlined. The concepts of the validation of models and the considerations that can lead to increased confidence in models are described. The physical processes that can be modelled using NAMMU and the mathematical models and numerical techniques that are used to represent them are discussed in some detail. Finally, the grounds that would lead one to have confidence that NAMMU is fit for purpose are summarised.

Capital budgeting is concerned with maximizing the total net profit subject to budget constraints by selecting an appropriate combination of projects. This paper presents chance maximizing models for capital budgeting with fuzzy input data and multiple conflicting objectives. When the decision maker sets a prospec-tive profit level and wants to maximize the chances of the total profit achieving the prospective profit level, a fuzzy dependent-chance programmingmodel, a fuzzy multi-objective dependent-chance programmingmodel, and a fuzzy goal dependent-chance programmingmodel are used to formulate the fuzzy capital budgeting problem. A fuzzy simulation based genetic algorithm is used to solve these models. Numerical examples are provided to illustrate the effectiveness of the simulation-based genetic algorithm and the po-tential applications of these models.

In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

Recent results showed PPAD-completeness of the problem of computing an equilibrium for Fisher's market model under additively separable, piecewise-linear, concave utilities. We show that introducing perfect price discrimination in this model renders its equilibrium polynomial time computable. Moreover, its set of equilibria are captured by a convex program that generalizes the classical Eisenberg-Gale program, and always admits a rational solution.

We have updated the parton and hadron cascade model PACIAE for the relativistic nuclear collisions, from based on JETSET 6.4 and PYTHIA 5.7, and referred to as PACIAE 2.0. The main physics concerning the stages of the parton initiation, parton rescattering, hadronization, and hadron rescattering were discussed. The structures of the programs were briefly explained. In addition, some calculated examples were compared with the experimental data. It turns out that this model (program) works well.

Full Text Available The aim of this paper is to give briefing of the concept of network-on-chip and programmingmodel topics on multiprocessors system-on-chip world, an attractive and relatively new field for academia. Numerous proposals from academia and industry are selected to highlight the evolution of the implementation approaches both on NoC proposals and on programmingmodels proposals.

Full Text Available Causal modeling and the accompanying learning algorithms provide useful extensions for in-depth statistical investigation and automation of performance modeling. We enlarged the scope of existing causal structure learning algorithms by using the form-free information-theoretic concept of mutual information and by introducing the complexity criterion for selecting direct relations among equivalent relations. The underlying probability distribution of experimental data is estimated by kernel density estimation. We then reported on the benefits of a dependency analysis and the decompositional capacities of causal models. Useful qualitative models, providing insight into the role of every performance factor, were inferred from experimental data. This paper reports on the results for a LU decomposition algorithm and on the study of the parameter sensitivity of the Kakadu implementation of the JPEG-2000 standard. Next, the analysis was used to search for generic performance characteristics of the applications.

The model showed that the nutritional strategy applied is dependent upon the availability, price and ... to produce least cost formulations for dairy cows, and enables the ... the trends which occur in practice. Description .... Mineral supplement.

.... Animal models of epigenetic modifications secondary to an altered IU milieu are an invaluable tool to study the mechanisms that determine the development of metabolic diseases, such as diabetes and obesity...

The demand for health professionals continues to increase, partially due to the aging population and the high proportion of practitioners nearing retirement. The University of British Columbia (UBC) has developed a program to address this demand, by providing support for internationally trained Physiotherapists in their preparation for taking the National Physiotherapy competency examinations.The aim was to create a program comprised of the educational tools and infrastructure to support internationally educated physiotherapists (IEPs) in their preparation for entry to practice in Canada and, to improve their pass rate on the national competency examination. The program was developed using a logic model and evaluated using program evaluation methodology. Program tools and resources included educational modules and curricular packages which were developed and refined based on feedback from clinical experts, IEPs and clinical physical therapy mentors. An examination bank was created and used to include test-enhanced education. Clinical mentors were recruited and trained to provide clinical and cultural support for participants. The IEP program has recruited 124 IEPs, with 69 now integrated into the Canadian physiotherapy workforce, and more IEPs continuing to apply to the program. International graduates who participated in the program had an improved pass rate on the national Physiotherapy Competency Examination (PCE); participation in the program resulted in them having a 28% (95% CI, 2% to 59%) greater possibility of passing the written section than their counterparts who did not take the program. In 2010, 81% of all IEP candidates who completed the UBC program passed the written component, and 82% passed the clinical component. The program has proven to be successful and sustainable. This programmodel could be replicated to support the successful integration of other international health professionals into the workforce.

The FIELD II simulation program can be used for simulating any kind of linear ultrasound fields. The program is capable of describing multi-element transducers used with any kind of excitation, apodization, and focusing. The program has been widely used in both academia and by commercial ultrasound...... companies for investigation novel transducer geometries and advanced linear imaging schemes. The programmodels transducer geometries using a division of the transducer elements into either rectangles, triangles, or bounding lines. The precision of the simulation and the simulation time is intimately linked...

Full Text Available We develop a nonlinear mathematical model with the effect of awareness programs on the binge drinking. Due to the fact that awareness programs are capable of inducing behavioral changes in nondrinkers, we introduce a separate class by avoiding contacts with the heavy drinkers. Furthermore we assume that cumulative density of awareness programs increases at a rate proportional to the number of heavy drinkers. We establish some sufficient conditions for the stability of the alcohol free and the alcohol present equilibria and give some numerical simulations to explain our main result. Our results show that awareness programs is an effective measure in reducing alcohol problems.

Insufficient theoretical definition of heterogeneous catalysts is the major difficulty confronting industrial suppliers who seek catalyst systems which are more active, selective, and stable than those currently available. In contrast, progress was made in tailoring homogeneous catalysts to specific reactions because more is known about the reaction intermediates promoted and/or stabilized by these catalysts during the course of reaction. However, modeling heterogeneous catalysts on a microscopic scale requires compiling and verifying complex information on reaction intermediates and pathways. This can be achieved by adapting homogeneous catalyzed reaction intermediate species, applying theoretical quantum chemistry and computer technology, and developing a better understanding of heterogeneous catalyst system environments. Research in microscopic reaction modeling is now at a stage where computer modeling, supported by physical experimental verification, could provide information about the dynamics of the reactions that will lead to designing supported catalysts with improved selectivity and stability.

Recent research suggests that school-based kindness education programs may benefit the learning and social-emotional development of youth and may improve school climate and school safety outcomes. However, how and to what extent kindness education programming influences positive outcomes in schools is poorly understood, and such programs are difficult to evaluate in the absence of a conceptual model for studying their effectiveness. In partnership with Kind Campus, a widely adopted school-based kindness education program that uses a bottom-up program framework, a methodology called concept mapping was used to develop a conceptual model for evaluating school-based kindness education programs from the input of 123 middle school students and approximately 150 educators, school professionals, and academic scholars. From the basis of this model, recommendations for processes and outcomes that would be useful to assess in evaluations of kindness education programs are made, and areas where additional instrument development may be necessary are highlighted. The utility of the concept mapping method as an initial step in evaluating other grassroots or non-traditional educational programming is also discussed.

Full Text Available Introductory programming seems far from being successful at both university and high school levels. Research data already published offer significant knowledge regarding university students’ deficiencies in computer programming and the alternative representations they built about abstract programming constructs. However, secondary education students’ learning and development in computer programming has not been extensively studied. This paper reports on the use of the SOLO taxonomy to explore secondary education students’ representations of the concept of programming variable and the assignment statement. Data was collected in the form of students’ written responses to programming tasks related to short code programs. The responses were mapped to the different levels of the SOLO taxonomy. The results showed that approximately more than one half of the students in the sample tended to manifest prestructural, unistructural and multistructural responses to the research tasks. In addition, the findings provide evidence that students’ thinking and application patterns are prevalently based on mathematical-like mental models about the concepts of programming variable and the assignment statement. The paper concludes with suggestions for instructional design and practice to help students’ building coherent and viable mental models of the programming variable and the assignment statement.

A three-dimensional static model is described to evaluate the forces on low-back muscles and on the spine during manual handling tasks and other forceful activities. It is simple to use either with a calculator or programmed onto a micro-computer, whilst being more accurate than existing simple models. Comparisons are made with a more sophisticated model that requires mathematical libraries and programming skills. As predictions are similar, so is the area of validity: the proposed model's accuracy is good for light tasks but poorer for strenuous ones.

A model to optimize the planning of the chemical integrated system comprised by multi-devices and multi-products has been proposed in this paper. With the objective to make more profits, the traditional model for optimizing production planning has been proposed. The price of chemicals, the market...... demand, and the production capacity have been considered as mutative variables, then an improved model in which some parameters are not constant has been developed and a new method to solve the grey linear programming has been proposed. In the grey programmingmodel, the value of credibility can...

Full Text Available Distribution system reconfiguration aims to choose a switching combination of branches of the system that optimize certain performance criteria of power supply while maintaining some specified constraints. The ability to automatically reconfigure the network quickly and reliably is a key requirement of self-healing networks which is an important part of the future Smart Grid system. We present a unified mathematical framework, which allows us to consider different objectives of distribution system reconfiguration problems in a flexible manner, and investigate its performance. The resulting optimization problem is in quadratic form which can be solved efficiently by using a quadratic mixed integer programming (QMIP solver. The proposed method has been applied for reconfiguring different standard test distribution systems.

This paper describes the introduction of stochastic linear programming into Operations DER-CAM, a tool used to obtain optimal operating schedules for a given microgrid under local economic and environmental conditions. This application follows previous work on optimal scheduling of a lithium-iron-phosphate battery given the output uncertainty of a 1 MW molten carbonate fuel cell. Both are in the Santa Rita Jail microgrid, located in Dublin, California. This fuel cell has proven unreliable, partially justifying the consideration of storage options. Several stochastic DER-CAM runs are executed to compare different scenarios to values obtained by a deterministic approach. Results indicate that using a stochastic approach provides a conservative yet more lucrative battery schedule. Lower expected energy bills result, given fuel cell outages, in potential savings exceeding 6percent.

PrimeSupplier, a supplier cross-program and element-impact simulation model, with supplier solvency indicator (SSI), has been developed so that the shuttle program can see early indicators of supplier and product line stability, while identifying the various elements and/or programs that have a particular supplier or product designed into the system. The model calculates two categories of benchmarks to determine the SSI, with one category focusing on agency programmatic data and the other focusing on a supplier's financial liquidity. PrimeSupplier was developed to help NASA smoothly transition design, manufacturing, and repair operations from the Shuttle program to the Constellation program, without disruption in the industrial supply base.

Full Text Available In the article presented a new model of physical and mathematical bevel gear to study the influence of design parameters and operating factors on the dynamic state of the gear transmission. Discusses the process of verifying proper operation of copyright calculation program used to determine the solutions of the dynamic model of bevel gear. Presents the block diagram of a computing algorithm that was used to create a program for the numerical simulation. The program source code is written in an interactive environment to perform scientific and engineering calculations, MATLAB

parallel programming is likely to serve for all tasks, however. Early vision algorithms are intensely data parallel, often utilizing fine-grain parallel computations that share an image, while cognition algorithms decompose naturally by function, often consisting of loosely-coupled, coarse-grain parallel units. A typical animate vision application will likely consist of many tasks, each of which may require a different parallel programmingmodel, and all of which must cooperate to achieve the desired behavior. These multi-modelprograms require an

Full Text Available Background: Tobacco use is highly prevalent and culturally accepted in rural Maharashtra, India. Aims: To study the knowledge, attitude, and practices (KAP regarding tobacco consumption, identify reasons for initiation and continuation of tobacco use, identify prevalence of tobacco consumption and its relation with different precancerous lesions, provide professional help for quitting tobacco, and develop local manpower for tobacco cessation activities. Settings, Design, Methods and Material: The present study was conducted for one year in a chemical industrial unit in Ratnagiri district. All employees (104 were interviewed and screened for oral neoplasia. Their socio-demographic features, habits, awareness levels etc. were recorded. Active intervention in the form of awareness lectures, focus group discussions, one-to-one counseling and, if needed, pharmacotherapy was offered to the tobacco users. Results: All employees actively participated in the program. Overall, 48.08% of the employees were found to use tobacco, among which the smokeless forms were predominant. Peer pressure and pleasure were the main reasons for initiation of tobacco consumption, and the belief that, though injurious, it would not harm them, avoiding physical discomfort on quitting and relieving stress were important factors for continuation of the habit. Employees had poor knowledge regarding the ill-effects of tobacco. 40% of tobacco users had oral precancerous lesions, which were predominant in employees consuming smokeless forms of tobacco. Conclusions: Identifying reasons for initiation and continuation of tobacco consumption along with baseline assessment of knowledge, attitudes, and practices regarding tobacco use, are important in formulating strategies for a comprehensive workplace tobacco cessation program.

Full Text Available In order to rank all decision making units (DMUs on the same basis, this paper proposes a multiobjective programming (MOP model based on a compensatory data envelopment analysis (DEA model to derive a common set of weights that can be used for the full ranking of all DMUs. We first revisit a compensatory DEA model for ranking all units, point out the existing problem for solving the model, and present an improved algorithm for which an approximate global optimal solution of the model can be obtained by solving a sequence of linear programming. Then, we applied the key idea of the compensatory DEA model to develop the MOP model in which the objectives are to simultaneously maximize all common weights under constraints that the sum of efficiency values of all DMUs is equal to unity and the sum of all common weights is also equal to unity. In order to solve the MOP model, we transform it into a single objective programming (SOP model using a fuzzy programming method and solve the SOP model using the proposed approximation algorithm. To illustrate the ranking method using the proposed method, two numerical examples are solved.

A growing body of evidence suggests that the intrauterine (IU) environment has a significant and lasting effect on the long-term health of the growing fetus and the development of metabolic disease in later life as put forth in the fetal origins of disease hypothesis. Metabolic diseases have been associated with alterations in the epigenome that occur without changes in the DNA sequence, such as cytosine methylation of DNA, histone posttranslational modifications, and micro-RNA. Animal models of epigenetic modifications secondary to an altered IU milieu are an invaluable tool to study the mechanisms that determine the development of metabolic diseases, such as diabetes and obesity. Rodent and nonlitter bearing animals are good models for the study of disease, because they have similar embryology, anatomy, and physiology to humans. Thus, it is feasible to monitor and modify the IU environment of animal models in order to gain insight into the molecular basis of human metabolic disease pathogenesis. In this review, the database of PubMed was searched for articles published between 1999 and 2011. Key words included epigenetic modifications, IU growth retardation, small for gestational age, animal models, metabolic disease, and obesity. The inclusion criteria used to select studies included animal models of epigenetic modifications during fetal and neonatal development associated with adult metabolic syndrome. Experimental manipulations included: changes in the nutritional status of the pregnant female (calorie-restricted, high-fat, or low-protein diets during pregnancy), as well as the father; interference with placenta function, or uterine blood flow, environmental toxin exposure during pregnancy, as well as dietary modifications during the neonatal (lactation) as well as pubertal period. This review article is focused solely on studies in animal models that demonstrate epigenetic changes that are correlated with manifestation of metabolic disease, including diabetes

Through Small Business Innovation Research (SBIR) contracts with Ames Research Center, Intelligent Automation Inc., based in Rockville, Maryland, advanced specialized software the company had begun developing with U.S. Department of Defense funding. The agent-based infrastructure now allows NASA's Airspace Concept Evaluation System to explore ways of improving the utilization of the National Airspace System (NAS), providing flexible modeling of every part of the NAS down to individual planes, airports, control centers, and even weather. The software has been licensed to a number of aerospace and robotics customers, and has even been used to model the behavior of crowds.

Design automation and analysis tools targeting embedded platforms, developed using a component-based design approach, must be able to reason about the capabilities of the platforms. In the general case where nothing is assumed about the components comprising a platform or the platform topology......, analysis must be employed to determine its capabilities. This kind of analysis is the subject of this dissertation. The main contribution of this work is the Service Relation Model used to describe and analyze the flow of service in models of platforms and systems composed of re-usable components...

Teaching pharmacokinetic-pharmacodynamic (PK/PD) models can be made more effective using computer simulations. We propose the programming of educational PK or PK/PD computer simulations as an alternative to the use of pre-built simulation software. This approach has the advantage of adaptability to non-standard or complicated PK or PK/PD models. Simplicity of the programming procedure was achieved by selecting the LabVIEW programming environment. An intuitive user interface to visualize the time courses of drug concentrations or effects can be obtained with pre-built elements. The environment uses a wiring analogy that resembles electrical circuit diagrams rather than abstract programming code. The goal of high interactivity of the simulation was attained by allowing the program to run in continuously repeating loops. This makes the program behave flexibly to the user input. The programming is described with the aid of a 2-compartment PK simulation. Examples of more sophisticated simulation programs are also given where the PK/PD simulation shows drug input, concentrations in plasma, and at effect site and the effects themselves as a function of time. A multi-compartmental model of morphine, including metabolite kinetics and effects is also included. The programs are available for download from the World Wide Web at http:// www. klinik.uni-frankfurt.de/zpharm/klin/ PKPDsimulation/content.html. For pharmacokineticists who only program occasionally, there is the possibility of building the computer simulation, together with the flexible interactive simulation algorithm for clinical pharmacological teaching in the field of PK/PD models.

Less than 6% of U.S. medical school applicants are African-American. The lack of diversity among physicians, by race as well as other measures, confers a negative impact on the American healthcare system because underrepresented minority (URM) physicians are more likely to practice in underserved communities and deliver more equitable, culturally competent care. MERIT (Medical Education Resources Initiative for Teens) is a nonprofit organization based in Baltimore, Maryland, USA. MERIT prepares URM high school students for health careers by providing a holistic support system for seven consecutive years. The programmodel, which utilizes weekly Saturday sessions, summer internships, and longitudinal mentoring, is built on four foundational pillars: (1) Ignite the Fire, (2) Illuminate the Path, (3) Create the Toolkit, and (4) Sustain the Desire. Since 2011, MERIT has supported 51 students in the Baltimore City Public School System. For the past two years, 100% (n = 14) of MERIT seniors enrolled in universities, compared to only 20.2% of Baltimore City students overall. While it is too early to know whether MERIT alumni will realize their goals of becoming healthcare professionals, they are currently excelling in universities and over 75% (n = 17) are still planning to pursue graduate degrees in health-related fields. After piloting an effective programmodel, MERIT now has three key priorities moving forward: (1) Creating a sustainable and thriving organization, (2) increasing the number of scholars the program supports in Baltimore, and (3) expanding MERIT to other cities.

This article aims to describe how schools should structure the development of academic talent at all levels of the K-12 educational system. Adopting as its theoretical framework the "Differentiating Model of Giftedness and Talent," the author proposes (a) a formal definition of academic talent development (ATD) inspired by the principles…

Educators today are challenged with the task of designing curricula and standards for students of varying abilities. While technology and innovation steadily improve classroom learning, teachers and administrators continue to struggle in developing the best methodologies and practices for students with disabilities. "Models for Effective…

This manual is intended to increase awareness of Title IX and related equity issues at the local school district level by providing materials and resources to specialists in school districts. The manual: (1) describes a model traveling equity resource display; and (2) provides instructions, agendas, and participant materials for a two-day training…

Deep learning is based on a set of algorithms that attempt to model high level abstractions in data. Specifically, RBM is a deep learning algorithm that used in the project to increase it\\'s time performance using some efficient parallel implementation by OpenACC tool with best possible optimizations on RBM to harness the massively parallel power of NVIDIA GPUs. GPUs development in the last few years has contributed to growing the concept of deep learning. OpenACC is a directive based ap-proach for computing where directives provide compiler hints to accelerate code. The traditional Restricted Boltzmann Ma-chine is a stochastic neural network that essentially perform a binary version of factor analysis. RBM is a useful neural net-work basis for larger modern deep learning model, such as Deep Belief Network. RBM parameters are estimated using an efficient training method that called Contrastive Divergence. Parallel implementation of RBM is available using different models such as OpenMP, and CUDA. But this project has been the first attempt to apply OpenACC model on RBM.

This report was prepared for the US Department of Energy (DOE) Office of Codes and Standards by the Pacific Northwest Laboratory (PNL) through its Building Energy Standards Program (BESP). The purpose of this task was to identify demand-side management (DSM) strategies for new construction that utilities have adopted or developed to promote energy-efficient design and construction. PNL conducted a survey of utilities and used the information gathered to extrapolate lessons learned and to identify evolving trends in utility new-construction DSM programs. The ultimate goal of the task is to identify opportunities where states might work collaboratively with utilities to promote the adoption, implementation, and enforcement of energy-efficient building energy codes.

Message passing interface (MPI) is the de facto standard in writing parallel scientific applications on distributed memory systems. Performance prediction of MPI programs on current or future parallel sys-terns can help to find system bottleneck or optimize programs. To effectively analyze and predict per-formance of a large and complex MPI program, an efficient and accurate communication model is highly needed. A series of communication models have been proposed, such as the LogP model family, which assume that the sending overhead, message transmission, and receiving overhead of a communication is not overlapped and there is a maximum overlap degree between computation and communication. However, this assumption does not always hold for MPI programs because either sending or receiving overhead introduced by MPI implementations can decrease potential overlap for large messages. In this paper, we present a new communication model, named LogGPO, which captures the potential overlap between computation with communication of MPI programs. We design and implement a trace-driven simulator to verify the LogGPO model by predicting performance of point-to-point communication and two real applications CG and Sweep3D. The average prediction errors of LogGPO model are 2.4% and 2.0% for these two applications respectively, while the average prediction errors of LogGP model are 38.3% and 9.1% respectively.

Be The Match® Patient and Health Professional Services (PHPS) supports patients undergoing hematopoietic cell transplant (HCT) and caregivers by providing educational programs and resources. HCT is a potentially curative therapy for blood cancers such as leukemia and lymphoma. To help meet the increasing demand for support services, PHPS implemented a multipronged plan to build and sustain the organization's capacity to conduct evaluation of its programs and resources. To do so, PHPS created and operationalized an internal evaluation model, developed customized resources to help stakeholders incorporate evaluation in program planning, and implemented utilization-focused evaluation for quality improvement. Formal mentorship was also critical in the development of an evidence-based, customized model and navigating inherent challenges throughout the process. Our model can serve as a guide for evaluators on establishing and operationalizing an internal evaluation program. Ultimately, we seek to improve support and education services from the time of diagnosis through survivorship.

Preprocessing and postprocessing computer programs that enhance the utility of the U.S. Geological Survey radial-flow model have been developed. The preprocessor program: (1) generates a triangular finite element mesh from minimal data input, (2) produces graphical displays and tabulations of data for the mesh , and (3) prepares an input data file to use with the radial-flow model. The postprocessor program is a version of the radial-flow model, which was modified to (1) produce graphical output for simulation and field results, (2) generate a statistic for comparing the simulation results with observed data, and (3) allow hydrologic properties to vary in the simulated region. Examples of the use of the processor programs for a hypothetical aquifer test are presented. Instructions for the data files, format instructions, and a listing of the preprocessor and postprocessor source codes are given in the appendixes. (Author 's abstract)

This paper characterizes quality, budget, and demand as fuzzy variables in a fuzzy vendor selec-tion expected value model and a fuzzy vendor selection chance-constrained programmingmodel, to maxi-mize the total quality level. The two models have distinct advantages over existing methods for selecting vendors in fuzzy environments. A genetic algorithm based on fuzzy simulations is designed to solve these two models. Numerical examples show the effectiveness of the algorithm.

The work is devoted to the application of Bourbaki's structure theory to substantiate the synthesis of simulation models of complex multicomponent systems, where every component may be a complex system itself. An application of the Bourbaki's structure theory offers a new approach to the design and computer implementation of simulation models of complex multicomponent systems—model synthesis and model-oriented programming. It differs from the traditional object-oriented approach. The central concept of this new approach and at the same time, the basic building block for the construction of more complex structures is the concept of models-components. A model-component endowed with a more complicated structure than, for example, the object in the object-oriented analysis. This structure provides to the model-component an independent behavior-the ability of standard responds to standard requests of its internal and external environment. At the same time, the computer implementation of model-component's behavior is invariant under the integration of models-components into complexes. This fact allows one firstly to construct fractal models of any complexity, and secondly to implement a computational process of such constructions uniformly-by a single universal program. In addition, the proposed paradigm allows one to exclude imperative programming and to generate computer code with a high degree of parallelism.

Under the mandate contained in the FY 1976 NASA Authorization Act, the National Aeronautics and Space Administration (NASA) has developed and is implementing a comprehensive program of research, technology development, and monitoring of the Earth's upper atmosphere, with emphasis on the upper troposphere and stratosphere. This program aims at expanding our chemical and physical understanding to permit both the quantitative analysis of current perturbations as well as the assessment of possible future changes in this important region of our environment. It is carried out jointly by the Upper Atmosphere Research Program (UARP) and the Atmospheric Chemistry Modeling and Analysis Program (ACMAP), both managed within the Research Division in the Office of Earth Science at NASA. Significant contributions to this effort have also been provided by the Atmospheric Effects of Aviation Project (AEAP) of NASA's Office of Aero-Space Technology. The long-term objectives of the present program are to perform research to: understand the physics, chemistry, and transport processes of the upper troposphere and the stratosphere and their control on the distribution of atmospheric chemical species such as ozone; assess possible perturbations to the composition of the atmosphere caused by human activities and natural phenomena (with a specific emphasis on trace gas geographical distributions, sources, and sinks and the role of trace gases in defining the chemical composition of the upper atmosphere); understand the processes affecting the distributions of radiatively active species in the atmosphere, and the importance of chemical-radiative-dynamical feedbacks on the meteorology and climatology of the stratosphere and troposphere; and understand ozone production, loss, and recovery in an atmosphere with increasing abundances of greenhouse gases. The current report is composed of two parts. Part 1 summarizes the objectives, status, and accomplishments of the research tasks supported

'Fetal programming' is a newly emerging field that is revealing astounding insights into the prenatal origins of adult disease, including metabolic, endocrine, and cardiovascular pathophysiology. In the present study, we tested the hypothesis that rat pups conceived, gestated and born at 2-g have significantly reduced birth weights and increased adult body weights as compared to 1-g controls. Offspring were produced by mating young adult male and female rats that were adapted to 2-g centrifugation. Female rats underwent conception, pregnancy and birth at 2-g. Newborn pups in the 2-g condition were removed from the centrifuge and fostered to non-manipulated, newly parturient dams maintained at 1-g. Comparisons were made with 1-g stationary controls, also cross- fostered at birth. As compared to 1-g controls, birth weights of pups gestated and born at 2-g were significantly reduced. Pup body weights were significantly reduced until Postnatal day (P)12. Beginning on P63, body weights of 2-g-gestated offspring exceeded those of 1-g controls by 7-10%. Thus, prenatal rearing at 2-g restricts neonatal growth and increases adult body weight. Collectively, these data support the hypothesis that 2-g centrifugation alters the intrauterine milieu, thereby inducing persistent changes in adult phenotype.

A new computer program, LATIS, being developed at Lawrence Livermore National Laboratory is used to study the effect of pulsed laser irradiation on endovascular patch welding. Various physical and biophysical effects are included in these simulations: laser light scattering and absorption, tissue heating and heat conduction, vascular cooling, and tissue thermal damage. The geometry of a patch being held against the inner vessel wall (500 {mu}m inner diameter) by a balloon is considered. The system is exposed to light pulsed from an optical fiber inside the balloon. A minimum in the depth of damage into the vessel wall is found. The minimum damage zone is about the thickness of the patch material that is heated by the laser. The more ordered the tissue the thinner the minimum zone of damage. The pulse length which minimizes the zone of damage is found to be the time for energy to diffuse across the layer. The delay time between the pulses is determined by the time for the heated layer to cool down. An optimal pulse length exists which minimizes the total time needed to weld the patch to the wall while keeping the thickness of the damaged tissue to less than 100 {mu}m. For the case that is considered, a patch dyed with light absorbing ICG on the side next to the vessel (thickness of the dyed layer is 60 {mu}m), the best protocol is found to be 65-200 ms pulses applied over 2 min.

in order to coordinate a structured way fl ow of resources between the links in the chain starting with “harvesting” going through the middle steps of “extraction” and “conversion” until the fi nal step “mixture “, which are involved in the supply network. The goal of the model is to determine a distribution plan of palm, oil, biodiesel and diesel throughout the chain, along with a production and inventory plan, and a capacity increase plan for biorefi neries in a way that minimizes the total cost of the production chain over a predefi ned planning horizon. The application of the model results in a projection to the year 2043 showing the behavior of the chain, specifi cally soil requirements for such production levels.

A growing body of evidence suggests that the intrauterine (IU) environment has a significant and lasting effect on the long-term health of the growing fetus and the development of metabolic disease in later life as put forth in the fetal origins of disease hypothesis. Metabolic diseases have been associated with alterations in the epigenome that occur without changes in the DNA sequence, such as cytosine methylation of DNA, histone posttranslational modifications, and micro-RNA. Animal models...

Full Text Available This research aimed to analyze the model of empowering dry land farmers in Central Java, the actors involved, the constraints faced, the impact and level of effectiveness. The study used two approaches: qualitative and quantitative approaches. In the qualitative approach, data were analyzed by using an interactive model. While the quantitative approach carried out by using the cost and benefit analysis. In the qualitative approach, data were analyzed by using an interactive model. While the quantitative approach conducted by using the cost and benefit analysis. The results showed that CSR was done through the assistance of technical consultants by applying concept "one product one village"; institutions involved include SOEs (State-Owned Enterprises, the Provincial Government, Local Government, Village Government, Private Companies and Community. The external constraints that happen came from cultural differences of government and private organizations as well as the existence of unpredictable extreme weather. Meanwhile the internal constraints derived from the knowledge level of farmers; ROI calculation result showed that the planting of horticulture commodities was profitable.

Abstract descriptions of how curricula are structured and run. The American National Standards Institute (ANSI) MedBiquitous Curriculum Inventory Standard provides a technical syntax through which a wide range of different curricula can be expressed and subsequently compared and analyzed. This standard has the potential to shift curriculum mapping and reporting from a somewhat disjointed and institution-specific undertaking to something that is shared among multiple medical schools and across whole medical education systems. Given the current explosion of different models of curricula (time-free, competency-based, socially accountable, distributed, accelerated, etc.), the ability to consider this diversity using a common model has particular value in medical education management and scholarship. This article describes the development and structure of the Curriculum Inventory Standard as a way of standardizing the modeling of different curricula for audit, evaluation and research purposes. It also considers the strengths and limitations of the current standard and the implications for a medical education world in which this level of commonality, precision, and accountability for curricular practice is the norm rather than the exception.

This report describes the Pacific Northwest Laboratory`s (PNL`s) evaluation of the Washington State Energy Code Program (WSECP). In 1990, the Washington State Legislature passed a residential energy efficiency code to be effective July 1, 1992. Bonneville supported passage and implementation of the code to ensure that new residences in the State of Washington were as energy efficient as economically feasible. The Washington State Energy Office (WSEO) is conducting the WSECP for Bonneville to support code implementation. This support takes several forms, including providing training to code enforcement officials, technical support both in the field and through telephone ``hot lines,`` and computerized tools to review house plans for code compliance. WSEO began implementing the WSECP in 1992, prior to the effective date of the new code. This first phase of the WSECP was the subject of an earlier process evaluation conducted by PNL. From that evaluation PNL found that most new homes being built immediately after the code went into effect were ``grand-fathered`` under the old code. The training program for the new code was in place and sessions were being attended by the jurisdictions but it was too early to determine if the training was effective in improving code compliance and easing the transition to the new energy code. That is the subject of this evaluation.

The University of Alabama's Graduate Geropsychology Education program (GGE) was conceived and implemented in the years prior to the design of the Pike's Peak Model (PPM) of geropsychology training. The GGE program provides a unique opportunity to evaluate the PPM, and this paper outlines the GGE program in the framework of the model. Three primary goals defined the GGE program: recruitment and retention of students in the geropsychology program, a doctoral level interdisciplinary class, and a set of clinical rotations in urban and rural sites. Outcomes were promising, indicating that geropsychology students were able to provide services with positive outcomes to underserved older adults in primary care settings and in a legal clinic, students from several disciplines rated the course very highly, and psychology students indicated that they were likely to continue in the field of geriatric care. Participating students have gone on to careers in geropsychology. Findings from this program support the design of the Pike's Peak Model, and provide support for broader implementation of similar training programs.

The University of Alabama’s Graduate Geropsychology Education program (GGE) was conceived and implemented in the years prior to the design of the Pike’s Peak Model (PPM) of geropsychology training. The GGE program provides a unique opportunity to evaluate the PPM, and this paper outlines the GGE program in the framework of the model. Three primary goals defined the GGE program: recruitment and retention of students in the geropsychology program, a doctoral level interdisciplinary class, and a set of clinical rotations in urban and rural sites. Outcomes were promising, indicating that geropsychology students were able to provide services with positive outcomes to underserved older adults in primary care settings and in a legal clinic, students from several disciplines rated the course very highly, and psychology students indicated that they were likely to continue in the field of geriatric care. Participating students have gone on to careers in geropsychology. Findings from this program support the design of the Pike’s Peak Model, and provide support for broader implementation of similar training programs. PMID:24883167

The paper considers a variable length Markov chain model associated with a group of stationary processes that share the same context tree but potentially different conditional probabilities. We propose a new model selection and estimation method, develop oracle inequalities and model selection properties for the estimator. These results also provide conditions under which the use of the group structure can lead to improvements in the overall estimation. Our work is also motivated by two methodological applications: discrete stochastic dynamic programming and dynamic discrete choice models. We analyze the uniform estimation of the value function for dynamic programming and the uniform estimation of average dynamic marginal effects for dynamic discrete choice models accounting for possible imperfect model selection. We also derive the typical behavior of our estimator when applied to polynomially $\\beta$-mixing stochastic processes. For parametric models, we derive uniform rate of convergence for the estimation...

Hazardous materials transportation is an important and hot issue of public safety. Based on the shortest path model, this paper presents a fuzzy multi-objective programmingmodel that minimizes the transportation risk to life, travel time and fuel consumption. First, we present the risk model, travel time model and fuel consumption model. Furthermore, we formulate a chance-constrained programmingmodel within the framework of credibility theory, in which the lengths of arcs in the transportation network are assumed to be fuzzy variables. A hybrid intelligent algorithm integrating fuzzy simulation and genetic algorithm is designed for finding a satisfactory solution. Finally, some numerical examples are given to demonstrate the efficiency of the proposed model and algorithm.

Optima is a software package for modeling HIV epidemics and interventions that we developed to address practical policy and program problems encountered by funders, governments, health planners, and program implementers. Optima's key feature is its ability to perform resource optimization to meet strategic HIV objectives, including HIV-related financial commitment projections and health economic assessments. Specifically, Optima allows users to choose a set of objectives (such as minimizing new infections, minimizing HIV-related deaths, and/or minimizing long-term financial commitments) and then determine the optimal resource allocation (and thus program coverage levels) for meeting those objectives. These optimizations are based on the following: calibrations to epidemiological data; assumptions about the costs of program implementation and the corresponding coverage levels; and the effects of these programs on clinical, behavioral, and other epidemiological outcomes. Optima is flexible for which population groups (specified by behavioral, epidemiological, and/or geographical factors) and which HIV programs are modeled, the amount of input data used, and the types of outputs generated. Here, we introduce this model and compare it with existing HIV models that have been used previously to inform decisions about HIV program funding and coverage targets. Optima has already been used in more than 20 countries, and there is increasing demand from stakeholders to have a tool that can perform evidence-based HIV epidemic analyses, revise and prioritize national strategies based on available resources, set program coverage targets, amend subnational program implementation plans, and inform the investment strategies of governments and their funding partners.

Full Text Available Graphics processing units and similar accelerators have been intensively used in general purpose computations for several years. In the last decade, GPU architecture and organization changed dramatically to support an ever-increasing demand for computing power. Along with changes in hardware, novel programmingmodels have been proposed, such as NVIDIA’s Compute Unified Device Architecture (CUDA and Open Computing Language (OpenCL by Khronos group. Although numerous commercial and scientific applications have been developed using these two models, they still impose a significant challenge for less experienced users. There are users from various scientific and engineering communities who would like to speed up their applications without the need to deeply understand a low-level programmingmodel and underlying hardware. In 2011, OpenACC programmingmodel was launched. Much like OpenMP for multicore processors, OpenACC is a high-level, directive-based programmingmodel for manycore processors like GPUs. This paper presents an analysis of OpenACC programmingmodel and its applicability in typical domains like image processing. Three, simple image processing algorithms have been implemented for execution on the GPU with OpenACC. The results were compared with their sequential counterparts, and results are briefly discussed.

The high performance computing community has experienced an explosive improvement in distributed-shared memory hardware. Driven by increasing real-world problem complexity, this explosion has ushered in vast numbers of new systems. Each new system presents new challenges to programmers and application developers. Part of the challenge is adapting to new architectures with new performance characteristics. Different vendors release systems with widely varying architectures that perform differently in different situations. Furthermore, since vendors need only provide a single performance number (total MFLOPS, typically for a single benchmark), they only have strong incentive initially to optimize the API of their choice. Consequently, only a fraction of the available APIs are well optimized on most systems. This causes issues porting and writing maintainable software, let alone issues for programmers burdened with mastering each new API as it is released. Also, programmers wishing to use a certain machine must choose their API based on the underlying hardware instead of the application. This thesis argues that a flexible, extensible translator for distributed-shared memory APIs can help address some of these issues. For example, a translator might take as input code in one API and output an equivalent program in another. Such a translator could provide instant porting for applications to new systems that do not support the application's library or language natively. While open-source APIs are abundant, they do not perform optimally everywhere. A translator would also allow performance testing using a single base code translated to a number of different APIs. Most significantly, this type of translator frees programmers to select the most appropriate API for a given application based on the application (and developer) itself instead of the underlying hardware.

The high performance computing community has experienced an explosive improvement in distributed-shared memory hardware. Driven by increasing real-world problem complexity, this explosion has ushered in vast numbers of new systems. Each new system presents new challenges to programmers and application developers. Part of the challenge is adapting to new architectures with new performance characteristics. Different vendors release systems with widely varying architectures that perform differently in different situations. Furthermore, since vendors need only provide a single performance number (total MFLOPS, typically for a single benchmark), they only have strong incentive initially to optimize the API of their choice. Consequently, only a fraction of the available APIs are well optimized on most systems. This causes issues porting and writing maintainable software, let alone issues for programmers burdened with mastering each new API as it is released. Also, programmers wishing to use a certain machine must choose their API based on the underlying hardware instead of the application. This thesis argues that a flexible, extensible translator for distributed-shared memory APIs can help address some of these issues. For example, a translator might take as input code in one API and output an equivalent program in another. Such a translator could provide instant porting for applications to new systems that do not support the application's library or language natively. While open-source APIs are abundant, they do not perform optimally everywhere. A translator would also allow performance testing using a single base code translated to a number of different APIs. Most significantly, this type of translator frees programmers to select the most appropriate API for a given application based on the application (and developer) itself instead of the underlying hardware.

Walsh et al. (2012) emphasized the importance of obtaining evidence to assess the effects of management actions on state variables relevant to objectives of conservation programs. They focused on malleefowl Leipoa ocellata, ground-dwelling Australian megapodes listed as vulnerable. They noted that although fox Vulpes vulpes baiting is the main management action used in malleefowl conservation throughout southern Australia, evidence of the effectiveness of this action is limited and currently debated. Walsh et al. (2012) then used data from 64 sites monitored for malleefowl and foxes over 23 years to assess key functional relationships relevant to fox control as a conservation action for malleefowl. In one set of analyses, Walsh et al. (2012) focused on two relationships: fox baiting investment versus fox presence, and fox presence versus malleefowl population size and rate of population change. Results led to the counterintuitive conclusion that increases in investments in fox control produced slight decreases in malleefowl population size and growth. In a second set of analyses, Walsh et al. (2012) directly assessed the relationship between investment in fox baiting and malleefowl population size and rate of population change. This set of analyses showed no significant relationship between investment in fox population control and malleefowl population growth. Both sets of analyses benefited from the incorporation of key environmental covariates hypothesized to influence these management relationships. Walsh et al. (2012) concluded that "in most situations, malleefowl conservation did not effectively benefit from fox baiting at current levels of investment." In this commentary, I discuss the work of Walsh et al. (2012) using the conceptual framework of structured decision making (SDM). In doing so, I accept their analytic results and associated conclusions as accurate and discuss basic ideas about evidence, conservation and limits to management.

In order to balance the temporal-spatial distribution of urban traffic flow, a model is established for combined urban traffic signal control and traffic flow guidance. With consideration of the wide use of fixed signal control at intersections, traffic assignment under traffic flow guidance, and dynamic characteristics of urban traffic management, a tri-level programmingmodel is presented. To reflect the impact of intersection delay on traffic assignment, the lower level model is set as a modified user equilibrium model. The middle level model, which contains several definitional constraints for different phase modes, is built for the traffic signal control optimization. To solve the problem of tide lane management, the upper level model is built up based on nonlinear 0-1 integer programming. A heuristic iterative optimization algorithm (HIOA) is set up to solve the tri-level programmingmodel. The lower level model is solved by method of successive averages (MSA), the middle level model is solved by non-dominated sorting genetic algorithm II (NSGA II), and the upper level model is solved by genetic algorithm (GA). A case study is raised to show the efficiency and applicability of the proposed modelling and computing method.

Accounting has been faced with a severe shortage in the supply of qualified doctoral faculty. Drawing upon the international mobility of foreign scholars and the spirit of the international medical graduate program, this article suggests a model to fill the demand in accounting doctoral faculty. The underlying assumption of the suggested model is…

by youths after high school graduation. It is assumed that the decision is taken year by year, and it is analyzed in a discrete choice dynamic programmingmodel. In this forward-looking behavioral model, it is shown that a small bonus would remove interruptions of the educational careers just after high...

The twofold purpose of this article is to highlight the importance of fostering social competence within inclusive preschool programs and to describe a model for training teachers in research-based social facilitation strategies so as to promote social interaction between children with and without disabilities. This model was developed to address…

High-stakes standardized student assessments are increasingly used in value-added evaluation models to connect teacher performance to P-12 student learning. These assessments are also being used to evaluate teacher preparation programs, despite validity and reliability threats. A more rational model linking student performance to candidates who…

An alternative management model was implemented to increase teacher productivity in an institutional school for 185 mildly to severely mentally retarded children. The model included three components: a management structure that allowed for problem solving while still motivating staff; an inservice program for staff; and an evaluation system to…

The Predictive Microbiology Program,(PMP)is based on the fact that most bacterial behaviors are reproducible and can be quantified by characterizing the environmental factors that affect growth, survival, and inactivation using mathematical modeling. The contents of PMP, a collection of models, are ...