Sample records for processing system development

Acquiring the knowledge to assemble an integrated Information System (IS) developmentprocess that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

An effort was undertaken to perform simple digital processing tasks using pre-existing general purpose digital computers. An experimental software package, LIGMALS, was obtained and modified for this purpose. The resulting software permits basic processing tasks to be performed including level slicing, gray mapping and ratio processing. The experience gained in this project indicates a possible direction which may be used by other developing countries to obtain digital processing capabilities.

The Spacelab Data Processing Facility (SLDPF) is an integral part of the Space Shuttle data network for missions that involve attached scientific payloads. Expert system prototypes were developed to aid in the performance of the quality assurance function of the Spacelab and/or Attached Shuttle Payloads processed telemetry data. The Spacelab Input ProcessingSystem (SIPS) and the Spacelab Output ProcessingSystem (SOPS), two expert systems, were developed to determine their feasibility and potential in the quality assurance of processed telemetry data. The capabilities and performance of these systems are discussed.

Systems engineering is designing and optimizing systems. This paper reviews the systems engineering process and indicates how it can be applied in the development of advanced human support systems. Systems engineering develops the performance requirements, subsystem specifications, and detailed designs needed to construct a desired system. Systems design is difficult, requiring both art and science and balancing human and technical considerations. The essential systems engineering activity is trading off and compromising between competing objectives such as performance and cost, schedule and risk. Systems engineering is not a complete independent process. It usually supports a systemdevelopment project. This review emphasizes the NASA project management process as described in NASA Procedural Requirement (NPR) 7120.5B. The process is a top down phased approach that includes the most fundamental activities of systems engineering - requirements definition, systems analysis, and design. NPR 7120.5B also requires projects to perform the engineering analyses needed to ensure that the system will operate correctly with regard to reliability, safety, risk, cost, and human factors. We review the systemdevelopment project process, the standard systems engineering design methodology, and some of the specialized systems analysis techniques. We will discuss how they could apply to advanced human support systemsdevelopment. The purpose of advanced systemsdevelopment is not directly to supply human space flight hardware, but rather to provide superior candidate systems that will be selected for implementation by future missions. The most direct application of systems engineering is in guiding the development of prototype and flight experiment hardware. However, anticipatory systems engineering of possible future flight systems would be useful in identifying the most promising development projects.

A diagnostic expert system for tea processing which can presume the cause of the defect of the processed tea was developed to contribute to the improvement of tea processing. This system that consists of some programs can be used through the Internet. The inference engine, the core of the system adopts production system which is well used on artificial intelligence, and is coded by Prolog as the artificial intelligence oriented language. At present, 176 rules for inference have been registered on this system. The system will be able to presume better if more rules are added to the system.

The developing central nervous system (CNS) is the organ system most frequently observed to exhibit congenital abnormalities. While the developing CNS lacks a blood brain barrier, the characteristics of known teratogens indicate that differential doses to the developing vs mature brain are not the major factor in differential sensitivity. Instead, most agents seem to act on processes that occur only during development. Thus, it appears that the susceptibility of the developing brain compared to the mature one depends to a great extent on the presence of processes sensitive to disruption. Yet cell proliferation, migration, and differentiation characterize many other developing organs, so the difference between CNS and other organs must depend on other properties of the developing CNS. The most important of these is probably the fact that nervous systemdevelopment takes much longer than development of other organs, making it subject to injury over a longer period. PMID:7925182

Spacelab Data Processing Facility (SLDPF) expert system prototypes have been developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output ProcessingSystem and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

Spacelab Data Processing Facility (SLDPF) expert system prototypes were developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. The SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output ProcessingSystem and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

This paper describes a system that directly supports a design process in a mechanical domain. This system is based on a thinking processdevelopment diagram that draws distinctions between requirement, tasks, solutions, and implementation, which enables designers to expand and deepen their thoughts of design. The system provides five main functions that designers require in each phase of the proposed design process: (1) thinking process description support which enables designers to describe their thoughts, (2) creativity support by term association with thesauri, (3) timely display of design knowledge including know-how obtained through earlier failures, general design theories, standard-parts data, and past designs, (4) design problem solving support using 46 kinds of thinking operations, and (5) proper technology transfer support which accumulates not only design conclusions but also the design process. Though this system is applied to mechanical engineering as the first target domain, it can be easily expanded to many other domains such as architecture and electricity.

Challenges of processing metal containing materials need to be addressed in order apply this technology to Behavior of metal containing materials on coater/developerprocessing including coating process, developerprocess and tool metal contamination is studied using CLEAN TRACKTM LITHIUS ProTM Z (Tokyo Electron Limited). Through this work, coating uniformity and coating film defectivity were studied. Metal containing material performance was comparable to conventional materials. Especially, new dispense system (NDS) demonstrated up to 80% reduction in coating defect for metal containing materials. As for processed wafer metal contamination, coated wafer metal contamination achieved less than 1.0E10 atoms/cm2 with 3 materials. After develop metal contamination also achieved less than 1.0E10 atoms/cm2 with 2 materials. Furthermore, through the metal defect study, metal residues and metal contamination were reduced by developer rinse optimization.

Washington state's public workers' compensation system has had a formal process for developing and implementing evidence-based clinical practice guidelines since 2007. Collaborating with the Industrial Insurance Medical Advisory Committee and clinicians from the medical community, the Office of the Medical Director has provided leadership and staff support necessary to develop guidelines that have improved outcomes and reduced the number of potentially harmful procedures. Guidelines are selected according to a prioritization schema and follow a developmentprocess consistent with that of the national Institute of Medicine. Evaluation criteria are also applied. Guidelines continue to be developed to provide clinical recommendations for optimizing care and reducing risk of harm.

An aim of research described in the article was to evaluate usefulness of the university information system, which precedes its reorganization. The study was conducted among representatives of all stakeholders - system users: candidates, students and university authorities. A need of system users expressed in the study: change of the approach in its construction - from purely information to procedural, it is consistent with a current process approach in systems design, intensified by the fashionable service oriented architecture (SOA). This thread was developed by conducting literature research and analysis of student information systems best practices. As a result the processes were selected and described, which implementation may assist the university system. Research result can be used by system designers for its improvement.

This article presents the development of an image-based sun position sensor and the algorithm for how to aim at the Sun precisely by using image processing. Four-quadrant light sensors and bar-shadow photo sensors were used to detect the Sun's position in the past years. Nevertheless, neither of them can maintain high accuracy under low irradiation conditions. Using the image-based Sun position sensor with image processing can address this drawback. To verify the performance of the Sun-tracking system including an image-based Sun position sensor and a tracking controller with embedded image processing algorithm, we established a Sun image tracking platform and did the performance testing in the laboratory; the results show that the proposed Sun tracking system had the capability to overcome the problem of unstable tracking in cloudy weather and achieve a tracking accuracy of 0.04°. PMID:23615582

A processingsystem capable of producing solar cell junctions by ion implantation followed by pulsed electron beam annealing was developed and constructed. The machine was to be capable of processing 4-inch diameter single-crystal wafers at a rate of 10(7) wafers per year. A microcomputer-controlled pulsed electron beam annealer with a vacuum interlocked wafer transport system was designed, built and demonstrated to produce solar cell junctions on 4-inch wafers with an AMI efficiency of 12%. Experiments showed that a non-mass-analyzed (NMA) ion beam could implant 10 keV phosphorous dopant to form solar cell junctions which were equivalent to mass-analyzed implants. A NMA ion implanter, compatible with the pulsed electron beam annealer and wafer transport system was designed in detail but was not built because of program termination.

A solar cell junction processingsystem was developed and fabricated. A pulsed electron beam for the four inch wafers is being assembled and tested, wafers were successfully pulsed, and solar cells fabricated. Assembly of the transport locks is completed. The transport was operated successfully but not with sufficient reproducibility. An experiment test facility to examine potential scaleup problems associated with the proposed ion implanter design was constructed and operated. Cells were implanted and found to have efficiency identical to the normal Spire implant process.

This contribution presents a rapid prototyping approach for the real-time demonstration of image processing algorithms. As an example EADS/LFK has developed a basic IR target tracking system implementing this approach. Traditionally in research and industry time-independent simulation of image processing algorithms on a host computer is processed. This method is good for demonstrating the algorithms' capabilities. Rarely done is a time-dependent simulation or even a real-time demonstration on a target platform to prove the real-time capabilities. In 1D signal processing applications time-dependent simulation and real-time demonstration has already been used for quite a while. For time-dependent simulation Simulink from The MathWorks has established as an industry standard. Combined with The MathWorks' Real-Time Workshop the simulation model can be transferred to a real-time target processor. The executable is generated automatically by the Real-Time Workshop directly out of the simulation model. In 2D signal processing applications like image processing The Mathworks' Matlab is commonly used for time-independent simulation. To achieve time-dependent simulation and real-time demonstration capabilities the algorithms can be transferred to Simulink, which in fact runs on top of Matlab. Additionally to increase the performance Simulink models or parts of them can be transferred to Xilinx FPGAs using Xilinx' System Generator. With a single model and the automatic workflow both, a time-dependant simulation and the real-time demonstration, are covered leading to an easy and flexible rapid prototyping approach. EADS/LFK is going to use this approach for a wider spectrum of IR image processing applications like automatic target recognition or image based navigation or imaging laser radar target recognition.

The purpose of this directory is to provide a basis for market development activities through a location listing of key trade associations, trade periodicals, and key firms for three target groups. Potential industrial users and potential IPH system designers were identified as the prime targets for market development activities. The bulk of the directory is a listing of these two groups. The third group, solar IPH equipment manufacturers, was included to provide an information source for potential industrial users and potential IPH system designers. Trade associates and their publications are listed for selected four-digit Standard Industrial Code (SIC) industries. Since industries requiring relatively lower temperature process heat probably will comprise most of the near-term market for solar IPH systems, the 80 SIC's included in this chapter have process temperature requirements less than 350/sup 0/F. Some key statistics and a location list of the largest plants (according to number of employees) in each state are included for 15 of the 80 SIC's. Architectural/engineering and consulting firms are listed which are known to have solar experience. Professional associated and periodicals to which information on solar IPH sytstems may be directed also are included. Solar equipment manufacturers and their associations are listed. The listing is based on the SERI Solar Energy Information Data Base (SEIDB).

Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

A set of data processing software is presented in this paper for processing NBI spectroscopic data. For better and more scientific managment and querying these data, they are managed uniformly by the NBI data server. The data processing software offers the functions of uploading beam spectral original and analytic data to the data server manually and automatically, querying and downloading all the NBI data, as well as dealing with local LZO data. The set software is composed of a server program and a client program. The server software is programmed in C/C++ under a CentOS development environment. The client software is developed under a VC 6.0 platform, which offers convenient operational human interfaces. The network communications between the server and the client are based on TCP. With the help of this set software, the NBI spectroscopic analysis system realizes the unattended automatic operation, and the clear interface also makes it much more convenient to offer beam intensity distribution data and beam power data to operators for operation decision-making. supported by National Natural Science Foundation of China (No. 11075183), the Chinese Academy of Sciences Knowledge Innovation

One of the main heritage tools used in scientific and engineering data spectrum analysis is the Fourier Integral Transform and its high performance digital equivalent - the Fast Fourier Transform (FFT). The Fourier view of nonlinear mechanics that had existed for a long time, and the associated FFT (fairly recent development), carry strong a-priori assumptions about the source data, such as linearity and of being stationary. Natural phenomena measurements are essentially nonlinear and nonstationary. A very recent development at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), known as the Hilbert-Huang Transform (HHT) proposes a novel approach to the solution for the nonlinear class of spectrum analysis problems. Using the Empirical Mode Decomposition (EMD) followed by the Hilbert Transform of the empirical decomposition data (HT), the HHT allows spectrum analysis of nonlinear and nonstationary data by using an engineering a-posteriori data processing, based on the EMD algorithm. This results in a non-constrained decomposition of a source real value data vector into a finite set of Intrinsic Mode Functions (IMF) that can be further analyzed for spectrum interpretation by the classical Hilbert Transform. This paper describes phase one of the development of a new engineering tool, the HHT Data ProcessingSystem (HHTDPS). The HHTDPS allows applying the "T to a data vector in a fashion similar to the heritage FFT. It is a generic, low cost, high performance personal computer (PC) based system that implements the HHT computational algorithms in a user friendly, file driven environment. This paper also presents a quantitative analysis for a complex waveform data sample, a summary of technology commercialization efforts and the lessons learned from this new technology development.

Techniques for producing model metal-metal oxide systems for the purpose of evaluating the results of processing such systems in the low-gravity environment afforded by a drop tower facility are described. Because of the lack of success in producing suitable materials samples and techniques for processing in the 3.5 seconds available, the program was discontinued.

There are over 40 subsystems being developed for the future SLS and Orion Launch Systems at Kennedy Space Center. These subsystems developed at the Kennedy Space Center Engineering Directorate follow a comprehensive design process which requires several different product deliverables during each phase of each of the subsystems. This Paper describes this process and gives an example of where the process has been applied.

An expert system which captures the expertise of workshop technicians in the drilling domain was developed. The expert system is aimed at novice technicians who know how to operate the machines but have not acquired the decision making skills that are gained with experience. This paper describes the domain background and the stages of development of the expert system.

The Future Armored Resupply Vehicle (FARV) will be the companion ammunition resupply vehicle to the Advanced Field Artillery System (AFAS). These systems are currently being investigated by the US Army for future acquisition. The FARV will sustain the AFAS with ammunition and fuel and will significantly increase capabilities over current resupply vehicles. Currently ammunition is transferred to field artillery almost entirely by hand. The level of automation to be included into the FARV is still under consideration. At the request of the US Army`s Project Manager, AFAS/FARV, Oak Ridge National Laboratory (ORNL) identified and evaluated various concepts for the automated upload, processing, storage, and delivery equipment for the FARV. ORNL, working with the sponsor, established basic requirements and assumptions for concept development and the methodology for concept selection. A preliminary concept has been selected, and the associated critical technologies have been identified. ORNL has provided technology demonstrations of many of these critical technologies. A technology demonstrator which incorporates all individual components into a total process demonstration is planned for late FY 1995.

A review of existing information pertaining to spacecraft power processingsystems and equipment was accomplished with a view towards applicability to the modularization of multi-kilowatt power processors. Power requirements for future spacecraft were determined from the NASA mission model-shuttle systems payload data study which provided the limits for modular power equipment capabilities. Three power processingsystems were compared to evaluation criteria to select the system best suited for modularity. The shunt regulated direct energy transfer system was selected by this analysis for a conceptual design effort which produced equipment specifications, schematics, envelope drawings, and power module configurations.

The Chief Engineer of the Exploration SystemsDevelopment (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard developmentprocess. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g. missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

The Chief Engineer of the Exploration SystemsDevelopment (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard developmentprocess. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g., missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

This paper explains the goals and challenges of NASA's risk communication efforts and how the Aerospace Systems Engineering Process (ASEP) was used to map the risk communication strategy used at the Jet Propulsion Laboratory to achieve these goals.

Tracker is an object-tracking and image-processing program designed and developed at the NASA Lewis Research Center to help with the analysis of images generated by microgravity combustion and fluid physics experiments. Experiments are often recorded on film or videotape for analysis later. Tracker automates the process of examining each frame of the recorded experiment, performing image-processing operations to bring out the desired detail, and recording the positions of the objects of interest. It can load sequences of images from disk files or acquire images (via a frame grabber) from film transports, videotape, laser disks, or a live camera. Tracker controls the image source to automatically advance to the next frame. It can employ a large array of image-processing operations to enhance the detail of the acquired images and can analyze an arbitrarily large number of objects simultaneously. Several different tracking algorithms are available, including conventional threshold and correlation-based techniques, and more esoteric procedures such as "snake" tracking and automated recognition of character data in the image. The Tracker software was written to be operated by researchers, thus every attempt was made to make the software as user friendly and self-explanatory as possible. Tracker is used by most of the microgravity combustion and fluid physics experiments performed by Lewis, and by visiting researchers. This includes experiments performed on the space shuttles, Mir, sounding rockets, zero-g research airplanes, drop towers, and ground-based laboratories. This software automates the analysis of the flame or liquid s physical parameters such as position, velocity, acceleration, size, shape, intensity characteristics, color, and centroid, as well as a number of other measurements. It can perform these operations on multiple objects simultaneously. Another key feature of Tracker is that it performs optical character recognition (OCR). This feature is useful in

An overview is given of seven processdevelopment activities which were presented at this session. Pulsed excimer laser processing of photovoltaic cells was presented. A different pulsed excimer laser annealing was described using a 50 w laser. Diffusion barrier research focused on lowering the chemical reactivity of amorphous thin film on silicon. In another effort adherent and conductive films were successfully achieved. Other efforts were aimed at achieving a simultaneous front and back junction. Microwave enhanced plasma deposition experiments were performed. An updated version of the Solar Array Manufacturing Industry Costing Standards (SAMICS) was presented, along with a life cycle cost analysis of high efficiency cells. The last presentation was on the evaluation of the ethyl vinyl acetate encapsulating system.

There are over 40 subsystems being developed for the future SLS and Orion Launch Systems at Kennedy Space Center. These subsystems are developed at the Kennedy Space Center Engineering Directorate. The Engineering Directorate at Kennedy Space Center follows a comprehensive design process which requires several different product deliverables during each phase of each of the subsystems. This Presentation describes this process with examples of where the process has been applied.

The purpose of this program is to demonstrate the technical readiness of a cost effective process sequence that has the potential for the production of flat plate photovoltaic modules which met the price goal in 1986 of $.70 or less per watt peak. Program efforts included: preliminary design review, preliminary cell fabrication using the proposed process sequence, verification of sandblasting back cleanup, study of resist parameters, evaluation of pull strength of the proposed metallization, measurement of contact resistance of Electroless Ni contacts, optimization of process parameter, design of the MEPSDU module, identification and testing of insulator tapes, development of a lamination process sequence, identification, discussions, demonstrations and visits with candidate equipment vendors, evaluation of proposals for tabbing and stringing machine.

Nowadays, when there is a need to introduce various innovations into the educational process, most efforts are aimed at simplifying the learning process. To that end, electronic textbooks, testing systems and other software is being developed. Most of them are intended to run on personal computers with limited mobility. Smart education is…

Design work for a photovoltaic module, fabricated using single crystal silicon dendritic web sheet material, resulted in the identification of surface treatment to the module glass superstrate which improved module efficiencies. A final solar module environmental test, a simulated hailstone impact test, was conducted on full size module superstrates to verify that the module's tempered glass superstrate can withstand specified hailstone impacts near the corners and edges of the module. Process sequence design work on the metallization process selective, liquid dopant investigation, dry processing, and antireflective/photoresist application technique tasks, and optimum thickness for Ti/Pd are discussed. A noncontact cleaning method for raw web cleaning was identified and antireflective and photoresist coatings for the dendritic webs were selected. The design of a cell string conveyor, an interconnect feed system, rolling ultrasonic spot bonding heat, and the identification of the optimal commercially available programmable control system are also discussed. An economic analysis to assess cost goals of the process sequence is also given.

gathering from the user. Further, methodologies have been developed to generate DEVS models from BPMN /BPEL-based and message-based requirement specifications...27] 3. BPMN /BPEL based system specifications: Business Process Modeling Notation ( BPMN ) [bpm] or Business Process Execution Language (BPEL) provide a...information is stored in .wsdl and .bpel files for BPEL but in proprietary format for BPMN . 4. DoDAF-based requirement specifications: Department of

In this paper, I describe software product lines and why a Ground Data ProcessingSystem should use one. I also describe how to develop a software product line, using examples from an imaginary Ground Data ProcessingSystem.

This report and set of appendices are a collection of memoranda originally drafted circa 2007-2009 for the purpose of describing and detailing a models-based systems engineering approach for satisfying enterprise and system-of-systems life cycle process requirements. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. The main thrust of the material presents a rational exposâe of a structured enterprise development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of standard systems engineering processes. While the approach described invokes application of the Department of Defense Architectural Framework (DoDAF), it is suitable for use with other architectural description frameworks.

The state-of-the-art Carbon Dioxide (CO2) Reduction Assembly (CRA) on the International Space Station (ISS) facilitates the recovery of oxygen from metabolic CO2. The CRA utilizes the Sabatier process to produce water with methane as a byproduct. The methane is currently vented overboard as a waste product. Because the CRA relies on hydrogen for oxygen recovery, the loss of methane ultimately results in a loss of oxygen. For missions beyond low earth orbit, it will prove essential to maximize oxygen recovery. For this purpose, NASA is exploring an integrated post-processor system to recover hydrogen from CRA methane. The post-processor, called a Plasma Pyrolysis Assembly (PPA) partially pyrolyzes methane to recover hydrogen with acetylene as a byproduct. In-flight operation of post-processor will require a Methane Purification Assembly (MePA) and an Acetylene Separation Assembly (ASepA). Recent efforts have focused on the design, fabrication, and testing of these components. The results and conclusions of these efforts will be discussed as well as future plans.

A cost effective process sequence and machinery for the production of flat plate photovoltaic modules are described. Cells were fabricated using the process sequence which was optimized, as was a lamination procedure. Insulator tapes and edge seal material were identified and tested. Encapsulation materials were evaluated.

In Butler County, Ohio, Butler Technology and Career Development Schools (Butler Tech) firmly believes that systematic delivery of career development theory and practice integrated with academic content standards will enable students to do all of the above. Because of this, Butler Tech's Career Initiatives division delivers a countywide career…

Restructuring research objectives from a technical readiness demonstration program to an investigation of high risk, high payoff activities associated with producing photovoltaic modules using non-CZ sheet material is reported. Deletion of the module frame in favor of a frameless design, and modification in cell series parallel electrical interconnect configuration are reviewed. A baseline process sequence was identified for the fabrication of modules using the selected dendritic web sheet material, and economic evaluations of the sequence were completed.

The major component fabrication program was completed. Assembly and system testing of the pulsed electron beam annealing machine are described. The design program for the transport reached completion, and the detailed drawings were released for fabrication and procurement of the long lead time components.

Primary avionics software system; software development approach; user support and problem diagnosis; software releases and configuration; quality/productivity programs; and software development/production facilities are addressed. Also examined are the external evaluations of the IBM process.

PETC has implemented a number of advanced combustion research projects that will lead to the establishment of a broad, commercially acceptable engineering data base for the advancement of coal as the fuel of choice for boilers, furnaces, and process heaters. Vortec Corporation's Coal-Fired Combustion System for Industrial Process Heating Applications has been selected for Phase III development under contract DE-AC22-91PC91161. This advanced combustion system research program is for the development of innovative coal-fired process heaters which can be used for high temperature melting, smelting, recycling, and refining processes. The process heater concepts to be developed are based on advanced glass melting and ore smelting furnaces developed and patented by Vortec Corporation. The process heater systems to be developed have multiple use applications; however, the Phase HI research effort is being focused on the development of a process heater system to be used for producing glass frits and wool fiber from boiler and incinerator ashes. The primary objective of the Phase III project is to develop and integrate all the system components, from fuel through total system controls, and then test the complete system in order to evaluate its potential marketability. The economic evaluation of commercial scale CMS processes has begun. In order to accurately estimate the cost of the primary process vessels, preliminary designs for 25, 50, and 100 ton/day systems have been started under Task 1. This data will serve as input data for life cycle cost analysis performed as part of techno-economic evaluations. The economic evaluations of commercial CMS systems will be an integral part of the commercialization plan.

Work is being performed to develop a general-purpose systems engineering model for the AAA separation process. The work centers on the development of a new user interface for the AMUSE code and on the specification of a systems engineering model. This paper presents background information and an overview of work completed to date. (authors)

The Biospheric Information Systems Branch at NASA s Goddard Space Flight Center has developed three generations of Science Investigator-led ProcessingSystems for use with various remote sensing instruments. The first system is used for data from the MODIS instruments flown on NASA s Earth Observing Systems @OS) Terra and Aqua Spacecraft launched in 1999 and 2002 respectively. The second generation is for the Ozone Measuring Instrument flying on the EOS Aura spacecraft launched in 2004. We are now developing a third generation of the system for evaluation science data processing for the Ozone Mapping and Profiler Suite (OMPS) to be flown by the NPOESS Preparatory Project (NPP) in 2006. The initial system was based on large scale proprietary hardware, operating and database systems. The current OMI system and the OMPS system being developed are based on commodity hardware, the LINUX Operating System and on PostgreSQL, an Open Source RDBMS. The new system distributes its data archive across multiple server hosts and processes jobs on multiple processor boxes. We have created several instances of this system, including one for operational processing, one for testing and reprocessing and one for applications development and scientific analysis. Prior to receiving the first data from OMI we applied the system to reprocessing information from the Solar Backscatter Ultraviolet (SBUV) and Total Ozone Mapping Spectrometer (TOMS) instruments flown from 1978 until now. The system was able to process 25 years (108,000 orbits) of data and produce 800,000 files (400 GiB) of level 2 and level 3 products in less than a week. We will describe the lessons we have learned and tradeoffs between system design, hardware, operating systems, operational staffing, user support and operational procedures. During each generational phase, the system has become more generic and reusable. While the system is not currently shrink wrapped we believe it is to the point where it could be readily

Silicon material research in the Republic of China (ROC) parallels its development in the electronic industry. A brief outline of the historical development in ROC silicon material research is given. Emphasis is placed on the recent Silane Project managed by the National Science Council, ROC, including project objectives, task forces, and recent accomplishments. An introduction is also given to industrialization of the key technologies developed in this project.

A brief overview is given for two software developments related to the ITS code system. These developments provide parallel processing and visualization capabilities and thus allow users to perform ITS calculations more efficiently. Timing results and a graphical example are presented to demonstrate these capabilities.

Many geoscientific problems, such as underground waste disposal, nuclear waste disposal, CO2 sequestration, geothermal energy, etc., require for prediction of ongoing processes as well as risk and safety assessment a numerical simulation system. The governing processes are thermal heat transfer (T), hydraulic flow in multi-phase systems (H), mechanical deformation (M) and geochemical reactions (C), which interact in a complex way (THMC). The development of suitable simulation systems requires a large amount of effort for code development, verification and applications. OpenGeoSys (OGS) is an open source scientific initiative for the simulation of these THMC processes in porous media. A flexible numerical framework based on the Finite Element Method is provided and applied to the governing process equations. Due to the object- and process-oriented character of the code, functionality enhancement and code coupling with external simulators can be performed reasonably effectively. This structure also allows for a distributed development, with developers at different locations contributing to the common code. The code is platform independent, accessible via internet for development and application, and checked by an automated benchmarking system regularly.

A medical information system is one extreme in using tacit knowledge that patricians and medical experts such as medical doctors use a lot but the knowledge may include a lot of experience information and be not explicitly formulated or implied. This is simply different from other discipline areas such as embedded engineering systems. Developing a mechanical system critically depends on how effectively such various knowledge is organized and integrated in implementing a system. As such, the developmentprocess that customers, management, engineers, and teams are involved must be evaluated from this view point. Existence of tacit knowledge may not be sensed well enough at project beginning, however it is necessary for project success. This paper describes the problems and how the Personal Software Process (PSP) and Team Software Process (TSP2) manage this problem and then typical performance results are discussed. It may be said that PSP individual and TSP team are CMMI level 4 units respectively.

Ares I Crew Launch Vehicle Upper Stage is designed and developed based on sound systems engineering principles. Systems Engineering starts with Concept of Operations and Mission requirements, which in turn determine the launch system architecture and its performance requirements. The Ares I-Upper Stage is designed and developed to meet these requirements. Designers depend on the support from materials, processes and manufacturing during the design, development and verification of subsystems and components. The requirements relative to reliability, safety, operability and availability are also dependent on materials availability, characterization, process maturation and vendor support. This paper discusses the roles and responsibilities of materials and manufacturing engineering during the various phases of Ares IUS development, including design and analysis, hardware development, test and verification. Emphasis is placed how materials, processes and manufacturing support is integrated over the Upper Stage Project, both horizontally and vertically. In addition, the paper describes the approach used to ensure compliance with materials, processes, and manufacturing requirements during the project cycle, with focus on hardware systems design and development.

The SEE data processingsystem, developed in 1985, manages and process test results. General information is provided on the SEE system: objectives, characteristics, basic principles, general organization, and operation. Full documentation is accessible by computer using the HELP SEE command.

I AD-A250 668 D0 ,I I I 111 Wl’i ill EDT|CS ELECTE MAY 27 1992 I C I FORMIC ACID: DEVELCPMENT OF AN ANALYTICAL METHOD AND USE AS A PROCESS INDICATOR...ANALYTICAL METHOD AND USE AS A PROCESS INDICATOR IN ANAEROBIC SYSTEMS A Special Research Problem Report Presented to the Faculty of the Division of...DEVELOPMENT-OF AN ANALYTICAL-METHOD ANDA USE AS A PROCESS INDICATOR IN ANAEROBIC-SYSTEMS by Sharon L. Perkins APPROVED: rr*W.f-.s, Adviso Dr. JWf . sord

Direct osmotic concentration (DOC) has been identified as a high potential technology for recycling of wastewater to drinking water in advanced life support (ALS) systems. As a result the DOC process has been selected for a NASA Rapid Technology Development Team (RTDT) effort. The existing prototype system has been developed to a Technology Readiness Level (TRL) 3. The current project focuses on advancing the development of this technology from TRL 3 to TRL 6 (appropriate for human rated testing). A new prototype of a DOC system is been designed and fabricated that addresses the deficiencies encountered during the testing of the original system and allowing the new prototype to achieve TRL 6. Background information is provided about the technologies investigated and their capabilities, results from preliminary tests, and the milestones plan and activities for the RTDT program intended to develop a second generation prototype of the DOC system.

The Office of Civilian Radioactive Waste Management (OCRWM) is executing a plan for improvement of the systems implemented to carry out its responsibilities under the Nuclear Waste Policy Act of 1982 (NWPA). As part of the plan, OCRWM is performing a systems engineering analysis of both the physical system, i.e., the Nuclear Waste Management System (NWMS), and the programmatic functions that must be accomplished to bring the physical system into being. The purpose of the program analysis is to provide a systematic identification and definition of all program functions, functional process flows, and function products necessary and sufficient to provide the physical system. The analysis resulting from this approach provides a basis for development of a comprehensive and integrated set of policies, standard practices, and procedures for the effective and efficient execution of the program. Thus, this analysis will form a basis for revising current OCRWM policies and procedures, or developing new ones is necessary. The primary purposes of this report are as follows: (1) summarizes the major functional processes and process flows that have been developed as a part of the program analysis, and (2) provide an introduction and assistance in understanding the detailed analysis information contained in the three volume report titled The Analysis of the Program to Develop the Nuclear Waste Management System (Woods 1991a).

AD A145 573 THE ERGONOMIST’S ROLE IN THE WEAPON SYSTEMDEVELOPMENT 1/1 PROCESS IN CANADA (U) DEFENCE AND CIVIL INS OF ENVIRONMENTAL MEDICINE DOWNSVIEW...AOAL 8VREAU OF STANDARDS-,, 6 3 - DCIEN No. 83-C-583 In I’ THE ERGON01II-ST’S ROLE IN THE LEAPON SYSTEM DEVELCPMENT PROCESS IN CANADA D.Beevis, 0A...the Canadian Forces a weapons system is defined as a composite of equipment, facili- ties, skills, and techniques forming a self-sufficient instrument

TRW’s Ada Process Model has proven to be key to the Command Center Processing and Display System-Replacement (CCPDS-R) project’s success to data in...developing over 3000,000 lines of Ada source code executing in a distributed VAX VMS environment. The Ada Process Model is, in simplest terms, a...software progress metrics. This paper provides an overview of the techniques and benefits of the Ada Process Model and describes some of the experience and

A high-throughput high-Tc SQUID fabrication process that can provide the appropriate number of SQUIDs for a 51-channel magnetocardiograph (MCG) has been developed. A new deposition system-based on a pulsed-laser-deposition technique to increase the process throughput in fabricating superconducting YBa2Cu3Oy thin films-was developed. In this system, nine superconducting thin films are successively deposited on bicrystal substrates in one deposition sequence. A mask aligner, which was customized for the bicrystal substrate, was also developed. This system enables mask alignment for the bicrystal grain boundary without the need for preprocessing to visualize it. In addition, the magnetometer pattern was designed to improve the yield for magnetometer fabrication. In this directly coupled magnetometer, four SQUIDs were connected with the same pickup coil. Accordingly, the yield of magnetometer could be enhanced by selecting the best SQUID among the four.

This paper describes a 3-D motion capture system for the quantitative evaluation of a finger-nose test using image processing. In the field of clinical medicine, qualitative and quantitative evaluation of voluntary movements is necessary for correct diagnosis of disorders. For this purpose, we have developed a 3-D measuring system with a multi-camera system. The configuration of the system is described and examples of movement data are shown for normal subjects and patients. In the finger-nose test at a fast trial speed, a discriminant analysis using Maharanobis generalized distances shows a discriminant rate of 93% between normal subjects and spinocerebellar degeneration(SCD) patients.

Secondary use of medical big data is increasingly popular in healthcare services and clinical research. Understanding the logic behind medical big data demonstrates tendencies in hospital information technology and shows great significance for hospital information systems that are designing and expanding services. Big data has four characteristics--Volume, Variety, Velocity and Value (the 4 Vs)--that make traditional systems incapable of processing these data using standalones. Apache Hadoop MapReduce is a promising software framework for developing applications that process vast amounts of data in parallel with large clusters of commodity hardware in a reliable, fault-tolerant manner. With the Hadoop framework and MapReduce application program interface (API), we can more easily develop our own MapReduce applications to run on a Hadoop framework that can scale up from a single node to thousands of machines. This paper investigates a practical case of a Hadoop-based medical big data processingsystem. We developed this system to intelligently process medical big data and uncover some features of hospital information system user behaviors. This paper studies user behaviors regarding various data produced by different hospital information systems for daily work. In this paper, we also built a five-node Hadoop cluster to execute distributed MapReduce algorithms. Our distributed algorithms show promise in facilitating efficient data processing with medical big data in healthcare services and clinical research compared with single nodes. Additionally, with medical big data analytics, we can design our hospital information systems to be much more intelligent and easier to use by making personalized recommendations.

This paper describes an engineering approach toward implementing the current neuroscientific understanding of how the primate brain fuses, or integrates, 'information' in the decision-making process. We describe a System of Systems (SoS) design for improving the overall performance, capabilities, operational robustness, and user confidence in Identification (ID) systems and show how it could be applied to biometrics security. We use the Physio-associative temporal sensor integration algorithm (PATSIA) which is motivated by observed functions and interactions of the thalamus, hippocampus, and cortical structures in the brain. PATSIA utilizes signal theory mathematics to model how the human efficiently perceives and uses information from the environment. The hybrid architecture implements a possible SoS-level description of the Joint Directors of US Laboratories for Fusion Working Group's functional description involving 5 levels of fusion and their associated definitions. This SoS architecture propose dynamic sensor and knowledge-source integration by implementing multiple Emergent Processing Loops for predicting, feature extracting, matching, and Searching both static and dynamic database like MSTAR's PEMS loops. Biologically, this effort demonstrates these objectives by modeling similar processes from the eyes, ears, and somatosensory channels, through the thalamus, and to the cortices as appropriate while using the hippocampus for short-term memory search and storage as necessary. The particular approach demonstrated incorporates commercially available speaker verification and face recognition software and hardware to collect data and extract features to the PATSIA. The PATSIA maximizes the confidence levels for target identification or verification in dynamic situations using a belief filter. The proof of concept described here is easily adaptable and scaleable to other military and nonmilitary sensor fusion applications.

A general overview of the development of a data acquisition and processingsystem is presented for a pulsed, 2-micron coherent Doppler Lidar system located in NASA Langley Research Center in Hampton, Virginia, USA. It is a comprehensive system that performs high-speed data acquisition, analysis, and data display both in real time and offline. The first flight missions are scheduled for the summer of 2010 as part of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The system as well as the control software is reviewed and its requirements and unique features are discussed.

The Institute for Marine Research and Observation (IMRO) - Ministry of Marine Affairs and Fisheries Republic of Indonesia (MMAF) has developed a potential fishing zone (PFZ) forecast using satellite data, called Peta Prakiraan Daerah Penangkapan Ikan (PPDPI). Since 2005, IMRO disseminates everyday PPDPI maps for fisheries marine ports and 3 days average for national areas. The accuracy in determining the PFZ and processing time of maps depend much on the experience of the operators creating them. This paper presents our research in developing an automated processingsystem for PPDPI in order to increase the accuracy and shorten processing time. PFZ are identified by combining MODIS sea surface temperature (SST) and chlorophyll-a (CHL) data in order to detect the presence of upwelling, thermal fronts and biological productivity enhancement, where the integration of these phenomena generally representing the PFZ. The whole process involves data download, map geo-process as well as layout that are carried out automatically by Python and ArcPy. The results showed that the automated processingsystem could be used to reduce the operator’s dependence on determining PFZ and speed up processing time.

The notion of using operational scenarios as part of requirements development during mission formulation (Phases A & B) is widely accepted as good system engineering practice. In the context of developing a Mission Operations System (MOS), there are numerous practical challenges to translating that notion into the cost-effective development of a useful set of requirements. These challenges can include such issues as a lack of Project-level focus on operations issues, insufficient or improper flowdown of requirements, flowdown of immature or poor-quality requirements from Project level, and MOS resource constraints (personnel expertise and/or dollars). System engineering theory must be translated into a practice that provides enough structure and standards to serve as guidance, but that retains sufficient flexibility to be tailored to the needs and constraints of a particular MOS or Project. We describe a detailed, scenario-based process for requirements development. Identifying a set of attributes for high quality requirements, we show how the portions of the process address many of those attributes. We also find that the basic process steps are robust, and can be effective even in challenging Project environments.

The technical readiness of a cost effective process sequence that has the potential for the production of flat plate photovoltaic modules which met the price goal in 1986 of $.70 or less per Watt peak was demonstrated. The proposed process sequence was reviewed and laboratory verification experiments were conducted. The preliminary process includes the following features: semicrystalline silicon (10 cm by 10 cm) as the silicon input material; spray on dopant diffusion source; Al paste BSF formation; spray on AR coating; electroless Ni plate solder dip metallization; laser scribe edges; K & S tabbing and stringing machine; and laminated EVA modules.

The basic objectives of the program are the following: (1) to design, develop, construct and deliver a junction processingsystem which will be capable of producing solar cell junctions by means of ion implantation followed by pulsed electron beam annealing; (2) to include in the system a wafer transport mechanism capable of transferring 4-inch-diameter wafers into and out of the vacuum chamber where the ion implantation and pulsed electron beam annealing processes take place; (3) to integrate, test and demonstrate the system prior to its delivery to JPL along with detailed operating and maintenance manuals; and (4) to estimate component lifetimes and costs, as necessary for the contract, for the performance of comprehensive analyses in accordance with the Solar Array Manufacturing Industry Costing Standards (SAMICS). Under this contract the automated junction formation equipment to be developed involves a new system design incorporating a modified, government-owned, JPL-controlled ion implanter into a Spire-developed pulsed electron beam annealer and wafer transport system. When modified, the ion implanter will deliver a 16 mA beam of /sup 31/P/sup +/ ions with a fluence of 2.5 x 10/sup 15/ ions per square centimeter at an energy of 10 keV. The throughput design goal rate for the junction processor is 10/sup 7/ four-inch-diameter wafers per year.

The gravure offset printing process is very cost-effective for printed electronics, such as printed solar cell, printed battery, printed TFT, printed RFID tag and so on. In gravure offset printing, there are two kinds of ink transfer processes -- off and set processes. At the off process, an elastic blanket cylinder picks up the ink from patterned plate or patterned cylinder. At the set process, ink on the elastic blanket cylinder is transferred onto the target substrate. These two ink transfer processes determine printing quality, therefore understanding of ink transfer mechanism during off and set processes are very important to control printing quality. In this study, we developed ink transfer monitoring system for roll-to-plate gravure printing. We visualized ink transfer from pattern plate to rolling blanket cylinder (off process) and from rolling blanket cylinder to plate substrate (set process) by using high-speed camera and long range microscope. We investigated the effects of pattern size, printing speed, rotational effect of blanket cylinder, contact angle and rheological property of ink to understanding gravure offset printing mechanism.

We present an image processing and analysis system to facilitate detailed performance analysis of free flow electrophoresis (FFE) chips. It consists of a cost-effective self-built imaging setup and a comprehensive customizable software suite. Both components were designed modularly to be accessible, adaptable, versatile, and automatable. The system provides tools for i) automated identification of chip features (e.g. separation zone and flow markers), ii) extraction and analysis of stream trajectories, and iii) evaluation of flow profiles and separation quality (e.g. determination of resolution). Equipped with these tools, the presented image processing and analysis system will enable faster development of FFE chips and applications. It will also serve as a robust detector for fluorescence-based analytical applications of FFE.

Tritium processing technologies of the two European Test Blanket Systems (TBS), HCLL (Helium Cooled Lithium Lead) and HCPB (Helium Cooled Pebble Bed), play an essential role in meeting the main objectives of the TBS experimental campaign in ITER. The compliancy with the ITER interface requirements, in terms of space availability, service fluids, limits on tritium release, constraints on maintenance, is driving the design of the TBS tritium processingsystems. Other requirements come from the characteristics of the relevant test blanket module and the scientific programme that has to be developed and implemented. This paper identifies the main requirements for the design of the TBS tritium systems and equipment and, at the same time, provides an updated overview on the current design status, mainly focusing onto the tritium extractor from Pb-16Li and TBS tritium accountancy. Considerations are also given on the possible extrapolation to DEMO breeding blanket. (authors)

Since small scale is key for successful introduction of continuous techniques in the pharmaceutical industry to allow its use during formulation development and process optimization, it is essential to determine whether the product quality is similar when small quantities of materials are processed compared to the continuous processing of larger quantities. Therefore, the aim of this study was to investigate whether material processed in a single cell of the six-segmented fluid bed dryer of the ConsiGma™-25 system (a continuous twin screw granulation and drying system introduced by GEA Pharma Systems, Collette™, Wommelgem, Belgium) is predictive of granule and tablet quality during full-scale manufacturing when all drying cells are filled. Furthermore, the performance of the ConsiGma™-1 system (a mobile laboratory unit) was evaluated and compared to the ConsiGma™-25 system. A premix of two active ingredients, powdered cellulose, maize starch, pregelatinized starch and sodium starch glycolate was granulated with distilled water. After drying and milling (1000 μm, 800 rpm), granules were blended with magnesium stearate and compressed using a Modul™ P tablet press (tablet weight: 430 mg, main compression force: 12 kN). Single cell experiments using the ConsiGma™-25 system and ConsiGma™-1 system were performed in triplicate. Additionally, a 1h continuous run using the ConsiGma™-25 system was executed. Process outcomes (torque, barrel wall temperature, product temperature during drying) and granule (residual moisture content, particle size distribution, bulk and tapped density, hausner ratio, friability) as well as tablet (hardness, friability, disintegration time and dissolution) quality attributes were evaluated. By performing a 1h continuous run, it was detected that a stabilization period was needed for torque and barrel wall temperature due to initial layering of the screws and the screw chamber walls with material. Consequently, slightly deviating

In a joint effort, NASA, Micro Medical Devices, and the Cleveland Clinic have developed a microarthroscopy system with digital image processing. This system consists of a disposable endoscope the size of a needle that is aimed at expanding the use of minimally invasive surgery on the knee, ankle, and other small joints. This device not only allows surgeons to make smaller incisions (by improving the clarity and brightness of images), but it gives them a better view of the injured area to make more accurate diagnoses. Because of its small size, the endoscope helps reduce physical trauma and speeds patient recovery. The faster recovery rate also makes the system cost effective for patients. The digital image processing software used with the device was originally developed by the NASA Glenn Research Center to conduct computer simulations of satellite positioning in space. It was later modified to reflect lessons learned in enhancing photographic images in support of the Center's microgravity program. Glenn's Photovoltaic Branch and Graphics and Visualization Lab (G-VIS) computer programmers and software developers enhanced and speed up graphic imaging for this application. Mary Vickerman at Glenn developed algorithms that enabled Micro Medical Devices to eliminate interference and improve the images.

We have developed an integrated process model (IPM) for a Laser Inertial Fusion-Fission Energy (LIFE) power plant. The model includes cost and performance algorithms for the major subsystems of the plant, including the laser, fusion target fabrication and injection, fusion-fission chamber (including the tritium and fission fuel blankets), heat transfer and power conversion systems, and other balance of plant systems. The model has been developed in Visual Basic with an Excel spreadsheet user interface in order to allow experts in various aspects of the design to easily integrate their individual modules and provide a convenient, widely accessible platform for conducting the system studies. Subsystem modules vary in level of complexity; some are based on top-down scaling from fission power plant costs (for example, electric plant equipment), while others are bottom-up models based on conceptual designs being developed by LLNL (for example, the fusion-fission chamber and laser systems). The IPM is being used to evaluate design trade-offs, do design optimization, and conduct sensitivity analyses to identify high-leverage areas for R&D. We describe key aspects of the IPM and report on the results of our systems analyses. Designs are compared and evaluated as a function of key design variables such as fusion target yield and pulse repetition rate.

As part of the Consolidated Fuel Reprocessing Program of the Fuel Recycle Division at the Oak Ridge National Laboratory (ORNL), an Integrated Process Demonstration (IPD) facility has been constructed for development of reprocessing plant technology. Through the use of cold materials, the IPD facility provides for the integrated operation of the major equipment items of the chemical-processing portion of a nuclear fuel reprocessing plant. The equipment, processes, and the extensive use of computers in data acquisition and control are prototypical of future reprocessing facilities and provide a unique test-bed for nuclear safeguards demonstrations. The data acquisition and control system consists of several microprocessors that communicate with one another and with a host minicomputer over a common data highway. At intervals of a few minutes, a ''snapshot'' is taken of the process variables, and the data are transmitted to a safeguards computer and minicomputer work station for analysis. This paper describes this data acquisition system and the data-handling procedures leading to microscopic process monitoring for safeguards purposes.

The PRISM project includes development of a set of visualization and processing tools for use by earth system scientists. A list of requirements has been formulated, based upon information provided by the PRISM community. After having conducted a review of the requirements and of the software packages available, the team is ready to begin development of two visualization systems: a web-enabled system designed for monitoring and quality controlling model runs as they are running (Low-End graphics), and another system for high quality analysis of data which includes the ability to do 3-D plots, animations etc. with the option of controlling plot generation through scripts or using graphical interfaces (High-End graphics). Both Low-End and High-End graphics tools will use netCDF-CF metadata, the chosen PRISM System standard type of data. This poster is intended to be a showcase for our current ideas and early plans. We wish to invite comments from the wide community of earth system modellers about what functionalities would be most useful.

Objectives This paper aims to present the archetype modelling process used for the Health Department of Minas Gerais State, Brazil (SES/MG), to support building its regional EHR system, and the lessons learned during this process. Methods This study was undertaken within the Minas Gerais project. The EHR system architecture was built assuming the reference model from the ISO 13606 norm. The whole archetype developmentprocess took about ten months, coordinated by a clinical team co-ordinated by three health professionals and one systems analyst from the SES/MG. They were supported by around 30 health professionals from the internal SES/MG areas, and 5 systems analysts from the PRODEMGE. Based on a bottom-up approach, the project team used technical interviews and brainstorming sessions to conduct the modelling process. Results The main steps of the archetype modelling process were identified and described, and 20 archetypes were created. Lessons learned: –The set of principles established during the selection of PCS elements helped the clinical team to keep the focus in their objectives;–The initial focus on the archetype structural organization aspects was important;–The data elements identified were subjected to a rigorous analysis aimed at determining the most suitable clinical domain;–Levelling the concepts to accommodate them within the hierarchical levels in the reference model was definitely no easy task, and the use of a mind mapping tool facilitated the modelling process;–Part of the difficulty experienced by the clinical team was related to a view focused on the original forms previously used;–The use of worksheets facilitated the modelling process by health professionals;–It was important to have a health professional that knew about the domain tables and health classifications from the Brazilian Federal Government as member in the clinical team. Conclusion The archetypes (referencing terminology, domain tables and term lists) provided a

A knowledge based system to assist process engineers in evaluating the processability and moldability of poly-isocyanurate (PIR) formulations for the thermal protection system of the Space Shuttle external tank (ET) is discussed. The Reaction Injection Molding- ProcessDevelopment Advisor (RIM-PDA) is a coupled system which takes advantage of both symbolic and numeric processing techniques. This system will aid the process engineer in identifying a startup set of mold schedules and in refining the mold schedules to remedy specific process problems diagnosed by the system.

13 The system shall have the specified probability of completing 2 mission hours without a system abort T: 0.95 probability of completing a 2...hour mission without a system abort O: 0.99 probability of completing a 2 hour mission without a system abort APA 4 The system shall not...Error/exception cleared. Robot ready to continue mission 1 Actions cannot be taken to resolve exception Mission aborted 9 9 End State Mission

In 2001, NASA Goddard Space Flight Center's Laboratory for Terrestrial Physics started the construction of a science Investigator-led ProcessingSystem (SIPS) for processing data from the Ozone Monitoring Instrument (OMI) which will launch on the Aura platform in mid 2004. The Ozone Monitoring Instrument (OMI) is a contribution of the Netherlands Agency for Aerospace Programs (NIVR) in collaboration with the Finnish Meteorological Institute (FMI) to the Earth Observing System (EOS) Aura mission. It will continue the Total Ozone Monitoring System (TOMS) record for total ozone and other atmospheric parameters related to ozone chemistry and climate. OMI measurements will be highly synergistic with the other instruments on the EOS Aura platform. The LTP previously developed the Moderate Resolution Imaging Spectrometer (MODIS) Data ProcessingSystem (MODAPS), which has been in full operations since the launches of the Terra and Aqua spacecrafts in December, 1999 and May, 2002 respectively. During that time, it has continually evolved to better support the needs of the MODIS team. We now run multiple instances of the system managing faster than real time reprocessings of the data as well as continuing forward processing. The new OMI Data ProcessingSystem (OMIDAPS) was adapted from the MODAPS. It will ingest raw data from the satellite ground station and process it to produce calibrated, geolocated higher level data products. These data products will be transmitted to the Goddard Distributed Active Archive Center (GDAAC) instance of the Earth Observing System (EOS) Data and Information System (EOSDIS) for long term archive and distribution to the public. The OMIDAPS will also provide data distribution to the OMI Science Team for quality assessment, algorithm improvement, calibration, etc. We have taken advantage of lessons learned from the MODIS experience and software already developed for MODIS. We made some changes in the hardware system organization, database and

Agricultural products typically exhibit high variance in quality characteristics. To assure customer satisfaction and control manufacturing productivity, quality classification is necessary to screen off defective items and to grade the products. This article presents an application of image processing techniques on squid grading and defect discrimination. A preliminary study indicated that surface color was an efficient determinant to justify quality of splendid squids. In this study, a computer vision system (CVS) was developed to examine the characteristics of splendid squids. Using image processing techniques, squids could be classified into three different quality grades as in accordance with an industry standard. The developedsystem first sifted through squid images to reject ones with black marks. Qualified squids were graded on a proportion of white, pink, and red regions appearing on their bodies by using fuzzy logic. The system was evaluated on 100 images of squids at different quality levels. It was found that accuracy obtained by the proposed technique was 95% compared with sensory evaluation of an expert.

The engineering design, fabrication, assembly, operation, economic analysis, and process support research and development for an Experimental ProcessSystemDevelopment Unit for producing semiconductor-grade silicon using the slane-to-silicon process are reported. The design activity was completed. About 95% of purchased equipment was received. The draft of the operations manual was about 50% complete and the design of the free-space system continued. The system using silicon power transfer, melting, and shotting on a psuedocontinuous basis was demonstrated.

Neutron supermirrors and supermirror polarizers are thin film multilayer based devices which are used for reflecting and polarizing neutrons in various neutron based experiments. In the present communication, the in-house development of a 9 m long in-line dc sputtering system has been described which is suitable for deposition of neutron supermirrors on large size (1500 mm × 150 mm) substrates and in large numbers. The optimisation process of deposition of Co and Ti thin film, Co/Ti periodic multilayers, and a-periodic supermirrors have also been described. The system has been used to deposit thin film multilayer supermirror polarizers which show high reflectivity up to a reasonably large critical wavevector transfer of ∼0.06 Å{sup −1} (corresponding to m = 2.5, i.e., 2.5 times critical wavevector transfer of natural Ni). The computer code for designing these supermirrors has also been developed in-house.

Air quality modeling is a useful methodology to investigate air quality degradation in various locations and to analyze effectiveness of emission reduction plans. A comprehensive air quality model usually requires a coordinated set of emissions input of all necessary chemical species. We have developed an anthropogenic emissions processingsystem for Asia in support of air quality modeling and analysis over Asia (named SMOKE-Asia). The SMOKE (Sparse Matrix Operator kernel Emissions) system, which was developed by U.S. EPA and has been maintained by the Carolina Environmental Program (CEP) of the University of North Carolina, was used to develop our emissions processingsystem. A merged version of INTEX 2006 and TRACE-P 2000 inventories was used as an initial Asian emissions inventory. The IDA (Inventory Data Analyzer) format was used to create SMOKE-ready emissions. Source Classification Codes (SCCs) and country/state/county (FIPS) code, which are the two key data fields of SMOKE IDA data structure, were created for Asia. The 38 SCCs and 2752 FIPS codes were allocated to our SMOKE-ready emissions for more comprehensive processing. US EPA's MIMS (Multimedia Integrated Modeling System) Spatial Allocator software, along with many global and regional GIS shapes, were used to create spatial allocation profiles for Asia. Temporal allocation and chemical speciation profiles were partly regionalized using Asia-based studies. Initial data production using the developed SMOKE-Asia system was successfully performed. NOx and VOC emissions for the year 2009 were projected to be increased by 50% from those of 1997. The emission hotspots, such as large cities and large point sources, are distinguished in the domain due to spatial allocation. Regional emission peaks were distinguished due to temporally resolved emission information. The PAR (Paraffin carbon bond) and XYL (Xylene and other polyalkyl aromatics) showed the first and second largest emission rate among VOC species

Existing global and continental scale river models, mainly designed for integrating with global climate model, are of very course spatial resolutions and they lack many important hydrological processes, such as overbank flow, irrigation diversion, groundwater seepage/recharge, which operate at a much finer resolution. Thus, these models are not suitable for producing streamflow forecast at fine spatial resolution and water accounts at sub-catchment levels, which are important for water resources planning and management at regional and national scale. A large-scale river system model has been developed and implemented for water accounting in Australia as part of the Water Information Research and Development Alliance between Australia's Bureau of Meteorology (BoM) and CSIRO. The model, developed using node-link architecture, includes all major hydrological processes, anthropogenic water utilisation and storage routing that influence the streamflow in both regulated and unregulated river systems. It includes an irrigation model to compute water diversion for irrigation use and associated fluxes and stores and a storage-based floodplain inundation model to compute overbank flow from river to floodplain and associated floodplain fluxes and stores. An auto-calibration tool has been built within the modelling system to automatically calibrate the model in large river systems using Shuffled Complex Evolution optimiser and user-defined objective functions. The auto-calibration tool makes the model computationally efficient and practical for large basin applications. The model has been implemented in several large basins in Australia including the Murray-Darling Basin, covering more than 2 million km2. The results of calibration and validation of the model shows highly satisfactory performance. The model has been operalisationalised in BoM for producing various fluxes and stores for national water accounting. This paper introduces this newly developed river system model

Studies have been undertaken to design an efficient zeolite ion exchange system for use at the ORNL Process Waste Treatment Plant to remove cesium and strontium to meet discharge limits. This report focuses on two areas: (1) design of column hardware and pretreatment steps needed to eliminate column plugging and channeling and (2) development of equilibrium models for the wastewater system. Results indicate that zeolite columns do not plug as quickly when the wastewater equalization is performed in the new Bethel Valley Storage Tanks instead of the former equalization basin where suspended solids concentration is high. A down-flow column with spent zeolite was used successfully as a prefilter to prevent plugging of the zeolite columns being used to remove strontium and cesium. Equilibrium studies indicate that a Langmuir isotherm models binary zeolite equilibrium data while the modified Dubinin-Polyani model predicts multicomponent data.

A numerical model has been developed for simulating freezing phenomena of food solution systems. The cell model was simplified to apply to food solution systems, incorporating with the existence of 3 parts such as unfrozen, frozen and moving boundary layers. Moreover, the moving rate of freezing front model was also introduced and calculated by using the variable space network method proposed by Murray and Landis (1957). To demonstrate the validity of the model, it was applied to the freezing processes of coffee solutions. Since the model required the phase diagram of the material to be frozen, the initial freezing temperatures of 1-55 % coffee solutions were measured by the DSC method. The effective thermal conductivity for coffee solutions was determined as a function of temperature and solute concentration by using the Maxwell - Eucken model. One-dimensional freezing process of 10 % coffee solution was simulated based on its phase diagram and thermo-physical properties. The results were good agreement with the experimental data and then showed that the model could accurately describe the change in the location of the freezing front and the distributions of temperature as well as ice fraction during a freezing process.

The Image Retrieval and ProcessingSystem (IRPS) is a software package developed at Washington University and used by the NASA Regional Planetary Image Facilities (RPIF's). The IRPS combines data base management and image processing components to allow the user to examine catalogs of image data, locate the data of interest, and perform radiometric and geometric calibration of the data in preparation for analysis. Version 1.0 of IRPS was completed in Aug. 1989 and was installed at several IRPS's. Other RPIF's use remote logins via NASA Science Internet to access IRPS at Washington University. Work was begun on designing and population a catalog of Magellan image products that will be part of IRPS Version 2.0, planned for release by the end of calendar year 1991. With this catalog, a user will be able to search by orbit and by location for Magellan Basic Image Data Records (BIDR's), Mosaicked Image Data Records (MIDR's), and Altimetry-Radiometry Composite Data Records (ARCDR's). The catalog will include the Magellan CD-ROM volume, director, and file name for each data product. The image processing component of IRPS is based on the Planetary Image Cartography Software (PICS) developed by the U.S. Geological Survey, Flagstaff, Arizona. To augment PICS capabilities, a set of image processing programs were developed that are compatible with PICS-format images. This software includes general-purpose functions that PICS does not have, analysis and utility programs for specific data sets, and programs from other sources that were modified to work with PICS images. Some of the software will be integrated into the Version 2.0 release of IRPS. A table is presented that lists the programs with a brief functional description of each.

The barotropic processes associated with the development of a precipitation system are investigated through analysis of cloud-resolving model simulations of Mei-yu torrential rainfall events over eastern China in mid-June 2011. During the model integration period, there were three major heavy rainfall events: 9-12, 13-16 and 16-20 June. The kinetic energy is converted from perturbation to mean circulations in the first and second period, whereas it is converted from mean to perturbation circulations in the third period. Further analysis shows that kinetic energy conversion is determined by vertical transport of zonal momentum. Thus, the prognostic equation of vertical transport of zonal momentum is derived, in which its tendency is associated with dynamic, pressure gradient and buoyancy processes. The kinetic energy conversion from perturbation to mean circulations in the first period is mainly associated with the dynamic processes. The kinetic energy conversion from mean to perturbation circulations in the third period is generally related to the pressure gradient processes.

Mallinckrodt Institute of Radiology (MIR) is using a digital image processingsystem which employs NASA-developed technology. MIR's computer system is the largest radiology system in the world. It is used in diagnostic imaging. Blood vessels are injected with x-ray dye, and the images which are produced indicate whether arteries are hardened or blocked. A computer program developed by Jet Propulsion Laboratory known as Mini-VICAR/IBIS was supplied to MIR by COSMIC. The program provides the basis for developing the computer imaging routines for data processing, contrast enhancement and picture display.

The most notable effect in processing dielectrics with micro- and millimeter-waves is volumetric heating of these materials, offering the opportunity of very high heating rates for the samples. In comparison to conventional heating where the heat transfer is diffusive and depends on the thermal conductivity of the material, the microwave field penetrates the sample and acts as an instantaneous heat source at each point of the sample. By this unique property, microwave heating at 2.45 GHz and 915 MHz ISM (Industrial, Medical, Scientific) frequencies is established as an important industrial technology since more than 50 years ago. Successful application of microwaves in industries has been reported e.g. by food processingsystems, domestic ovens, rubber industry, vacuum drying etc. The present paper shows some outlines of microwave systemdevelopment at Forschungszentrum Karlsruhe, IHM by transferring properties from the higher frequency regime (millimeter-waves) to lower frequency applications. Anyway, the need for using higher frequencies like 24 GHz (ISM frequency) for industrial applications has to be carefully verified with respect to special physical/engineering advantages or to limits the standard microwave technology meets for the specific problem.

The Naval Enlisted Professional Development Information System (NEPDIS) was designed to function as a fully computerized information assembly and analysis system to support labor force, personnel, and training management. The NEPDIS comprises separate training development, instructional, training record and evaluation, career development, and…

Describes the Thai Learning System, which is designed to help learners acquire the Thai word order system. The system facilitates the lessons on the Web using HyperText Markup Language and Perl programming, which interfaces with natural language processing by means of Prolog. (Author/VWL)

How to facilitate efficient video manipulation and access in a web-based environment is becoming a popular trend for video applications. In this paper, we present a web-oriented video management and application processingsystem, based on our previous work on multimedia database and content-based retrieval. In particular, we extend the VideoMAP architecture with specific web-oriented mechanisms, which include: (1) Concurrency control facilities for the editing of video data among different types of users, such as Video Administrator, Video Producer, Video Editor, and Video Query Client; different users are assigned various priority levels for different operations on the database. (2) Versatile video retrieval mechanism which employs a hybrid approach by integrating a query-based (database) mechanism with content- based retrieval (CBR) functions; its specific language (CAROL/ST with CBR) supports spatio-temporal semantics of video objects, and also offers an improved mechanism to describe visual content of videos by content-based analysis method. (3) Query profiling database which records the `histories' of various clients' query activities; such profiles can be used to provide the default query template when a similar query is encountered by the same kind of users. An experimental prototype system is being developed based on the existing VideoMAP prototype system, using Java and VC++ on the PC platform.

A multi-faceted study was initiated in November 1993 to provide assurance that needed testing capabilities, facilities, and support infrastructure (sampling systems, casks, transportation systems, permits, etc.) would be available when needed for process and equipment development to support pretreatment and vitrification facility design and construction schedules. This first major report provides a snapshot of the known testing needs for pretreatment, low-level waste (LLW) and high-level waste (HLW) vitrification, and documents the results of a series of preliminary studies and workshops to define the issues needing resolution by cold or hot testing. Identified in this report are more than 140 Hanford Site tank waste pretreatment and LLW/HLW vitrification technology issues that can only be resolved by testing. The report also broadly characterizes the level of testing needed to resolve each issue. A second report will provide a strategy(ies) for ensuring timely test capability. Later reports will assess the capabilities of existing facilities to support needed testing and will recommend siting of the tests together with needed facility and infrastructure upgrades or additions.

The potential of digital image processingsystems costing $20,000 or less for geographic information systems is assessed with the emphasis on the volume of data to be handled, the commercial hardware systems available, and the basic software for: (1) data entry, conversion and digitization; (2) georeferencing and geometric correction; (3) data structuring; (4) editing and updating; (5) analysis and retrieval; (6) output drivers; and (7) data management. Costs must also be considered as tangible and intangible factors.

The development of a cost effective process sequence that has the potential for the production of flat plate photovoltaic modules which meet the price goal in 1986 of 70 cents or less per Watt peak is described. The major accomplishments include (1) an improved AR coating technique; (2) the use of sand blast back clean-up to reduce clean up costs and to allow much of the Al paste to serve as a back conductor; and (3) the development of wave soldering for use with solar cells. Cells were processed to evaluate different process steps, a cell and minimodule test plan was prepared and data were collected for preliminary Samics cost analysis.

The engineering design, fabrication, assembly, operation, economic analysis, and process support R and D for an Experimental ProcessSystemDevelopment Unit (EPSDU) are reported. About 95% of purchased equipment is received and will be reshipped to the West Coast location. The Data Collection System is completed. In the area of melting/consolidation, to the system using silicon powder transfer, melting and shotting on a pseudocontinuous basis is demonstrated. It is proposed to continue the very promising fluid bed work.

The development and processing of three-dimensional images as a "hypercube" of spectral data in hyperspectral optical-electronic remote sensing systems are described in a formalized manner. The correlation identification of observed objects on the basis of spectral features is considered. The criterion for determining of similarity between vectors of recorded and reference spectral images of objects is based on their cross-correlation. Taking into the fact that the total spectral data array recorded by currently applicable hyperspectrometers is excessive for the solution of many issues related to remote sensing of the Earth, this paper proposes a method making it possible to reduce spectral data redundancy by selection of the most informative spectral channels. The essential dimension of the spectral data makes it possible to solve issues related to identification and classification of objects by spectral features through a limited number of very informative spectral channels selected in the areas where the function describing a spectral image of the observed object undergoes well-defined changes in behavior. The algorithm for selection of the most informative spectral channels, which is based on the determination of jump coordinates (major changes) of a spectral image, is substantiated. The selected channels meet the maximum likelihood criterion. The obtained experimental research data on object identification quality with involvement of real hyperspectral data of aerospace Earth remote sensing systems are reported. Five to twenty spectral readouts are needed to provide identification by a limited number of very informative spectral channels. This confirms the idea of existing essential dimensionality of the spectral data.

We have been developing a new laser cladding system to repair the damages of parts in aging plants. It consists of some devices which are a laser torch, composite-type optical fiber, QCW fiber laser and etc. All devices are installed in a mobile rack, so we can carry it to plants, laboratories or anywhere we want to use. We should irradiate the work with the best accuracy of laser beam and filler wire in laser cladding. A composite-type optical fiberscope is useful. This fiberscope was composed of a center fiber for beam delivery surrounded by 20000 fibers for visible image delivery. Thus it always keeps target on center of gun-sight. We succeeded to make a line laser cladding on an inside wall of 1-inch tube by our system. Before this success, we solved two serious problems which are the contamination of optics and the deformation of droplet. Observing laser cladding process by X-ray imaging with Spring-8 synchrotron radiation, we found that the molten pool depth was formed to be under a hundred micrometers for 10 milliseconds. A Quasi-CW fiber laser with 1 kW was employed for a heat source to generate the shallow molten pool. The X-ray shadowgraph clarified that a molten droplet was formed at the edge of a wire up to a millimeter size. It grew up if the wire didn't contact with the tube wall in initial state. Here we succeeded to measure the thermo-electromotive force voltage between a wire and a tube metal to confirm whether both came in contact. We propose to apply the laser cladding technology to the maintenance of aging industrial plants and nuclear facilities.

Most of the time (and cost) involved in planning hot forging process is related to activities strongly dependent on human expertise, intuition, and creativity, and also to iterative procedure involving extensive experimental work. In this paper, the development of an expert system for forging process design, which emphasizes materials’ workability, is discussed. Details of the forging process design expert system, its basic modules, design and implementation details, and deliverables are explained. The system uses the vast database available on the hot workability of more than 200 technologically important materials and the knowledge acquired from a materials’ expert. The C Language Integrated Production System (CLIPS) has been adopted to develop this expert system. The expert system can address three types of functions, namely, forging process design, materials information system, and forging defect analysis. The expert system will aid and prompt a novice engineer in designing a forging process by providing accurate information of the process parameters, lubricants, type of machine, die material, and type of process (isothermal versus non-isothermal) for a given material with a known specification or code and prior history.

is reactive, oxidizing once it has been deposited into soil. The recovery process needs a variety of measurement systems for precisely locating the...medical imaging material, etc.), toxic or hazardous materials such as pesticides or asbestos , consumer products such as refrigerators or...2012), the soil fines remain contaminated by tightly bound DU oxides. The leaching system removes uranium from these soil fines and deposits the

The Interactive Software Invocation System (NASA-ISIS) was first transported to the M68000 microcomputer, and then rewritten in the programming language Path Pascal. Path Pascal is a significantly enhanced derivative of Pascal, allowing concurrent algorithms to be expressed using the simple and elegant concept of Path Expressions. The primary results of this contract was to verify the viability of Path Pascal as a system's development language. The NASA-ISIS implementation using Path Pascal is a prototype of a large, interactive system in Path Pascal. As such, it is an excellent demonstration of the feasibility of using Path Pascal to write even more extensive systems. It is hoped that future efforts will build upon this research and, ultimately, that a full Path Pascal/ISIS Operating System (PPIOS) might be developed.

Human aging is a complex process with pivotal changes in gene expression of biological pathways. Immune system dysfunction has been recognized as one of the most important abnormalities induced by senescent names immunosenescence. Emerging evidences suggest miR role in immunosenescence. We aimed to systemically review all relevant reports to clearly state miR effects on immunosenescence process. Sensitive electronic searches carried out. Quality assessment has been performed. Since majority of the included studies were laboratory works, and therefore heterogen, we discussed miR effects on immunological aging process nonstatically. Forty-six articles were found in the initial search. After exclusion of 34 articles, 12 studies enrolled to the final stage. We found that miRs have crucial roles in exact function of immune system. MiRs are involved in the regulation of the aging process in the immune system components and target certain genes, promoting or inhibiting immune system reaction to invasion. Also, miRs control life span of the immune system members by regulation of the genes involved in the apoptosis. Interestingly, we found that immunosenescence is controllable by proper manipulation of the various miRs expression. DNA methylation and histone acetylation have been discovered as novel strategies, altering NF-κB binding ability to the miR promoter sites. Effect of miRs on impairment of immune system function due to the aging is emerging. Although it has been accepted that miRs have determinant roles in the regulation of the immunosenescence; however, most of the reports are concluded from animal/laboratory works, suggesting the necessity of more investigations in human.

ABO typing is the first test done on blood that is to be used for transfusion. A person must receive ABO-matched blood, as ABO incompatibility is the major cause of fatal transfusion reactions. Until now, this blood typing has been done manually, and there is therefore a need for an automated typing machine that uses a very small volume of blood. In this paper, we present a new micro blood-typing system with a fully 3-dimentional geometry, which was realized using micro-stereolithography. This system was fabricated with a novel integration process based on a virtual environment and blood typing experiments using this system were successfully performed.

This presentation identifies the key materials with the highest payoff to advance the state-of-the-art in materials for turbomachinery. Current show stoppers for advancing the state of the art in materials and processes were identified. Technical issues associated with incorporating key materials into new systems are discussed. Opportunities, where they exist, are identified to overcome these technical challenges.

The hot forming process of steel requires temperatures of up to 1300°C. Usually, the invested energy is lost to the environment by the subsequent cooling of the forged parts to room temperature. Thermoelectric systems are able to recover this wasted heat by converting the heat into electrical energy and feeding it into the power grid. The proposed thermoelectric system covers an absorption surface of half a square meter, and it is equipped with 50 Bismuth-Telluride based thermoelectric generators, five cold plates, and five inverters. Measurements were performed under production conditions of the industrial environment of the forging process. The heat distribution and temperature profiles are measured and modeled based on the prevailing production conditions and geometric boundary conditions. Under quasi-stationary conditions, the thermoelectric system absorbs a heat radiation of 14.8 kW and feeds electrical power of 388 W into the power grid. The discussed model predicts the measured values with slight deviations.

Development of efficient bioprocesses is essential for cost-effective manufacturing of recombinant therapeutic proteins. To achieve further process improvement and process rationalization comprehensive data analysis of both process data and phenotypic cell-level data is essential. Here, we present a framework for advanced bioprocess data analysis consisting of multivariate data analysis (MVDA), metabolic flux analysis (MFA), and pathway analysis for mapping of large-scale gene expression data sets. This data analysis platform was applied in a processdevelopment project with an IgG-producing Chinese hamster ovary (CHO) cell line in which the maximal product titer could be increased from about 5 to 8 g/L.Principal component analysis (PCA), k-means clustering, and partial least-squares (PLS) models were applied to analyze the macroscopic bioprocess data. MFA and gene expression analysis revealed intracellular information on the characteristics of high-performance cell cultivations. By MVDA, for example, correlations between several essential amino acids and the product concentration were observed. Also, a grouping into rather cell specific productivity-driven and process control-driven processes could be unraveled. By MFA, phenotypic characteristics in glycolysis, glutaminolysis, pentose phosphate pathway, citrate cycle, coupling of amino acid metabolism to citrate cycle, and in the energy yield could be identified. By gene expression analysis 247 deregulated metabolic genes were identified which are involved, inter alia, in amino acid metabolism, transport, and protein synthesis.

This paper outlines an evaluation system for elementary education based on cognitive and social development research. Piaget has defined certain core mental structures that develop in all humans in the same order. Both Piaget and Kohlberg have done work on the process of developmental change and the conditions necessary for optimal growth of the…

Successful exploitation of the vast amount of heat stored beneath the earth’s surface in hydrothermal and fluid-limited, low permeability geothermal resources would greatly expand the Nation’s domestic energy inventory and thereby promote a more secure energy supply, a stronger economy and a cleaner environment. However, a major factor limiting the expanded development of current hydrothermal resources as well as the production of enhanced geothermal systems (EGS) is insufficient knowledge about the chemical processes controlling subsurface fluid flow. With funding from past grants from the DOE geothermal program and other agencies, we successfully developed advanced equation of state (EOS) and simulation technologies that accurately describe the chemistry of geothermal reservoirs and energy production processes via their free energies for wide XTP ranges. Using the specific interaction equations of Pitzer, we showed that our TEQUIL chemical models can correctly simulate behavior (e.g., mineral scaling and saturation ratios, gas break out, brine mixing effects, down hole temperatures and fluid chemical composition, spent brine incompatibilities) within the compositional range (Na-K-Ca-Cl-SO4-CO3-H2O-SiO2-CO2(g)) and temperature range (T < 350°C) associated with many current geothermal energy production sites that produce brines with temperatures below the critical point of water. The goal of research carried out under DOE grant DE-FG36-04GO14300 (10/1/2004-12/31/2007) was to expand the compositional range of our Pitzer-based TEQUIL fluid/rock interaction models to include the important aluminum and silica interactions (T < 350°C). Aluminum is the third most abundant element in the earth’s crust; and, as a constituent of aluminosilicate minerals, it is found in two thirds of the minerals in the earth’s crust. The ability to accurately characterize effects of temperature, fluid mixing and interactions between major rock-forming minerals and hydrothermal and

Between 2008 and 2035 global energy demand is expected to grow by 53%. While most industry-level analyses of manufacturing in the United States (U.S.) have traditionally focused on high energy consumers such as the petroleum, chemical, paper, primary metal, and food sectors, the remaining sectors account for the majority of establishments in the U.S. Specifically, of the establishments participating in the Energy Information Administration's Manufacturing Energy Consumption Survey in 2006, the non-energy intensive" sectors still consumed 4*109 GJ of energy, i.e., one-quarter of the energy consumed by the manufacturing sectors, which is enough to power 98 million homes for a year. The increasing use of renewable energy sources and the introduction of energy-efficient technologies in manufacturing operations support the advancement towards a cleaner future, but having a good understanding of how the systems and processes function can reduce the environmental burden even further. To facilitate this, methods are developed to model the energy of manufacturing across three hierarchical levels: production equipment, factory operations, and industry; these methods are used to accurately assess the current state and provide effective recommendations to further reduce energy consumption. First, the energy consumption of production equipment is characterized to provide machine operators and product designers with viable methods to estimate the environmental impact of the manufacturing phase of a product. The energy model of production equipment is tested and found to have an average accuracy of 97% for a product requiring machining with a variable material removal rate profile. However, changing the use of production equipment alone will not result in an optimal solution since machines are part of a larger system. Which machines to use, how to schedule production runs while accounting for idle time, the design of the factory layout to facilitate production, and even the

The design, fabrication, and installation of an experimental processsystemdevelopment unit (EPSDU) were analyzed. Supporting research and development were performed to provide an information data base usable for the EPSDU and for technological design and economical analysis for potential scale-up of the process. Iterative economic analyses were conducted for the estimated product cost for the production of semiconductor grade silicon in a facility capable of producing 1000-MT/Yr.

The process technology for the manufacture of semiconductor-grade silicon in a large commercial plant by 1986, at a price less than $14 per kilogram of silicon based on 1975 dollars is discussed. The engineering design, installation, checkout, and operation of an Experimental ProcessSystemDevelopment unit was discussed. Quality control of scaling-up the process and an economic analysis of product and production costs are discussed.

The process for dry powder impregnation of carbon fiber tows being developed at LaRC overcomes many of the difficulties associated with melt, solution, and slurry prepregging. In the process, fluidized powder is deposited on spread tow bundles and fused to the fibers by radiant heating. Impregnated tows have been produced for preform, weaving, and composite materials applications. Design and operating data correlations were developed for scale up of the process to commercial operation. Bench scale single tow experiments at tow speeds up to 50 cm/sec have demonstrated that the process can be controlled to produce weavable towpreg. Samples were woven and molded into preform material of good quality.

Pain is a serious problem for infants and children and treatment options are limited. Moreover, infants born prematurely or hospitalized for illness likely have concurrent infection that activates the immune system. It is now recognized that the immune system in general and glia in particular influence neurotransmission and that the neural bases of pain are intimately connected to immune function. We know that injuries that induce pain activate immune function and suppressing the immune system alleviates pain. Despite this advance in our understanding, virtually nothing is known of the role that the immune system plays in pain processing in infants and children, even though pain is a serious clinical issue in pediatric medicine. This brief review summarizes the existing data on immune-neural interactions in infants, providing evidence for the immaturity of these interactions.

This paper presents an introduction to the development of the flux map processing code MAPLE developed by China Nuclear Power Technology Research Institute (CNPPJ), China Guangdong Nuclear Power Group (CGN). The method to get the three-dimensional 'measured' power distribution according to measurement signal has also been described. Three methods, namely, Weight Coefficient Method (WCM), Polynomial Expand Method (PEM) and Thin Plane Spline (TPS) method, have been applied to fit the deviation between measured and predicted results for two-dimensional radial plane. The measured flux map data of the LINGAO nuclear power plant (NPP) is processed using MAPLE as a test case to compare the effectiveness of the three methods, combined with a 3D neutronics code COCO. Assembly power distribution results show that MAPLE results are reasonable and satisfied. More verification and validation of the MAPLE code will be carried out in future. (authors)

The process of geometric parameter estimation to accuracies of one centimeter, i.e., multilateration, is defined and applications are listed. A brief functional explanation of the theory is presented. Next, various multilateration systems are described in order of increasing system complexity. Expected systems accuracy is discussed from a general point of view and a summary of the errors is listed. An outline of the design of a software processingsystem for multilateration, called MICRODOT, is presented next. The links of this software, which can be used for multilateration data simulations or operational data reduction, are examined on an individual basis. Functional flow diagrams are presented to aid in understanding the software capability. MICRODOT capability is described with respect to vehicle configurations, interstation coordinate reduction, geophysical parameter estimation, and orbit determination. Numerical results obtained from MICRODOT via data simulations are displayed both for hypothetical and real world vehicle/station configurations such as used in the GEOS-3 Project. These simulations show the inherent power of the multilateration procedure.

A scanner-based storage system employs a head mounted on a scanner which oscillates over the moving media. The head moves in an approximately sinusoidal path relative to the media at a high frequency, time-multiplexing the read/write signals of several tracks. The resulting multi-channel readback can yield higher data rates over a conventional system with a head that moves linearly relative to the media. Scanner-based storage systems are not commercially available at present. We are envisioning a system that uses an opto-electronic scanner, developed at CMU, in which the deflection of a laser beam is controlled by an input voltage. Since no mechanical motion is involved, this scanner has a high bandwidth which makes it well suited to our application.

The development of animal systems is described in terms of a series of overlapping phases: pattern specification; differentiation; growth; and aging. The extent to which altered (micro) gravity (g) affects those phases is briefly reviewed for several animal systems. As a model, amphibian egg/early embryo is described. Recent data derived from clinostat protocols indicates that microgravity simulation alters early pattern specification (dorsal/ventral polarity) but does not adversely influence subsequent morphogenesis. Possible explanations for the absence of catastrophic microgravity effects on amphibian embryogenesis are discussed.

Natural rubber processing wastewater contains high concentrations of organic compounds, nitrogen, and other contaminants. In this study, a treatment system composed of a baffled reactor (BR), an upflow anaerobic sludge blanket (UASB) reactor, and a downflow hanging sponge (DHS) reactor was used to treat natural rubber processing wastewater in Vietnam. The BR showed good total suspended solids removal of 47.6%, as well as acidification of wastewater. The UASB reactor achieved a high chemical oxygen demand (COD) removal efficiency of 92.7% ± 2.3% and energy recovery in the form of methane with an organic loading rate of 12.2 ± 6.6 kg-COD·m(-3)·day(-1). The DHS reactor showed a high performance in residual organic matter removal from UASB effluent. In total, the system achieved high-level total COD removal of 98.6% ± 1.2% and total suspended solids removal of 98.0% ± 1.4%. Massive parallel 16S rRNA gene sequencing of the retained sludge in the UASB reactor showed the predominant microbial phyla to be Bacteroidetes, Firmicutes, Proteobacteria, WWE1, and Euryarchaeota. Uncultured bacteria belonging to the phylum Bacteroidetes and Phylum WWE1 were predominant in the UASB reactor. This microbial assemblage utilizes the organic compounds contained in natural rubber processing wastewater. In addition, the methane-producing archaea Methanosaeta sp. and Methanolinea sp. were detected.

Advanced furnace systems are being developed for use in space. Systems are being tested for current experiment applications and modified for future experiment requirements. Future projects are: (1) fabrication and testing of the Advanced Automated Directional Solidification Furnace (AADSF) flight hardware; (2) development of a Heat Pipe Furnace (HPF) for use in space. Heat pipes will be tested for space flight qualification in conjunction with the furnace development. The HPF design will be based on the AADSF development and will be of modular design including capabilities of operating with or without heat pipes; and (3) the AADSF furnace will be modified and tested to operate at temperatures up to 1700 C in the heated cavity. This will be accomplished by developing a new hot end heating module and insulation package for the existing AADSF. Refurbishment of the Drop Tower Furnace (DTF) is under way. The DTF can operate at temperatures up to 1700 C. The sample size will be approximately 3/8 in. dia. x 5/8 in. long. Design improvements for the General Purpose Rocket Furnace (GPRF) for use in the Material Experiment Assembly (MEA) are to be accomplished.

KRW Energy Systems, Inc., is engaged in the continuing development of a pressurized, fluidized-bed gasification process at its Waltz Mill Site in Madison, Pennsylvania. The overall objective of the program is to demonstrate the viability of the KRW process for the environmentally-acceptable production of low- and medium-Btu fuel gas from a variety of fossilized carbonaceous feedstocks and industrial fuels. This report presents process analysis of the 24 ton-per-day ProcessDevelopment Unit (PDU) operations and is a continuation of the process analysis work performed in 1980 and 1981. Included is work performed on PDU process data; gasification; char-ash separation; ash agglomeration; fines carryover, recycle, and consumption; deposit formation; materials; and environmental, health, and safety issues. 63 figs., 43 tabs.

To support the development of flexible and reusable MAS, we have built a framework designated MAS-CF. MAS-CF is a component framework that implements a layered architecture based on contextual composition. Interaction rules, controlled by architecture mechanisms, ensure very low coupling, making possible the sharing of distributed services in a transparent, dynamic and independent way. These properties propitiate large-scale reuse, since organizational abstractions can be reused and propagated to all instances created from a framework. The objective is to reduce complexity and development time of multi-agent systems through the reuse of generic organizational abstractions.

Activity analysis in a small laboratory animal is an effective procedure for various bioscience fields. The simplest way to obtain animal activity data is just observation and recording manually, even though this is labor intensive and rather subjective. In order to analyze animal movement automatically and objectivity, expensive equipment is usually needed. In the present study, we develop animal activity analysis system by means of a template matching method with video recorded movements in laboratory animal at a low cost.

An Integrated Waste Management-Water System (WM-WS) which uses radioisotopes for thermal energy is described and results of its trial in a 4-man, 180 day simulated space mission are presented. It collects urine, feces, trash, and wash water in zero gravity, processes the wastes to a common evaporator, distills and catalytically purifies the water, and separates and incinerates the solid residues using little oxygen and no chemical additives or expendable filters. Technical details on all subsystems are given along with performance specifications. Data on recovered water and heat loss obtained in test trials are presented. The closed loop incinerator and other projects underway to increase system efficiency and capacity are discussed.

Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-230 C and 270-280 C. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, producing a final product that will have a lower mass but a higher heating value. An important aspect of research is to establish a degree of torrefaction where gains in heating value offset the loss of mass. There is a lack of literature on torrefaction reactor designs and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes (a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and (b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed for different capacities, designing the heat loads and gas flow rates, and developing an interactive excel sheet where the user can define design specifications. In this report, 25-1000 kg/hr are used in equations for the design of the torrefier, examples of calculations, and specifications for the torrefier.

Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300°C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200–230ºC and 270–280ºC. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, producing a final product that will have a lower mass but a higher heating value. An important aspect of research is to establish a degree of torrefaction where gains in heating value offset the loss of mass. There is a lack of literature on torrefaction reactor designs and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed for different capacities, designing the heat loads and gas flow rates, and developing an interactive excel sheet where the user can define design specifications. In this report, 25–1000 kg/hr are used in equations for the design of the torrefier, examples of calculations, and specifications for the torrefier.

The development of high temperature containerless processing equipment and the design and evaluation of associated systems required for microgravity materials processing and property measurements are discussed. Efforts were directed towards the following task areas: design and development of a High Temperature Acoustic Levitator (HAL) for containerless processing and property measurements at high temperatures; testing of the HAL module to establish this technology for use as a positioning device for microgravity uses; construction and evaluation of a brassboard hot wall Acoustic Levitation Furnace; construction and evaluation of a noncontact temperature measurement (NCTM) system based on AGEMA thermal imaging camera; construction of a prototype Division of Amplitude Polarimetric Pyrometer for NCTM of levitated specimens; evaluation of and recommendations for techniques to control contamination in containerless materials processing chambers; and evaluation of techniques for heating specimens to high temperatures for containerless materials experimentation.

The ARGOS project (Advanced Rayleigh guided Ground layer adaptive Optics System) will upgrade the Large Binocular Telescope (LBT) with an AO System consisting of six Rayleigh laser guide stars. This adaptive optics system integrates several control loops and many different components like lasers, calibration swing arms and slope computers that are dispersed throughout the telescope. The purpose of the instrument control software (ICS) is running this AO system and providing convenient client interfaces to the instruments and the control loops. The challenges for the ARGOS ICS are the development of a distributed and safety-critical software system with no defects in a short time, the creation of huge and complex software programs with a maintainable code base, the delivery of software components with the desired functionality and the support of geographically distributed project partners. To tackle these difficult tasks, the ARGOS software engineers reuse existing software like the novel middleware from LINC-NIRVANA, an instrument for the LBT, provide many tests at different functional levels like unit tests and regression tests, agree about code and architecture style and deliver software incrementally while closely collaborating with the project partners. Many ARGOS ICS components are already successfully in use in the laboratories for testing ARGOS control loops.

Significant advances have been made in the last decade in the areas of Geographic Information Systems (GIS) and spatial analysis technology, both in hardware and software. Science user requirements are so problem specific that currently no single system can satisfy all of the needs. The work presented here forms part of a conceptual framework for an all-encompassing science-user workstation system. While definition and development of the system as a whole will take several years, it is intended that small scale projects such as the current work will address some of the more short term needs. Such projects can provide a quick mechanism to integrate tools into the workstation environment forming a larger, more complete hydrologic analysis platform. Described here are two components that are very important to the practical use of remote sensing and digital map data in hydrology. Described here is a graph-theoretic technique to rasterize elevation contour maps. Also described is a system to manipulate synthetic aperture radar (SAR) data files and extract soil moisture data.

The present investigation is concerned with the potential applications of trusted computer system technologies in space. It is suggested that the rapidly expanding roles of new space defense missions will require space-borne command, control, communication, intelligence, and battle management (C2/I-BM) systems. The trusted computer system technology can be extended to develop new computer architectures which are able to support the broader requirements of C3/I-BM processing. The Gemini Trusted Multiple Microcomputer Base product is being developed to meet the demanding requirements and to support simultaneously the multiple capabilities. Attention is given to recent important events of trusted computer systemdevelopments, and to the Gemini system architecture.

Background The lack of a uniform way for qualitative and quantitative evaluation of vaccine candidates under development led us to set up a standardized scheme for vaccine efficacy and safety evaluation. We developed and implemented molecular and immunology methods, and designed support tools for immunization data storage and analyses. Such collection can create a unique opportunity for immunologists to analyse data delivered from their laboratories. Results We designed and implemented GeVaDSs (Genetic Vaccine Decision Support system) an interactive system for efficient storage, integration, retrieval and representation of data. Moreover, GeVaDSs allows for relevant association and interpretation of data, and thus for knowledge-based generation of testable hypotheses of vaccine responses. Conclusions GeVaDSs has been tested by several laboratories in Europe, and proved its usefulness in vaccine analysis. Case study of its application is presented in the additional files. The system is available at: http://gevads.cs.put.poznan.pl/preview/(login: viewer, password: password). PMID:22574945

The 2011 volcanic unrest at El Hierro Island illustrated the need for a Volcanic Alert System (VAS) specifically designed for the management of volcanic crises developing after long repose periods. The VAS comprises the monitoring network, the software tools for analysis of the monitoring parameters, the Volcanic Activity Level (VAL) management, and the assessment of hazard. The VAS presented here focuses on phenomena related to moderate eruptions, and on potentially destructive volcano-tectonic earthquakes and landslides. We introduce a set of new data analysis tools, aimed to detect data trend changes, as well as spurious signals related to instrumental failure. When data-trend changes and/or malfunctions are detected, a watchdog is triggered, issuing a watch-out warning (WOW) to the Monitoring Scientific Team (MST). The changes in data patterns are then translated by the MST into a VAL that is easy to use and understand by scientists, technicians, and decision-makers. Although the VAS was designed specifically for the unrest episodes at El Hierro, the methodologies may prove useful at other volcanic systems.

The basic objectives of the program are the following: (1) to design, develop, construct and deliver a junction processingsystem which will be capable of producing solar cell junctions by means of ion implantation followed by pulsed electron beam annealing; (2) to include in the system a wafer transport mechanism capable of transferring 4-inch-diameter wafers into and out of the vacuum chamber where the ion implantation and pulsed electron beam annealing processes take place; (3) to integrate, test and demonstrate the system prior to its delivery to JPL along with detailed operating and maintenance manuals; and (4) to estimate component lifetimes and costs, as necessary for the contract, for the performance of comprehensive analyses in accordance with the Solar Array Manufacturing Industry Costing Standards (SAMICS). Under this contract the automated junction formation equipment to be developed involves a new system design incorporating a modified, government-owned, JPL-controlled ion implanter into a spire-developed pulsed electron beam annealer and wafer transport system. When modified, the ion implanter will deliver a 16 mA beam of /sup 31/P/sup +/ ions with a fluence of 2.5 x 10/sup 15/ ions per square centimeter at an energy of 10 keV. The throughput design goal rate for the junction processor is 10/sup 7/ four-inch-diameter wafers per year. Work on the pulsed electron beam subsystem development is described. (WHK)

Astro-H is the sixth Japanese X-ray space observatory which will be launched in 2014. Two of onboard instruments of Astro-H, Hard X-ray Imager and Soft Gamma-ray Detector are surrounded by many number of large Bismuth Germanate (Bi4Ge3O12; BGO) scintillators. Optimum readout system of scintillation lights from these BGOs are essential to reduce the background signals and achieve high performance for main detectors because most of gamma-rays from out of field-of-view of main detectors or radio-isotopes produced inside them due to activation can be eliminated by anti-coincidence technique using BGO signals. We apply Avalanche Photo Diode (APD) for light sensor of these BGO detectors since their compactness and high quantum efficiency make it easy to design such large number of BGO detector system. For signal processing from APDs, digital filter and other trigger logics on the Field-Programmable Gate Array (FPGA) is used instead of discrete analog circuits due to limitation of circuit implementation area on spacecraft. For efficient observations, we have to achieve as low threshold of anti-coincidence signal as possible by utilizing the digital filtering. In addition, such anti-coincident signals should be sent to the main detector within 5 μs to make it in time to veto the A-D conversion. Considering this requirement and constraint from logic size of FPGA, we adopt two types of filter, 8 delay taps filter with only 2 bit precision coefficient and 16 delay taps filter with 8 bit precision coefficient. The data after former simple filter provides anti-coincidence signal quickly in orbit, and the latter filter is used for detail analysis after the data is down-linked.

A data acquisition and control system for a triple quadrupole mass spectrometer has been developed using several microprocessors in a distributed processingsystem. This system includes four processors, one acting as the system master controlling three slave processors. In such a distributed processingsystem each processor is assigned a specific task. Critical to this application is the allocation of the task of data acquisition, ion path control, and peak finding to separate slave processors. This modular approach leads to a system where each major section of the instrument has it's own dedicated intelligence. This parallel processingsystem allows operations that are often implemented in hardware (for speed considerations) to be performed in software. The use of triple quadrupole mass spectrometry, and MS/MS technique, to detect selected species in middle distillate fuels was examined. Collision-activated dissociation (CAD) spectra were obtained for reference compounds from several heteroatom-containing compound classes. The CAD results were used to select screening reactions for each compound class. The effectiveness of these screening reactions was demonstrated by identifying the presence of various species in samples of Jet A aviation fuel, a shale oil derived fuel and No. 2 diesel fuel.

Progress in the development of techniques to grow silicon web at 25 wq cm/min output rate is reported. Feasibility of web growth with simultaneous melt replenishment is discussed. Other factors covered include: (1) tests of aftertrimmers to improve web width; (2) evaluation of growth lid designs to raise speed and output rate; (3) tests of melt replenishment hardware; and (4) investigation of directed gas flow systems to control unwanted oxide deposition in the system and to improve convective cooling of the web. Compatibility with sufficient solar cell performance is emphasized.

The use of clinical imaging modalities within the pharmaceutical research space provides value and challenges. Typical clinical settings will utilize a Picture Archive and Communication System (PACS) to transmit and manage Digital Imaging and Communications in Medicine (DICOM) images generated by clinical imaging systems. However, a PACS is complex and provides many features that are not required within a research setting, making it difficult to generate a business case and determine the return on investment. We have developed a next-generation DICOM processingsystem using open-source software, commodity server hardware such as Apple Xserve®, high-performance network-attached storage (NAS), and in-house-developed preprocessing programs. DICOM-transmitted files are arranged in a flat file folder hierarchy easily accessible via our downstream analysis tools and a standard file browser. This next-generation system had a minimal construction cost due to the reuse of all the components from our first-generation system with the addition of a second server for a few thousand dollars. Performance metrics were gathered and the system was found to be highly scalable, performed significantly better than the first-generation system, is modular, has satisfactory image integrity, and is easier to maintain than the first-generation system. The resulting system is also portable across platforms and utilizes minimal hardware resources, allowing for easier upgrades and migration to smaller form factors at the hardware end-of-life. This system has been in production successfully for 8 months and services five clinical instruments and three pre-clinical instruments. This system has provided us with the necessary DICOM C-Store functionality, eliminating the need for a clinical PACS for day-to-day image processing.

Several major modifications were made to the design presented at the PDR. The frame was deleted in favor of a "frameless" design which will provide a substantially improved cell packing factor. Potential shaded cell damage resulting from operation into a short circuit can be eliminated by a change in the cell series/parallel electrical interconnect configuration. The baseline process sequence defined for the MEPSON was refined and equipment design and specification work was completed. SAMICS cost analysis work accelerated, format A's were prepared and computer simulations completed. Design work on the automated cell interconnect station was focused on bond technique selection experiments.

The hard X-ray Imager and Soft Gamma-ray Detector onboard ASTRO-H demonstrate high sensitivity to hard X-ray (5-80 keV) and soft gamma-rays (60-600 keV), respectively. To reduce the background, both instruments are actively shielded by large, thick Bismuth Germanate scintillators. We have developed the signal processingsystem of the avalanche photodiode in the BGO active shields and have demonstrated its effectiveness after assembly in the flight model of the HXI/SGD sensor and after integration into the satellite. The energy threshold achieved is about 150 keV and anti-coincidence efficiency for cosmic-ray events is almost 100%. Installed in the BGO active shield, the developed signal processingsystem successfully reduces the room background level of the main detector.

Nuclear Thermal Propulsion (NTP) is under development for deep space exploration. NTP's high specific impulse (> 850 second) enables a large range of destinations, shorter trip durations, and improved reliability. W-60vol%UO2 CERMET fuel development efforts emphasize fabrication, performance testing and process optimization to meet service life requirements. Fuel elements must be able to survive operation in excess of 2850 K, exposure to flowing hydrogen (H2), vibration, acoustic, and radiation conditions. CTE mismatch between W and UO2 result in high thermal stresses and lead to mechanical failure as a result UO2 reduction by hot hydrogen (H2) [1]. Improved powder metallurgy fabrication process control and mitigated fuel loss can be attained by coating UO2 starting powders within a layer of high density tungsten [2]. This paper discusses the advances of a fluidized bed chemical vapor deposition (CVD) system that utilizes the H2-WCl6 reduction process.

A comprehensive but simple-to-use software package called DPS (Data ProcessingSystem) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology.

The main objectives of this project were the development of a four-compartment electrolytic cell using high selective membranes to remove nitrate from crop residue leachate and convert it to nitric acid, and the development of an six compartment electrodialysis cell to remove selectively sodium from urine wastes. The recovery of both plant inedible biomass and human wastes nutrients to sustain a biomass production system are important aspects in the development of a controlled ecological life support system (CELSS) to provide the basic human needs required for life support during long term space missions. A four-compartment electrolytic cell has been proposed to remove selectively nitrate from crop residue and to convert it to nitric acid, which is actually used in the NASA-KSC Controlled Ecological Life Support System to control the pH of the aerobic bioreactors and biomass production chamber. Human activities in a closed system require large amount of air, water and minerals to sustain life and also generate wastes. Before using human wastes as nutrients, these must be treated to reduce organic content and to remove some minerals which have adverse effects on plant growth. Of all the minerals present in human urine, sodium chloride (NACl) is the only one that can not be used as nutrient for most plants. Human activities also requires sodium chloride as part of the diet. Therefore, technology to remove and recover sodium chloride from wastes is highly desirable. A six-compartment electrodialysis cell using high selective membranes has been proposed to remove and recover NaCl from human urine.

Research efforts during the first year focused on numerical simulations of two convective systems with the Penn State/NCAR mesoscale model. The first of these systems was tropical cyclone Irma, which occurred in 1987 in Australia's Gulf of Carpentaria during the AMEX field program. Comparison simulations of this system were done with two different convective parameterization schemes (CPS's), the Kain-Fritsch (1993 - KF) and the Betts-Miller (Betts 1986- BM) schemes. The second system was the June 10-11 1985 squall line simulation, which occurred over the Kansas-Oklahoma region during the PRE-STORM experiment. Simulations of this system using the KF scheme were examined in detail.

Research efforts focused on numerical simulations of two convective systems with the Penn State/NCAR mesoscale model. The first of these systems was tropical cyclone Irma, which occurred in 1987 in Australia's Gulf of Carpentaria during the AMEX field program. Comparison simulations of this system were done with two different convective parameterization schemes (CPS's), the Kain-Fritsch (KF) and the Betts-Miller (BM) schemes. The second system was the June 10-11, 1985 squall line simulation, which occurred over the Kansas-Oklahoma region during the PRE-STORM experiment. Simulations of this system using the KF scheme were examined in detail.

The wet oxidation process is considered as a potential treatment method for wastes aboard manned spacecraft for these reasons: (1) Fecal and urine wastes are processed to sterile water and CO2 gas. However, the water requires post-treatment to remove salts and odor; (2) the residual ash is negligible in quantity, sterile and easily collected; and (3) the product CO2 gas can be processed through a reduction step to aid in material balance if needed. Reaction of waste materials with oxygen at elevated temperature and pressure also produces some nitrous oxide, as well as trace amounts of a few other gases.

Structure Chart ..... ............... . 66 21 Block Diagram of Correlation Method ......... . 72 vi A .0 P List of Tables .. .:.,- Table Page I User Interface...transmitted and received by complex electrical apparatus, the performance of which could be subjected to mathematical analysis. Signal processing now ...The definition of the term "signal" now includes almost any physical variable of interest, and the techniques of signal analysis and processing are

Whether or not a computerized system enhances the conditions of work in the application domain, very much demands on the user interface. Graphical user interfaces seem to attract the interest of the users but mostly ignore some basic rules of visual information processing thus leading to systems which are difficult to use, lowering productivity and increasing working stress (cognitive and work load). In this work we present some fundamental ergonomic considerations and their application to the medical image processing and archiving domain. We introduce the extensions to an existing concept needed to control and guide the development of GUIs with respect to domain specific ergonomics. The suggested concept, called Model-View-Controller Constraints (MVCC), can be used to programmatically implement ergonomic constraints, and thus has some advantages over written style guides. We conclude with the presentation of existing norms and methods to evaluate user interfaces.

NiCrAlY coatings were deposited using the Mettech Axial III™ plasma spray system. The microstructural features of the coatings, such as the porosity, crack, un-melted particle, and oxide content, were analyzed to investigate the effects of the spray process parameters on these features. Two Taguchi arrays were used to examine the effects of the spray process parameters such as powder size, ratio of (H2 + N2) gas flow over total gas flow, current, spray-gun nozzle size, and spray distance, on the microstructural features of the coatings. The results from statistical analysis are used to create regression equations to predict the microstructural features of the coatings. In the regression equations, a process index (PI) is used as a complex variable incorporating a number of process parameters. The results from an additional set of experiments are used to verify the validity of the regression equations. It has been demonstrated that the equations correlate well with the results from the subsequent set of experiments. It is concluded from this study that the PI can be used to categorize coating qualities with respect to the extent of crack, porosity, unmelted particle, and oxide content in the coating. These equations can also serve as an initial step in developingprocess parameters by means of the Mettech Axial III™ System.

At the West Valley Demonstration Project (WVDP),the Vitrification Facility (VF)is designed to convert the high-level radioactive waste (HLW)stored on the site to a stable glass for disposal at a Department of Energy (DOE)-specified federal repository. The Scaled Vitrification System III (SVS-III)verification tests were conducted between February 1995 and August 1995 as a supplemental means to support the vitrification process flowsheet, but at only one seventh the scale.During these tests,the process flowsheet was refined and optimized. The SVS-III test series was conducted with a focus on confirming the applicability of the Redox Forecasting Model, which was based on the Index of Feed Oxidation (IFO)developed during the Functional and Checkout Testing of Systems (FACTS)and SVS-I tests. Additional goals were to investigate the prototypical feed preparation cycle and test the new target glass composition. Included in this report are the basis and current designs of the major components of the Scale Vitrification System and the results of the SVS-III tests.The major subsystems described are the feed preparation and delivery, melter, and off-gas treatment systems. In addition,the correlation between the melter's operation and its various parameters;which included feed rate,cold cap coverage,oxygen reduction (redox)state of the glass,melter power,plenum temperature,and airlift analysis;were developed.

A guided wave-based in-process cure monitoring technique for carbon fiber reinforced polymer (CFRP) composites was investigated at NASA Langley Research Center. A key cure transition point (vitrification) was identified and the degree of cure was monitored using metrics such as amplitude and time of arrival (TOA) of guided waves. Using an automated system preliminarily developed in this work, high-temperature piezoelectric transducers were utilized to interrogate a twenty-four ply unidirectional composite panel fabricated from Hexcel (Registered Trademark) IM7/8552 prepreg during cure. It was shown that the amplitude of the guided wave increased sharply around vitrification and the TOA curve possessed an inverse relationship with degree of cure. The work is a first step in demonstrating the feasibility of transitioning the technique to perform in-process cure monitoring in an autoclave, defect detection during cure, and ultimately a closed-loop process control to maximize composite part quality and consistency.

A barrier crucible design which consistently maintains melt stability over long periods of time was successfully tested and used in long growth runs. The pellet feeder for melt replenishment was operated continuously for growth runs of up to 17 hours. The liquid level sensor comprising a laser/sensor system was operated, performed well, and meets the requirements for maintaining liquid level height during growth and melt replenishment. An automated feedback loop connecting the feed mechanism and the liquid level sensing system was designed and constructed and operated successfully for 3.5 hours demonstrating the feasibility of semi-automated dendritic web growth. The sensitivity of the cost of sheet, to variations in capital equipment cost and recycling dendrites was calculated and it was shown that these factors have relatively little impact on sheet cost. Dendrites from web which had gone all the way through the solar cell fabrication process, when melted and grown into web, produce crystals which show no degradation in cell efficiency. Material quality remains high and cells made from web grown at the start, during, and the end of a run from a replenished melt show comparable efficiencies.

Canberra Industries, Inc. has designed a new truck monitoring system for a facility in Japan. The customer desires to separately quantify the Cs-137 and Cs-134 content of truck cargo entering and leaving a Waste Consolidation Area. The content of the trucks will be some combination of sand, soil, and vegetation with densities ranging from 0.3 g/cc - 1.6 g/cc. The typical weight of the trucks will be approximately 10 tons, but can vary between 4 and 20 tons. The system must be sensitive enough to detect 100 Bq/kg in 10 seconds (with less than 10% relative standard deviation) but still have enough dynamic range to measure 1,000,000 Bq/kg material. The system will be operated in an outdoor environment. Starting from these requirements, Canberra explored all aspects of the counting system in order to provide the customer with the optimized solution. The desire to separately quantify Cs-137 and Cs-134 favors the use of a spectroscopic system as a solution. Using the In Situ Object Counting System (ISOCS) mathematical efficiency calculation tool, we explored various detector types, number, and physical arrangement for maximum performance. Given the choice of detector, the ISOCS software was used to investigate which geometric parameters (fill height, material density, etc.) caused the most fluctuations in the efficiency results. Furthermore, these variations were used to obtain quantitative estimates of the uncertainties associated with the possible physical variations in the truck size, detector positioning, and material composition, density, and fill height. Various shielding options were also explored to ensure that any measured Cs content would be from the truck and not from the surrounding area. The details of the various calculations along with the final design are given. (authors)

A Review on Torrefaction Process and Design of Moving Bed Torrefaction System for Biomass Processing Jaya Shankar Tumuluru1, Shahab Sokhansanj2 and Christopher T. Wright1 Idaho National Laboratory Biofuels and Renewable Energy Technologies Department Idaho Falls, Idaho 83415 Oak Ridge National Laboratory Bioenergy Resource and Engineering Systems Group Oak Ridge, TN 37831 Abstract Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-230 C and 270-280 C. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, which produces a final product that will have a lower mass but a higher heating value. There is a lack of literature on the design aspects of torrefaction reactor and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes (a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and (b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed torrefier for different capacities ranging from 25-1000 kg/hr, designing the heat loads and gas flow rates, and

The Spitzer Telemetry ProcessingSystem (SirtfTlmProc) was designed to address objectives of JPL's Multi-mission Image Processing Lab (MIPL) in processing spacecraft telemetry and distributing the resulting data to the science community. To minimize costs and maximize operability, the software design focused on automated error recovery, performance, and information management. The systemprocesses telemetry from the Spitzer spacecraft and delivers Level 0 products to the Spitzer Science Center. SirtfTlmProc is a unique system with automated error notification and recovery, with a real-time continuous service that can go quiescent after periods of inactivity. The software can process 2 GB of telemetry and deliver Level 0 science products to the end user in four hours. It provides analysis tools so the operator can manage the system and troubleshoot problems. It automates telemetry processing in order to reduce staffing costs.

Synthetic aperture radar (SAR) is uniquely suited to help solve the Search and Rescue problem since it can be utilized either day or night and through both dense fog or thick cloud cover. Other papers in this session, and in this session in 1997, describe the various SAR image processing algorithms that are being developed and evaluated within the Search and Rescue Program. All of these approaches to using SAR data require substantial amounts of digital signal processing: for the SAR image formation, and possibly for the subsequent image processing. In recognition of the demanding processing that will be required for an operational Search and Rescue Data ProcessingSystem (SARDPS), NASA/Goddard Space Flight Center and NASA/Stennis Space Center are conducting a technology demonstration utilizing SHARC multi-chip modules from Boeing to perform SAR image formation processing.

taking into account the effects of polycrystalline microstructures, elastic anisotropy of the crystals, and material damages due to microplasticity and...anisotropic crystal elasticity, intragranular microplasticity and intergranular microfracture have been developed and implemented into the ABAQUS codes...Zhang, K. S., Wu, M. S., and Feng, R. (2005). Simulation of microplasticity -induced deformation in uniaxially strained ceramics by 3-D Voronoi

Adolescence is a period of development in which peer relationships become especially important. A computer-based game (Cyberball) has been used to explore the effects of social exclusion in adolescents and adults. The current functional magnetic resonance imaging (fMRI) study used Cyberball to extend prior work to the cross-sectional study of…

In volcanic areas with long repose periods (as El Hierro), recently installed monitoring networks offer no instrumental record of past eruptions nor experience in handling a volcanic crisis. Both conditions, uncertainty and inexperience, contribute to make the communication of hazard more difficult. In fact, in the initial phases of the unrest at El Hierro, the perception of volcanic risk was somewhat distorted, as even relatively low volcanic hazards caused a high political impact. The need of a Volcanic Alert System became then evident. In general, the Volcanic Alert System is comprised of the monitoring network, the software tools for the analysis of the observables, the management of the Volcanic Activity Level, and the assessment of the threat. The Volcanic Alert System presented here places special emphasis on phenomena associated to moderate eruptions, as well as on volcano-tectonic earthquakes and landslides, which in some cases, as in El Hierro, may be more destructive than an eruption itself. As part of the Volcanic Alert System, we introduce here the Volcanic Activity Level which continuously applies a routine analysis of monitoring data (particularly seismic and deformation data) to detect data trend changes or monitoring network failures. The data trend changes are quantified according to the Failure Forecast Method (FFM). When data changes and/or malfunctions are detected, by an automated watchdog, warnings are automatically issued to the Monitoring Scientific Team. Changes in the data patterns are then translated by the Monitoring Scientific Team into a simple Volcanic Activity Level, that is easy to use and understand by the scientists and technicians in charge for the technical management of the unrest. The main feature of the Volcanic Activity Level is its objectivity, as it does not depend on expert opinions, which are left to the Scientific Committee, and its capabilities for early detection of precursors. As a consequence of the El Hierro

Regenerative Life Support Systems (RLSS), which maintain human lives by recycling substances essential for living, are comprised of humans, plants, and material circulation systems. The plants supply food to the humans or reproduce water and gases by photosynthesis, while the material circulation systems recycle physicochemically and circulate substances disposed by humans and plants. RLSS attracts attention since manned space activities have been shifted from previous short trips to long-term stay activities as such base as a space station, a lunar base, and a Mars base. The present typical space base is the International Space Station (ISS), a manned experimental base for prolonged stays, where RLSS recycles only water and air. In order to accommodate prolonged and extended manned activity in future space bases, developing RLSS that implements food production and regeneration of resources at once using plants is expected. The configuration of RLSS should be designed to suit its own duty, for which design requirements for RLSS with an unprecedented configuration may arise. Accordingly, it is necessary to establish a conceptual design method for generalized RLSS. It is difficult, however, to systematize the design process by analyzing previous design because there are only a few ground-experimental facilities, namely CEEF (Closed Ecology Experiment Facilities) of Japan, BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) of the U.S., and BIOS3 of Russia. Thus a conceptual design method which doesn’t rely on previous design examples is required for generalized RLSS from the above reasons. This study formalizes a conceptual design process, and develops a conceptual design support tool for RLSS based on this design process.

Adolescence is a period of development in which peer relationships become especially important. A computer-based game (Cyberball) has been used to explore the effects of social exclusion in adolescents and adults. The current functional magnetic resonance imaging (fMRI) study used Cyberball to extend prior work to the cross-sectional study of younger children and adolescents (7 to 17 years), identifying age-related changes in the neural correlates of social exclusion across the important transition from middle childhood into adolescence. Additionally, a control task illustrated the specificity of these age-related changes for social exclusion as distinct from expectancy violation more generally. During exclusion, activation in and functional connectivity between ventrolateral prefrontal cortex and ventral anterior cingulate cortex increased with age. These effects were specific to social exclusion and did not exist for expectancy violation. Our results illustrate developmental changes from middle childhood through adolescence in both affective and regulatory brain regions during social exclusion.

A system and method for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy.

The overall project objective was to apply high throughput experimentation and combinatorial methods together with novel syntheses to discover and optimize efficient, practical, and economically sustainable materials for photoelectrochemical production of bulk hydrogen from water. Automated electrochemical synthesis and photoelectrochemical screening systems were designed and constructed and used to study a variety of new photoelectrocatalytic materials. We evaluated photocatalytic performance in the dark and under illumination with or without applied bias in a high-throughput manner and did detailed evaluation on many materials. Significant attention was given to -Fe2O3 based semiconductor materials and thin films with different dopants were synthesized by co-electrodeposition techniques. Approximately 30 dopants including Al, Zn, Cu, Ni, Co, Cr, Mo, Ti, Pt, etc. were investigated. Hematite thin films doped with Al, Ti, Pt, Cr, and Mo exhibited significant improvements in efficiency for photoelectrochemical water splitting compared with undoped hematite. In several cases we collaborated with theorists who used density functional theory to help explain performance trends and suggest new materials. The best materials were investigated in detail by X-ray diffraction (XRD), scanning electron microscopy (SEM), ultraviolet-visual spectroscopy (UV-Vis), X-ray photoelectron spectroscopy (XPS). The photoelectrocatalytic performance of the thin films was evaluated and their incident photon

Areas of concern with respect to processing, storage, and output requirements of a generalized information processingsystem are considered. Special emphasis is placed on multiple-access systems. Problems of system management and control are discussed, including hierarchies of storage levels. Facsimile, digital, and mass random access storage…

A chemical compatibility study was conducted between SiC filament and the following P/M matrix alloys: Waspaloy, Hastelloy-X, NiCrAlY, Ha-188, S-57, FeCrAlY, and Incoloy 800. None of the couples demonstrated sufficient chemical compatibility to withstand the minimum HIP consolidation temperatures (996 C) or intended application temperature of the composite (982 C). However, Waspaloy, Haynes 188, and Hastelloy-X were the least reactive with SiC of the candidate alloys. Chemical vapor deposited tungsten was shown to be an effective diffusion barrier between the superalloy matrix and SiC filament providing a defect-free coating of sufficient thickness. However, the coating breaks down when the tungsten is converted into intermetallic compounds by interdiffusion with matrix constituents. Waspaloy was demonstrated to be the most effective matrix alloy candidate in contact with the CVD tungsten barrier because of its relatively low growth rate constant of the intermediate compound and the lack of formation of Kirkendall voids at the matrix-barrier interface. Fabrication methods were developed for producing panels of uniaxial and angle ply composites utilizing CVD tungsten coated filament.

We have developed a new target platform to study Laser Plasma Interaction in ignition-relevant condition at the Omega laser facility (LLE/Rochester)[1]. By shooting an interaction beam along the axis of a gas-filled hohlraum heated by up to 17 kJ of heater beam energy, we were able to create a millimeter-scale underdense uniform plasma at electron temperatures above 3 keV. Extensive Thomson scattering measurements allowed us to benchmark our hydrodynamic simulations performed with HYDRA [1]. As a result of this effort, we can use with much confidence these simulations as input parameters for our LPI simulation code pF3d [2]. In this paper, we show that by using accurate hydrodynamic profiles and full three-dimensional simulations including a realistic modeling of the laser intensity pattern generated by various smoothing options, fluid LPI theory reproduces the SBS thresholds and absolute reflectivity values and the absence of measurable SRS. This good agreement was made possible by the recent increase in computing power routinely available for such simulations.

The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

Neutron emission profiles are routinely measured in JT-60U Tokamak. Stinbene neuron detectors (SNDs), which combine a Stilbene organic crystal scintillation detector (Stilbene detector) with an analog neutron-gamma pulse shape discrimination (PSD) circuit, have been used to measure neutron flux efficiently. Although the SND has many advantages as a neutron detector, the maximum count rate is limited up to {approx}1x 10{sup 5} counts/s due to the dead time of the analog PSD circuit. To overcome this issue, a digital signal processing (DSP) system using a Flash-ADC has been developed. In this system, anode signals from the photomultiplier of the Stilbene detector are fed to the Flash ADC and digitized. Then, the PSD between neutrons and gamma-rays are performed using software. The photomultiplier tube is also modified to suppress and correct gain fluctuation of the photomultiplier. The DSP system has been installed in the center channel of the vertical neutron collimator system in JT-60U and applied to measurements of neutron flux in JT-60U experiments. Neutron flux are successfully measured with count rate up to {approx}1x 10{sup 6} counts/s without the effect of pile up of detected pulses. The performance of the DSP system as a neutron detector is demonstrated.

This phase consists of the engineering design, fabrication, assembly, operation, economic analysis, and process support R&D for an Experimental ProcessSystemDevelopment Unit (EPSDU). The mechanical bid package was issued and the bid responses are under evaluation. Similarly, the electrical bid package was issued, however, responses are not yet due. The majority of all equipment is on order or has been received at the EPSDU site. The pyrolysis/consolidation process design package was issued. Preparation of process and instrumentation diagram for the free-space reactor was started. In the area of melting/consolidation, Kayex successfully melted chunk silicon and have produced silicon shot. The free-space reactor powder was successfully transported pneumatically from a storage bin to the auger feeder twenty-five feet up and was melted. The fluid-bed PDU has successfully operated at silane feed concentrations up to 21%. The writing of the operating manual has started. Overall, the design phase is nearing completion.

aid Ta:tial operations system 2ft ABSTRACT (Cmmtimto em te*o*oo ebm It nmmemmmn emd Identify by block number) This report describes research...equipment would be linked up at Corps to form weapon systems, which would then be dispatched to divisions. There is provision in the concept for

A laser material processingsystem and method are provided. A further aspect of the present invention employs a laser for micromachining. In another aspect of the present invention, the system uses a hollow waveguide. In another aspect of the present invention, a laser beam pulse is given broad bandwidth for workpiece modification.

entity’s real estate situation and condition for use by 17 customers including (but not limited to) the business entity. 18 Information is processed to...the score to provide a well- 3 rounded picture of a particular real estate situation. 4 Stratmann discloses a method for assisting an individual in...identify a potential 9 flaw in the opportunity analysis . These criteria include whether 10 the process is dealing with a real customer, if it is

was analyzed and improvements implemented to the Veeco PVD-AlN prototype system to establish a specification and baseline PVD-AlN films on sapphire and in parallel the evaluation of PVD AlN on silicon substrates began. In Phase II of the project a Beta tool based on a scaled-up process module capable of depositing uniform films on batches of 4”or 6” diameter substrates in a production worthy operation was developed and qualified. In Phase III, the means to increase the throughput of the PVD-AlN system was evaluated and focused primarily on minimizing the impact of the substrate heating and cooling times that dominated the overall cycle time.

The silicon web process takes advantage of natural crystallographic stabilizing forces to grow long, thin single crystal ribbons directly from liquid silicon. The ribbon, or web, is formed by the solidification of a liquid film supported by surface tension between two silicon filaments, called dendrites, which border the edges of the growing strip. The ribbon can be propagated indefinitely by replenishing the liquid silicon as it is transformed to crystal. The dendritic web process has several advantages for achieving low cost, high efficiency solar cells. These advantages are discussed.

X-ray astronomy research is often limited by the size, weight, complexity, and cost of functioning x-ray optics. Micropore optics promises an economical alternative to traditional (e.g., glass or foil) x-ray optics; however, many manufacturing difficulties prevent micropore optics from being a viable solution. Ezoe et al. introduced microelectromechanical systems (MEMS) micropore optics having curvilinear micropores in 2008. Made by either deep reactive ion etching or x-ray lithography, electroforming, and molding (LIGA), MEMS micropore optics suffer from high micropore sidewall roughness (10-30nmrms) which, by current standards, cannot be improved. In this research, a new alternating magnetic-field-assisted finishing process was developed using a mixture of ferrofluid and microscale abrasive slurry. A machine was built, and a set of working process parameters including alternating frequency, abrasive size, and polishing time was selected. A polishing experiment on a LIGA-fabricated MEMS micropore optic was performed, and a change in micropore sidewall roughness of 9.3+/-2.5nmrms to 5.7+/-0.7nmrms was measured. An improvement in x-ray reflectance was also seen. This research shows the feasibility and confirms the effects of this new polishing process on MEMS micropore optics.

X-ray astronomy research is often limited by the size, weight, complexity, and cost of functioning x-ray optics. Micropore optics promises an economical alternative to traditional (e.g., glass or foil) x-ray optics; however, many manufacturing difficulties prevent micropore optics from being a viable solution. Ezoe et al. introduced microelectromechanical systems (MEMS) micropore optics having curvilinear micropores in 2008. Made by either deep reactive ion etching or x-ray lithography, electroforming, and molding (LIGA), MEMS micropore optics suffer from high micropore sidewall roughness (10-30nmrms) which, by current standards, cannot be improved. In this research, a new alternating magnetic-field-assisted finishing process was developed using a mixture of ferrofluid and microscale abrasive slurry. A machine was built, and a set of working process parameters including alternating frequency, abrasive size, and polishing time was selected. A polishing experiment on a LIGA-fabricated MEMS micropore optic was performed, and a change in micropore sidewall roughness of 9.3{+-}2.5nmrms to 5.7{+-}0.7nmrms was measured. An improvement in x-ray reflectance was also seen. This research shows the feasibility and confirms the effects of this new polishing process on MEMS micropore optics.

An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.

Thirty-five (35) furnace runs were carried out during this quarter, of which 25 produced a total of 120 web crystals. The two main thermal models for the dendritic growth process were completed and are being used to assist the design of the thermal geometry of the web growth apparatus. The first model, a finite element representation of the susceptor and crucible, was refined to give greater precision and resolution in the critical central region of the melt. The second thermal model, which describes the dissipation of the latent heat to generate thickness-velocity data, was completed. Dendritic web samples were fabricated into solar cells using a standard configuration and a standard process for a N(+) -P-P(+) configuration. The detailed engineering design was completed for a new dendritic web growth facility of greater width capability than previous facilities.

The Environmental Laboratories Automation Software System or PALMA (Spanish abbreviation) was developed by a multidisciplinary team in order to support the main tasks of heterogeneous air quality control networks. The software process for PALMA development, which can be perfectly applied to similar multidisciplinary projects, was (a) well-defined, (b) arranged between environmental technicians and informatics, (c) based on quality guides, and (d) clearly user-centred. Moreover, it introduces some interesting advantages with regard to the classical step-by-step approaches. PALMA is a web-based system that allows 'off-line' and automated telematic data acquisition from distributed inmission stations belonging not only to homogeneous but also to heterogeneous air quality control networks. It provides graphic and tabular representations for a comprehensive and centralised analysis of acquired data, and considers the daily work that is associated with such networks: validation of the acquired data, alerts with regard to (periodical) tasks (e.g., analysers verification), downloading of files with environmental information (e.g., dust forecasts), etc. The implantation of PALMA has provided qualitative and quantitative improvements in the work performed by the people in charge of the considered control network.

A zero gravity processing furnace system was designed that will allow acquisition of photographic or other visual information while the sample is being processed. A low temperature (30 to 400 C) test model with a flat specimen heated by quartz-halide lamps was constructed. A high temperature (400 to 1000 C) test model heated by resistance heaters, utilizing a cylindrical specimen and optics, was also built. Each of the test models is discussed in detail. Recommendations are given.

Technical activities are reported in the design of process, facilities, and equipment for producing silicon at a rate and price comensurate with production goals for low cost solar cell modules. The silane-silicone process has potential for providing high purity poly-silicon on a commercial scale at a price of fourteen dollars per kilogram by 1986, (1980 dollars). Commercial process, economic analysis, process support research and development, and quality control are discussed.

This report and set of appendices are a collection of memoranda originally drafted in 2009 for the purpose of providing motivation and the necessary background material to support the definition and integration of engineering and management processes related to technology development. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. As presented herein, the material begins with a survey of open literature perspectives on technology development life cycles, including published data on %E2%80%9Cwhat went wrong.%E2%80%9D The main thrust of the material presents a rational expose%CC%81 of a structured technology development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of the systems engineering process. The material concludes with a discussion on the use of multiple measures to assess technology maturity, including consideration of the viewpoint of potential users.

High-throughput systems and processes have typically been targeted for processdevelopment and optimization in the bioprocessing industry. For process characterization, bench scale bioreactors have been the system of choice. Due to the need for performing different process conditions for multiple process parameters, the process characterization studies typically span several months and are considered time and resource intensive. In this study, we have shown the application of a high-throughput mini-bioreactor system viz. the Advanced Microscale Bioreactor (ambr15(TM) ), to perform process characterization in less than a month and develop an input control strategy. As a pre-requisite to process characterization, a scale-down model was first developed in the ambr system (15 mL) using statistical multivariate analysis techniques that showed comparability with both manufacturing scale (15,000 L) and bench scale (5 L). Volumetric sparge rates were matched between ambr and manufacturing scale, and the ambr process matched the pCO2 profiles as well as several other process and product quality parameters. The scale-down model was used to perform the process characterization DoE study and product quality results were generated. Upon comparison with DoE data from the bench scale bioreactors, similar effects of process parameters on process yield and product quality were identified between the two systems. We used the ambr data for setting action limits for the critical controlled parameters (CCPs), which were comparable to those from bench scale bioreactor data. In other words, the current work shows that the ambr15(TM) system is capable of replacing the bench scale bioreactor system for routine processdevelopment and process characterization.

Beam Instrumentation DevelopmentSystem (BIDS) is a collection of common support libraries and modules developed during a series of Low-Level Radio Frequency (LLRF) control and timing/synchronization projects. BIDS includes a collection of Hardware Description Language (HDL) libraries and software libraries. The BIDS can be used for the development of any FPGA-based system, such as LLRF controllers. HDL code in this library is generic and supports common Digital Signal Processing (DSP) functions, FPGA-specific drivers (high-speed serial link wrappers, clock generation, etc.), ADC/DAC drivers, Ethernet MAC implementation, etc.

A complete description of the development and initial evaluation of the Strapdown Inertial Reference Unit (SIRU) system is reported. Systemdevelopment documents the system mechanization with the analytic formulation for fault detection and isolation processing structure; the hardware redundancy design and the individual modularity features; the computational structure and facilities; and the initial subsystem evaluation results.

Disclosed is a single chamber ultra-high vacuum processingsystem for the oduction of hermetically sealed quartz resonators wherein electrode metallization and sealing are carried out along with cleaning and bake-out without any air exposure between the processing steps. The system includes a common vacuum chamber in which is located a rotatable wheel-like member which is adapted to move a plurality of individual component sets of a flat pack resonator unit past discretely located processing stations in said chamber whereupon electrode deposition takes place followed by the placement of ceramic covers over a frame containing a resonator element and then to a sealing stage where a pair of hydraulic rams including heating elements effect a metallized bonding of the covers to the frame.

PETC has implemented a number of advanced combustion research projects that will lead to the establishment of a broad, commercially acceptable engineering data base for the advancement of coal as the fuel of choice for boilers, furnaces, and process heaters. Vortec Corporation`s Coal-Fired Combustion System for Industrial Process Heating Applications has been selected for Phase III development under contract DE-AC22-91PC91161. This advanced combustion system research program is for the development of innovative coal-fired process heaters which can be used for high temperature melting, smelting, recycling, and refining processes. The process heater concepts to be developed are based on advanced glass melting and ore smelting furnaces developed and patented by Vortec Corporation. The process heater systems to be developed have multiple use applications; however, the Phase HI research effort is being focused on the development of a process heater system to be used for producing glass frits and wool fiber from boiler and incinerator ashes. The primary objective of the Phase III project is to develop and integrate all the system components, from fuel through total system controls, and then test the complete system in order to evaluate its potential marketability. The economic evaluation of commercial scale CMS processes has begun. In order to accurately estimate the cost of the primary process vessels, preliminary designs for 25, 50, and 100 ton/day systems have been started under Task 1. This data will serve as input data for life cycle cost analysis performed as part of techno-economic evaluations. The economic evaluations of commercial CMS systems will be an integral part of the commercialization plan.

Optimizing performance on work activities and processes requires metrics of performance for management to monitor and analyze in order to support further improvements in efficiency, effectiveness, safety, reliability and cost. Information systems are therefore required to assist management in making timely, informed decisions regarding these work processes and activities. Currently information systems regarding Space Shuttle maintenance and servicing do not exist to make such timely decisions. The work to be presented details a system which incorporates various automated and intelligent processes and analysis tools to capture organize and analyze work process related data, to make the necessary decisions to meet KSC organizational goals. The advantages and disadvantages of design alternatives to the development of such a system will be discussed including technologies, which would need to bedesigned, prototyped and evaluated.

This patent describes an improvement in a computer controlled processingsystem for lumber production. It comprises: a computer, a sequence of processing stations for processing a log segment including; an excess material removing station for generating opposed flat side surfaces on the log segment. The flat side surfaces determined by the computer to become sides of boards to be severed from the log segments; a profiling station for forming profiled edges above and below the flat side surfaces to become the side edges of the boards to be severed from the log segment, and a severing station for severing the boards from the log segments, a conveyance means establishing a path of conveyance and having continuous control of the log segment on conveying the log segment along the path and through the above defined sequence of processing stations.

The objective of the SYNODOS collaborative project was to develop a generic IT solution, combining a medical terminology server, a semantic analyser and a knowledge base. The goal of the project was to generate meaningful epidemiological data for various medical domains from the textual content of French medical records. In the context of this project, we built a care pathway oriented conceptual model and corresponding annotation method to develop and evaluate an expert system's knowledge base. The annotation method is based on a semi-automatic process, using a software application (MedIndex). This application exchanges with a cross-lingual multi-termino-ontology portal. The annotator selects the most appropriate medical code proposed for the medical concept in question by the multi-termino-ontology portal and temporally labels the medical concept according to the course of the medical event. This choice of conceptual model and annotation method aims to create a generic database of facts for the secondary use of electronic health records data.

beer production and distribution. The whole system consists of four entities: Retailer , Wholesaler, Distributor, and Factory (R, W, D, and F). It is...EXPLORING THE DYNAMICS AND MODELING NATIONAL BUDGET AS A SUPPLY CHAIN SYSTEM : A PROPOSAL FOR...MODELING NATIONAL BUDGET AS A SUPPLY CHAIN SYSTEM : A PROPOSAL FOR REENGINEERING THE BUDGETING PROCESS AND FOR DEVELOPING A MANAGEMENT FLIGHT

Integrated water system modeling is a feasible approach to understanding severe water crises in the world and promoting the implementation of integrated river basin management. In this study, a classic hydrological model (the time variant gain model: TVGM) was extended to an integrated water system model by coupling multiple water-related processes in hydrology, biogeochemistry, water quality, and ecology, and considering the interference of human activities. A parameter analysis tool, which included sensitivity analysis, autocalibration and model performance evaluation, was developed to improve modeling efficiency. To demonstrate the model performances, the Shaying River catchment, which is the largest highly regulated and heavily polluted tributary of the Huai River basin in China, was selected as the case study area. The model performances were evaluated on the key water-related components including runoff, water quality, diffuse pollution load (or nonpoint sources) and crop yield. Results showed that our proposed model simulated most components reasonably well. The simulated daily runoff at most regulated and less-regulated stations matched well with the observations. The average correlation coefficient and Nash-Sutcliffe efficiency were 0.85 and 0.70, respectively. Both the simulated low and high flows at most stations were improved when the dam regulation was considered. The daily ammonium-nitrogen (NH4-N) concentration was also well captured with the average correlation coefficient of 0.67. Furthermore, the diffuse source load of NH4-N and the corn yield were reasonably simulated at the administrative region scale. This integrated water system model is expected to improve the simulation performances with extension to more model functionalities, and to provide a scientific basis for the implementation in integrated river basin managements.

specification successfully executed by application processes, technique draws from both the techniques of abstract data 121 Communications February 1980 of...J. and Farber. David A. "A \\lod’l for Verification of Data Security in Operating Systems." Communications of the ACM. Vol.21. No.9. September 1978. pp...the data being communicated exists in dleanext form as it is passed from one encrypted link to the next by the switch. Therefore the software in the

A system to fabricate precise, high aspect ratio polymeric molds by photolithograpic process is described. The molds for producing micro-scale parts from engineering materials by the LIGA process. The invention is a developersystem for developing a PMMA photoresist having exposed patterns comprising features having both very small sizes, and very high aspect ratios. The developersystem of the present invention comprises a developer tank, an intermediate rinse tank and a final rinse tank, each tank having a source of high frequency sonic agitation, temperature control, and continuous filtration. It has been found that by moving a patterned wafer, through a specific sequence of developer/rinse solutions, where an intermediate rinse solution completes development of those portions of the exposed resist left undeveloped after the development solution, by agitating the solutions with a source of high frequency sonic vibration, and by adjusting and closely controlling the temperatures and continuously filtering and recirculating these solutions, it is possible to maintain the kinetic dissolution of the exposed PMMA polymer as the rate limiting step.

A microcontroller-based system of oceanographic instrumentation providing a comprehensive set of measurements relevant to sediment transport processes has been developed. Analysis of the data provided by the system yields time series of vertical profiles of mean sediment size and concentration, horizontal profiles of bedform geometry, and single location measurements of flow velocity, pressure, turbidity, and water temperature. Details of the system architecture, including capabilities provided by both hardware and software contained within the system are given. An improved method for the determination of suspended sediment size and concentration from the system's acoustic backscatter intensity measurements is presented. By retaining the size dependence throughout the derivation for an explicit solution for concentration, a new explicit solution to the acoustic backscatter equation results. This new concentration solution improves the technique for determining median sediment size by incorporating sediment attenuation in the calculation. Because this new technique relies on the minimization of the variance in concentration as determined by different frequency transducers, the previous technique of pairing transducers of different frequencies is replaced by a technique making use of any number of different frequency transducers. The new size/concentration inversion technique is tested using both simulated and laboratory data. Numerical precision is shown to be the only source of error with the use of simulated data. Laboratory tests result in less than 20% error in the determination of both concentration and size over a range of nearly one meter. Finally, suspended sediment concentration data from the nearshore region obtained from an experiment performed in Duck, North Carolina, are examined to find the relevant time scales of sediment suspension. In this location, low frequency forcing mechanisms are as significant in suspending sediment as the incident-band wave

The Network Command ProcessingSystem (NCPS) developed for the National Aeronautics and Space Administration (NASA) Ground Network (GN) stations is a spacecraft command system utilizing a MULTIBUS I/68030 microprocessor. This system was developed and implemented at ground stations worldwide to provide a Project Operations Control Center (POCC) with command capability for support of spacecraft operations such as the LANDSAT, Shuttle, Tracking and Data Relay Satellite, and Nimbus-7. The NCPS consolidates multiple modulation schemes for supporting various manned/unmanned orbital platforms. The NCPS interacts with the POCC and a local operator to process configuration requests, generate modulated uplink sequences, and inform users of the ground command link status. This paper presents the system functional description, hardware description, and the software design.

In the early days of general computer systems for science data processing, staff members working on NASA's data systems would most often be hired as mathematicians. Computer engineering was very often filled by those with electrical engineering degrees. Today, the Goddard Space Flight Center has special position descriptions for data scientists or as they are more commonly called: data systems engineers. These staff members are required to have very diverse skills, hence the need for a generalized position description. There is always a need for data systems engineers to develop, maintain and operate the complex data systems for Earth and space science missions. Today's data systems engineers however are not just mathematicians, they are computer programmers, GIS experts, software engineers, visualization experts, etc... They represent many different degree fields. To put together distributed systems like the NASA Earth Observing Data and Information System (EOSDIS), staff are required from many different fields. Sometimes, the skilled professional is not available and must be developed in-house. This paper will address the various skills and jobs for data systems engineers at NASA. Further it explores how to develop staff to become data scientists.

This paper briefly recounts the recent history of process reengineering at the NASA Jet Propulsion Laboratory, with a focus on the engineering processes. The JPL process structure is described and the processdevelopment activities of the past several years outlined. The main focus of the paper is on the current process structure, the emphasis on the flight project life cycle, the governance approach that lead to Flight Project Practices, and the remaining effort to capture process knowledge at the detail level of the work group.

Operators of critical processes, such as nuclear power production, must contend with highly complex systems, procedures, and regulations. Developing human-machine interfaces (HMIs) that better support operators is a high priority for ensuring the safe and reliable operation of critical processes. Human factors engineering (HFE) provides a rich and mature set of tools for evaluating the performance of HMIs, but the set of tools for developing and designing HMIs is still in its infancy. Here we propose that Microsoft Windows Presentation Foundation (WPF) is well suited for many roles in the research and development of HMIs for process control.

This work plan defines the manner in which the Waste Receiving and Processing Facility, Module I Process Area will be maintained under development control status. This status permits resolution of identified design discrepancies, control system changes, as-building of equipment, and perform modifications to increase process operability and maintainability as parallel efforts. This work plan maintains configuration control as these efforts are undertaken. This task will end with system testing and reissue of field verified design drawings.

A fully-automatic ultrasound image processingsystem that can determine the needle entry site for epidural anesthesia (EA) in real time is presented in this paper. Neither the knowledge of anesthetists nor additional hardware is required to operate the system, which firstly directs the anesthetists to reach the desired insertion region in the longitudinal view, i.e., lumbar level L3-L4, and then locates the ideal puncture site by instructing the anesthetists to rotate and slightly adjust the position of ultrasound probe. In order to implement these functions, modules including image processing, panorama stitching, feature extraction/selection, template matching and support vector machine (SVM) classification are incorporated in this system. Additionally, a user-friendly graphical user interface (GUI), which displays the processing results and guides anesthetists intuitively, is further designed to conceal the intricacy of algorithms. Feasibility and effectiveness of the proposed system has been evaluated through a set of realtime tests on 53 volunteers from a local hospital.

The pursuit of a high uranium density research reactor fuel plate has led to monolithic fuel, which possesses the greatest possible uranium density in the fuel region. Processdevelopments in fabrication development include friction stir welding tool geometry and cooling improvements and a reduction in the length of time required to complete the transient liquid phase bonding process. Annealing effects on the microstructures of the U-10Mo foil and friction stir welded aluminum 6061 cladding are also examined.

The goal of the Mars Aqueous ProcessingSystem (MAPS) is to establish a flexible process that generates multiple products that are useful for human habitation. Selectively extracting useful components into an aqueous solution, and then sequentially recovering individual constituents, can obtain a suite of refined or semi-refined products. Similarities in the bulk composition (although not necessarily of the mineralogy) of Martian and Lunar soils potentially make MAPS widely applicable. Similar process steps can be conducted on both Mars and Lunar soils while tailoring the reaction extents and recoveries to the specifics of each location. The MAPS closed-loop process selectively extracts, and then recovers, constituents from soils using acids and bases. The emphasis on Mars involves the production of useful materials such as iron, silica, alumina, magnesia, and concrete with recovery of oxygen as a byproduct. On the Moon, similar chemistry is applied with emphasis on oxygen production. This innovation has been demonstrated to produce high-grade materials, such as metallic iron, aluminum oxide, magnesium oxide, and calcium oxide, from lunar and Martian soil simulants. Most of the target products exhibited purities of 80 to 90 percent or more, allowing direct use for many potential applications. Up to one-fourth of the feed soil mass was converted to metal, metal oxide, and oxygen products. The soil residue contained elevated silica content, allowing for potential additional refining and extraction for recovery of materials needed for photovoltaic, semiconductor, and glass applications. A high-grade iron oxide concentrate derived from lunar soil simulant was used to produce a metallic iron component using a novel, combined hydrogen reduction/metal sintering technique. The part was subsequently machined and found to be structurally sound. The behavior of the lunar-simulant-derived iron product was very similar to that produced using the same methods on a Michigan iron

Business process or “soft” costs account for well over 50% of the installed price of residential photovoltaic (PV) systems in the United States, so understanding these costs is crucial for identifying PV cost-reduction opportunities. Among these costs are those imposed by city-level permitting processes, which may add both expense and time to the PV developmentprocess. Building on previous research, this study evaluates the effect of city-level permitting processes on the installed price of residential PV systems and on the time required to develop and install those systems. The study uses a unique dataset from the U.S. Department of Energy’s Rooftop Solar Challenge Program, which includes city-level permitting process “scores,” plus data from the California Solar Initiative and the U.S. Census. Econometric methods are used to quantify the price and development-time effects of city-level permitting processes on more than 3,000 PV installations across 44 California cities in 2011. Results indicate that city-level permitting processes have a substantial and statistically significant effect on average installation prices and project development times. The results suggest that cities with the most favorable (i.e., highest-scoring) permitting practices can reduce average residential PV prices by $0.27–$0.77/W (4%–12% of median PV prices in California) compared with cities with the most onerous (i.e., lowest-scoring) permitting practices, depending on the regression model used. Though the empirical models for development times are less robust, results suggest that the most streamlined permitting practices may shorten development times by around 24 days on average (25% of the median development time). These findings illustrate the potential price and development-time benefits of streamlining local permitting procedures for PV systems.

The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

Prepared for use by staff in development workshops at Burlington County College (BCC), in New Jersey, this handbook offers college-wide guidelines for improving the quality of service provided to internal and external customers, and reviews key elements of BCC's Customer Service System (CSS), a computerized method of recording and following-up on…

The population of the study consisted of 15 high school industrial arts students, 10 freshman and sophomore college students, and 10 adults. A polysensory, self-pacing instructional system was developed which included (1) pretests and post tests, (2) a general instruction book, (3) equipment to practice arc welding, (4) programed instruction…

General for Auditing ATTN: Audit Suggestions/13F25-04 4800 Mark Center Drive Alexandria, VA 22350-1500 Acronyms and Abbreviations BPR ...certified to obligate funds in excess of $1 million without determining whether an appropriate business process reengineering ( BPR ) was completed...Section 1072 of Public Law 111-84 requires the DON CMO to determine whether an adequate BPR occurred for ongoing defense business system modernizations to

Explains the use of experimental poetry forms for decoding practice to help develop reading skills in elementary school students. The use of computers for word processing capabilities is discussed; seven forms of poetry are described; and results are examined in terms of motivation and the development of literacy. (four references) (LRW)

The theme of this paper is that governmental resources will not permit the simultaneous development of all viable lunar materials processing (LMP) candidates. Choices will inevitably be made, based on the results of system integration trade studies comparing candidates to each other for high-leverage applications. It is in the best long-term interest of the LMP community to lead the selection process itself, quickly and practically. The paper is in five parts. The first part explains what systems integration means and why the specialized field of LMP needs this activity now. The second part defines the integration context for LMP -- by outlining potential lunar base functions, their interrelationships and constraints. The third part establishes perspective for prioritizing the development of LMP methods, by estimating realistic scope, scale, and timing of lunar operations. The fourth part describes the use of one type of analytical tool for gaining understanding of system interactions: the input/output model. A simple example solved with linear algebra is used to illustrate. The fifth and closing part identifies specific steps needed to refine the current ability to study lunar base system integration. Research specialists have a crucial role to play now in providing the data upon which this refinement process must be based.

Histopathology image processing, analysis and computer-aided diagnosis have been shown as effective assisting tools towards reliable and intra-/inter-observer invariant decisions in traditional pathology. Especially for cancer patients, decisions need to be as accurate as possible in order to increase the probability of optimal treatment planning. In this study, we propose a new image collection library (HICL-Histology Image Collection Library) comprising 3831 histological images of three different diseases, for fostering research in histopathology image processing, analysis and computer-aided diagnosis. Raw data comprised 93, 116 and 55 cases of brain, breast and laryngeal cancer respectively collected from the archives of the University Hospital of Patras, Greece. The 3831 images were generated from the most representative regions of the pathology, specified by an experienced histopathologist. The HICL Image Collection is free for access under an academic license at http://medisp.bme.teiath.gr/hicl/ . Potential exploitations of the proposed library may span over a board spectrum, such as in image processing to improve visualization, in segmentation for nuclei detection, in decision support systems for second opinion consultations, in statistical analysis for investigation of potential correlations between clinical annotations and imaging findings and, generally, in fostering research on histopathology image processing and analysis. To the best of our knowledge, the HICL constitutes the first attempt towards creation of a reference image collection library in the field of traditional histopathology, publicly and freely available to the scientific community.

The development of course materials at the Open Learning Institute, British Columbia, Canada, is examined from two perspectives: as an industrial process and as a social process. The public institute provides distance education through paced home-study courses. The course team model used at the Institute is a system approach. Course development…

Collection of computer programs constitutes software tool for simulation of medium- to large-scale software-development projects. Necessary to include easily identifiable and more-readily quantifiable characteristics like costs, times, and numbers of errors. Mathematical model incorporating these and other factors of dynamics of software-developmentprocess implemented in the Software Life Cycle Simulator (SLICS) computer program. Simulates dynamics of software-developmentprocess. In combination with input and output expert software systems and knowledge-based management software system, develops information for use in managing large software-development project. Intended to aid managers in planning, managing, and controlling software-developmentprocesses by reducing uncertainties in budgets, required personnel, and schedules.

The developmentprocess for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the developmentprocess can be investigated with software developmentprocess models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

Digital systemprocesses spacecraft television pictures by converting images sensed on a photostorage vidicon to pulses which can be transmitted by telemetry. This system can be applied in the processing of medical X ray photographs and in electron microscopy.

Problems associated with the development of the measurement of air pollution from satellites (MAPS) experiment program are addressed. The primary thrust of this research was the utilization of the MAPS experiment data in three application areas: low altitude aircraft flights (one to six km); mid altitude aircraft flights (eight to 12 km); and orbiting space platforms. Extensive research work in four major areas of data management was the framework for implementation of the MAPS experiment technique. These areas are: (1) data acquisition; (2) data processing, analysis and interpretation algorithms; (3) data display techniques; and (4) information production.

The topics are presented in viewgraph form and include the following: International Organization for Standards (ISO); International Electrotechnical Committee (IEC); ISO/IEC Joint Technical Committee 1 (JTC-1); U.S. interface to JTC-1; ANSI; national organizations; U.S. standards developmentprocesses; national and international standards developing organizations; regional organizations; and X3 information processingsystems.

A model of program development and evaluation was developed at Genesee Community College, utilizing a system theory/process of deductive and inductive reasoning to ensure coherence and continuity within the program. The model links activities to specific measurable outcomes. Evaluation checks and feedback are built in at various levels so that…

A new concept for processing spacecraft type wastes has been evaluated. The feasibility of reacting various waste materials with steam at temperatures of 538 - 760 C in both a continuous and batch reactor with residence times from 3 to 60 seconds has been established. Essentially complete gasification is achieved. Product gases are primarily hydrogen, carbon dioxide, methane, and carbon monoxide. Water soluble synthetic wastes are readily processed in a continuous tubular reactor at concentrations up to 20 weight percent. The batch reactor is able to process wet and dry wastes at steam to waste weight ratios from 2 to 20. Feces, urine, and synthetic wastes have been successfully processed in the batch reactor.

This advanced combustion system research program is for the development of innovative coal-fired process heaters which can be used for high temperature melting, smelting and waste vitrification processes. The process heater systems to be developed have multiple use applications; however, the Phase III research effort is being focused on the development of a process heater system to be used for producing value added vitrified glass products from boiler/incinerator ashes and industrial wastes. The primary objective of the Phase III project is to develop and integrate all the system components, from fuel through total system, controls, and then test the complete system in order to evaluate its potential marketability. The past quarter began with a two-day test performed in January to determine the cause of pulsations in the batch feed system observed during pilot-scale testing of surrogate TSCA incinerator ash performed in December of 1993. Two different batch feedstocks were used during this test: flyash and cullet. The cause of the pulsations was traced to a worn part in the feeder located at the bottom of the batch feed tank. The problem was corrected by replacing the wom part with the corresponding part on the existing coal feed tank. A new feeder for the existing coal tank, which had previously been ordered as part of the new coal handling system, was procured and installed. The data from the pilot-scale tests performed on surrogate TSCA incinerator ash during December of 1993 was collected and analyzed. All of the glass produced during the test passed both the Toxicity characteristics Leach Procedure (TCLP) and the Product Consistency Test (PCT) by approximately two orders of magnitude.

Stanford University's use of a risk assessment methodology to improve the management of systemsdevelopment projects is discussed. After examining the concepts of hazard, peril, and risk as they relate to the systemdevelopmentprocess, three ways to assess risk are covered: size, structure, and technology. The overall objective for Stanford…

Improving Process Heating System Performance: A Sourcebook for Industry is a development of the U.S. Department of Energy (DOE) Advanced Manufacturing Office (AMO) and the Industrial Heating Equipment Association (IHEA). The AMO and IHEA undertook this project as part of an series of sourcebook publications developed by AMO on energy-consuming industrial systems, and opportunities to improve performance. Other topics in this series include compressed air systems, pumping systems, fan systems, steam systems, and motors and drives

capable of processing massive amounts of data in both real-time and post-flight settings, and to produce software segments that can be integrated to assist in the task as well. The selected software must be able to: (1) process massive amounts of data (up to 4GB) at a speed useful in a real-time settings (small fractions of a second); (2) process data in post-flight settings to allow test reproduction or further data analysis, inclusive; (3) produce, or make easier to produce, three-dimensional plots/graphs to make the data accessible to flight test engineers; and (4) be customized to allow users to use their own processing formulas or functions and display the data in formats they prefer. Several software programs were evaluated to determine their utility in completing the research objectives. These programs include: OriginLab, Graphis, 3D Grapher, Visualization Sciences Group (VSG) Avizo Wind, Interactive Analysis and Display System (IADS), SigmaPlot, and MATLAB.

This review is the first attempt to integrate the available data on all types of phase equilibria (solubility, extraction and sorption) in systems containing light fullerenes (C60 and C70). In the case of solubility diagrams, the following types of phase equilibria are considered: individual fullerene (C60 or C70)-solvent under polythermal and polybaric conditions; C60-C70-solvent, individual fullerene-solvent(1)-solvent(2), as well as multicomponent systems comprising a single fullerene or an industrial mixture of fullerenes and vegetable oils, animal fats or essential oils under polythermal conditions. All published experimental data on the extraction equilibria in C60-C70-liquid phase(1)-liquid phase(2) systems are described systematically and the sorption characteristics of various materials towards light fullerenes are estimated. The possibility of application of these experimental data for development of pre-chromatographic and chromatographic methods for separation of fullerene mixtures and application of fullerenes as nanomodifiers are described. The bibliography includes 87 references.

Most manufacturing processes require physical pointwise positioning of the components or tools from one location to another. Typical mechanical systems utilize either stop-and-go or fixed feed-rate procession to accomplish the task. The first approach achieves positional accuracy but prolongs overall time and increases wear on the mechanical system. The second approach sustains the throughput but compromises positional accuracy. A computer firmware approach has been developed to optimize this point wise mechanism by utilizing programmable interrupt controls to synchronize engineering processes 'on the fly'. This principle has been implemented in an eddy current imaging system to demonstrate the improvement. Software programs were developed that enable a mechanical controller card to transmit interrupts to a system controller as a trigger signal to initiate an eddy current data acquisition routine. The advantages are: (1) optimized manufacturing processes, (2) increased throughput of the system, (3) improved positional accuracy, and (4) reduced wear and tear on the mechanical system.

Multipurpose vacuum processingsystems are cost effective; occupy less space, multiple functional under one roof and user friendly. A multipurpose vacuum induction system was designed, fabricated and installed in a record time of 6 months time at NFTDC Hyderabad. It was designed to function as a) vacuum induction melting/refining of oxygen free electronic copper/pure metals, b) vacuum induction melting furnace for ferrous materials c) vacuum induction melting for non ferrous materials d) large vacuum heat treatment chamber by resistance heating (by detachable coil and hot zone) e) bottom discharge vacuum induction melting system for non ferrous materials f) Induction heat treatment system and g) directional solidification /investment casting. It contains provision for future capacity addition. The attachments require to manufacture multiple shaped castings and continuous rod casting can be added whenever need arises. Present capacity is decided on the requirement for 10years of development path; presently it has 1.2 ton liquid copper handling capacity. It is equipped with provision for capacity addition up to 2 ton liquid copper handling capacity in future. Provision is made to carry out the capacity addition in easy steps quickly. For easy operational maintenance and troubleshooting, design was made in easily detachable sections. High vacuum system is also is detachable, independent and easily movable which is first of its kind in the country. Detailed design parameters, advantages and development history are presented in this paper.

The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the developmentprocess of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

A Flash-ADC data acquisition (DAQ) system has been developed for the drift chamber array designed for the External-Target-Experiment at the Cooling Storage Ring at the Heavy Ion Research Facility, Lanzhou. The simplified readout electronics system has been developed using the Flash-ADC modules and the whole waveform in the sampling window is obtained, with which the time and energy information can be deduced with an offline processing. A digital filter algorithm has been developed to discriminate the noise and the useful signal. With the digital filtering process, the signal to noise ratio (SNR) is increased and a better time and energy resolution can be obtained. Supported by National Basic Research Program of China (973) (2015CB856903 and 2014CB845405), partly by National Science Foundation of China (U1332207 and 11375094), and by Tsinghua University Initiative Scientific Research Program

UCLA-ENG-7854), August 1978. Popek, G.J. and D.A. Farber. "A Model for Verification of Data Security in Operating Systems," Communications of the ACM...via covert channels is a data security problem. On the other hand, the unauthorized use of the system to communicate is a confinement problem. The...point here is that if there exists a communication channel, it may be accidentally used by a user and information leaked. For a system to be data secure

Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also

The mechanisms by which infants and children process pain should be viewed within the context of a developing sensory nervous system. The study of the neurophysiological properties and connectivity of sensory neurons in the developing spinal cord dorsal horn of the intact postnatal rat has shed light on the way in which the newborn central nervous system analyzes cutaneous innocuous and noxious stimuli. The receptive field properties and evoked activity of newborn dorsal horn cells to single repetitive and persistent innocuous and noxious inputs are developmentally regulated and reflect the maturation of excitatory transmission within the spinal cord. These changes will have an important influence on pain processing in the postnatal period.

This final report contains the results of a bench-scale program to continue the development of the TRW proprietary Gravimelt Process for chemically cleaning coal. This project consisted of two major efforts, a laboratory study aimed at identifying parameters which would influence the operation of a bench unit for desulfurization and demineralization of coal and the design, construction and operation of two types of continuous plug-flow type bench-scale fused caustic leachers. This present bench scale project has demonstrated modes for the continuous operation of fused caustic leaching of coal at coal throughputs of 1 to 5 pounds per hour. The remaining process unit operations of leach solutions regeneration and coal washing and filtration should be tested at bench scale together with fused caustic leaching of coal to demonstrate the complete Gravimelt Process. 22 figures, 11 tables.

A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

Silicon material research in the Republic of China (ROC) parallels its development in the electronic industry. A brief outline of the historical development in ROC silicon material research is given. Emphasis is placed on the recent Silane Project managed by the National Science Council, ROC, including project objectives, task forces, and recent accomplishments. An introduction is also given to industrialization of the key technologies developed in this project.

systems have been reliable. NTA/Lillestrom and Hamar will take a new initiative medio April regarding 04C. The line will be remeasured and if a certain...estimate of the ambient noise level at the site of the FINESA array, ground motion spectra were calculated for four time intervals. Two intervals were

Two digital video data compression systems directly applicable to the Space Shuttle TV Communication System were described: (1) For the uplink, a low rate monochrome data compressor is used. The compression is achieved by using a motion detection technique in the Hadamard domain. To transform the variable source rate into a fixed rate, an adaptive rate buffer is provided. (2) For the downlink, a color data compressor is considered. The compression is achieved first by intra-color transformation of the original signal vector, into a vector which has lower information entropy. Then two-dimensional data compression techniques are applied to the Hadamard transformed components of this last vector. Mathematical models and data reliability analyses were also provided for the above video data compression techniques transmitted over a channel encoded Gaussian channel. It was shown that substantial gains can be achieved by the combination of video source and channel coding.

The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) has been under development at JPL for the past four years. During this time, a dedicated ground data-processingsystem has been designed and implemented to store and process the large amounts of data expected. This paper reviews the objectives of this ground data-processingsystem and describes the hardware. An outline of the data flow through the system is given, and the software and incorporated algorithms developed specifically for the systematic processing of AVIRIS data are described.

Under a previous contract with Morgantown Energy Technology Center (METC), Department of Energy (DOE) Contract No. DE-AC21-84MC21108, UCC Research Corporation (UCCRC) built and tested a 1500 lb/day Mild Gasification ProcessDevelopment Unit (MGU). The MGU, as tested under the previous contract, is shown in Figure 1. Testing completed under the previous contract showed that good quality hydrocarbon liquids and good quality char can be produced in the MGU. However, the MGU is not optimized. The primary objectives of the current project are to optimize the MGU and determine the suitability of char for several commercial applications. The program consists of four tasks; Task 1-Test Plan; Task 2-Optimization of Mild Gasification Process; Task 3-Evaluation of Char and Char/Coal Blends as a Boiler/Blast Furnace Fuel; and Task 4-Analysis of Data and Preparation of Final Report. Task 1 has been completed while work continued on Task 2.

Under a previous contract with Morgantown Energy Technology Center (METC), Department of Energy (DOE) Contract No. AC21-84MC21108, UCC Research Corporation (UCCRC) built and tested a 1500 lb/day Mild Gasification ProcessDevelopment Unit (MGU). The MGU, as tested under the previous contract, is shown in Figure 1. Testing completed under the previous contract showed that good quality hydrocarbon liquids and good quality char can be produced in the MGU. However, the MGU is not optimized. The primary objectives of the current project are to optimize the MGU and determine the suitability of char for several commercial applications. The program consists of four tasks; Task 1 -- Test Plan; Task 2 -- Optimization of Mild Gasification Process; Task 3 -- Evaluation of Char and Char/Coal Blends as a Boiler/Blast Furnace Fuel; and Task 4 -- Analysis of Data and Preparation of Final Report. Task 1 has been completed while work continued on Task 2.

Under a previous contract with Morgantown Energy Technology Center (METC), Department of Energy (DOE) Contract No. AC21-84MC21108, UCC Research Corporation (UCCRC) built and tested a 1500 lb/day Mild Gasification ProcessDevelopment Unit (MGU). The MGU, as tested under the previous contract, is shown in Figure 1. Testing completed under the previous contract showed that good quality hydrocarbon liquids and good quality char can be produced in the MGU. However, the MGU is not optimized. The primary objectives of the current project are to optimize the MGU and determine the suitability of char for several commercial applications. The program consists of four tasks; Task 1 -- Test Plan; Task 2 -- Optimization of Mild Gasification Process; Task 3 -- Evaluation of Char and Char/Coal Blends as a Boiler/Blast Furnace Fuel; and Task 4 -- Analysis of Data and Preparation of Final Report. Task 1 has been completed while work continued on Task 2.

Under a previous contract with Morgantown Energy Technology Center (METC), Department of Energy (DOE) Contract No. AC21-84MC21108, UCC Research Corporation (UCCRC) built and tested a 1500 lb/day Mild Gasification ProcessDevelopment Unit (MGU). The MGU, as tested under the previous contract, is shown in Figure 1. Testing completed under the previous contract showed that good quality hydrocarbon liquids and good quality char can be produced in the MGU. However, the MGU is not optimized. The primary objectives of the current project are to optimize the MGU and determine the suitability of char for several commercial applications. The program consists of four tasks; Task 1 -- Test Plan; Task 2 -- Optimization of Mild Gasification Process; Task 3 -- Evaluation of Char and Char/Coal Blends as a Boiler/Blast Furnace Fuel; and Task 4 -- Analysis of Data and Preparation of Final Report. Task 1 has been completed while work continued on Task 2.

The coconut processing industry generates a significant amount of liquid waste. New technologies targeting the treatment of industrial effluents have emerged, including advanced oxidation processes, the Fenton reaction, and electrochemical processes, which produce strong oxidizing species to remove organic matter. In this study we combined the Fenton reaction and electrochemical process to treat wastewater generated by the coconut industry. We prepared a synthetic wastewater consisting of a mixture of coconut milk and water and assessed how the Fenton reagents' concentration, the cathode material, the current density, and the implementation of associated technologies affect its treatment. Electrochemical treatment followed by the Fenton reaction diminished turbidity and chemical oxygen demand (COD) by 85 and 95%, respectively. The Fenton reaction followed by the electrochemical process reduced turbidity and COD by 93 and 85%, respectively. Therefore, a combination of the Fenton and electrochemical technologies can effectively treat the effluent from the coconut processing industry.

This report discusses test campaign GCT4 of the Kellogg Brown & Root, Inc. (KBR) transport reactor train with a Siemens Westinghouse Power Corporation (Siemens Westinghouse) particle filter system at the Power SystemsDevelopment Facility (PSDF) located in Wilsonville, Alabama. The transport reactor is an advanced circulating fluidized-bed reactor designed to operate as either a combustor or a gasifier using one of two possible particulate control devices (PCDs). The transport reactor was operated as a pressurized gasifier during GCT4. GCT4 was planned as a 250-hour test run to continue characterization of the transport reactor using a blend of several Powder River Basin (PRB) coals and Bucyrus limestone from Ohio. The primary test objectives were: Operational Stability--Characterize reactor loop and PCD operations with short-term tests by varying coal-feed rate, air/coal ratio, riser velocity, solids-circulation rate, system pressure, and air distribution. Secondary objectives included the following: Reactor Operations--Study the devolatilization and tar cracking effects from transient conditions during transition from start-up burner to coal. Evaluate the effect of process operations on heat release, heat transfer, and accelerated fuel particle heat-up rates. Study the effect of changes in reactor conditions on transient temperature profiles, pressure balance, and product gas composition. Effects of Reactor Conditions on Synthesis Gas Composition--Evaluate the effect of air distribution, steam/coal ratio, solids-circulation rate, and reactor temperature on CO/CO{sub 2} ratio, synthesis gas Lower Heating Value (LHV), carbon conversion, and cold and hot gas efficiencies. Research Triangle Institute (RTI) Direct Sulfur Recovery Process (DSRP) Testing--Provide syngas in support of the DSRP commissioning. Loop Seal Operations--Optimize loop seal operations and investigate increases to previously achieved maximum solids-circulation rate.

Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

For large organizations, updating instructional programs presents a challenge to keep abreast of constantly changing business processes and policies. Each time a process or policy changes, significant resources are required to locate and modify the training materials that convey the new content. Moreover, without the ability to track learning…

Photovoltaic manufacturing is an emerging industry that promises a carbon-free, nearly limitless source of energy for our nation. However, the high-temperature manufacturing processes used for conventional silicon-based photovoltaics are extremely energy-intensive and expensive. This high cost imposes a critical barrier to the widespread implementation of photovoltaic technology. Argonne National Laboratory and its partners recently invented new methods for manufacturing nanostructured photovoltaic devices that allow dramatic savings in materials, process energy, and cost. These methods are based on atomic layer deposition, a thin film synthesis technique that has been commercialized for the mass production of semiconductor microelectronics. The goal of this project was to develop these low-cost fabrication methods for the high efficiency production of nanostructured photovoltaics, and to demonstrate these methods in solar cell manufacturing. We achieved this goal in two ways: 1) we demonstrated the benefits of these coatings in the laboratory by scaling-up the fabrication of low-cost dye sensitized solar cells; 2) we used our coating technology to reduce the manufacturing cost of solar cells under development by our industrial partners.

This project met the objective to further the development of an integrated multi-contaminant removal process in which H2S, NH3, HCl and heavy metals including Hg, As, Se and Cd present in the coal-derived syngas can be removed to specified levels in a single/integrated process step. The process supports the mission and goals of the Department of Energy's Gasification Technologies Program, namely to enhance the performance of gasification systems, thus enabling U.S. industry to improve the competitiveness of gasification-based processes. The gasification program will reduce equipment costs, improve process environmental performance, and increase process reliability and flexibility. Two sulfur conversion concepts were tested in the laboratory under this project, i.e., the solventbased, high-pressure University of California Sulfur Recovery Process High Pressure (UCSRP-HP) and the catalytic-based, direct oxidation (DO) section of the CrystaSulf-DO process. Each process required a polishing unit to meet the ultra-clean sulfur content goals of <50 ppbv (parts per billion by volume) as may be necessary for fuel cells or chemical production applications. UCSRP-HP was also tested for the removal of trace, non-sulfur contaminants, including ammonia, hydrogen chloride, and heavy metals. A bench-scale unit was commissioned and limited testing was performed with simulated syngas. Aspen-PlusÂ®-based computer simulation models were prepared and the economics of the UCSRP-HP and CrystaSulf-DO processes were evaluated for a nominal 500 MWe, coal-based, IGCC power plant with carbon capture. This report covers the progress on the UCSRP-HP technology development and the CrystaSulf-DO technology.

A parallel image detection and image processingsystem for scanning transmission electron microscopy was developed using a multidetector array consisting of a multianode photomultiplier tube arranged in an 8 x 8 square array. The system enables the taking of 64 images simultaneously from different scattered directions with a scanning time of 2.6 s. Using the 64 images, phase and amplitude contrast images of gold particles on an amorphous carbon thin film could be separately reconstructed by applying respective 8 shaped bandpass Fourier filters for each image and multiplying the phase and amplitude reconstructing factors.

A three phase research and development program has resulted in the development and commercialization of a Cyclone Melting System (CMS{trademark}), capable of being fueled by pulverized coal, natural gas, and other solid, gaseous, or liquid fuels, for the vitrification of industrial wastes. The Phase 3 research effort focused on the development of a process heater system to be used for producing value added glass products from the vitrification of boiler/incinerator ashes and industrial wastes. The primary objective of the Phase 3 project was to develop and integrate all the system components, from fuel through total system controls, and then test the complete system in order to evaluate its potential for successful commercialization. The demonstration test consisted of one test run with a duration of 105 hours, approximately one-half (46 hours) performed with coal as the primary fuel source (70% to 100%), the other half with natural gas. Approximately 50 hours of melting operation were performed vitrifying approximately 50,000 lbs of coal-fired utility boiler flyash/dolomite mixture, producing a fully-reacted vitrified product.

A three phase research and development program has resulted in the development and commercialization of a Cyclone Melting System (CMS{trademark}), capable of being fueled by pulverized coal, natural gas, and other solid, gaseous, or liquid fuels, for the vitrification of industrial wastes. The Phase 3 research effort focused on the development of a process heater system to be used for producing value added glass products from the vitrification of boiler/incinerator ashes and industrial wastes. The primary objective of the Phase 3 project was to develop and integrate all the system components, from fuel through total system controls, and then test the complete system in order to evaluate its potential for successful commercialization. The demonstration test consisted of one test run with a duration of 105 hours, approximately one-half (46 hours) performed with coal as the primary fuel source (70% to 100%), the other half with natural gas. Approximately 50 hours of melting operation were performed vitrifying approximately 50,000 lbs of coal-fired utility boiler flyash/dolomite mixture, producing a fully-reacted vitrified product. Appendix A contains 89 figures containing the data from the demonstration tests undertaken under Phase 3.

As the original magnet designer and manufacturer of ORNL s 9T, 5-inch ID bore magnet, American Magnetics Inc. (AMI) has collaborated with ORNL s Materials Processing Group s and this partnership has been instrumental in the development of our unique thermo-magnetic facilities and expertise. Consequently, AMI and ORNL have realized that the commercial implementation of the High Magnetic Field Processing (HMFP) technology will require the evolution of robust, automated superconducting (SC) magnet systems that will be cost-effective and easy to operate in an industrial environment. The goal of this project and CRADA is to significantly expedite the timeline for implementing this revolutionary and pervasive cross-cutting technology for future US produced industrial components. The successful completion of this project is anticipated to significantly assist in the timely commercialization and licensing of our HMFP intellectual property for a broad spectrum of industries; and to open up a new market for AMI. One notable outcome of this project is that the ThermoMagnetic Processing Technology WON a prestigious 2009 R&D 100 Awards. This award acknowledges and recognizes our TMP Technology as one of the top 100 innovative US technologies in 2009. By successfully establishing the design requirements for a commercial scale magnetic processingsystem, this project effort has accomplished a key first step in facilitating the building and demonstration of a superconducting magnetic processing coil, enabling the transition of the High Magnetic Field Processing Technology beyond a laboratory novelty into a commercially viable and industrially scalable Manufacturing Technology.

This paper reports a highly effective method for the mass production of large-area plastic optical films with a microlens array pattern based on a continuous roll-to-roll film extrusion and roller embossing process. In this study, a thin steel mold with a micro-circular hole array pattern is fabricated by photolithography and a wet chemical etching process. The thin steel mold was then wrapped onto a metal cylinder to form an embossing roller mold. During the roll-to-roll process operation, a thermoplastic raw material (polycarbonate grains) was put into the barrel of the plastic extruder with a flat T-die. Then, the molten polymer film was extruded and immediately pressed against the surface of the embossing roller mold. Under the proper processing conditions, the molten polymer will just partially fill the micro-circular holes of the mold and due to surface tension form a convex lens surface. A continuous plastic optical film with a microlens array pattern was obtained. Experiments are carried out to investigate the effect of plastic microlens formation on the roll-to-roll process. Finally, the geometrical and optical properties of the fabricated plastic optical film were measured and proved satisfactory. This technique shows great potential for the mass production of large-area plastic optical films with a microlens array pattern.

This paper presents several brief papers about the process of systemic change. These are: (1) Step-Up-To-Excellence: A Protocol for Navigating Whole-System Change in School Districts by Francis M. Duffy; (2) The Guidance System for Transforming Education by Charles M. Reigeluth; (3) The Schlechty Center For Leadership In School Reform by Monica…

In 2011, significant progress was made in developing and deploying technologies to remove, transport, and interim store remote-handled sludge from the 105-K West Fuel Storage Basin on the Hanford Site in south-central Washington State. The sludge in the 105-K West Basin is an accumulation of degraded spent nuclear fuel and other debris that collected during long-term underwater storage of the spent fuel. In 2010, an innovative, remotely operated retrieval system was used to successfully retrieve over 99.7% of the radioactive sludge from 10 submerged temporary storage containers in the K West Basin. In 2011, a full-scale prototype facility was completed for use in technology development, design qualification testing, and operator training on systems used to retrieve, transport, and store highly radioactive K Basin sludge. In this facility, three separate systems for characterizing, retrieving, pretreating, and processing remote-handled sludge were developed. Two of these systems were successfully deployed in 2011. One of these systems was used to pretreat knockout pot sludge as part of the 105-K West Basin cleanup. Knockout pot sludge contains pieces of degraded uranium fuel ranging in size from 600 {mu}m to 6350 {mu}m mixed with pieces of inert material, such as aluminum wire and graphite, in the same size range. The 2011 pretreatment campaign successfully removed most of the inert material from the sludge stream and significantly reduced the remaining volume of knockout pot product material. Removing the inert material significantly minimized the waste stream and reduced costs by reducing the number of transportation and storage containers. Removing the inert material also improved worker safety by reducing the number of remote-handled shipments. Also in 2011, technology development and final design were completed on the system to remove knockout pot material from the basin and transport the material to an onsite facility for interim storage. This system is

Airborne, Maritime and Fixed Station Joint Tactical Radio System ^Åèìáëáíáçå oÉëÉ~êÅÜW `ob^qfkd pvkbodv clo fkclojba `e^kdb - ii - • Program...perspective to cost estimation and control and may be enriched by enhancing with system engineering activities that are also focused and similar areas ...Mission and requirements definition, Functional analysis, Alternative synthesis, and Evaluation , trade-off, and selection. This methodology is

NASA's Advanced Exploration Systems (AES) Life Support System (LSS) Project is chartered with de-veloping advanced life support systems that will ena-ble NASA human exploration beyond low Earth orbit (LEO). The goal of AES is to increase the affordabil-ity of long-duration life support missions, and to re-duce the risk associated with integrating and infusing new enabling technologies required to ensure mission success. Because of the robust nature of distillation systems, the AES LSS Project is pursuing develop-ment of the Cascade Distillation Subsystem (CDS) as part of its technology portfolio. Currently, the system is being developed into a flight forward Generation 2.0 design.

Open source is a still unexploited chance for healthcare organizations and technology providers to answer to a growing demand for innovation and to join economical benefits with a new way of managing hospital information systems. This chapter will present the case of the web enterprise clinical portal developed in Italy by Niguarda Hospital in Milan with the support of Fondazione Politecnico di Milano, to enable a paperless environment for clinical and administrative activities in the ward. This represents also one rare case of open source technology and reuse in the healthcare sector, as the system's porting is now taking place at Besta Neurological Institute in Milan. This institute is customizing the portal to feed researchers with structured clinical data collected in its portal's patient records, so that they can be analyzed, e.g., through business intelligence tools. Both organizational and clinical advantages are investigated, from process monitoring, to semantic data structuring, to recognition of common patterns in care processes.

The purpose of this deconstructive case study was to conduct a Foucauldian power/knowledge analysis constructed from the perceptions of three teachers at an intermediate school in South Texas regarding the role of the teacher evaluation process and its influence on instructional practices. Using Foucault's (1977a) work on power/knowledge, of…

Cleaning and sanitation in food processing facilities is a critical step in reducing the risk of transfer of pathogenic organisms to food consumed by the public. Current methods to check the effectiveness of sanitation procedures rely on visual observation and sub-sampling tests such as ATP biolumin...

The Launch ProcessingSystem represents Kennedy Space Center's role in providing a major integrated hardware and software system for the test, checkout and launch of a new space vehicle. Past programs considered the active flight vehicle to ground interfaces as part of the flight systems and therefore the related ground system was provided by the Development Center. The major steps taken to transform the Launch ProcessingSystem from a concept to reality with the successful launches of the Shuttle Programs Space Transportation System are addressed.

the results of SECD Process Model Task. The SECD Process Model is a system acquisition and development model that emphasizes System Engineering...activities over the entire system lifecycle. The Process model is a graphical representation of the System Engineering Lifecycle activities, agents, flows...feedbacks, and work products. This interactive Process Model provides a multi- dimensional view of government acquisition and contractor development

Many schools and school systems have been deliberately working towards full implementation of Assessment for Learning for more than a decade, yet success has been elusive. Using a leader's implementation of Assessment for Learning in one school as an illustration, this article examines eight positional leaders' experiences as they implemented both…

The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software developmentprocess and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software developmentprocess, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate

Some technologies, developed recently in Europe, combine several processes to separate and reuse materials from solid waste. These plants have in common, generally, that they are reasonably small, have a composting component for the organic portion, and often have a refuse-derived fuel component for combustible waste. Many European communities also have very effective drop-off center programs for recyclables such as bottles and cans. By maintaining the integrity of several different fractions of the waste, there is a less to landfill and less to burn. The importance of these hybrid systems is that they introduce in one plant an approach that encompasses the key concept of today's solid waste planning; recover as much as possible and landfill as little as possible. The plants also introduce various risks, particularly of finding secure markets. There are a number of companies offering various combinations of materials recovery, composting, and waste combustion. Four examples are included: multiple materials recovery and refuse-derived fuel production in Eden Prairie, Minnesota; multiple materials recovery, composting and refuse-derived fuel production in Perugia, Italy; composting, refuse-derived fuel, and gasification in Tolmezzo, Italy; and a front-end system on a mass burning waste-to-energy plant in Neuchatel, Switzerland.

An environment-friendly decentralized wastewater treatment process that is comprised of activated sludge process (ASP) and wetland vegetation, named as vegetation-activated sludge process (V-ASP), was developed for decentralized wastewater treatment. The long-term experimental results evidenced that the vegetation sequencing batch reactor (V-SBR) process had consistently stable higher removal efficiencies of organic substances and nutrients from domestic wastewater compared with traditional sequencing batch reactor (SBR). The vegetation allocated into V-SBR system could not only remove nutrients through its vegetation transpiration ratio but also provide great surface area for microorganism activity enhancement. This high vegetation transpiration ratio enhanced nutrients removal effectiveness from wastewater mainly by flux enhancement, oxygen and substrate transportation acceleration, and vegetation respiration stimulation. A mathematical model based on ASM2d was successfully established by involving the specific function of vegetation to simulate system performance. The simulation results on the influence of operational parameters on V-ASP treatment effectiveness demonstrated that V-SBR had a high resistance to seasonal temperature fluctuations and influent loading shocking.

Progress is reported on the engineering design, fabrication, assembly, operation, economic analysis, and process support R and D for an Experimental ProcessSystemDevelopment Unit (EPSDU) for producing semiconductor-grade silicon using the silane-to-silicon process. Most of the process related equipment has been ordered and is being fabricated. Equipment and building foundations have been completed at the EPSDU site, and all the steel was erected for the gantry. The switch gear/control building and the melter building will be completed during the next quarter. The data collection system design is progressing. Various computer programs are being written which will be used to convert electrical, pneumatic and other raw signals into engineering values. The free-space reactor development work was completed with a final 12-hour run in which the free-space reactor PDU ran flawlessly. Also, the quality control method development task was completed. Slim rods were grown from seed silicon rods for subsequent float zone operation and impurity characterization. An excellent quality epitaxial film was deposited on a silicon wafer. Both undoped ad doped films were deposited and the resistivity of the films have been measured. (WHK)

Researchers at the Timken Company conceived a project to develop an on-line instrument for wall thickness measurement of steel seamless mechanical tubing based on laser ultrasonic technology. The instrument, which has been installed and tested at a piercing mill, provides data on tube eccentricity and concentricity. Such measurements permit fine-tuning of manufacturing processes to eliminate excess material in the tube wall and therefore provide a more precisely dimensioned product for their customers. The resulting process energy savings are substantial, as is lowered environmental burden. The expected savings are $85.8 million per year in seamless mechanical tube piercing alone. Applied across the industry, this measurement has a potential of reducing energy consumption by 6 x 10{sup 12} BTU per year, greenhouse gas emissions by 0.3 million metric tons carbon equivalent per year, and toxic waste by 0.255 million pounds per year. The principal technical contributors to the project were the Timken Company, Industrial Materials Institute (IMI, a contractor to Timken), and Oak Ridge National Laboratory (ORNL). Timken provided mill access as well as process and metallurgical understanding. Timken researchers had previously developed fundamental ultrasonic analysis methods on which this project is based. IMI developed and fabricated the laser ultrasonic generation and receiver systems. ORNL developed Bayesian and wavelet based real-time signal processing, spread-spectrum wireless communication, and explored feature extraction and pattern recognition methods. The resulting instrument has successfully measured production tubes at one of Timken's piercing mills. This report concentrates on ORNL's contribution through the CRADA mechanism. The three components of ORNL's contribution were met with mixed success. The real-time signal-processing task accomplished its goal of improvement in detecting time of flight information with a minimum of false data. The signal processing

ASPEN (Advanced System for Process Engineering) is a state of the art process simulator and economic evaluation package which was designed for use in engineering fossil energy conversion processes. ASPEN can represent multiphase streams including solids, and handle complex substances such as coal. The system can perform steady state material and energy balances, determine equipment size and cost, and carry out preliminary economic evaluations. It is supported by a comprehensive physical property system for computation of major properties such as enthalpy, entropy, free energy, molar volume, equilibrium ratio, fugacity coefficient, viscosity, thermal conductivity, and diffusion coefficient for specified phase conditions; vapor, liquid, or solid. The properties may be computed for pure components, mixtures, or components in a mixture, as appropriate. The ASPEN Input Language is oriented towards process engineers.

In support of technology development to utilize coal for efficient, affordable, and environmentally clean power generation, the Power SystemsDevelopment Facility (PSDF), located in Wilsonville, Alabama, has routinely demonstrated gasification technologies using various types of coals. The PSDF is an engineering scale demonstration of key features of advanced coal-fired power systems, including a Transport Gasifier, a hot gas particulate control device, advanced syngas cleanup systems, and high-pressure solids handling systems. This final report summarizes the results of the technology development work conducted at the PSDF through January 31, 2009. Twenty-one major gasification test campaigns were completed, for a total of more than 11,000 hours of gasification operation. This operational experience has led to significant advancements in gasification technologies.

A fiber Bragg grating is a portion of a core of a fiber optic strand that has been treated to affect the way light travels through the strand. Light within a certain narrow range of wavelengths will be reflected along the fiber by the grating, while light outside that range will pass through the grating mostly undisturbed. Since the range of wavelengths that can penetrate the grating depends on the grating itself as well as temperature and mechanical strain, fiber Bragg gratings can be used as temperature and strain sensors. This capability, along with the light-weight nature of the fiber optic strands in which the gratings reside, make fiber optic sensors an ideal candidate for flight testing and monitoring in which temperature and wing strain are factors. The purpose of this project is to research the availability of software capable of processing massive amounts of data in both real-time and post-flight settings, and to produce software segments that can be integrated to assist in the task as well.

As the original magnet designer and manufacturer of ORNL’s 9T, 5-inch ID bore magnet, American Magnetics Inc. (AMI) has collaborated with ORNL’s Materials Processing Group’s and this partnership has been instrumental in the development of our unique thermo-magnetic facilities and expertise. Consequently, AMI and ORNL have realized that the commercial implementation of the High Magnetic Field Processing (HMFP) technology will require the evolution of robust, automated superconducting (SC) magnet systems that will be cost-effective and easy to operate in an industrial environment. The goal of this project and CRADA is to significantly expedite the timeline for implementing this revolutionary and pervasive cross-cutting technology for future US produced industrial components. The successful completion of this project is anticipated to significantly assist in the timely commercialization and licensing of our HMFP intellectual property for a broad spectrum of industries; and to open up a new market for AMI. One notable outcome of this project is that the ThermoMagnetic Processing Technology WON a prestigious 2009 R&D 100 Awards. This award acknowledges and recognizes our TMP Technology as one of the top 100 innovative US technologies in 2009. By successfully establishing the design requirements for a commercial scale magnetic processingsystem, this project effort has accomplished a key first step in facilitating the building and demonstration of a superconducting magnetic processing coil, enabling the transition of the High Magnetic Field Processing Technology beyond a laboratory novelty into a commercially viable and industrially scalable Manufacturing Technology.

Green petroleum coke (GPC) is an oil refining byproduct that can be used directly as a solid fuel or as a feedstock for the production of calcined petroleum coke. GPC contains a high amount of volatiles and sulfur. During the calcination process, the GPC is heated to remove the volatiles and sulfur to produce purified calcined coke, which is used in the production of graphite, electrodes, metal carburizers, and other carbon products. Currently, more than 80% of calcined coke is produced in rotary kilns or rotary hearth furnaces. These technologies provide partial heat utilization of the calcined coke to increase efficiency of the calcination process, but they also share some operating disadvantages. However, coke calcination in an electrothermal fluidized bed (EFB) opens up a number of potential benefits for the production enhancement, while reducing the capital and operating costs. The increased usage of heavy crude oil in recent years has resulted in higher sulfur content in green coke produced by oil refinery process, which requires a significant increase in the calcinations temperature and in residence time. The calorific value of the process off-gas is quite substantial and can be effectively utilized as an “opportunity fuel” for combined heat and power (CHP) production to complement the energy demand. Heat recovered from the product cooling can also contribute to the overall economics of the calcination process. Preliminary estimates indicated the decrease in energy consumption by 35-50% as well as a proportional decrease in greenhouse gas emissions. As such, the efficiency improvement of the coke calcinations systems is attracting close attention of the researchers and engineers throughout the world. The developed technology is intended to accomplish the following objectives: - Reduce the energy and carbon intensity of the calcined coke production process. - Increase utilization of opportunity fuels such as industrial waste off-gas from the novel

Potential space missions of the nineties and the next century require that we look at the broad category of remote systems as an important means to achieve cost-effective operations, exploration and colonization objectives. This paper addresses such missions, which can use remote systems technology as the basis for identifying required capabilities which must be provided. The relationship of the space-based tasks to similar tasks required for terrestrial applications is discussed. The development status of the required technology is assessed and major issues which must be addressed to meet future requirements are identified. This includes the proper mix of humans and machines, from pure teleoperation to full autonomy; the degree of worksite compatibility for a robotic system; and the required design parameters, such as degrees-of-freedom. Methods for resolution are discussed including analysis, graphical simulation and the use of laboratory test beds. Grumman experience in the application of these techniques to a variety of design issues are presented utilizing the Telerobotics Development Laboratory which includes a 17-DOF robot system, a variety of sensing elements, Deneb/IRIS graphics workstations and control stations. The use of task/worksite mockups, remote systemdevelopment test beds and graphical analysis are discussed with examples of typical results such as estimates of task times, task feasibility and resulting recommendations for design changes. The relationship of this experience and lessons-learned to future development of remote systems is also discussed.

An uplink controlling assembly speeds data processing using a special parallel codeblock technique. A correct start sequence initiates processing of a frame. Two possible start sequences can be used; and the one which is used determines whether data polarity is inverted or non-inverted. Processing continues until uncorrectable errors are found. The frame ends by intentionally sending a block with an uncorrectable error. Each of the codeblocks in the frame has a channel ID. Each channel ID can be separately processed in parallel. This obviates the problem of waiting for error correction processing. If that channel number is zero, however, it indicates that the frame of data represents a critical command only. That data is handled in a special way, independent of the software. Otherwise, the processed data further handled using special double buffering techniques to avoid problems from overrun. When overrun does occur, the system takes action to lose only the oldest data.

This study examines Strategic Planning concepts and how they relate to the development of Hospital Information Systems. The author recommends that... Strategic Planning methods be utilized in the development of Hospital Information Systems, and provides guidance on how to do so. Keywords: Theses...Integrated information systems; Hospital administration; Computer networks; Information exchange; Health care; Strategic planning ; Information systems.

Advanced Information ProcessingSystem (AIPS) is a computer systems philosophy, a set of validated hardware building blocks, and a set of validated services as embodied in system software. The goal of AIPS is to provide the knowledgebase which will allow achievement of validated fault-tolerant distributed computer system architectures, suitable for a broad range of applications, having failure probability requirements of 10E-9 at 10 hours. A background and description is given followed by program accomplishments, the current focus, applications, technology transfer, FY92 accomplishments, and funding.

The Architecture for Survivable SystemProcessing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

The Architecture for Survivable SystemProcessing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

two volumes. Volume 1 is the Development Metodology and Volume 2 is an Evaluation Methodology containing methods for evaluation, validation and...system are written in an English -like language which almost anyone can understand. Thus programming in rule based systems can become "programming for...computers and others have little understanding about how computers work. The knowledge engineer must therefore be willing and able to teach the expert

This report discusses test campaign GCT3 of the Halliburton KBR transport reactor train with a Siemens Westinghouse Power Corporation (Siemens Westinghouse) particle filter system at the Power SystemsDevelopment Facility (PSDF) located in Wilsonville, Alabama. The transport reactor is an advanced circulating fluidized-bed reactor designed to operate as either a combustor or a gasifier using one of two possible particulate control devices (PCDs). The transport reactor was operated as a pressurized gasifier during GCT3. GCT3 was planned as a 250-hour test run to commission the loop seal and continue the characterization of the limits of operational parameter variations using a blend of several Powder River Basin coals and Bucyrus limestone from Ohio. The primary test objectives were: (1) Loop Seal Commissioning--Evaluate the operational stability of the loop seal with sand and limestone as a bed material at different solids circulation rates and establish a maximum solids circulation rate through the loop seal with the inert bed. (2) Loop Seal Operations--Evaluate the loop seal operational stability during coal feed operations and establish maximum solids circulation rate. Secondary objectives included the continuation of reactor characterization, including: (1) Operational Stability--Characterize the reactor loop and PCD operations with short-term tests by varying coal feed, air/coal ratio, riser velocity, solids circulation rate, system pressure, and air distribution. (2) Reactor Operations--Study the devolatilization and tar cracking effects from transient conditions during transition from start-up burner to coal. Evaluate the effect of process operations on heat release, heat transfer, and accelerated fuel particle heat-up rates. Study the effect of changes in reactor conditions on transient temperature profiles, pressure balance, and product gas composition. (3) Effects of Reactor Conditions on Syngas Composition--Evaluate the effect of air distribution, steam

process for a major program. The United States DOD Directive 5000.1 defines four distinct phases of the acquisition process: concept exploration , demon...Unified or Specified Command. 1. Concept Exploration Phase The first phase for a major system is the concept exploration phase. During this phase... exploration phase proreses. Premature introduction of operating and support details may have a negative effect by dosing out promising alternatives [Ref

The CALIOPE receiver systemdevelopment at LANL is the story of two technologies. The first of these technologies consists of off-the-shelf mercury-cadmium-telluride (MCT) detectors and amplifiers. The vendor for this system is Kolmar Technologies. This system was fielded in the Tan Trailer I (TTI) in 1995 and will be referred to in this paper as GEN I. The second system consists of a MCT detector procured from Santa Barbara Research Center (SBRC) and an amplifier designed and built by LANL. This system was fielded in the Tan Trailer II (TTII) system at the NTS tests in 1996 and will be referred to as GEN II. The LANL CALIOPE experimental plan for 1996 was to improve the lidar system by progressing to a higher rep rate laser to perform many shots in a much shorter period of time. In keeping with this plan, the receiver team set a goal of developing a detector system that was background limited for the projected 100 nanosecond (ns) laser pulse. A set of detailed simulations of the DIAL lidar experiment was performed. From these runs, parameters such as optimal detector size, field of view of the receiver system, nominal laser return power, etc. were extracted. With this information, detector physics and amplifier electronic models were developed to obtain the required specifications for each of these components. These derived specs indicated that a substantial improvement over commercially available, off-the-shelf, amplifier and detector technologies would be needed to obtain the goals. To determine if the original GEN I detector was usable, the authors performed tests on a 100 micron square detector at cryogenic temperatures. The results of this test and others convinced them that an advanced detector was required. Eventually, a suitable detector was identified and a number of these single element detectors were procured from SBRC. These single element detectors were witness for the detector arrays built for another DOE project.

Design and development of a second generation Plasma Pyrolysis Assembly (PPA) reactor is currently underway as part of NASA s Atmosphere Revitalization Resource Recovery effort. By recovering up to 75% of the hydrogen currently lost as methane in the Sabatier reactor effluent, the PPA helps to minimize life support resupply costs for extended duration missions. To date, second generation PPA development has demonstrated significant technology advancements over the first generation device by doubling the methane processing rate while, at the same time, more than halving the required power. One development area of particular interest to NASA system engineers is fouling of the PPA reactor with carbonaceous products. As a mitigation plan, NASA MSFC has explored the feasibility of using an oxidative plasma based upon metabolic CO2 to regenerate the reactor window and gas inlet ports. The results and implications of this testing are addressed along with the advanced PPA reactor development work.

Xcpu2 is a new process management system that allows the users to specify custom file system for a running job. Most cluster management systems enforce single software distribution running on all nodes. Xcpu2 allows programs running on the cluster to work in environment identical to the user's desktop, using the same versions of the libraries and tools the user installed locally, and accessing the configuration file in the same places they are located on the desktop. Xcpu2 builds on our earlier work with the Xcpu system. Like Xcpu, Xcpu2's process management interface is represented as a set of files exported by a 9P file server. It supports heterogeneous clusters and multiple head nodes. Unlike Xcpu, it uses pull instead of push model. In this paper we describe the Xcpu2 clustering model, its operation and how the per-job filesystem configuration can be used to solve some of the common problems when running a cluster.

State-of-the-art (SOA) carbon dioxide (CO2) reduction technology for the International Space Station produces methane as a byproduct. This methane is subsequently vented overboard. The associated loss of hydrogen ultimately reduces the mass of oxygen that can be recovered from CO2 in a closed-loop life support system. As an alternative to SOA CO2 reduction technology, NASA is exploring a Series-Bosch system capable of reducing CO2 with hydrogen to form water and solid carbon. This results in 100% theoretical recovery of oxygen from metabolic CO2. In the past, Bosch-based technology did not trade favorably against SOA technology due to a high power demand, low reaction efficiencies, concerns with carbon containment, and large resupply requirements necessary to replace expended catalyst cartridges. An alternative approach to Bosch technology, labeled "Series-Bosch," employs a new system design with optimized multi-stage reactors and a membrane-based separation and recycle capability. Multi-physics modeling of the first stage reactor, along with chemical process modeling of the integrated system, has resulted in a design with potential to trade significantly better than previous Bosch technology. The modeling process and resulting system architecture selection are discussed.

This paper urges the curriculum developer to assume the accountability for his decisions necessitated by the actual ways our society functions. The curriculum developer is encouraged to recognize that he is a salesman with a commodity (the curriculum). He is urged to realize that if he cannot market the package to the customers (the various…

Whether it be monitoring the thermal subsystem of Space Station Freedom, or controlling the navigation of the autonomous rover on Mars, NASA missions in the 1990s cannot enjoy an increased level of autonomy without the efficient implementation of expert systems. Merely increasing the computational speed of uniprocessors may not be able to guarantee that real-time demands are met for larger systems. Speedup via parallel processing must be pursued alongside the optimization of sequential implementations. Prototypes of parallel expert systems have been built at universities and industrial laboratories in the U.S. and Japan. The state-of-the-art research in progress related to parallel execution of expert systems is surveyed. The survey discusses multiprocessors for expert systems, parallel languages for symbolic computations, and mapping expert systems to multiprocessors. Results to date indicate that the parallelism achieved for these systems is small. The main reasons are (1) the body of knowledge applicable in any given situation and the amount of computation executed by each rule firing are small, (2) dividing the problem solving process into relatively independent partitions is difficult, and (3) implementation decisions that enable expert systems to be incrementally refined hamper compile-time optimization. In order to obtain greater speedups, data parallelism and application parallelism must be exploited.

The development of the Skylab M512 Materials Processing Facility is traced from the design of a portable, self-contained electron beam welding system for terrestrial applications to the highly complex experiment system ultimately developed for three Skylab missions. The M512 experiment facility was designed to support six in-space experiments intended to explore the advantages of manufacturing materials in the near-zero-gravity environment of Earth orbit. Detailed descriptions of the M512 facility and related experiment hardware are provided, with discussions of hardware verification and man-machine interfaces included. An analysis of the operation of the facility and experiments during the three Skylab missions is presented, including discussions of the hardware performance, anomalies, and data returned to earth.

This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC`s expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three-dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above.

The word 'trauma' describes the disease entity resulting from physical injury. Trauma is one of the leading causes of death worldwide and deaths due to injury look set to increase. As early as the 1970s, it became evident that centralisation of resources and expertise could reduce the mortality rate from serious injury and that organisation of trauma care delivery into formal systems could improve outcome further. Internationally, trauma systems have evolved in various forms, with widespread reports of mortality and functional outcome benefits when major trauma management is delivered in this way. The management of major trauma in England is currently undergoing significant change. The London Trauma System began operating in April 2010 and others throughout England became operational this year. Similar systems exist internationally and continue to be developed. Anaesthetists have been and continue to be involved with all levels of trauma care delivery, from the provision of pre-hospital trauma and retrieval teams, through to chronic pain management and rehabilitation of patients back into society. This review examines the international development of major trauma care delivery and the components of a modern trauma system.

The development of an internal insulation system for cryogenic liquids is described. The insulation system is based on a gas layer concept in which capillary or surface tension effects are used to maintain a stable gas layer within a cellular core structure between the tank wall and the contained cryogen. In this work, a 1.8 meter diameter tank was insulated and tested with liquid hydrogen. Ability to withstand cycling of the aluminum tank wall to 450 K was a design and test condition.

Auditory development involves changes in the peripheral and central nervous system along the auditory pathways, and these occur naturally, and in response to stimulation. Human development occurs along a trajectory that can last decades, and is studied using behavioral psychophysics, as well as physiologic measurements with neural imaging. The auditory system constructs a perceptual space that takes information from objects and groups, segregates sounds, and provides meaning and access to communication tools such as language. Auditory signals are processed in a series of analysis stages, from peripheral to central. Coding of information has been studied for features of sound, including frequency, intensity, loudness, and location, in quiet and in the presence of maskers. In the latter case, the ability of the auditory system to perform an analysis of the scene becomes highly relevant. While some basic abilities are well developed at birth, there is a clear prolonged maturation of auditory development well into the teenage years. Maturation involves auditory pathways. However, non-auditory changes (attention, memory, cognition) play an important role in auditory development. The ability of the auditory system to adapt in response to novel stimuli is a key feature of development throughout the nervous system, known as neural plasticity.

Auditory development involves changes in the peripheral and central nervous system along the auditory pathways, and these occur naturally, and in response to stimulation. Human development occurs along a trajectory that can last decades, and is studied using behavioral psychophysics, as well as physiologic measurements with neural imaging. The auditory system constructs a perceptual space that takes information from objects and groups, segregates sounds, and provides meaning and access to communication tools such as language. Auditory signals are processed in a series of analysis stages, from peripheral to central. Coding of information has been studied for features of sound, including frequency, intensity, loudness, and location, in quiet and in the presence of maskers. In the latter case, the ability of the auditory system to perform an analysis of the scene becomes highly relevant. While some basic abilities are well developed at birth, there is a clear prolonged maturation of auditory development well into the teenage years. Maturation involves auditory pathways. However, non-auditory changes (attention, memory, cognition) play an important role in auditory development. The ability of the auditory system to adapt in response to novel stimuli is a key feature of development throughout the nervous system, known as neural plasticity. PMID:25726262

Requirements capture the system-level capabilities in a set of complete, necessary, clear, attainable, traceable, and verifiable statements of need. Requirements should not be unduly restrictive, but should set limits that eliminate items outside the boundaries drawn, encourage competition (or alternatives), and capture source and reason of requirement. If it is not needed by the customer, it is not a requirement. They establish the verification methods that will lead to product acceptance. These must be reproducible assessment methods.

A new process for forming ice shapes on an aircraft wing was developed at the NASA Lewis Research Center. The innovative concept was formed by Lewis' Icing Research Tunnel (IRT) team, and the hardware was manufactured by Lewis' Manufacturing Engineering Division. This work was completed to increase our understanding of the stability and control of aircraft during icing conditions. This project will also enhance our evaluation of true aerodynamic wind tunnel effects on aircraft. In addition, it can be used as a design tool for evaluating ice protection systems.

The Advanced Information ProcessingSystem (AIPS) is a multi-computer architecture composed of hardware and software building blocks that can be configured to meet a broad range of application requirements. The hardware building blocks are fault-tolerant, general-purpose computers, fault-and damage-tolerant networks (both computer and input/output), and interfaces between the networks and the computers. The software building blocks are the major software functions: local system services, input/output, system services, inter-computer system services, and the system manager. The foundation of the local system services is an operating system with the functions required for a traditional real-time multi-tasking computer, such as task scheduling, inter-task communication, memory management, interrupt handling, and time maintenance. Resting on this foundation are the redundancy management functions necessary in a redundant computer and the status reporting functions required for an operator interface. The functional requirements, functional design and detailed specifications for all the local system services are documented.

The Expert SystemDevelopment Methodology (ESDM) provides an approach to developing expert system software. Because of the uncertainty associated with this process, an element of risk is involved. ESDM is designed to address the issue of risk and to acquire the information needed for this purpose in an evolutionary manner. ESDM presents a life cycle in which a prototype evolves through five stages of development. Each stage consists of five steps, leading to a prototype for that stage. Development may proceed to a conventional development methodology (CDM) at any time if enough has been learned about the problem to write requirements. ESDM produces requirements so that a product may be built with a CDM. ESDM is considered preliminary because is has not yet been applied to actual projects. It has been retrospectively evaluated by comparing the methods used in two ongoing expert systemdevelopment projects that did not explicitly choose to use this methodology but which provided useful insights into actual expert systemdevelopment practices and problems.

of microstructural evolution, (5) development of Gamma and Beta-Gamma titanium alloys towards rolled sheets for thermal protection applications, ( 6 ...the hydrostatic stress. This work was published in Metallurgical and Materials Transactions A by Nicolaou, Miller, and Semiatin [ 6 ]. 4 2.2.2 The...observed values for the Titanium 6242s measured by Porter and John, as well as Ti6- 4 alloy reported on by Chan in Mater. Trans, 2008. In addition

This paper presents a systems approach based on the work of W. Edwards Deming to system wide, high impact staff development. Deming has pointed out the significance of structure in systems. By restructuring the process of staff development we can bring about cost effective improvement of the whole system. We can improve student achievement while…

The consumption of milk in China is increasing as living standards rapidly improve, and huge amounts of aseptic composite milk packaging waste are being generated. Aseptic composite packaging is composed of paper, polyethylene, and aluminum. It is difficult to separate the polyethylene and aluminum, so most of the waste is currently sent to landfill or incinerated with other municipal solid waste, meaning that enormous amounts of resources are wasted. A wet process technique for separating the aluminum and polyethylene from the composite materials after the paper had been removed from the original packaging waste was studied. The separation efficiency achieved using different separation reagents was compared, different separation mechanisms were explored, and the impacts of a range of parameters, such as the reagent concentration, temperature, and liquid-solid ratio, on the separation time and aluminum loss ratio were studied. Methanoic acid was found to be the optimal separation reagent, and the suitable conditions were a reagent concentration of 2-4 mol/L, a temperature of 60-80°C, and a liquid-solid ratio of 30 L/kg. These conditions allowed aluminum and polyethylene to be separated in less than 30 min, with an aluminum loss ratio of less than 3%. A mass balance was produced for the aluminum-polyethylene separation system, and control technique was developed to keep the ion concentrations in the reaction system stable. This allowed a continuous industrial-scale process for separating aluminum and polyethylene to be developed, and a demonstration facility with a capacity of 50t/d was built. The demonstration facility gave polyethylene and aluminum recovery rates of more than 98% and more than 72%, respectively. Separating 1t of aluminum-polyethylene composite packaging material gave a profit of 1769 Yuan, meaning that an effective method for recycling aseptic composite packaging waste was achieved.

The development of hardware and software for an interactive, minicomputer based processing and display system for atmospheric and oceanographic information extraction and image data analysis is described. The major applications of the system are discussed as well as enhancements planned for the future.

The ERIPS is an interactive computer system used in the analysis of remotely sensed data. It consists of a set of software programs which are executed on an IBM System/360 Model 75J computer under the direction of a trained analyst. The software was a derivative of the Purdue LARSYS program and has evolved to include an extensive pattern recognition system and a number of manipulative, preprocessing routines which prepare the imagery for the pattern recognition application. The original purpose of the system was to analyze remotely sensed data, to develop and perfect techniques to process the data, and to determine the feasibility of applying the data to significant earth resources problems. The Systemdeveloped into a production system. Error recovery and multi-jobbing capabilities were added to the system.

Metal fuel cycle with pyro-processing technology has another potential different from oxide fuel cycle with aqueous process. In addition to the advantage of metal fuel fast reactor, such as achieving a high breeding ratio over 1.3, the pyro-processing with metal electrorefining expects that no additional process is required to separate minor actinides and no organic solvent that degrades by radiation and acid is utilized. 'Feasibility Study on Commercialized Fast Reactor (FR) Cycle Systems' in Japan selected the metal fuel fast reactor fuel cycle with metal-electrorefining as the sub-system for future development. CRIEPI has been involving on R and D of pyro-processing technology with metal-electrorefining since 1980's and followed JAERI, that orients to apply on the treatment of spent target with nitride fuel for ADS, and JNC, currently merged to JAEA, and, then, wider collaboration started among CRIEPI/JAERI/JNC. The series verification of process starting with MOX pellets have produced U-Pu alloy after distillation through reduction and electrorefining at the facility installed in Tokai, JAEA. The metal fuel fabrication has started from the stage of U-Pu alloy fabrication from UO{sub 2} and PuO{sub 2} mixture by electrochemical reduction, and currently succeeds to produce a fuel slag of U-Pu-Zr of 30 cm by injection casting in Oarai, JAEA. The alloys are scheduled to irradiate in JOYO fast reactor core. The development of engineering model of electro-refiner and electrochemical reduction device has successfully conducted by use of UO{sub 2} with kg scale. In addition to domestic R and Ds, pyro-processing verification with genuine material is proceeding in the joint study of CRIEPI/ITU. TRU is extracted into cadmium from chloride prepared from HLLW through denitration by reductive extraction in a caisson installed in a hot cell, and, then, electrorefining by use of PHENIX-irradiated metal fuel with minor actinides is scheduled. Thus, the R and D on pyro-processing

For the purpose of SO2 reduction and stabilizing ice wine, a new antibacterial technique was developed and verified in order to reduce the content of sulfur dioxide (SO2) and simultaneously maintain protein stability during ice wine aging process. Hazardous bacterial strain (lactic acid bacteria, LAB) and protein stability of Italian Riesling ice wine were evaluated in terms of different amounts of lysozyme, SO2, polyphenols, and wine pH by single-factor experiments. Subsequently, a quadratic rotation-orthogonal composite design with four variables was conducted to establish the multiple linear regression model that demonstrated the influence of different treatments on synthesis score between LAB inhibition and protein stability of ice wine. The results showed that, synthesis score can be influenced by lysozyme and SO2 concentrations on an extremely significant level (P system, which is specially designed for ice wine aging, was optimized step by step by response surface methodology and ridge analysis. As a result, the optimal proportion should be control in ice wine as follows: 179.31 mg L−1 lysozyme, 177.14 mg L−1 SO2, 0.60 g L−1 polyphenols, and 4.01 ice wine pH. Based on this system, the normalized synthesis score between LAB inhibition and protein stability can reach the highest point 0.920. Finally, by the experiments of verification and comparison, it was indicated that lysozyme-combined antibacterial system, which was a practical and prospective method to reduce SO2 concentration and effectively prevent contamination from hazardous LAB, can be used to stabilize ice wine during aging process. PMID:26405531

Whether it be monitoring the thermal subsystem of Space Station Freedom, or controlling the navigation of the autonomous rover on Mars, NASA missions in the 90's cannot enjoy an increased level of autonomy without the efficient use of expert systems. Merely increasing the computational speed of uniprocessors may not be able to guarantee that real time demands are met for large expert systems. Speed-up via parallel processing must be pursued alongside the optimization of sequential implementations. Prototypes of parallel expert systems have been built at universities and industrial labs in the U.S. and Japan. The state-of-the-art research in progress related to parallel execution of expert systems was surveyed. The survey is divided into three major sections: (1) multiprocessors for parallel expert systems; (2) parallel languages for symbolic computations; and (3) measurements of parallelism of expert system. Results to date indicate that the parallelism achieved for these systems is small. In order to obtain greater speed-ups, data parallelism and application parallelism must be exploited.

The intelligent transportation system has generated a strong need for the development of intelligent camera systems to meet the requirements of sophisticated applications, such as electronic toll collection (ETC), traffic violation detection and automatic parking lot control. In order to achieve the highest levels of accuracy in detection, these cameras must have high speed electronic shutters, high resolution, high frame rate, and communication capabilities. A progressive scan interline transfer CCD camera, with its high speed electronic shutter and resolution capabilities, provides the basic functions to meet the requirements of a traffic camera system. Unlike most industrial video imaging applications, traffic cameras must deal with harsh environmental conditions and an extremely wide range of light. Optical character recognition is a critical function of a modern traffic camera system, with detection and accuracy heavily dependent on the camera function. In order to operate under demanding conditions, communication and functional optimization is implemented to control cameras from a roadside computer. The camera operates with a shutter speed faster than 1/2000 sec. to capture highway traffic both day and night. Consequently camera gain, pedestal level, shutter speed and gamma functions are controlled by a look-up table containing various parameters based on environmental conditions, particularly lighting. Lighting conditions are studied carefully, to focus only on the critical license plate surface. A unique light sensor permits accurate reading under a variety of conditions, such as a sunny day, evening, twilight, storms, etc. These camera systems are being deployed successfully in major ETC projects throughout the world.

The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.

Experimental data generated by the Very High Temperature Reactor Program need to be more available to users in the form of data tables on Web pages that can be downloaded to Excel or in delimited text formats that can be used directly for input to analysis and simulation codes, statistical packages, and graphics software. One solution that can provide current and future researchers with direct access to the data they need, while complying with records management requirements, is the Nuclear Data Management and Analysis System (NDMAS). This report describes the NDMAS system and its components, defines roles and responsibilities, describes the functions the system performs, describes the internal processes the NDMAS team uses to carry out the mission, and describes the hardware and software used to meet Very High Temperature Reactor Program needs.

Hydrogenolysis systems are provided that can include a reactor housing an Ru-comprising hydrogenolysis catalyst and wherein the contents of the reactor is maintained at a neutral or acidic pH. Reactant reservoirs within the system can include a polyhydric alcohol compound and a base, wherein a weight ratio of the base to the compound is less than 0.05. Systems also include the product reservoir comprising a hydrogenolyzed polyhydric alcohol compound and salts of organic acids, and wherein the moles of base are substantially equivalent to the moles of salts or organic acids. Processes are provided that can include an Ru-comprising catalyst within a mixture having a neutral or acidic pH. A weight ratio of the base to the compound can be between 0.01 and 0.05 during exposing.

Hydrogenolysis systems are provided that can include a reactor housing an Ru-comprising hydrogenolysis catalyst and wherein the contents of the reactor is maintained at a neutral or acidic pH. Reactant reservoirs within the system can include a polyhydric alcohol compound and a base, wherein a weight ratio of the base to the compound is less than 0.05. Systems also include the product reservoir comprising a hydrogenolyzed polyhydric alcohol compound and salts of organic acids, and wherein the moles of base are substantially equivalent to the moles of salts or organic acids. Processes are provided that can include an Ru-comprising catalyst within a mixture having a neutral or acidic pH. A weight ratio of the base to the compound can be between 0.01 and 0.05 during exposing.

Digital tomosynthesis offers the advantage of low radiation doses compared to conventional computed tomography (CT) by utilizing small numbers of projections ( 80) acquired over a limited angular range. It produces 3D volumetric data, although there are artifacts due to incomplete sampling. Based upon these characteristics, we developed a prototype digital tomosynthesis R/F system for applications in chest imaging. Our prototype chest digital tomosynthesis (CDT) R/F system contains an X-ray tube with high power R/F pulse generator, flat-panel detector, R/F table, electromechanical radiographic subsystems including a precise motor controller, and a reconstruction server. For image reconstruction, users select between analytic and iterative reconstruction methods. Our reconstructed images of Catphan700 and LUNGMAN phantoms clearly and rapidly described the internal structures of phantoms using graphics processing unit (GPU) programming. Contrast-to-noise ratio (CNR) values of the CTP682 module of Catphan700 were higher in images using a simultaneous algebraic reconstruction technique (SART) than in those using filtered back-projection (FBP) for all materials by factors of 2.60, 3.78, 5.50, 2.30, 3.70, and 2.52 for air, lung foam, low density polyethylene (LDPE), Delrin® (acetal homopolymer resin), bone 50% (hydroxyapatite), and Teflon, respectively. Total elapsed times for producing 3D volume were 2.92 s and 86.29 s on average for FBP and SART (20 iterations), respectively. The times required for reconstruction were clinically feasible. Moreover, the total radiation dose from our system (5.68 mGy) was lower than that of conventional chest CT scan. Consequently, our prototype tomosynthesis R/F system represents an important advance in digital tomosynthesis applications.

of polymer solution suspended in water or from a spray. Hollow PS particles were obtained by swelling PS latex with solvent, freezing in liquid nitrogen, and drying in vacuum. It is shown that the particle morphology is due to phase separation in the polymer emulsion droplets upon freezing in liquid nitrogen, and that morphological changes are driven largely by lowering interfacial free energy. The dried hollow particles were resuspended in a dispersing media and exposed to a plasticizer, which imparts mobility to polymer chains, to close the surface opening and form microcapsules surrounding an aqueous core. The interfacial free energy difference between the hydrophobic inside and hydrophilic outside surfaces is the major driving force for closing the hole on the surface. A controlled release biodegradable vehicle for drug was made by encapsulating procaine hydrochloride, a water-soluble drug, into the core of poly(DL-lactide) (PLA) microcapsules, which were made by the freeze-drying and subsequent closing process. The encapsulation efficiency is affected by the hollow particle morphology, amount of closing agent, exposure time, surfactant, and method of dispersing the hollow particles in water. Controlled release of procaine hydrochloride from the microcapsules into phosphate buffer was observed. The use of benign solvents dimethyl carbonate in spray/freeze-drying and CO2 for closing would eliminate concerns of residual harmful solvent in the product. The ease of separation of CO2 from the drug solution may also enable recycling of the drug solution to increase the overall encapsulation efficiency using these novel hollow particles.

The previous 'Art of Injector Design' is maturing and merging with the more systematic 'Science of Combustion Device Analysis.' This technology can be based upon observation, correlation, experimentation and ultimately analytical modeling based upon basic engineering principles. This methodology is more systematic and far superior to the historical injector design process of 'Trial and Error' or blindly 'Copying Past Successes.' The benefit of such an approach is to be able to rank candidate design concepts for relative probability of success or technical risk in all the important combustion device design requirements and combustion processdevelopment risk categories before committing to an engine development program. Even if a single analytical design concept cannot be developed to predict satisfying all requirements simultaneously, a series of risk mitigation key enabling technologies can be identified for early resolution. Lower cost subscale or laboratory experimentation to demonstrate proof of principle, critical instrumentation requirements, and design discriminating test plans can be developed based on the physical insight provided by these analyses.

The search for past or present life signs is one of the primary goals of the future Mars exploratory missions. With this aim the Mars Organic Molecule Analyzer (MOMA) module of the ExoMars 2013 next coming European space mission is designed to the in situ analysis, in the Martian soil, of organic molecules of exobiological interest such as amino acids, carboxylic acids, nucleobases or polycyclic aromatic hydrocarbons (PAHs). In the frame of the MOMA experiment we have been developing a Sample ProcessingSystem (SPS) compatible with gas chromatography (GC) analysis. The main goal of SPS is to allow the extraction and the gas chromatography separation of the refractory organic compounds from a solid matrix at trace level within space compatible operating conditions. The SPS is a mini-reactor, containing the solid sample (~500mg), able to increase (or decrease) the internal temperature from 20 to 500 °C within 13 sec. The extraction step is therefore performed by using thermodesorption, the best yield of extraction being obtained at 300°C for 10 to 20 min. It has to be noticed that the temperature could be increased up to 500°C without a significant lost of efficiency if the heating run time is kept below 3 min. After the thermodesorption the chemical derivatization of the extracted compounds is performed directly on the soil with a mixture of MTBSTFA and DMF [buch et al.]. By decreasing the polarity of the target molecules, this step allows their volatilization at a temperature below 250°C without any chemical degradation. Once derivatized, the targeted volatile molecules are transferred through a heated transfer line in the gas chromatograph coupled with a mass spectrometer for the detection. The SPS is a "one step/one pot" sample preparation system which should allow the MOMA experiment to detect the refractory molecules absorbed in the Martian soil at a detection limit below the ppb level. A. Buch, R. Sternberg, C. Szopa, C. Freissinet, C. Garnier, J. El Bekri

Development of a rapid and accurate method for visceral fat measurement is an important task, given the recent increase in the number of patients with metabolic syndrome. In this study, we optimized the Fast Low Angle Shot (FLASH) sequence using a binominal radiofrequency excitation pulse, in which the acquisition time is short, and measured changes in the amount of visceral fat in subjects after a period of wearing clothes with a fat-reducing effect during walking. We solved the reproducibility problem associated with the number of slices, and developed automatic measurement software for high-precision separation and extraction of abdominal visceral fat images. This software was developed using intensity correction with the coil position, derivation of a threshold by histogram analysis and fat separation by template matching for abdominal images. The cross-sectional area of a single slice varies for every acquisition due to visceral organ movement, but the relative error largely converged for seven slices. The measured amount of abdominal fat tended to be consistent with changes in the body fat and waist circumference of the subjects. The correlation coefficients between automatic extraction using the measurement software and manual extraction were 0.9978 for subcutaneous fat and 0.9972 for visceral fat, showing very strong positive correlations. The consistency rates were 0.9502+/-0.0167 for subcutaneous fat and 0.9395+/-0.0147 for visceral fat, and the shapes of the regions were also extracted very accurately. These results show that the magnetic resonance imaging acquisition method and image processingsystemdeveloped in this study are beneficial for measurement of abdominal visceral fat. Therefore, this method may have a major role in future diagnosis of metabolic syndrome.

This paper presents a short synopsis of the important developments in casting/solidification processes, as well as the important advances in the conventional methods. These developments are discussed related to quality aspects. The position of each process with respect to practice, as well as expected gains in cost, are examined. The paper briefly features the author's work on innovative processes (directional solidification, rheocasting, squeeze-casting and rapid solidification) as well as work of other investigators on developments in conventional methods.

Test and checkout systems are essential components in ensuring safety and reliability of aircraft and related systems for space missions. A variety of systems, developed over several years, are in use at the NASA/KSC. Many of these systems are configured as distributed data processingsystems with the functionality spread over several multiprocessor nodes interconnected through networks. To be cost-effective, a system should take the least amount of resource and perform a given testing task in the least amount of time. There are two aspects of performance evaluation: monitoring and benchmarking. While monitoring is valuable to system administrators in operating and maintaining, benchmarking is important in designing and upgrading computer-based systems. These two aspects of performance evaluation are the foci of this project. This paper first discusses various issues related to software, hardware, and hybrid performance monitoring as applicable to distributed systems, and specifically to the TCMS (Test Control and Monitoring System). Next, a comparison of several probing instructions are made to show that the hybrid monitoring technique developed by the NIST (National Institutes for Standards and Technology) is the least intrusive and takes only one-fourth of the time taken by software monitoring probes. In the rest of the paper, issues related to benchmarking a distributed system have been discussed and finally a prescription for developing a micro-benchmark for the TCMS has been provided.

The architecture of dynamic security assessment processingsystem (DSAPS) is proposed to address online dynamic security assessment (DSA) with focus of the dissertation on low-probability, high-consequence events. DSAPS upgrades current online DSA functions and adds new functions to fit into the modern power grid. Trajectory sensitivity analysis is introduced and its applications in power system are reviewed. An index is presented to assess transient voltage dips quantitatively using trajectory sensitivities. Then the framework of anticipatory computing system (ACS) for cascading defense is presented as an important function of DSAPS. ACS addresses various security problems and the uncertainties in cascading outages. Corrective control design is automated to mitigate the system stress in cascading progressions. The corrective controls introduced in the dissertation include corrective security constrained optimal power flow, a two-stage load control for severe under-frequency conditions, and transient stability constrained optimal power flow for cascading outages. With state-of-the-art computing facilities to perform high-speed extended-term time-domain simulation and optimization for large-scale systems, DSAPS/ACS efficiently addresses online DSA for low-probability, high-consequence events, which are not addressed by today's industrial practice. Human interference is reduced in the computationally burdensome analysis.

A fuzzy classifier system that discovers rules for controlling a mathematical model of a pH titration system was developed by researchers at the U.S. Bureau of Mines (USBM). Fuzzy classifier systems successfully combine the strengths of learning classifier systems and fuzzy logic controllers. Learning classifier systems resemble familiar production rule-based systems, but they represent their IF-THEN rules by strings of characters rather than in the traditional linguistic terms. Fuzzy logic is a tool that allows for the incorporation of abstract concepts into rule based-systems, thereby allowing the rules to resemble the familiar 'rules-of-thumb' commonly used by humans when solving difficult process control and reasoning problems. Like learning classifier systems, fuzzy classifier systems employ a genetic algorithm to explore and sample new rules for manipulating the problem environment. Like fuzzy logic controllers, fuzzy classifier systems encapsulate knowledge in the form of production rules. The results presented in this paper demonstrate the ability of fuzzy classifier systems to generate a fuzzy logic-based process control system.

Functional systems theory was used to consider the process of reinforcement of the actions on the body of reinforcing factors, i.e., the results of behavior satisfying the body's original needs. The systemsprocess of reinforcement includes reverse afferentation entering the CNS from receptors acted upon by various parameters of the desired results, and mechanisms for comparing reverse afferentation with the apparatus which accepts the results of the action and the corresponding emotional component. A tight interaction between reinforcement and the dominant motivation is generated on the basis of the hologram principle. Reinforcement forms an apparatus for predicting a desired result, i.e. a result-of-action acceptor. Reinforcement procedures significant changes in the activities of individual neurons in the various brain structures involved in dominant motivation, transforming their spike activity for a burst pattern to regular discharges; there are also molecular changes in neuron properties. After preliminary reinforcement, the corresponding motivation induces the ribosomal system of neurons to start synthesizing special effector molecules, which organize molecular engrams of the acceptor of the action's result. Sensory mechanisms of reinforcement are considered, with particular reference to the information role of emotions.

The telemedicine optoelectronic biomedical data processingsystem is created to share medical information for the control of health rights and timely and rapid response to crisis. The system includes the main blocks: bioprocessor, analog-digital converter biomedical images, optoelectronic module for image processing, optoelectronic module for parallel recording and storage of biomedical imaging and matrix screen display of biomedical images. Rated temporal characteristics of the blocks defined by a particular triggering optoelectronic couple in analog-digital converters and time imaging for matrix screen. The element base for hardware implementation of the developed matrix screen is integrated optoelectronic couples produced by selective epitaxy.

This report discusses Test Campaign TC12 of the Kellogg Brown & Root, Inc. (KBR) Transport Gasifier train with a Siemens Westinghouse Power Corporation (SW) particle filter system at the Power SystemsDevelopment Facility (PSDF) located in Wilsonville, Alabama. The Transport Gasifier is an advanced circulating fluidized-bed reactor designed to operate as either a combustor or a gasifier using a particulate control device (PCD). While operating as a gasifier, either air or oxygen can be used as the oxidant. Test run TC12 began on May 16, 2003, with the startup of the main air compressor and the lighting of the gasifier start-up burner. The Transport Gasifier operated until May 24, 2003, when a scheduled outage occurred to allow maintenance crews to install the fuel cell test unit and modify the gas clean-up system. On June 18, 2003, the test run resumed when operations relit the start-up burner, and testing continued until the scheduled end of the run on July 14, 2003. TC12 had a total of 733 hours using Powder River Basin (PRB) subbituminous coal. Over the course of the entire test run, gasifier temperatures varied between 1,675 and 1,850 F at pressures from 130 to 210 psig.

The process of reinforcement is considered in the context of the general theory of functional systems as an important part of behavioural act organization closely interacting with the dominant motivation. It is shown that reinforcement substantially changes the activities of separate neurons in different brain structures involved in dominant motivation. After a preliminary reinforcement under the influence of corresponding motivation the ribosomal apparatus of neurons begins to synthesize special molecular engrams of the action acceptor. The sensory mechanisms of reinforcement and, especially, the role of emotions are considered in details in the paper.

This report provides an alternative strategy evolved from the current Hanford Site Tank Waste Remediation System (TWRS) programmatic baseline for accomplishing the treatment and disposal of the Hanford Site tank wastes. This optimized processing strategy performs the major elements of the TWRS Program, but modifies the deployment of selected treatment technologies to reduce the program cost. The present program for development of waste retrieval, pretreatment, and vitrification technologies continues, but the optimized processing strategy reuses a single facility to accomplish the separations/low-activity waste (LAW) vitrification and the high-level waste (HLW) vitrification processes sequentially, thereby eliminating the need for a separate HLW vitrification facility.

A new fine coal dewatering technology has been developed and tested in the present work. The work was funded by the Solid Fuels and Feedstocks Grand Challenge PRDA. The objective of this program was to 'develop innovative technical approaches to ensure a continued supply of environmentally sound solid fuels for existing and future combustion systems with minimal incremental fuel cost.' Specifically, this solicitation is aimed at developing technologies that can (i) improve the efficiency or economics of the recovery of carbon when beneficiating fine coal from both current production and existing coal slurry impoundments and (ii) assist in the greater utilization of coal fines by improving the handling characteristics of fine coal via dewatering and/or reconstitution. The results of the test work conducted during Phase I of the current project demonstrated that the new dewatering technologies can substantially reduce the moisture from fine coal, while the test work conducted during Phase II successfully demonstrated the commercial viability of this technology. It is believed that availability of such efficient and affordable dewatering technology is essential to meeting the DOE's objectives.

Two different system architectures are presented. The two architectures are derived from two different data flows within the Spacelab Output ProcessingSystem. The major differences between these system architectures are in the position of the decommutation function (the first architecture performs decommutation in the latter half of the system and the second architecture performs that function in the front end of the system). In order to be examined, the system was divided into five stand-alone subsystems; Work Assembler, Mass Storage System, Output Processor, Peripheral Pool, and Resource Monitor. The work load of each subsystem was estimated independent of the specific devices to be used. The candidate devices were surveyed from a wide sampling of off-the-shelf devices. Analytical expressions were developed to quantify the projected workload in conjunction with typical devices which would adequately handle the subsystem tasks. All of the study efforts were then directed toward preparing performance and cost curves for each architecture subsystem.

The Hybrid Sulfur (HyS) Thermochemical Process is a means of producing hydrogen via water-splitting through a combination of chemical reactions and electrochemistry. Energy is supplied to the system as high temperature heat (approximately 900 C) and electricity. Advanced nuclear reactors (Generation IV) or central solar receivers can be the source of the primary energy. Large-scale hydrogen production based on this process could be a major contributor to meeting the needs of a hydrogen economy. This project's objectives include optimization of the HyS process design, analysis of technical issues and concerns, creation of a development plan, and laboratory-scale proof-of-concept testing. The key component of the HyS Process is the SO2-depolarized electrolyzer (SDE). Studies were performed that showed that an electrolyzer operating in the range of 500-600 mV per cell can lead to an overall HyS cycle efficiency in excess of 50%, which is superior to all other currently proposed thermochemical cycles. Economic analysis indicated hydrogen production costs of approximately $1.60 per kilogram for a mature nuclear hydrogen production plant. However, in order to meet commercialization goals, the electrolyzer should be capable of operating at high current density, have a long operating lifetime , and have an acceptable capital cost. The use of proton-exchange-membrane (PEM) technology, which leverages work for the development of PEM fuel cells, was selected as the most promising route to meeting these goals. The major accomplishments of this project were the design and construction of a suitable electrolyzer test facility and the proof-of-concept testing of a PEM-based SDE.

Development of the liquid carbon dioxide process for the cleaning of coal was performed in batch, variable volume (semi-continuous), and continuous tests. Continuous operation at feed rates up to 4.5 kg/hr (10-lb/hr) was achieved with the Continuous System. Coals tested included Upper Freeport, Pittsburgh, Illinois No. 6, and Middle Kittanning seams. Results showed that the ash and pyrite rejections agreed closely with washability data for each coal at the particle size tested (-200 mesh). A 0.91 metric ton (1-ton) per hour Proof-of-Concept Plant was conceptually designed. A 181 metric ton (200 ton) per hour and a 45 metric ton (50 ton) per hour plant were sized sufficiently to estimate costs for economic analyses. The processing costs for the 181 metric ton (200 ton) per hour and 45 metric ton (50 ton) per hour were estimated to be $18.96 per metric ton ($17.20 per ton) and $11.47 per metric ton ($10.40 per ton), respectively for these size plants. The costs for the 45 metric ton per hour plant are lower because it is assumed to be a fines recovery plant which does not require a grinding circuit of complex waste handling system.

The Concise Data Processing Assessment (CDPA) was developed to probe student abilities related to the nature of measurement and uncertainty and to handling data. The diagnostic is a ten question, multiple-choice test that can be used as both a pre-test and post-test. A key component of the developmentprocess was interviews with students, which…

The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760).

When NASA designs a spacecraft to undertake a new mission, innovation does not stop after the design phase. In many cases, these spacecraft are firsts of their kind, requiring not only remarkable imagination and expertise in their conception but new technologies and methods for their manufacture. In the realm of manufacturing, NASA has from necessity worked on the cutting-edge, seeking new techniques and materials for creating unprecedented structures, as well as capabilities for reducing the cost and increasing the efficiency of existing manufacturing technologies. From friction stir welding enhancements (Spinoff 2009) to thermoset composites (Spinoff 2011), NASA s innovations in manufacturing have often transferred to the public in ways that enable the expansion of the Nation s industrial productivity. NASA has long pursued ways of improving upon and ensuring quality results from manufacturing processes ranging from arc welding to thermal coating applications. But many of these processes generate blinding light (hence the need for special eyewear during welding) that obscures the process while it is happening, making it difficult to monitor and evaluate. In the 1980s, NASA partnered with a company to develop technology to address this issue. Today, that collaboration has spawned multiple commercial products that not only support effective manufacturing for private industry but also may support NASA in the use of an exciting, rapidly growing field of manufacturing ideal for long-duration space missions.

This report discusses Test Campaign TC15 of the Kellogg Brown & Root, Inc. (KBR) Transport Gasifier train with a Siemens Power Generation, Inc. (SPG) particle filter system at the Power SystemsDevelopment Facility (PSDF) located in Wilsonville, Alabama. The Transport Gasifier is an advanced circulating fluidized-bed reactor designed to operate as either a combustor or gasifier using a particulate control device (PCD). While operating as a gasifier, either air or oxygen can be used as the oxidant. Test run TC15 began on April 19, 2004, with the startup of the main air compressor and the lighting of the gasifier startup burner. The Transport Gasifier was shutdown on April 29, 2004, accumulating 200 hours of operation using Powder River Basin (PRB) subbituminous coal. About 91 hours of the test run occurred during oxygen-blown operations. Another 6 hours of the test run was in enriched-air mode. The remainder of the test run, approximately 103 hours, took place during air-blown operations. The highest operating temperature in the gasifier mixing zone mostly varied from 1,800 to 1,850 F. The gasifier exit pressure ran between 200 and 230 psig during air-blown operations and between 110 and 150 psig in oxygen-enhanced air operations.

Summary Despite a great deal of progress, more than 10% of pregnant women in the USA smoke. Epidemiological studies have demonstrated correlations between developmental tobacco smoke exposure and sensory processing deficits, as well as a number of neuropsychiatric conditions, including attention deficit hyperactivity disorder. Significantly, data from animal models of developmental nicotine exposure have suggested that the nicotine in tobacco contributes significantly to the effects of developmental smoke exposure. Consequently, we hypothesize that nicotinic acetylcholine receptors (nAChRs) are critical for setting and refining the strength of corticothalamic-thalamocortical loops during critical periods of development and that disruption of this process by developmental nicotine exposure can result in long-lasting dysregulation of sensory processing. The ability of nAChR activation to modulate synaptic plasticity is likely to underlie the effects of both endogenous cholinergic signaling and pharmacologically-administered nicotine to alter cellular, physiological and behavioral processes during critical periods of development. PMID:18692078

The Ground Data Systems Resource Allocation Process at the Jet Propulsion Laboratory provides medium- and long-range planning for the use of Deep Space Network and Mission Control and Computing Center resources in support of NASA's deep space missions and Earth-based science. Resources consist of radio antenna complexes and associated data processing and control computer networks. A semi-automated system was developed that allows operations personnel to interactively generate, edit, and revise allocation plans spanning periods of up to ten years (as opposed to only two or three weeks under the manual system) based on the relative merit of mission events. It also enhances scientific data return. A software system known as the Resource Allocation and Planning Helper (RALPH) merges the conventional methods of operations research, rule-based knowledge engineering, and advanced data base structures. RALPH employs a generic, highly modular architecture capable of solving a wide variety of scheduling and resource sequencing problems. The rule-based RALPH system has saved significant labor in resource allocation. Its successful use affirms the importance of establishing and applying event priorities based on scientific merit, and the benefit of continuity in planning provided by knowledge-based engineering. The RALPH system exhibits a strong potential for minimizing development cycles of resource and payload planning systems throughout NASA and the private sector.

This report describes a personal computer based system for automatic and semiautomatic tracking of objects on film or video tape, developed to meet the needs of the Microgravity Combustion and Fluids Science Research Programs at the NASA Lewis Research Center. The system consists of individual hardware components working under computer control to achieve a high degree of automation. The most important hardware components include 16-mm and 35-mm film transports, a high resolution digital camera mounted on a x-y-z micro-positioning stage, an S-VHS tapedeck, an Hi8 tapedeck, video laserdisk, and a framegrabber. All of the image input devices are remotely controlled by a computer. Software was developed to integrate the overall operation of the system including device frame incrementation, grabbing of image frames, image processing of the object's neighborhood, locating the position of the object being tracked, and storing the coordinates in a file. This process is performed repeatedly until the last frame is reached. Several different tracking methods are supported. To illustrate the process, two representative applications of the system are described. These applications represent typical uses of the system and include tracking the propagation of a flame front and tracking the movement of a liquid-gas interface with extremely poor visibility.

Hormonal regulation of cellular function involves the binding of small molecules with receptors that then coordinate subsequent interactions with other signal transduction proteins. These dynamic, multicomponent processes are difficult to track in cells and even in reconstituted in vitro systems, and most methods can monitor only two-component interactions, often with limited capacity to follow dynamic changes. Through a judicious choice of three organic acceptor fluorophores paired with a terbium donor fluorophore, we have developed the first example of a one-donor/three-acceptor multicolor time-resolved fluorescence energy transfer (TR-FRET) system, and we have exemplified its use by monitoring a ligand-regulated protein-protein exchange process in a four-component biological system. By careful quantification of the emission from each of the three acceptors at the four channels for terbium donor emission, we demonstrate that any of these donor channels can be used to estimate the magnitude of the three FRET signals in this terbium-donor triple-acceptor system with minimal bleedthrough. Using this three-channel terbium-based, TR-FRET assay system, we show in one experiment that the addition of a fluorescein-labeled estrogen agonist displaces a SNAPFL-labeled antiestrogen from the ligand binding pocket of a terbium-labeled estrogen receptor, at the same time causing a Cy5-labeled coactivator to be recruited to the estrogen receptor. This experiment demonstrates the power of a four-color TR-FRET experiment, and it shows that the overall process of estrogen receptor ligand exchange and coactivator binding is a dynamic but precisely coordinated process.

This article recounts the development of process automation system that rely on nucleonic sources for nondestructive measurement. The author details the formation and growth of a manufacturer and supplier of these systems. The development of the Accu Ray, the first nucleonic gage, is discussed.

The implementation of a laboratory-based ultrasound tomography system to an industrial process application is not straightforward. In the present work, a tomography system with 16 transducers has been applied to an industrial 50 mm hydrocyclone to visualize its air-core size and position. Hydrocyclones are used to separate fine particles from a slurry. The efficiency of the separation process depends on the size of the air core within the cyclone. If the core is too large due to spigot wear, there will be a detrimental effect on the slurry throughput. Conversely, if the throughput is increased to an extent where the air core becomes unstable or disappears, the particle separation will no longer take place, and the processed batches may become contaminated. Ultrasound tomography presents a very good tool with which to visualize the size, position and movement of the air core and monitor its behaviour under varying input parameters. Ultimately, it could be used within this application both to control the input flow rate depending on the air core size and to detect spigot wear. This paper describes the development of an ultrasonic tomography system applied to an instrumented hydrocyclone. Time-of-flight data are captured by a dedicated acquisition system that pre-processes the information using a DSP and transfers the results to a PC via a fast serial link. The hardware of the tomography system is described, and cursory results are presented in the form of reconstructed images of the air core within the hydrocyclone.

The Department of Energy is seeking to modernize its special nuclear material (SNM) production facilities and concurrently reduce radiation exposures and process and incidental radioactive waste generated. As part of this program, Lawrence Livermore National Laboratory (LLNL) lead team is developing and adapting generic and specific applications of commercial robotic technologies to SNM pyrochemical processing and other operations. A working gantry robot within a sealed processing glove box and a telerobot control test bed are manifestations of this effort. This paper describes the development challenges and progress in adapting processing, robotic, and nuclear safety technologies to the application. 3 figs.

Chromatographic separation serves as "a workhorse" for downstream processdevelopment and plays a key role in removal of product-related, host cell-related, and process-related impurities. Complex and poorly characterized raw materials and feed material, low feed concentration, product instability, and poor mechanistic understanding of the processes are some of the critical challenges that are faced during development of a chromatographic step. Traditional processdevelopment is performed as trial-and-error-based evaluation and often leads to a suboptimal process. High-throughput processdevelopment (HTPD) platform involves an integration of miniaturization, automation, and parallelization and provides a systematic approach for time- and resource-efficient chromatography processdevelopment. Creation of such platforms requires integration of mechanistic knowledge of the process with various statistical tools for data analysis. The relevance of such a platform is high in view of the constraints with respect to time and resources that the biopharma industry faces today. This protocol describes the steps involved in performing HTPD of process chromatography step. It described operation of a commercially available device (PreDictor™ plates from GE Healthcare). This device is available in 96-well format with 2 or 6 μL well size. We also discuss the challenges that one faces when performing such experiments as well as possible solutions to alleviate them. Besides describing the operation of the device, the protocol also presents an approach for statistical analysis of the data that is gathered from such a platform. A case study involving use of the protocol for examining ion-exchange chromatography of granulocyte colony-stimulating factor (GCSF), a therapeutic product, is briefly discussed. This is intended to demonstrate the usefulness of this protocol in generating data that is representative of the data obtained at the traditional lab scale. The agreement in the

This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC's expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three- dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above. The following technical tasks have been accomplished as part of the CRADA. 1. The LMES welding code has been ported to the Intel Paragon parallel computer at ORNL

Positron Emission Tomography (PET) historically has major clinical and preclinical applications in cancerous oncology, neurology, and cardiovascular diseases. Recently, in a new direction, an application specific PET system is being developed at Thomas Jefferson National Accelerator Facility (Jefferson Lab) in collaboration with Duke University, University of Maryland at Baltimore (UMAB), and West Virginia University (WVU) targeted for plant eco-physiology research. The new plant imaging PET system is versatile and scalable such that it could adapt to several plant imaging needs - imaging many important plant organs including leaves, roots, and stems. The mechanical arrangement of the detectors is designed to accommodate the unpredictable and random distribution in space of the plant organs without requiring the plant be disturbed. Prototyping such a system requires a new data acquisition system (DAQ) and data processingsystem which are adaptable to the requirements of these unique and versatile detectors.

In the evolution of life forms nonphotosynthetic mechanisms are developed. The question remains whether a total life system could evolve which is not dependent upon photosynthesis. In trying to visualize life on other planets, the photosynthetic process has problems. On Mars, the high intensity of light at the surface is a concern and alternative mechanisms need to be defined and analyzed. In the UV search for alternate mechanisms, several different areas may be identified. These involve activated inorganic compounds in the atmosphere, such as the products of photodissociation of carbon dioxide and the organic material which may be created by natural phenomena. In addition, a life system based on the pressure of the atmospheric constituents, such as carbon dioxide, is a possibility. These considerations may be important for the understanding of evolutionary processes of life on another planet. Model systems which depend on these alternative mechanisms are defined and related to presently planned and future planetary missions.

A precise control of composite material processing would not only improve part quality, but it would also directly reduce the overall manufacturing cost. The development and incorporation of sensors will help to generate real-time information for material processing relationships and equipment characteristics. In the present work, the thermocouple, pressure transducer, and dielectrometer technologies were investigated. The monitoring sensors were integrated with the computerized control system in three non-autoclave fabrication techniques: hot-press, self contained tool (self heating and pressurizing), and pressure vessel). The sensors were implemented in the parts and tools.

The MILDGAS process is capable of processing both eastern caking and western non-caking coals to yield a slate of liquid and solid products. The liquids can be processed to produce: feedstocks for chemicals; pitch for use as a binder for electrodes in the aluminum industry; and fuels. Depending on the feed coal characteristics and the operating conditions, the char can be used as an improved fuel for power generation or can be used to make form coke for steel-making blast furnaces or for foundry cupola operations. The specific objectives of the program are to: design, construct, and operate a 24-tons/day adiabatic processdevelopment unit (PDU) to obtain process performance data suitable for design scaleup; obtain large batches of coal-derived co-products for industrial evaluation; prepare a detailed design of a demonstration unit; and develop technical and economic plans for commercialization of the MILDGAS process. In this paper, the authors present the process design of the PDU facility, a description of the expected product distribution and the project test plan to be implemented in the program.

Conventional photoresist processing involves resist coating, exposure, post-exposure bake, development, rinse and spin drying of a wafer. DDRP mitigates pattern collapse by applying a special polymer material (DDRM) which replaces the exposed/developed part of the photoresist material before wafer is spin dried. As noted above, the main mechanism of pattern collapse is the capillary forces governed by surface tension of rinse water and its asymmetrical recession from both sides of the lines during the drying step of the developprocess. DDRP essentially eliminates these failure mechanisms by replacing remaining rinse water with DDRM and providing a structural framework that support resist lines from both sides during spin dry process. Dry development rinse process (DDRP) eliminates the root causes responsible for pattern collapse of photoresist line structures. Since these collapse mechanisms are mitigated, without the need for changes in the photoresist itself, achievable resolution of the state-of-the-art EUV photoresists can further be improved.

The Herschel Space Observatory, the fourth cornerstone mission in the ESA science program, was launched on the 14th of May 2009. As a cryogenic mission Herschel's operational lifetime is consumable-limited by its supply of liquid helium, estimated to be depleted by March 2013. Afterwards the mission will start a 4.75 year long post-operations phase. Originally it was considered sufficient to provide astronomers with raw data and software tools to carry out a basic data reduction, and no “data products” were to be generated and delivered. Following the realisation that the expectations of the astronomical community on the deliverables of an observatory mission evolved it was agreed to implement a single ‘cradle to grave’ data analysis system supporting the needs of all users for the whole project cycle. We will summarise the lessons learned during those ten years of Herschel data processingdevelopment, address the main challenges of this major software development project, and reflect on what went well, what needed to be adapted, and our open points.

The oil field's first intelligent rod pumping system designed specifically to reduce the cost of pumping oil wells now is a reality. As a plus benefit, the system (called Liftronic) is compact and quiet. The new system combines an efficient mechanical design with a computer control system to reduce pumping costs. The unit stands less than 8 ft high, or approx. one-fourth the height of a comparable beam unit. It also mounts directly on the wellhead. The entire system can be concealed behind a fence or enclosed within a small building to make it a more attractive neighbor in residential, commercial, or recreational areas. It is useful also for agricultural areas where overhead irrigation systems restrict the use of many oil field pumping systems.

In this technical document, we describe the development of the Aviation System Analysis Capability (ASAC) Executive Assistant (EA) Proof of Concept (POC) and Beta version. We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models in the ASAC system, and describe the design process and the results of the ASAC EA POC and Beta systemdevelopment. We also describe the evaluation process and results for applicable COTS software. The document has seven chapters, a bibliography, and two appendices.

High performance, brittle materials are the materials of choice for many of today's engineering applications. This paper describes three separate precision grinding processesdeveloped at Lawrence Liver-more National Laboratory to machine precision ceramic components. Included in the discussion of the precision processes is a variety of grinding wheel dressing, truing and profiling techniques.

The Genesis soil washing system is an integrated system of modular design allowing for maximum material handling capabilities, with optimized use of space for site mobility. The Surfactant Activated Bio-enhanced Remediation Equipment-Generation 1 (SABRE-1, Patent Applied For) modification was developed specifically for removing petroleum byproducts from contaminated soils. Scientifically formulated surfactants, introduced by high pressure spray nozzles, displace the contaminant from the surface of the soil particles into the process solution. Once the contaminant is dispersed into the liquid fraction of the process, it is either mechanically removed, chemically oxidized, or biologically oxidized. The contaminated process water is pumped through the Genesis Biosep (Patent Applied For) filtration system where the fines portion is flocculated, and the contaminant-rich liquid portion is combined with an activated mixture of nutrients and carefully selected bacteria to decompose the hydrocarbon fraction. The treated soil and dewatered fines are transferred to a bermed stockpile where bioremediation continues during drying. The process water is reclaimed, filtered, and recycled within the system.

This action research study examined the effectiveness of the process implemented by Partnerships to Uplift Communities (PUC) Schools Charter Management Organization to develop their school leader evaluation system in collaboration with current PUC school leaders. The development of the leadership evaluation system included the collective voices of…

The integration of many life support systems necessary to construct a stable habitat is difficult. The correct identification of the appropriate technologies and corresponding interfaces is an exhaustive process. Once technologies are selected secondary issues such as mechanical and electrical interfaces must be addressed. The required analytical and testing work must be approached in a piecewise fashion to achieve timely results. A repeatable process has been developed to identify and prioritize system level assessments and testing needs. This Assessment Selection Process has been defined to assess cross cutting integration issues on topics at the system or component levels. Assessments are used to identify risks, encourage future actions to mitigate risks, or spur further studies.

ONE-HUNDRED MANUFACTURERS EXPRESSED INTEREST IN BIDDING FOR A SYSTEM ON SCHOOL CONSTRUCTION CALLED SCSD OR SCHOOL CONSTRUCTION SYSTEMSDEVELOPMENT TO THE FIRST CALIFORNIA COMMISSION ON SCHOOL CONSTRUCTION SYSTEMS. TWENTY-TWO BUILDINGS COMPRISED THE PROJECT. THE OBJECTIVE WAS TO DEVELOP AN INTEGRATED SYSTEM OF STANDARD SCHOOL BUILDING COMPONENTS…

A system for the online, non-contact measurement of wall thickness in steel seamless mechanical tubing has been developed and demonstrated at a tubing production line at the Timken Company in Canton, Ohio. The system utilizes laser-generation of ultrasound and laser-detection of time of flight with interferometry, laser-doppler velocimetry and pyrometry, all with fiber coupling. Accuracy (<1% error) and precision (1.5%) are at targeted levels. Cost and energy savings have exceeded estimates. The system has shown good reliability in measuring over 200,000 tubes in its first six months of deployment.

The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts.

Autonomous control systems provides the ability of self-governance beyond the conventional control system. As the complexity of mechanical and electrical systems increases, there develops a natural drive for developing robust control systems to manage complicated operations. By closing the bridge between conventional automated systems to knowledge based self-awareness systems, nominal control of operations can evolve into relying on safe critical mitigation processes to support any off-nominal behavior. Current research and development efforts lead by the Autonomous Propellant Loading (APL) group at NASA Kennedy Space Center aims to improve cryogenic propellant transfer operations by developing an automated control and health monitoring system. As an integrated systems, the center aims to produce an Autonomous Operations System (AOS) capable of integrating health management operations with automated control to produce a fully autonomous system.

The ground processing and launch of Shuttle vehicles and their payloads is the primary task of Kennedy Space Center. It is a process which is largely manual and contains little inherent automation. Business is conducted today much as it was during previous NASA programs such as Apollo. In light of new programs and decreasing budgets, NASA must find more cost effective ways in which to do business while retaining the quality and safety of activities. Advanced technologies including artificial intelligence could cut manpower and processing time. This paper is an overview of the research and development in Al technology at KSC with descriptions of the systems which have been implemented, as well as a few under development which are promising additions to ground processing software. Projects discussed cover many facets of ground processing activities, including computer sustaining engineering, subsystem monitor and diagnosis tools and launch team assistants. The deployed Al applications have proven an effectiveness which has helped to demonstrate the benefits of utilizing intelligent software in the ground processing task.

The author describes a comprehensive career developmentsystem implemented by Coca-Cola USA. The system's objectives are (1) to promote from within, (2) to develop talent for the future, (3) to make managers responsible for development efforts, and (4) to make individuals ultimately responsible for their development. (CH)

The Exploration Medical Capability (ExMC) Element systems engineering goals include defining the technical system needed to implement exploration medical capabilities for Mars. This past year, scenarios captured in the medical system concept of operations laid the foundation for systems engineering technical development work. The systems engineering team analyzed scenario content to identify interactions between the medical system, crewmembers, the exploration vehicle, and the ground system. This enabled the definition of functions the medical system must provide and interfaces to crewmembers and other systems. These analyses additionally lead to the development of a conceptual medical system architecture. The work supports the ExMC community-wide understanding of the functional exploration needs to be met by the medical system, the subsequent development of medical system requirements, and the system verification and validation approach utilizing terrestrial analogs and precursor exploration missions.

Documentation for the U.S. Navy's curriculum developmentsystem is brought together in this paper, beginning with a description of the Naval Technical Training System. This description includes the Navy Training Plan (NTP) process, which is the current mechanism for introducing new courses; the organization and administration of the system; the…

Viewgraphs are included on processdevelopment in aqueous cleaning which is taking place at the Aerojet Advanced Solid Rocket Motor (ASRM) Division under a NASA Marshall Space and Flight Center contract for design, development, test, and evaluation of the ASRM including new production facilities. The ASRM will utilize aqueous cleaning in several manufacturing process steps to clean case segments, nozzle metal components, and igniter closures. ASRM manufacturing processdevelopment is underway, including agent selection, agent characterization, subscale process optimization, bonding verification, and scale-up validation. Process parameters are currently being tested for optimization utilizing a Taguci Matrix, including agent concentration, cleaning solution temperature, agitation and immersion time, rinse water amount and temperature, and use/non-use of drying air. Based on results of processdevelopment testing to date, several observations are offered: aqueous cleaning appears effective for steels and SermeTel-coated metals in ASRM processing; aqueous cleaning agents may stain and/or attack bare aluminum metals to various extents; aqueous cleaning appears unsuitable for thermal sprayed aluminum-coated steel; aqueous cleaning appears to adequately remove a wide range of contaminants from flat metal surfaces, but supplementary assistance may be needed to remove clumps of tenacious contaminants embedded in holes, etc.; and hot rinse water appears to be beneficial to aid in drying of bare steel and retarding oxidation rate.

Viewgraphs are included on processdevelopment in aqueous cleaning which is taking place at the Aerojet Advanced Solid Rocket Motor (ASRM) Division under a NASA Marshall Space and Flight Center contract for design, development, test, and evaluation of the ASRM including new production facilities. The ASRM will utilize aqueous cleaning in several manufacturing process steps to clean case segments, nozzle metal components, and igniter closures. ASRM manufacturing processdevelopment is underway, including agent selection, agent characterization, subscale process optimization, bonding verification, and scale-up validation. Process parameters are currently being tested for optimization utilizing a Taguci Matrix, including agent concentration, cleaning solution temperature, agitation and immersion time, rinse water amount and temperature, and use/non-use of drying air. Based on results of processdevelopment testing to date, several observations are offered: aqueous cleaning appears effective for steels and SermeTel-coated metals in ASRM processing; aqueous cleaning agents may stain and/or attack bare aluminum metals to various extents; aqueous cleaning appears unsuitable for thermal sprayed aluminum-coated steel; aqueous cleaning appears to adequately remove a wide range of contaminants from flat metal surfaces, but supplementary assistance may be needed to remove clumps of tenacious contaminants embedded in holes, etc.; and hot rinse water appears to be beneficial to aid in drying of bare steel and retarding oxidation rate.

Each phase of the sequence developmentprocess had to overcome many operational challenges due to the immense complexity of the spacecraft, tour design, pointing capabilities, flight rules and software development. This paper will address the specific challenges related to each of those complexities and the methods used to overcome them during operation.

In this article we explored the theories of Arnold Gesell, Erik Erickson and Jean Piaget about how human beings development. In this component we will analyze the cognitive processes of how children perceive and develop, in particular children from a cross-cultural background. How learning takes place, and how the influences of culture, and…

This study aimed at exploring how high school students deal with designing an information system, for example, for a small business or a medical clinic, the extent to which students develop as independent learners while working on their projects, and the factors that help or hinder fostering students' design skills. The three-phase dual-loop…

Vacuum-Assisted Resin Transfer Molding (VARTM) is a Liquid Composite Molding (LCM) process where both resin injection and fiber compaction are achieved under pressures of 101.3 kPa or less. Originally developed over a decade ago for marine composite fabrication, VARTM is now considered a viable process for the fabrication of aerospace composites (1,2). In order to optimize and further improve the process, a finite element analysis (FEA) process model is being developed to include the coupled phenomenon of resin flow, preform compaction and resin cure. The model input parameters are obtained from resin and fiber-preform characterization tests. In this study, the compaction behavior and the Darcy permeability of a commercially available carbon fabric are characterized. The resulting empirical model equations are input to the 3- Dimensional Infiltration, version 5 (3DINFILv.5) process model to simulate infiltration of a composite panel.

Most medical information systems are interactive information systems, since they provide their users with conversational access to data. The design of an interactive information system requires attention to data design, process design, and user interface design so that the resulting system will be easy to use and reliable. This paper describes some automated tools aimed at assisting software designers and developers in creating interactive information systems, with emphasis on the Software through Pictures environment and the User Software Engineering (USE) methodology.

Trends in recent literature advocate a family systems approach to career development. To examine associations between process aspects of adolescent career development and family adaptability-family cohesion, 262 Virginia high school students (157 females, 105 males) completed the Career Development Inventory, the Assessment of Career Decision…

One of the recent challenges in the aerospace industry has been to smoothly transition operations-oriented computer systems to meet increasing demands on smaller budgets. Sometimes the best solution is not affordable, but the current situation is equally untenable.

The application of Artificial Intelligence to electronic and process control can help attain the autonomy and safety requirements of manned space systems. An overview of documented applications within various industries is presented. The developmentprocess is discussed along with associated issues for implementing an intelligence process control system.

Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced.

The Compact Microscope Imaging System (CMIS) is a diagnostic tool with intelligent controls for use in space, industrial, medical, and security applications. The CMIS can be used in situ with a minimum amount of user intervention. This system, which was developed at the NASA Glenn Research Center, can scan, find areas of interest, focus, and acquire images automatically. Large numbers of multiple cell experiments require microscopy for in situ observations; this is only feasible with compact microscope systems. CMIS is a miniature machine vision system that combines intelligent image processing with remote control capabilities. The software also has a user-friendly interface that can be used independently of the hardware for post-experiment analysis. CMIS has potential commercial uses in the automated online inspection of precision parts, medical imaging, security industry (examination of currency in automated teller machines and fingerprint identification in secure entry locks), environmental industry (automated examination of soil/water samples), biomedical field (automated blood/cell analysis), and microscopy community. CMIS will improve research in several ways: It will expand the capabilities of MSD experiments utilizing microscope technology. It may be used in lunar and Martian experiments (Rover Robot). Because of its reduced size, it will enable experiments that were not feasible previously. It may be incorporated into existing shuttle orbiter and space station experiments, including glove-box-sized experiments as well as ground-based experiments.

The process used in the design and evaluation of modules of instruction with the PLATO IV Computer System for stimulus display and response recording is described. Steps in the instructional design process are listed as problem identification and task analysis, identification of entry characteristics, development of performance objectives,…

This slide presentation reviews NASA's use of systems engineering for the complete life cycle of a project. Systems engineering is a methodical, disciplined approach for the design, realization, technical management, operations, and retirement of a system. Each phase of a NASA project is terminated with a Key decision point (KDP), which is supported by major reviews.

test system for rapid species identification in addition to the oligonucleotide fingerprint. The test organisms for this work were Bacillus bacteria...comparisons. Results: Different species of bacteria, including Escherichia coli, Bacillus bacteria, and Geobacillus stearothermophilus produce qualitatively...of Bacillus bacteria were “hotter” (red) than the Escherichia coli strains (blue), which suggested gene expressions at these features were greater

A versatile array processor-based system for general-purpose image processing was developed. At the heart of this system is an extensive, flexible software package that incorporates the array processor for effective interactive image processing. The software system is described in detail, and its application to a diverse set of applications at LLNL is briefly discussed. 4 figures, 1 table.

The negative tone development (NTD) process has proven benefits for superior imaging performance in 193nm lithography. Shrink materials, such as AZ® RELACS® have found widespread use as a resolution enhancement technology in conventional 248nm (DUV), 193 nm dry (ArF) and 193 nm immersion (ArFi) lithography. Surfactant rinses, such as AZ® FIRM® are employed as yield enhancement materials to improve the lithographic performance by avoiding pattern collapse, eliminating defects, and improving CDU. This paper describes the development and recent achievements obtained with new shrink and rinse materials for application in NTD patterning processes.

Objective was to provide basis for producing, processing, and forming UHCS (ultrahigh carbon steel) on a commercial scale. Business plans were developed for potential commercialization. Effort was directed at improving the combination of flow stress and forming rates in UHCS alloys in order to make near net shape superplastic forming competitive; the result was the development of a series of UHCS alloys and processing, the selection of which depends on the specific requirements of the commercial application. Useful ancillary properties of these materials include: improved mechanical properties, wear resistance, and oxidation resistance at elevated temperatures.

Two-Stage Modified FT (MFT) process has been developed for producing high-octane gasoline from coal-based syngas. The main R&D are focused on the development of catalysts and technologies process. Duration tests were finished in the single-tube reactor, pilot plant (100T/Y), and industrial demonstration plant (2000T/Y). A series of satisfactory results has been obtained in terms of operating reliability of equipments, performance of catalysts, purification of coal - based syngas, optimum operating conditions, properties of gasoline and economics etc. Further scaling - up commercial plant is being considered.

Compatibility between an arcjet propulsion system and a communications satellite was verified by testing a Government-furnished, 1.4 kW hydrazine arcjet system with the FLTSATCOM qualification model satellite in a 9.1-meter (30-foot) diameter thermal-vacuum test chamber. Background pressure was maintained at 10(exp -5) torr during arcjet operation by cryopumping the thruster exhaust with an array of 5 K liquid helium cooled panels. Power for the arcjet system was obtained from the FLTSATCOM battery simulator. Spacecraft telemetry was monitored during each thruster firing period. No changes in telemetry data attributable to arcjet operation were detected in any of the tests. Electromagnetic compatibility data obtained included radiated emission measurements, conducted emission measurements, and cable coupling measurements. Significant noise was observed at lower frequencies. Above 500 MHz, radiated emissions were generally within limits, indicating that communication links at S-band and higher frequencies will not be affected. Other test data taken with a diagnostic array of calorimeters, radiometers, witness plates, and a residual gas analyzer evidenced compatible operation, and added to the data base for arcjet system integration. Two test series were conducted. The first series only included the arcjet and diagnostic array operating at approximately 0.1 torr background pressure. The second series added the qualification model spacecraft, a solar panel, and the helium cryopanels. Tests were conducted at 0.1 torr and 10(exp-5) torr. The arcjet thruster was canted 20 degrees relative to the solar panel axis, typical of the configuration used for stationkeeping thrusters on geosynchronous communications satellites.

Compatibility between an arcjet propulsion system and a communications satellite was verified by testing a Government-furnished, 1.4 kW hydrazine arcjet system with the FLTSATCOM qualification model satellite in a 9.1-meter (30-foot) diameter thermal-vacuum test chamber. Background pressure was maintained at 10(exp -5) torr during arcjet operation by cryopumping the thruster exhaust with an array of 5 K liquid helium cooled panels. Power for the arcjet system was obtained from the FLTSATCOM battery simulator. Spacecraft telemetry was monitored during each thruster firing period. No changes in telemetry data attributable to arcjet operation were detected in any of the tests. Electromagnetic compatibility data obtained included radiated emission measurements, conducted emission measurements, and cable coupling measurements. Significant noise was observed at lower frequencies. Above 500 MHz, radiated emissions were generally within limits, indicating that communication links at S-band and higher frequencies will not be affected. Other test data taken with a diagnostic array of calorimeters, radiometers, witness plates, and a residual gas analyzer evidenced compatible operation, and added to the data base for arcjet system integration. Two test series were conducted. The first series only included the arcjet and diagnostic array operating at approximately 0.1 torr background pressure. The second series added the qualification model spacecraft, a solar panel, and the helium cryopanels. Tests were conducted at 0.1 torr and 10(exp-5) torr. The arcjet thruster was canted 20 degrees relative to the solar panel axis, typical of the configuration used for stationkeeping thrusters on geosynchronous communications satellites.

This article stresses that terrestrial volcanism represents only part of the range of volcanism in the solar system. Earth processes of volcanicity are dominated by plate tectonics, which does not seem to operate on other planets, except possibly on Venus. Lunar volcanicity is dominated by lava effusion at enormous rates. Mars is similar, with the addition to huge shield volcanoes developed over fixed hotspots. Io, the moon closest to Jupiter, is the most active body in the Solar System and, for example, much sulphur and silicates are emitted. The eruptions of Io are generated by heating caused by tides induced by Jupiter. Europa nearby seems to emit water from fractures and Ganymede is similar. The satellites of Saturn and Uranus are also marked by volcanic craters, but they are of very low temperature melts, possibly of ammonia and water. The volcanism of the solar system is generally more exotic, the greater the distance from Earth. -A.Scarth

A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

a block diagram of the SoC based system. The digital portion of the chip includes a general purpose microcontroller for fully programmable...to a NVM. It has a programmable 16b MSP430 microcontroller and hardware accelerators for programmable FFT, FIR filtering, Cordic co-processing, and...the Vcap to a 1.2V and 0.5V supply, which are leveraged by the microcontroller in active, sleep and deep sleep modes. Once the Vcap is charged, the

By formalizing the hiring process in your EMS system, you help ensure the command staff provides the organization with high-quality employees. It is far better to focus on the quality of the people in your department than the quantity. Many departments have lots of members on their roster, but only a handful who actively contribute.

Both behavioral and neuroimaging evidence indicate that individuals with autism demonstrate marked abnormalities in the processing of faces. These abnormalities are often explained as either the result of an innate impairment to specialized neural systems or as a secondary consequence of reduced levels of social interest. A review of the…

The annual conference on Neural Information ProcessingSystems (NIPS) is the flagship conference on neural computation. The conference is interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, vision, speech and signal processing, reinforcement learning and control, implementations, and diverse applications. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This CD-ROM contains the entire proceedings of the twelve Neural Information ProcessingSystems conferences from 1988 to 1999. The files are available in the DjVu image format developed by Yann LeCun and his group at AT&T Labs. The CD-ROM includes free browsers for all major platforms. Michael I. Jordan is Professor of Computer Science and of Statistics at the University of California, Berkeley. Yann LeCun is Head of the Image Processing Research Department at AT&T Labs-Research. Sara A. Solla is Professor of Physics and of Physiology at Northwestern University.

The low-pressure hydride/dehydride process was developed from the need to recover thin-film coatings of plutonium metal from the inner walls of an isotope separation chamber located at Los Alamos and to improve the safety operation of a hydride recovery process using hydrogen at a pressure of 0.7 atm at Rocky Flats. This process is now the heart of the Advanced Recovery and Integrated Extraction System (ARIES) project.

This research project is designed to apply artificial intelligence technology including expert systems, dynamic interface of neural networks, and hypertext to construct an expert systemdeveloper. The developer environment is specifically suited to building expert systems which monitor the performance of ground support equipment for propulsion systems and testing facilities. The expert systemdeveloper, through the use of a graphics interface and a rule network, will be transparent to the user during rule constructing and data scanning of the knowledge base. The project will result in a software system that allows its user to build specific monitoring type expert systems which monitor various equipments used for propulsion systems or ground testing facilities and accrues system performance information in a dynamic knowledge base.

The US Department of Energy's (DOE) Ocean Energy Technology Program seeks to develop the technology of converting the ocean's vast energy resource into usable forms to the point where industry can assess its potential, commercial utility. The current focus in the program is on the utilization of open-cycle OTEC to produce electricity. The open-cycle OTEC process is one of the few alternative energy options which provides the potential for baseload-carrying capability. This paper provides a very brief overview of the program activities and focuses on results recently obtained from the program's experimental facility designed to allow testing of OC-OTEC subsystems under actual operating conditions utilizing seawater. The facility, referred to as the Seacoast Test Facility (STF), is currently composed of a Heat and Mass Transfer Scoping Test Apparatus (HMTSTA) being supplied by up to 1600 gallons per minute of warm seawater and 1000 gallons per minute of cold seawater. Researchers have obtained experimental data on the performance of evaporators and surface condensers. Also, information on mist elimination and deaeration processes have been obtained. Plans call for modification to the HMTSTA to accommodate the addition of direct-contact condensers. Summary results will be discussed addressing recent studies, by Argonne National Laboratory (ANL), of corrosion and biofouling of aluminum alloy surface condensers. Also discussed is the production of desalinated seawater using an open-cycle OTEC process. Finally to be discussed will be recent developments in OTEC turbines and an assessment of seawater supply systems required for OTEC. A brief overview of the program's future plans also will be presented. 4 refs., 11 figs., 2 tabs.

Two recent books (Jiang, 2014, "Advances in Chinese as a second language"; Wang, 2013, "Grammatical development of Chinese among non-native speakers") provide new resources for exploring the role of processing in acquiring Chinese as a second language (L2). This review article summarizes, assesses and compares some of the…

Presents descriptions of the management, systematic, and open-access curriculum development models to identify the decisionmaking bases, operational processes, evaluation requirements, and curriculum control methods of each model. A possible relationship among these models is then suggested. (Author/DN)

The reflective and interrogative processes required for developing effective qualitative research questions can give shape and direction to a study in ways that are often underestimated. Good research questions do not necessarily produce good research, but poorly conceived or constructed questions will likely create problems that affect all…

Recent work has demonstrated the importance of derivational morphology to later language development and has led to a consensus that derivation is a lexical process. In this review, derivational morphology is discussed in terms of lexical representation models from both linguistic and psycholinguistic perspectives. Input characteristics, including…

The European Spallation Source (ESS) is a neutron-scattering facility funded and supported in collaboration with 17 European countries in Lund, Sweden. Cryogenic cooling at ESS is vital particularly for the linear accelerator, the hydrogen target moderators, a test stand for cryomodules, the neutron instruments and their sample environments. The paper will focus on specific process design criteria, design decisions and their motivations for the helium cryoplants and auxiliary equipment. Key issues for all plants and their process concepts are energy efficiency, reliability, smooth turn-down behaviour and flexibility. The accelerator cryoplant (ACCP) and the target moderator cryoplant (TMCP) in particular need to be prepared for a range of refrigeration capacities due to the intrinsic uncertainties regarding heat load definitions. Furthermore the paper addresses questions regarding process arrangement, 2 K cooling methodology, LN2 precooling, helium storage, helium purification and heat recovery.

A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.

The increased contribution of solar energy in power generation sources requires an accurate estimation of surface solar irradiance conditioned by geographical, temporal and meteorological conditions. The knowledge of the variability of these factors is essential to estimate the expected energy production and therefore help stabilizing the electricity grid and increase the reliability of available solar energy. The use of numerical meteorological models in combination with statistical post-processing tools may have the potential to satisfy the requirements for short-term forecasting of solar irradiance for up to several days ahead and its application in solar devices. In this contribution, we present an assessment of a short-term irradiance prediction system based on the WRF-ARW mesoscale meteorological model (Skamarock et al., 2005) and several post-processing tools in order to improve the overall skills of the system in an annual simulation of the year 2004 in Spain. The WRF-ARW model is applied with 4 km x 4 km horizontal resolution and 38 vertical layers over the Iberian Peninsula. The hourly model irradiance is evaluated against more than 90 surface stations. The stations are used to assess the temporal and spatial fluctuations and trends of the system evaluating three different post-processes: Model Output Statistics technique (MOS; Glahn and Lowry, 1972), Recursive statistical method (REC; Boi, 2004) and Kalman Filter Predictor (KFP, Bozic, 1994; Roeger et al., 2003). A first evaluation of the system without post-processing tools shows an overestimation of the surface irradiance, due to the lack of atmospheric absorbers attenuation different than clouds not included in the meteorological model. This produces an annual BIAS of 16 W m-2 h-1, annual RMSE of 106 W m-2 h-1 and annual NMAE of 42%. The largest errors are observed in spring and summer, reaching RMSE of 350 W m-2 h-1. Results using Kalman Filter Predictor show a reduction of 8% of RMSE, 83% of BIAS

A novel process was developed for converting scum, a waste material from wastewater treatment facilities, to biodiesel. Scum is an oily waste that was skimmed from the surface of primary and secondary settling tanks in wastewater treatment plants. Currently scum is treated either by anaerobic digestion or landfilling which raised several environmental issues. The newly developedprocess used a six-step method to convert scum to biodiesel, a higher value product. A combination of acid washing and acid catalyzed esterification was developed to remove soap and impurities while converting free fatty acids to methyl esters. A glycerol washing was used to facilitate the separation of biodiesel and glycerin after base catalyzed transesterification. As a result, 70% of dried and filtered scum was converted to biodiesel which is equivalent to about 134,000 gallon biodiesel per year for the Saint Paul waste water treatment plant in Minnesota.

The significance of compiling case histories of empirical process knowledge and the role of such histories in improving the efficiency of manufacturing processdevelopment is discussed in this paper. Methods of representing important investigations as cases and using the information from such cases to eliminate redundancy of empirical investigations in analogous processdevelopment situations are also discussed. A system is proposed that uses such methods to capture the problem-solving framework of the application domain. A conceptual design of the system is presented and discussed.

Research at Virginia Tech led to the development of two complementary concepts for improving the removal of inorganic sulfur from many eastern U.S. coals. These concepts are referred to as Electrochemically Enhanced Sulfur Rejection (EESR) and Polymer Enhanced Sulfur Rejection (PESR) processes. The EESR process uses electrochemical techniques to suppress the formation of hydrophobic oxidation products believed to be responsible for the floatability of coal pyrite. The PESR process uses polymeric reagents that react with pyrite and convert floatable middlings, i.e., composite particles composed of pyrite with coal inclusions, into hydrophilic particles. These new pyritic-sulfur rejection processes do not require significant modifications to existing coal preparation facilities, thereby enhancing their adoptability by the coal industry. It is believed that these processes can be used simultaneously to maximize the rejection of both well-liberated pyrite and composite coal-pyrite particles. The project was initiated on October 1, 1992 and all technical work has been completed. This report is based on the research carried out under Tasks 2-7 described in the project proposal. These tasks include Characterization, Electrochemical Studies, In Situ Monitoring of Reagent Adsorption on Pyrite, Bench Scale Testing of the EESR Process, Bench Scale Testing of the PESR Process, and Modeling and Simulation.

In complex systems with stochastic components, systems laws often emerge that describe higher level behavior regardless of lower level component configurations. In this paper, emergent laws for describing mechanochemical systems are investigated for processive myosin-actin motility systems. On the basis of prior experimental evidence that longer processive lifetimes are enabled by larger myosin ensembles, it is hypothesized that emergent scaling laws could coincide with myosin-actin contact probability or system energy consumption. Because processivity is difficult to predict analytically and measure experimentally, agent-based computational techniques are developed to simulate processive myosin ensembles and produce novel processive lifetime measurements. It is demonstrated that only systems energy relationships hold regardless of isoform configurations or ensemble size, and a unified expression for predicting processive lifetime is revealed. The finding of such laws provides insight for how patterns emerge in stochastic mechanochemical systems, while also informing understanding and engineering of complex biological systems.

The SANC system is used for systematic calculations of various processes within the Standard Model in the one-loop approximation. QED, electroweak, and QCD corrections are computed to a number of processes being of interest for modern and future high-energy experiments. Several applications for the LHC physics program are presented. Development of the system and the general problems and perspectives for future improvement of the theoretical precision are discussed.

Onboard image processingsystems for a hyperspectral sensor have been developed in order to maximize image data transmission efficiency for large volume and high speed data downlink capacity. Since more than 100 channels are required for hyperspectral sensors on Earth observation satellites, fast and small-footprint lossless image compression capability is essential for reducing the size and weight of a sensor system. A fast lossless image compression algorithm has been developed, and is implemented in the onboard correction circuitry of sensitivity and linearity of Complementary Metal Oxide Semiconductor (CMOS) sensors in order to maximize the compression ratio. The employed image compression method is based on Fast, Efficient, Lossless Image compression System (FELICS), which is a hierarchical predictive coding method with resolution scaling. To improve FELICS's performance of image decorrelation and entropy coding, we apply a two-dimensional interpolation prediction and adaptive Golomb-Rice coding. It supports progressive decompression using resolution scaling while still maintaining superior performance measured as speed and complexity. Coding efficiency and compression speed enlarge the effective capacity of signal transmission channels, which lead to reducing onboard hardware by multiplexing sensor signals into a reduced number of compression circuits. The circuitry is embedded into the data formatter of the sensor system without adding size, weight, power consumption, and fabrication cost.

Onboard image processingsystems for a hyperspectral sensor have been developed in order to maximize image data transmission efficiency for large volume and high speed data downlink capacity. Since more than 100 channels are required for hyperspectral sensors on Earth observation satellites, fast and small-footprint lossless image compression capability is essential for reducing the size and weight of a sensor system. A fast lossless image compression algorithm has been developed, and is implemented in the onboard correction circuitry of sensitivity and linearity of Complementary Metal Oxide Semiconductor (CMOS) sensors in order to maximize the compression ratio. The employed image compression method is based on Fast, Efficient, Lossless Image compression System (FELICS), which is a hierarchical predictive coding method with resolution scaling. To improve FELICS’s performance of image decorrelation and entropy coding, we apply a two-dimensional interpolation prediction and adaptive Golomb-Rice coding. It supports progressive decompression using resolution scaling while still maintaining superior performance measured as speed and complexity. Coding efficiency and compression speed enlarge the effective capacity of signal transmission channels, which lead to reducing onboard hardware by multiplexing sensor signals into a reduced number of compression circuits. The circuitry is embedded into the data formatter of the sensor system without adding size, weight, power consumption, and fabrication cost. PMID:26404281

An important new frontier is being opened in steel processing with the emergence of thin strip casting. Casting steel directly to thin strip has enormous benefits in energy savings by potentially eliminating the need for hot reduction in a hot strip mill. This has been the driving force for numerous current research efforts into the direct strip casting of steel. The US Department of Energy initiated a program to evaluate the development of thin strip casting in the steel industry. In earlier phases of this program, planar flow casting on an experimental caster was studied by a team of engineers from Westinghouse Electric corporation and Armco Inc. A subsequent research program was designed as a fundamental and developmental study of both planar and melt overflow casting processes. This study was arranged as several separate and distinct tasks which were often completed by different teams of researchers. An early task was to design and build a water model to study fluid flow through different designs of planar flow casting nozzles. Another important task was mathematically modeling of melt overflow casting process. A mathematical solidification model for the formation of the strip in the melt overflow process was written. A study of the material and conditioning of casting substrates was made on the small wheel caster using the melt overflow casting process. This report discusses work on the development of thin steel casting.

Nationale Canada Abstract The aim of the Soldier Integrated Headwear System –Technology Demonstration Project (SIHS-TDP) is to empirically...determine the most promising headwear integration concept that significantly enhances the survivability and effectiveness of the future Canadian

Most innovations have contextual pre-cursors that prompt new ways of thinking and in their turn help to give form to the new reality. This was the case with the e-scape software developmentprocess. The origins of the system existed in software components and ideas that we had developed through previous projects, but the ultimate direction we took…

Three overlapping areas of research activities are presented: (1) Information theory and optimal filtering are extended to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing. (2) Focal-plane processing techniques and technology are developed to combine effectively image gathering with coding. The emphasis is on low-level vision processing akin to the retinal processing in human vision. (3) A breadboard adaptive image-coding system is being assembled. This system will be used to develop and evaluate a number of advanced image-coding technologies and techniques as well as research the concept of adaptive image coding.

Transient gene expression (TGE) is a rapid method for the production of recombinant proteins in mammalian cells. While the volumetric productivity of TGE has improved significantly over the past decade, most methods involve extensive cell line engineering and plasmid vector optimization in addition to long fed batch cultures lasting up to 21 days. Our colleagues have recently reported the development of a CHO K1SV GS-KO host cell line. By creating a bi-allelic glutamine synthetase knock out of the original CHOK1SV host cell line, they were able to improve the efficiency of generating high producing stable CHO lines for drug product manufacturing. We developed a TGE method using the same CHO K1SV GS-KO host cell line without any further cell line engineering. We also refrained from performing plasmid vector engineering. Our objective was to setup a TGE process to mimic protein quality attributes obtained from stable CHO cell line. Polyethyleneimine (PEI)-mediated transfections were performed at high cell density (4 × 10(6) cells/mL) followed by immediate growth arrest at 32 °C for 7 days. Optimizing DNA and PEI concentrations proved to be important. Interestingly, found the direct transfection method (where DNA and PEI were added sequentially) to be superior to the more common indirect method (where DNA and PEI are first pre-complexed). Moreover, the addition of a single feed solution and a polar solvent (N,N dimethylacetamide) significantly increased product titers. The scalability of process from 2 mL to 2 L was demonstrated using multiple proteins and multiple expression volumes. Using this simple, short, 7-day TGE process, we were able to successfully produce 54 unique proteins in a fraction of the time that would have been required to produce the respective stable CHO cell lines. The list of 54 unique proteins includes mAbs, bispecific antibodies, and Fc-fusion proteins. Antibody titers of up to 350 mg/L were achieved with the simple 7-day process. Titers

Pilot Carrousel testing was conducted for about three months on wastewaters generated at a major potato processing facility in 1993. The testing focused toward removal of BOD, NH{sub 3} and NO{sub 3}, and Total-P. After five-six weeks that it took for the system to reach steady state operation, the pilot plant was able to treat the wastewaters quite well. Effluent BOD{sub 5} and TKN values were less than 8 and 4 mg/L, respectively, during the second half of testing. Total-P in the effluent was less than 10 mg/L, although this step was not optimized. Based on the pilot testing, a full-scale Carrousel activated sludge plant was designed and commissioned in 1994. This plant is currently treating all the wastewaters from the facility and performing contaminant removals at a very high level.

The speed of growth in high technology differs for software, hardware and firmware. Hardware innovations come in leaps, as opposed to the gradual improvements in software and firmware technologies. This inhibits the full utilization of the hardware advances and reduces the cost benefit ratio of high technology ventures. The Microelectronics Systems Branch has committed its resources to use the look ahead technique whereby the technology is researched in its use, current development track and its future capabilities. This knowledge is used to meet the requirements of the space program not only in the nineties but also through 2005. This paper illustrates Analytical Hierarchy Process techniques used effectively and very successfully to support projects reusing systems with predicted enhancements.

where we stand in the development of operational theories of organizational learning . We make this assessment and propose a new model of organizational ... learning . Our goal is to define two perspectives on organizational learning C_. and to contrast and connect them, and thereby to facilitate movement

Multiple process approaches have been used historically to manufacture cylindrical nuclear fuel compacts. Scale-up of fuel compacting was required for the Next Generation Nuclear Plant (NGNP) project to achieve an economically viable automated production process capable of providing a minimum of 10 compacts/minute with high production yields. In addition, the scale-up effort was required to achieve matrix density equivalent to baseline historical production processes, and allow compacting at fuel packing fractions up to 46% by volume. The scale-up approach of jet milling, fluid-bed overcoating, and hot-press compacting adopted in the U.S. Advanced Gas Reactor (AGR) Fuel Development Program involves significant paradigm shifts to capitalize on distinct advantages in simplicity, yield, and elimination of mixed waste. A series of compaction trials have been completed to optimize compaction conditions of time, temperature, and forming pressure using natural uranium oxycarbide (NUCO) fuel at packing fractions exceeding 46% by volume. Results from these trials are included. The scale-up effort is nearing completion with the process installed and operable using nuclear fuel materials. Final process testing is in progress to certify the process for manufacture of qualification test fuel compacts in 2012.

The study of resilience in child development has overturned many negative assumptions about children growing up in adverse conditions. An examination of findings from variable- and person-focused investigations suggests that resilience is common and usually arises from the normative functions of human adaptational systems, with the greatest…

This is a description of experience in the development and introduction of a full-scale process control system for the PGU-450T power production unit of station No. 3 at the TETs-27 heat and electric power station of JSC 'Mosenergo' based on the latest, fourth generation program package SPPA-T3000, which is being used for the first time in Russia for steam-gas units. The fundamental technical solutions for the structure of the process control system are described, along with the features of the algorithms for control of the main engineering equipment in electric power plants based on the PGU-450.

This viewgraph presentation provides information on the development of a system by which aircraft pilots will be warned of turbulence. This networked system of in situ sensors will be mounted on various aircraft all of which are linked through a ground based parabolic antenna. As its end result, this system will attempt to reduce the number of accidents arising from turbulence.

Life depends as much on the flow of information as on the flow of energy. Here we review the many efforts to make this intuition precise. Starting with the building blocks of information theory, we explore examples where it has been possible to measure, directly, the flow of information in biological networks, or more generally where information-theoretic ideas have been used to guide the analysis of experiments. Systems of interest range from single molecules (the sequence diversity in families of proteins) to groups of organisms (the distribution of velocities in flocks of birds), and all scales in between. Many of these analyses are motivated by the idea that biological systems may have evolved to optimize the gathering and representation of information, and we review the experimental evidence for this optimization, again across a wide range of scales.

A safeguards system has been developed since 1993 in the course of supporting a fuel cycle process to fabricate CANDU fuel with spent PWR fuel (known as Direct Use of PWR spent fuel In CANDU, DUPIC). The major safeguards technology involved here was to design and fabricate a neutron coincidence counting system for process accountability, and also an unattended continuous monitoring system in association with independent verification by the IAEA. This combined technology was to produce information of nuclear material content and to maintain knowledge of the continuity of nuclear material flow. In addition to hardware development, diagnosis software is being developed to assist data acquisition, data review, and data evaluation based on a neural network system on the IAEA C/S system.

This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1 - 1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1 - 1). The objective of this analysis was to develop the BDCFs for the

In recent years, there has been much interest in the development of solid oxide fuel cell technology operating directly on hydrocarbon fuels. The development of a catalytic fuel processingsystem, which is integrated with the solid oxide fuel cell (SOFC) power source is outlined here. The catalytic device utilises a novel three-way catalytic system consisting of an in situ pre-reformer catalyst, the fuel cell anode catalyst and a platinum-based combustion catalyst. The three individual catalytic stages have been tested in a model catalytic microreactor. Both temperature-programmed and isothermal reaction techniques have been applied. Results from these experiments were used to design the demonstration SOFC unit. The apparatus used for catalytic characterisation can also perform in situ electrochemical measurements as described in previous papers [C.M. Finnerty, R.H. Cunningham, K. Kendall, R.M. Ormerod, Chem. Commun. (1998) 915-916; C.M. Finnerty, N.J. Coe, R.H. Cunningham, R.M. Ormerod, Catal. Today 46 (1998) 137-145]. This enabled the performance of the SOFC to be determined at a range of temperatures and reaction conditions, with current output of 290 mA cm -2 at 0.5 V, being recorded. Methane and butane have been evaluated as fuels. Thus, optimisation of the in situ partial oxidation pre-reforming catalyst was essential, with catalysts producing high H 2/CO ratios at reaction temperatures between 873 K and 1173 K being chosen. These included Ru and Ni/Mo-based catalysts. Hydrocarbon fuels were directly injected into the catalytic SOFC system. Microreactor measurements revealed the reaction mechanisms as the fuel was transported through the three-catalyst device. The demonstration system showed that the fuel processing could be successfully integrated with the SOFC stack.

A process for making an interconnect system for a multilayer circuit pattern. The interconnect system is formed having minimized through-hole space consumption so as to be suitable for high density, closely meshed circuit patterns.

General Dynamics has developed advanced hardware, software, and algorithms for use with the Tomahawk cruise missile and other unmanned vehicles. We have applied this technology to the problem of locating and determining the orientation of the docking port of a target vehicle with respect to an approaching spacecraft. The system described in this presentation utilizes a multi-processor based computer to digitize and process television imagery and extract parameters such as range to the target vehicle, approach, velocity, and pitch and yaw angles. The processor is based on the Inmos T-800 Transputer and is configured as a loosely coupled array. Each processor operates asynchronously and has its own local memory. This allows additional processors to be easily added if additional processing power is required for more complex tasks. Total system throughput is approximately 100 MIPS (scalar) and 60 MFLOPS and can be expanded as desired. The algorithm implemented on the system uses a unique adaptive thresholding technique to locate the target vehicle and determine the approximate position of the docking port. A target pattern surrounding the port is than analyzed in the imagery to determine the range and orientation of the target. This information is passed to an autopilot which uses it to perform course and speed corrections. Future upgrades to the processor are described which will enhance its capabilities for a variety of missions.

The selective hydrophobic coagulation (SHC) process is based on the recent finding that hydrophobic particles can be selectively coagulated without using traditional agglomerating agents or flocculants. The driving force for the coagulation is the attractive energy between hydrophobic surfaces, an interaction that has been overlooked in classical colloid chemistry. In most cases, selective separations can be achieved using simple pH control to disperse the mineral matter, followed by recovery of the coal coagula using techniques that take advantage of the size enlargement. In the present work, studies have been carried out to further investigate the fundamental mechanisms of the SHC process and the parameters that affect the process of separating coal from the ash-forming minerals and pyritic sulfur. Studies have included direct force measurements of the attractive interaction between model hydrophobic surfaces, in-situ measurements of the size distributions of coagula formed under a variety of operating conditions, and development of a population balance model to describe the coagulation process. An extended DLVO colloid stability model which includes a hydrophobic interaction energy term has also been developed to explain the findings obtained from the experimental studies. In addition to the fundamental studies, bench-scale processdevelopment test work has been performed to establish the best possible method of separating the coagula from dispersed mineral matter. Two types of separators, i.e., a sedimentation tank and a rotating drum screen, were examined in this study. The sedimentation tank proved to be the more efficient unit, achieving ash reductions as high as 60% in a single pass while recovering more than 90% of the combustible material. This device, which minimizes turbulence and coagula breakage, was used in subsequent test work to optimize design and operating parameters.

Accurate and timely information on rice crop growth and yield helps governments and other stakeholders adapting their economic policies and enables relief organizations to better anticipate and coordinate relief efforts in the wake of a natural catastrophe. Such delivery of rice growth and yield information is made possible by regular earth observation using space-born Synthetic Aperture Radar (SAR) technology combined with crop modeling approach to estimate yield. Radar-based remote sensing is capable of observing rice vegetation growth irrespective of cloud coverage, an important feature given that in incidences of flooding the sky is often cloud-covered. The system allows rapid damage assessment over the area of interest. Rice yield monitoring is based on a crop growth simulation and SAR-derived key information, particularly start of season and leaf growth rate. Results from pilot study sites in South and South East Asian countries suggest that incorporation of SAR data into crop model improves yield estimation for actual yields. Remote-sensing data assimilation into crop model effectively capture responses of rice crops to environmental conditions over large spatial coverage, which otherwise is practically impossible to achieve. Such improvement of actual yield estimates offers practical application such as in a crop insurance program. Process-based crop simulation model is used in the system to ensure climate information is adequately captured and to enable mid-season yield forecast.

We developed a hydraulic fracturing simulator by coupling a flow simulator to a geomechanics code, namely T+M simulator. Modeling of the vertical fracture development involves continuous updating of the boundary conditions and of the data connectivity, based on the finite element method for geomechanics. The T+M simulator can model the initial fracture development during the hydraulic fracturing operations, after which the domain description changes from single continuum to double or multiple continua in order to rigorously model both flow and geomechanics for fracture-rock matrix systems. The T+H simulator provides two-way coupling between fluid-heat flow and geomechanics, accounting for thermo-poro-mechanics, treats nonlinear permeability and geomechanical moduli explicitly, and dynamically tracks changes in the fracture(s) and in the pore volume. We also fully account for leak-off in all directions during hydraulic fracturing. We first test the T+M simulator, matching numerical solutions with the analytical solutions for poromechanical effects, static fractures, and fracture propagations. Then, from numerical simulation of various cases of the planar fracture propagation, shear failure can limit the vertical fracture propagation of tensile failure, because of leak-off into the reservoirs. Slow injection causes more leak-off, compared with fast injection, when the same amount of fluid is injected. Changes in initial total stress and contributions of shear effective stress to tensile failure can also affect formation of the fractured areas, and the geomechanical responses are still well-posed.

We developed a hydraulic fracturing simulator by coupling a flow simulator to a geomechanics code, namely T+M simulator. Modeling of the vertical fracture development involves continuous updating of the boundary conditions and of the data connectivity, based on the finite element method for geomechanics. The T+M simulator can model the initial fracture development during the hydraulic fracturing operations, after which the domain description changes from single continuum to double or multiple continua in order to rigorously model both flow and geomechanics for fracture-rock matrix systems. The T+H simulator provides two-way coupling between fluid-heat flow and geomechanics, accounting for thermoporomechanics, treats nonlinear permeability and geomechanical moduli explicitly, and dynamically tracks changes in the fracture(s) and in the pore volume. We also fully accounts for leak-off in all directions during hydraulic fracturing. We first validate the T+M simulator, matching numerical solutions with the analytical solutions for poromechanical effects, static fractures, and fracture propagations. Then, from numerical simulation of various cases of the planar fracture propagation, shear failure can limit the vertical fracture propagation of tensile failure, because of leak-off into the reservoirs. Slow injection causes more leak-off, compared with fast injection, when the same amount of fluid is injected. Changes in initial total stress and contributions of shear effective stress to tensile failure can also affect formation of the fractured areas, and the geomechanical responses are still well-posed.

Starting in January 2004, NASA instituted a set of internal working groups to develop ongoing recommendations for the continuing broad evolution of Earth Science Data Systemsdevelopment and management within NASA. One of these Data Systems Working Groups is called the Standards Process Group (SPG). This group's goal is to facilitate broader use of standards that have proven implementation and operational benefit to NASA Earth science by facilitating the approval of proposed standards and directing the evolution of standards. We have found that the candidate standards that self defined communities are proposing for approval to the SPG are one of 3 types: (1) A NASA community developed standard used within at least one self defined community where the proposed standard has not been approved or adopted by an external standards organization and where new implementations are expected to be developed from scratch, using the proposed standard as the implementation specification; (2) A NASA community developed standard used within at least one self defined community where the proposed standard has not been approved or adopted by an external standards organization and where new implementations are not expected to be developed from scratch but use existing software libraries or code;. (3) A standard already approved by an external standards organization but is being proposed for use for the NASA Earth science community. There are 3 types of reviews potentially needed to evaluate a proposed standard: (1) A detailed technical review to determine the quality, accuracy, and clarity of the proposed specification and where a detailed technical review ensures that implementers can use the proposed standard as an implementation specification for any future implementations with confidence; (2) A "usefulness" user review that determines if the proposed standard is useful or helpful or necessary to the user to carry out his work; (3) An operational review that evaluates if the

An effective systems engineering approach applied through the project life cycle can help Langley produce a better product. This paper demonstrates how an enhanced systems engineering process for in-house flight projects assures that each system will achieve its goals with quality performance and within planned budgets and schedules. This paper also describes how the systems engineering process can be used in combination with available software tools.

A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.

The entropy generation approach has been developed for the analysis of complex systems, with particular regards to biological systems, in order to evaluate their stationary states. The entropy generation is related to the transport processes related to exergy flows. Moreover, cancer can be described as an open complex dynamic and self-organizing system. Consequently, it is used as an example useful to evaluate the different thermo-chemical quantities of the transport processes in normal and in tumoral cells systems.

The author has identified the following significant results. A new technique for image processingsystem performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

Four experiment systems which have fundamental significance in the field of biotechnology are developed for the Get Away Special (GAS). Unique considerations were necessary to develop the systems which carry out biotechnological experiments under GAS's restricted conditions: delicate thermal control, fluid handling and protection from contamination. All experimental processes are controlled by internal sequencers and results of the experiments are recorded as images and numerical data within the systems. The systems are standardized in order to enable repeated use with a variety of experiments by replacement of the experiment modules and modification of experiment sequencing programs.

Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory. PMID:22816038

Although the application of ultrasonic energy to produce or to enhance a wide variety of processes have been explored since about the middle of the 20th century, only a reduced number of ultrasonic processes have been established at industrial level. However, during the last ten years the interest in ultrasonic processing has revived particularly in industrial sectors where the ultrasonic technology may represent a clean and efficient tool to improve classical existing processes or an innovation alternative for the development of new processes. Such seems to be the case of relevant sectors such as food industry, environment, pharmaceuticals and chemicals manufacture, machinery, mining, etc where power ultrasound is becoming an emerging technology for processdevelopment. The possible major problem in the application of high-intensity ultrasound on industrial processing is the design and development of efficient power ultrasonic systems (generators and reactors) capable of large scale successful operation specifically adapted to each individual process. In the area of ultrasonic processing in fluid media and more specifically in gases, the development of the steppedplate transducers and other power ge with extensive radiating surface has strongly contributed to the implementation at semi-industrial and industrial stage of several commercial applications, in sectors such as food and beverage industry (defoaming, drying, extraction, etc), environment (air cleaning, sludge filtration, etc...), machinery and process for manufacturing (textile washing, paint manufacture, etc). The development of different cavitational reactors for liquid treatment in continuous flow is helping to introduce into industry the wide potential of the area of sonochemistry. Processes such as water and effluent treatment, crystallization, soil remediation, etc have been already implemented at semi-industrial and/or industrial stage. Other single advances in sectors like mining or energy have

This report details efforts to scale-up and re-establish the manufacturing process for the curing agent known as Hylene MP. First, small scale reactions were completed with varying conditions to determine key drivers for yielding high quality product. Once the optimum conditions were determined on the small scale, the scaled-up process conditions were determined. New equipment was incorporated into the manufacturing process to create a closed production system and improve chemical exposure controls and improve worker safety. A safe, efficient manufacturing process was developed to manufacture high quality Hylene MP in large quantities.

Focuses on the development of a pilot unit for use in an advanced separations process laboratory in an effort to develop experiments on such processes as reverse osmosis, ultrafiltration, adsorption, and chromatography. Discusses reverse osmosis principles, the experimental system design, and some experimental studies. (TW)

An experimental glow discharge or plasma carburizing apparatus using an anomalous glow discharge created in a methane/hydrogen gas mixture at a pressure in the range 1-25 mbar was developed. Carbon concentration profiles were obtained using the apparatus and compared with similar data for vacuum and with the alternative methods giving a metallurgically superior product and savings in energy and treatment gas. It is indicated that a production glow discharge carburizing system technically feasible and meets criteria such as rapid loading/unloading and the fast heat up required in a commercial system.

Work to define and develop a full scale Space Station Freedom (SSF) mockup with the flexibility to evolve into future designs, to validate techniques for maintenance and logistics and verify human task allocations and support trade studies is described. This work began in early 1985 and ended in August, 1991. The mockups are presently being used at MSFC in Building 4755 as a technology and design testbed, as well as for public display. Micro Craft also began work on the Process Material Management System (PMMS) under this contract. The PMMS simulator was a sealed enclosure for testing to identify liquids, gaseous, particulate samples, and specimen including, urine, waste water, condensate, hazardous gases, surrogate gasses, liquids, and solids. The SSF would require many trade studies to validate techniques for maintenance and logistics and verify system task allocations; it was necessary to develop a full scale mockup which would be representative of current SSF design with the ease of changing those designs as the SSF design evolved and changed. The tasks defined for Micro Craft were to provide the personnel, services, tools, and materials for the SSF mockup which would consist of four modules, nodes, interior components, and part task mockups of MSFC responsible engineering systems. This included the Engineering Control and Life Support Systems (ECLSS) testbed. For the initial study, the mockups were low fidelity, soft mockups of graphics art bottle, and other low cost materials, which evolved into higher fidelity mockups as the R&D design evolved, by modifying or rebuilding, an important cost saving factor in the design process. We designed, fabricated, and maintained the full size mockup shells and support stands. The shells consisted of cylinders, end cones, rings, longerons, docking ports, crew airlocks, and windows. The ECLSS required a heavier cylinder to support the ECLSS systems test program. Details of this activity will be covered. Support stands were

Work to define and develop a full scale Space Station Freedom (SSF) mockup with the flexibility to evolve into future designs, to validate techniques for maintenance and logistics and verify human task allocations and support trade studies is described. This work began in early 1985 and ended in August, 1991. The mockups are presently being used at MSFC in Building 4755 as a technology and design testbed, as well as for public display. Micro Craft also began work on the Process Material Management System (PMMS) under this contract. The PMMS simulator was a sealed enclosure for testing to identify liquids, gaseous, particulate samples, and specimen including, urine, waste water, condensate, hazardous gases, surrogate gasses, liquids, and solids. The SSF would require many trade studies to validate techniques for maintenance and logistics and verify system task allocations; it was necessary to develop a full scale mockup which would be representative of current SSF design with the ease of changing those designs as the SSF design evolved and changed. The tasks defined for Micro Craft were to provide the personnel, services, tools, and materials for the SSF mockup which would consist of four modules, nodes, interior components, and part task mockups of MSFC responsible engineering systems. This included the Engineering Control and Life Support Systems (ECLSS) testbed. For the initial study, the mockups were low fidelity, soft mockups of graphics art bottle, and other low cost materials, which evolved into higher fidelity mockups as the R&D design evolved, by modifying or rebuilding, an important cost saving factor in the design process. We designed, fabricated, and maintained the full size mockup shells and support stands. The shells consisted of cylinders, end cones, rings, longerons, docking ports, crew airlocks, and windows. The ECLSS required a heavier cylinder to support the ECLSS systems test program. Details of this activity will be covered. Support stands were

The Tomo-e Gozen camera is a next-generation, extremely wide field optical camera, equipped with 84 CMOS sensors. The camera records about a 20 square degree area at 2 Hz, providing "astronomical movie data". We have developed a prototype of the Tomo-e Gozen camera (hereafter, Tomo-e PM), to evaluate the basic design of the Tomo-e Gozen camera. Tomo-e PM, equipped with 8 CMOS sensors, can capture a 2 square degree area at up to 2 Hz. Each CMOS sensor has about 2.6 M pixels. The data rate of Tomo-e PM is about 80 MB/s, corresponding to about 280 GB/hour. We have developed an operating system and reduction softwares to handle such a large amount of data. Tomo-e PM was mounted on 1.0-m Schmidt Telescope in Kiso Observatory at the University of Tokyo. Experimental observations were carried out in the winter of 2015 and the spring of 2016. The observations and software implementation were successfully completed. The data reduction is now in execution.

Landfill leachate pollution presents a serious environmental problem. It would be valuable to develop a sustainable method, one that is inexpensive and requires little energy, to eliminate the pollution and dispose of the waste. In a previous study, we reported the results of a leachate treatment for landfills in which we relied on the moss, Scopelophia cataractae, to support a sustainable method of waste reduction. In this study, for the development of a waste reduction system of landfill leachate treatment, we attempted to produce zinc as useful metal and ethanol as fuel from the remainder of moss after wastewater treatment. Steam explosions, which were used as physicochemical pretreatments to expose the raw material to saturated steam under high pressure and temperature, were used to pretreat the moss. By electrolysis, zinc recovered, and the maximum zinc recovery after wastewater treatment was 0.504 at 2.0 MPa steam pressure (211 °C) and 5 min steaming time. After that time, by simultaneous saccharification and fermentation using a Meicelase and Saccharomyces cerevisiae AM12, 0.42 g dm-3 of the maximum ethanol concentration was produced from 10 g dm-3 of exploded moss at 2.5 MPa steam pressure (223 °C) and 1 min steaming time.

Controlled parallel bioreactor systems allow fed-batch operation at early stages of processdevelopment. The characteristics of shaken bioreactors operated in parallel (shake flask, microtiter plate), sparged bioreactors (small-scale bubble column) and stirred bioreactors (stirred-tank, stirred column) are briefly summarized. Parallel fed-batch operation is achieved with an intermittent feeding and pH-control system for up to 16 bioreactors operated in parallel on a scale of 100 ml. Examples of the scale-up and scale-down of pH-controlled microbial fed-batch processes demonstrate that controlled parallel reactor systems can result in more effective bioprocess development. Future developments are also outlined, including units of 48 parallel stirred-tank reactors with individual pH- and pO2-controls and automation as well as liquid handling system, operated on a scale of ml.

We describe the methodology, tools and technologies for designing and implementing communication and control systems for networked automated or driver assist vehicles. In addressing design, we discuss enabling methodologies and our suite of enabling computational tools for formal modeling, simulation, and implementation. We illustrate our description with design, development and implementation work we have performed for Automated Highway Systems, Autonomous Underwater Vehicles, Mobile Offshore Base, Unmanned Air Vehicles, and Cooperative Adaptive Cruise Control. We conclude with the assertion - borne from our experience - that ground vehicle systems with any degree of automated operation could benefit from the type of integrated developmentprocess that we describe.

NASA utilizes an evidence based system to perform risk assessments for the human system for spaceflight missions. The center of this process is the multi-disciplinary Human System Risk Board (HSRB). The HSRB is chartered from the Chief Health and Medical Officer (OCHMO) at NASA Headquarters. The HSRB reviews all human system risks via an established comprehensive risk and configuration management plan based on a project management approach. The HSRB facilitates the integration of human research (terrestrial and spaceflight), medical operations, occupational surveillance, systems engineering and many other disciplines in a comprehensive review of human system risks. The HSRB considers all factors that influence human risk. These factors include pre-mission considerations such as screening criteria, training, age, sex, and physiological condition. In mission factors such as available countermeasures, mission duration and location and post mission factors such as time to return to baseline (reconditioning), post mission health screening, and available treatments. All of the factors influence the total risk assessment for each human risk. The HSRB performed a comprehensive review of all potential inflight medical conditions and events and over the course of several reviews consolidated the number of human system risks to 30, where the greatest emphasis is placed for investing program dollars for risk mitigation. The HSRB considers all available evidence from human research and, medical operations and occupational surveillance in assessing the risks for appropriate mitigation and future work. All applicable DRMs (low earth orbit for 6 and 12 months, deep space for 30 days and 1 year, a lunar mission for 1 year, and a planetary mission for 3 years) are considered as human system risks are modified by the hazards associated with space flight such as microgravity, exposure to radiation, distance from the earth, isolation and a closed environment. Each risk has a summary

The IMAGES interactive image processingsystem was created specifically for undergraduate remote sensing education in geography. The system is interactive, relatively inexpensive to operate, almost hardware independent, and responsive to numerous users at one time in a time-sharing mode. Most important, it provides a medium whereby theoretical remote sensing principles discussed in lecture may be reinforced in laboratory as students perform computer-assisted image processing. In addition to its use in academic and short course environments, the system has also been used extensively to conduct basic image processing research. The flow of information through the system is discussed including an overview of the programs.

A tubeless evaporation process which has the potential to combine the advantage of both evaporation and freezing processes, without their disadvantages is being developed. The TEP is capable of concentrating process solutions of such things as sugar, caustic soda, salt, sodium sulfate, black liquor from the pulp and paper industry, cooling tower blowdown, ''spent'' pickling liquor (sulfuric acid) from the steel industry, and nitric acid with potential energy savings of half to three-quarters of the energy required by conventional evaporators, with about half of the capital and maintenance cost. It has similar potential for the production of fresh water from seawater. The process uses working fluids (WF's) at their freezing point to effect direct contact heat exchange. The purpose of this project was to find additional and lower cost WF's in the laboratory, to obtain sizing information for the major equipment for an economic evaluation and a pilot plant design in a bench scale plant, and to perform the economic evaluation, and the pilot plant design and cost estimate. 6 refs., 37 figs., 7 tabs.

This report describes development of processes for cladding APT Target tungsten components with a thin layer (0.127-mm) of Alloy 718, Alloy 600 or 316L stainless steel alloy. The application requires that the cladding be thermally bonded to the tungsten in order to transfer heat generated in the tungsten volume to a surrounding coolant. High temperature diffusion bonding using the hot isostatic processing (HIP) technique was selected as the method for creating a metallurgical bond between pure tungsten tubes and rods and the cladding materials. Bonding studies using a uniaxially loaded vacuum hot press were conducted in preliminary experiments to determine acceptable time-temperature conditions for diffusion bonding. The results were successfully applied in cladding tungsten rods and tubes with these alloys. Temperatures 800-810 C were suitable for cladding tungsten with Alloy 600 and 316L stainless steel alloy, whereas tungsten was clad with Alloy 718 at 1020 C.

Developing production-quality software to process the large volumes of scientific data is the responsibility of the TES Ground Data System, which is being developed at the Jet Propulsion Laboratory together with support contractor Raytheon/ITSS. The large data volume and processing requirements of the TES pose significant challenges to the design.

A new model for the development of proces information systems is proposed. It is robust and inexpensive, capable of providing timely, neccessary information to the user by integrating Products, Instructions, Examples, Tools, and Process.

This review describes the recent results in hydrothermal liquefaction (HTL) of biomass in continuous-flow processingsystems. Although much has been published about batch reactor tests of biomass HTL, there is only limited information yet available on continuous-flow tests, which can provide a more reasonable basis for process design and scale-up for commercialization. High-moisture biomass feedstocks are the most likely to be used in HTL. These materials are described and results of their processing are discussed. Engineered systems for HTL are described however they are of limited size and do not yet approach a demonstration scale of operation. With the results available process models have been developed and mass and energy balances determined. From these models process costs have been calculated and provide some optimism as to the commercial likelihood of the technology.

This review describes the recent results in hydrothermal liquefaction (HTL) of biomass in continuous-flow processingsystems. Although much has been published about batch reactor tests of biomass HTL, there is only limited information yet available on continuous-flow tests, which can provide a more reasonable basis for process design and scale-up for commercialization. High-moisture biomass feedstocks are the most likely to be used in HTL. These materials are described and results of their processing are discussed. Engineered systems for HTL are described; however, they are of limited size and do not yet approach a demonstration scale of operation. With the results available, process models have been developed, and mass and energy balances determined. From these models, process costs have been calculated and provide some optimism as to the commercial likelihood of the technology.

The Advanced High-Temperature Reactor (AHTR) is a design concept for a central station-type [1500 MW(e)] Fluoride salt–cooled High-temperature Reactor (FHR) that is currently undergoing development by Oak Ridge National Laboratory for the US. Department of Energy, Office of Nuclear Energy’s Advanced Reactor Concepts program. FHRs, by definition, feature low-pressure liquid fluoride salt cooling, coated-particle fuel, a high-temperature power cycle, and fully passive decay heat rejection. The overall goal of the AHTR development program is to demonstrate the technical feasibility of FHRs as low-cost, large-size power producers while maintaining full passive safety. The AHTR is approaching a preconceptual level of maturity. An initial integrated layout of its major systems, structures, and components (SSCs), and an initial, high-level sequence of operations necessary for constructing and operating the plant is nearing completion. An overview of the current status of the AHTR concept has been recently published [1], and a report providing a more detailed overview of the AHTR structures and mechanical systems is currently in preparation. This report documents the refueling components and processes envisioned at this early development phase. The report is limited to the refueling aspects of the AHTR and does not include overall reactor or power plant design information. The report, however, does include a description of the materials envisioned for the various components and the instrumentation necessary to control the refueling process. The report begins with an overview of the refueling strategy. Next a mechanical description of the AHTR fuel assemblies and core is provided. The reactor vessel upper assemblies are then described. Following this the refueling path structures and the refueling mechanisms and components are described. The sequence of operations necessary to fuel and defuel the reactor is then discussed. The report concludes with a discussion of the

The Advanced High-Temperature Reactor (AHTR) is a design concept for a central station-type [1500 MW(e)] Fluoride salt-cooled High-temperature Reactor (FHR) that is currently undergoing development by Oak Ridge National Laboratory for the US. Department of Energy, Office of Nuclear Energy's Advanced Reactor Concepts program. FHRs, by definition, feature low-pressure liquid fluoride salt cooling, coated-particle fuel, a high-temperature power cycle, and fully passive decay heat rejection. The overall goal of the AHTR development program is to demonstrate the technical feasibility of FHRs as low-cost, large-size power producers while maintaining full passive safety. The AHTR is approaching a preconceptual level of maturity. An initial integrated layout of its major systems, structures, and components (SSCs), and an initial, high-level sequence of operations necessary for constructing and operating the plant is nearing completion. An overview of the current status of the AHTR concept has been recently published and a report providing a more detailed overview of the AHTR structures and mechanical systems is currently in preparation. This report documents the refueling components and processes envisioned at this early development phase. The report is limited to the refueling aspects of the AHTR and does not include overall reactor or power plant design information. The report, however, does include a description of the materials envisioned for the various components and the instrumentation necessary to control the refueling process. The report begins with an overview of the refueling strategy. Next a mechanical description of the AHTR fuel assemblies and core is provided. The reactor vessel upper assemblies are then described. Following this the refueling path structures and the refueling mechanisms and components are described. The sequence of operations necessary to fuel and defuel the reactor is then discussed. The report concludes with a discussion of the levels of

A system for non-destructively measuring an object and controlling industrial processes in response to the measurement is disclosed in which an impulse laser generates a plurality of sound waves over timed increments in an object. A polarizing interferometer is used to measure surface movement of the object caused by the sound waves and sensed by phase shifts in the signal beam. A photon multiplier senses the phase shift and develops an electrical signal. A signal conditioning arrangement modifies the electrical signals to generate an average signal correlated to the sound waves which in turn is correlated to a physical or metallurgical property of the object, such as temperature, which property may then be used to control the process. External, random vibrations of the workpiece are utilized to develop discernible signals which can be sensed in the interferometer by only one photon multiplier. In addition the interferometer includes an arrangement for optimizing its sensitivity so that movement attributed to various waves can be detected in opaque objects. The interferometer also includes a mechanism for sensing objects with rough surfaces which produce speckle light patterns. Finally the interferometer per se, with the addition of a second photon multiplier is capable of accurately recording beam length distance differences with only one reading.

A system for nondestructively measuring an object and controlling industrial processes in response to the measurement is disclosed in which an impulse laser generates a plurality of sound waves over timed increments in an object. A polarizing interferometer is used to measure surface movement of the object caused by the sound waves and sensed by phase shifts in the signal beam. A photon multiplier senses the phase shift and develops an electrical signal. A signal conditioning arrangement modifies the electrical signals to generate an average signal correlated to the sound waves which in turn is correlated to a physical or metallurgical property of the object, such as temperature, which property may then be used to control the process. External, random vibrations of the workpiece are utilized to develop discernible signals which can be sensed in the interferometer by only one photon multiplier. In addition the interferometer includes an arrangement for optimizing its sensitivity so that movement attributed to various waves can be detected in opaque objects. The interferometer also includes a mechanism for sensing objects with rough surfaces which produce speckle light patterns. Finally the interferometer per se, with the addition of a second photon multiplier is capable of accurately recording beam length distance differences with only one reading. 38 figures.

This project was focused on the development of tools for the automatic configuration of signal processingsystems. The goal is to develop tools that will be useful in a variety of Government and commercial areas and useable by people who are not signal processing experts. In order to get the most benefit from signal processing techniques, deep technical expertise is often required in order to select appropriate algorithms, combine them into a processing chain, and tune algorithm parameters for best performance on a specific problem. Therefore a significant benefit would result from the assembly of a toolbox of processing algorithms that has been selected for their effectiveness in a group of related problem areas, along with the means to allow people who are not signal processing experts to reliably select, combine, and tune these algorithms to solve specific problems. Defining a vocabulary for problem domain experts that is sufficiently expressive to drive the configuration of signal processing functions will allow the expertise of signal processing experts to be captured in rules for automated configuration. In order to test the feasibility of this approach, we addressed a lightning classification problem, which was proposed by DOE as a surrogate for problems encountered in nuclear nonproliferation data processing. We coded a toolbox of low-level signal processing algorithms for extracting features of RF waveforms, and demonstrated a prototype tool for screening data. We showed examples of using the tool for expediting the generation of ground-truth metadata, for training a signal recognizer, and for searching for signals with particular characteristics. The public benefits of this approach, if successful, will accrue to Government and commercial activities that face the same general problem - the development of sensor systems for complex environments. It will enable problem domain experts (e.g. analysts) to construct signal and image processing chains without

conducted that demonstrated that certi- fied organizations using quality standards produce better products or services than non-certified ones. Advantages... quality to process quality . There are several organizational process standards available. But the most popular one is ISO 9001, which is the interna...tionally recognized series of standards for an effective quality conformance system, with over a million certified organiza- tions worldwide. Other

In February 1976, the Energy Research and Development Administration (ERDA) published the Handbook of Gasifiers and Gas Treatment Systems. The intent of this handbook was to provide a ready reference to systems that are or may be applicable to coal conversion technology. That handbook was well received by users and was subsequently reprinted many times. The Department of Energy (successor agency to the ERDA) expands, revises and updates the Handbook in this volume. This new Handbook is not intended as a comparative evaluation, but rather as an impartial reference on recent and current technology. The Handbook now presents 39 gasification technologies and 40 gas processingsystems that are or may be applicable to coal conversion technology. The information presented has been approved or supplied by the particular licensor/developer.

The Department of Defense and the aerospace industry are responsible for decades of successful implementation of systems engineering process models used for the development of complex systems. The process models implemented throughout acquisition life cycles have proven to be comprehensive and flexible, and hence are designed to reduce acquisition schedule variability and the inherent risks of life-cycle cost overruns. While implementing the appropriate process model is important, various process models do not evaluate and quantify potential technical, manufacturing, scheduling and cost risks that may impact acquisition activities throughout the acquisition life cycle of the complex system. A potential way to effectively manage these risks with the appropriate process model is through the incorporation of the Synthesized Framework , the proposed method developed in this dissertation. With the described Synthesized Framework (SF), process models and risk drivers can be analyzed using this comprehensive approach, which implements qualitative and quantitative risk analysis techniques through Monte Carlo simulation. The result is a repeatable, inherent, risk-driven commitment process that can stabilize and synchronize both systems engineering and acquisition processes.

There is a clear need to better understand and predict future climate change, so that science can more confidently inform climate policy, including adaptation planning and future mitigation strategies. Understanding carbon cycle feedbacks, and the relationship between emissions (fossil and land use) and the resulting atmospheric carbon dioxide (CO2) and methane (CH4) concentrations in a changing climate has been recognized as an important goal by the IPCC. The existing surface greenhouse gas observing networks provide accurate and precise measurements of background values, but they are not configured to target the extended, complex and dynamic regions of the carbon budget. Space Agencies around the globe are committed to CO2 and CH4 observations: GOSAT-1/2, OCO-2/3, MERLin, TanSat, and CarbonSat. In addition to these Low Earth Orbit (LEO) missions, a new mission in Geostationary Orbit (GEO), geoCARB, which would provide mapping-like measurements of carbon dioxide, methane, and carbon monoxide concentrations over major land areas, has been recently proposed to the NASA Venture Program. These pioneering missions do not provide the spatial/temporal coverage to answer the key carbon-climate questions at process relevant scales nor do they address the distribution and quantification of anthropogenic sources at urban scales. They do demonstrate, however, that a well-planned future system of system integrating space-based LEO and GEO missions with extensive in situ observations could provide the accuracy, spatial resolution, and coverage needed to address critical open issues in the carbon-climate system. Dr. Diana Wickland devoted enormous energy in developing a comprehensive apprioach to understand the global carbon cycle; she understood well that an integrated, coordinated, international approach is needed. This shines through in her recent contribution in co-chairing the team that produced the "CEOS Strategy for Carbon Observations from Space." A NASA-funded community

Honeycomb composite structures are widely used in the aerospace and sporting goods industries because of the superior performance and weight saving advantages they offer over traditional metal structures. However, in order to maximize the mechanical and chemical properties of honeycomb composites, the structures must be specially designed to take advantage of their inherent anisotropic, viscoelastic and heterogeneous qualities. In the open literature little work has been done to understand these relationships. Most research efforts have been focused towards studying and modeling the effects of environmental exposure, impact damage and energy absorption. The objectives of this work was to use a systemic engineering approach to explore the fundamental material relationships of honeycomb composites with an emphasis towards the industrial manufacturing, design and performance characteristics of these materials. To reach this goal, a methodology was created to develop model honeycomb systems that were characteristically similar to their commercial counterparts. From the model systems, some of the important chemical and mechanical properties that controlled the behavior of honeycomb core were identified. With the knowledge gained from the model system, studies were carried out to correlate the compressive properties of honeycomb rings to honeycomb core. This type of correlation gives paper, resin, and adhesive manufactures the ability to develop new honeycomb materials without requiring specific honeycomb manufacturers to divulge their trade secrets. After characterizing the honeycomb core, efforts were made to understand the manufacturing and in-service responses of honeycomb materials. Using three Design of Experiments, investigations were performed to measure the mechanisms of composite structures to propagate damage and water over a fourteen month service period. Collectively, this research represents a fundamental starting point for understanding the processing

A technique of 3-D video imaging, was developed for use on manned missions for observation and control of remote manipulators. An improved medical diagnostic fluoroscope with a stereo, real-time output was also developed. An explanation of how this system works, and recommendations for future work in this area are presented.

Commission of the European Communities, Brussels (Belgium). Directorate-General for Education, Training, and Youth.

The EUROTECNET program was implemented to develop and improve vocational training policies and systems to meet the challenges of change in the economic and social situation through the development of innovative responses and actions. Each Member State of the European Community was asked to identify one issue of strategic and critical importance to…

A number of decisions in the health care field rely heavily on published clinical evidence. A systematic approach to evidence development and publication planning is required to develop a portfolio of evidence that includes at minimum information on efficacy, safety, durability of effect, quality of life, and economic outcomes. The approach requires a critical assessment of available literature, identification of gaps in the literature, and a strategic plan to fill the gaps to ensure the availability of evidence demanded for clinical decisions, coverage/payment decisions and health technology assessments. The purpose of this manuscript is to offer a six-step strategic process leading to a portfolio of evidence that meets the informational needs of providers, payers, and governmental agencies concerning patient access to a therapy.

In addition to reviewing the characteristics of document processingsystems, this paper pays considerable attention to the description of a system via a feature list approach. The purpose of this report is to present features of the systems in parallel fashion to facilitate comparison so that a potential user may have a basis for evaluation in…

Glaciers and ice sheets modulate global sea level by storing water deposited as snow on the surface, and discharging water back into the ocean through melting. Their physical state can be characterized in terms of their mass balance and dynamics. To estimate the current ice mass balance, and to predict future changes in the motion of the Greenland and Antarctic ice sheets, it is necessary to know the ice sheet thickness and the physical conditions of the ice sheet surface and bed. This information is required at fine resolution and over extensive portions of the ice sheets. A tomographic algorithm has been developed to take raw data collected by a multiple-channel synthetic aperture sounding radar system over a polar ice sheet and convert those data into two-dimensional (2D) ice thickness measurements. Prior to this work, conventional processing techniques only provided one-dimensional ice thickness measurements along profiles.

The world is addicted to ranking: everything, from the reputation of scientists, journals, and universities to purchasing decisions is driven by measured or perceived differences between them. Here, we analyze empirical data capturing real time ranking in a number of systems, helping to identify the universal characteristics of ranking dynamics. We develop a continuum theory that not only predicts the stability of the ranking process, but shows that a noise-induced phase transition is at the heart of the observed differences in ranking regimes. The key parameters of the continuum theory can be explicitly measured from data, allowing us to predict and experimentally document the existence of three phases that govern ranking stability.

The Contained Recovery of Oily Waste (CROW{trademark}) technology has been successfully tested in the laboratory and presently is being implemented at field sites contaminated with wood treating wastes and byproducts of town gas production. These field demonstrations will utilize only hot-water displacement without any chemical additives because the use of chemicals to enhance the hot-water flushing process has only been tested on a preliminary basis. Preliminary testing has shown that low concentrations of chemicals could reduce the contaminant content by an additional 10 to 20 wt %. Western Research Institute (WRI) research, plus research at Carnegie Mellon University, on surfactant enhancement of solubility of polynuclear aromatic hydrocarbons in water and water-soil systems indicate the potential of chemical enhancement of the CROW process. Chemicals that have been tested and that were used in these tests are totally biodegradable. The objective of this task was to obtain sufficient baseline data to show the effectiveness and environmentally safe use of chemicals, primarily surfactants, to enhance the CROW process. To meet this objective, 14 one-dimensional displacement tests were conducted. Eleven tests were conducted on a material from a former manufactured gas plant (MGP) site and four tests were conducted with a contaminated soil from a former wood treatment facility. The tests investigated the effect of three chemical concentrations (0, 0.5, and 1.0 vol %) at three temperatures (ambient, the projected optimum temperature, and one 40{degree}F [22{degree}C] below the optimum temperature).

This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

We previously found that exposure to glycidol at 1000 ppm in drinking water caused axonopathy in maternal rats and aberrations in late-stage hippocampal neurogenesis, targeting the process of neurite extension in offspring. To identify the profile of developmental neurotoxicity of glycidol, pregnant Sprague-Dawley rats were given drinking water containing glycidol from gestational day 6 until weaning on day 21 after delivery, and offspring at 0, 300 and 1000 ppm were subjected to region-specific global gene expression profiling. Four brain regions were selected to represent both cerebral and cerebellar tissues, i.e., the cingulate cortex, corpus callosum, hippocampal dentate gyrus and cerebellar vermis. Downregulated genes in the dentate gyrus were related to axonogenesis (Nfasc), myelination (Mal, Mrf and Ugt8), and cell proliferation (Aurkb and Ndc80) at ≥ 300 ppm, and upregulated genes were related to neural development (Frzb and Fzd6) at 1000 ppm. Upregulation was observed for genes related to myelination (Kl, Igf2 and Igfbp2) in the corpus callosum and axonogenesis and neuritogenesis (Efnb3, Tnc and Cd44) in the cingulate cortex, whereas downregulation was observed for genes related to synaptic transmission (Thbs2 and Ccl2) in the cerebellar vermis; all of these changes were mostly observed at 1000 ppm. Altered gene expression of Cntn3, which functions on neurite outgrowth-promotion, was observed in all four brain regions at 1000 ppm. Gene expression profiles suggest that developmental exposure to glycidol affected plasticity of neuronal networks in the broad brain areas, and dentate gyrus neurogenesis may be the sensitive target of this type of toxicity.

The Field Artillery Ammunition ProcessingSystem (FAAPS) is an initiative to introduce a palletized load system (PLS) that is transportable with an automated ammunition processing and storage system for use on the battlefield. System proponents have targeted a 20% increase in the ammunition processing rate over the current operation while simultaneously reducing the total number of assigned field artillery battalion personnel by 30. The overall objective of the FAAPS Project is the development and demonstration of an improved process to accomplish these goals. The initial phase of the FAAPS Project and the subject of this study is the FAAPS concept evaluation. The concept evaluation consists of (1) identifying assumptions and requirements, (2) documenting the process flow, (3) identifying and evaluating technologies available to accomplish the necessary ammunition processing and storage operations, and (4) presenting alternative concepts with associated costs, processing rates, and manpower requirements for accomplishing the operation. This study provides insight into the achievability of the desired objectives.

Complication of physical experiments and increasing volumes of experimental data necessitate the application of supercomputer and distributed computing systems for data processing. Design and development of such systems, their mathematical modeling, and investigation of their characteristics and functional capabilities is an urgent scientific and practical problem. In the present work, the characteristics of operation of such distributed system of processing of data of physical experiments are investigated using the apparatus of theory of queuing networks.

This report documents work-in-progress accomplished prior to programmatic changes that negated bringing this effort to conclusion as originally intended. The High Consequence System Surety (HCS{sup 2}) project pulls together a multi-disciplinary team to integrate the elements of surety safety, security, control, reliability and quality--into a new, encompassing process. The benefit of using this process is enhanced surety in the design of a high consequence system through an up-front, designed-in approach. This report describes the integrated, high consequence surety process and includes a hypothetical example to illustrate the process.

An image processingsystem is the combination of an image processor with other control and display devices plus the necessary software needed to produce an interactive capability to analyze and enhance image data. Such an image processingsystem installed at NASA Langley Research Center, Instrument Research Division, Acoustics and Vibration Instrumentation Section (AVIS) is described. Although much of the information contained herein can be found in the other references, it is hoped that this single handbook will give the user better access, in concise form, to pertinent information and usage of the image processingsystem.

This report describes the process modeling done in support of the Integrated Nonthermal Treatment System (INTS) study. This study was performed to supplement the Integrated Thermal Treatment System (ITTS) study and comprises five conceptual treatment systems that treat DOE contract-handled mixed low-level wastes (MLLW) at temperatures of less than 350{degrees}F. ASPEN PLUS, a chemical process simulator, was used to model the systems. Nonthermal treatment systems were developed as part of the INTS study and include sufficient processing steps to treat the entire inventory of MLLW. The final result of the modeling is a process flowsheet with a detailed mass and energy balance. In contrast to the ITTS study, which modeled only the main treatment system, the INTS study modeled each of the various processing steps with ASPEN PLUS, release 9.1-1. Trace constituents, such as radionuclides and minor pollutant species, were not included in the calculations.

At the 234th National Meeting of the American Chemical Society, held in Boston, MA, August 19-23, 2007, the ACS BIOT division held two oral sessions on Cell Culture ProcessDevelopment. In addition, a number of posters were presented in this area. The critical issues facing cell culture processdevelopment today are how to effectively respond to the increase in product demands and decreased process timelines while maintaining robust process performance and product quality and responding to the Quality by Design initiative promulgated by the Food and Drug Administration. Two main areas were addressed in the presentations: first, to understand the effects of process conditions on productivity and product quality, and second, to achieve improved production cell lines. A variety of techniques to achieve these goals were presented, including automated flow cytometric analysis, a high-throughput cell analysis and selection method, transcriptional and epigenetic techniques for analysis of cell lines and cell culture systems, and novel techniques for glycoform analysis.

NASA's Standards Process Group (SPG) facilitates the approval of proposed standards that have proven implementation and operational benefit for use in NASA's Earth science data systems. After some initial experience in approving proposed standards, the SPG has tailored its Standards Process to remove redundant reviews to shorten the review process. We have found that the candidate submissions that self defined communities are proposing for endorsement to the SPG are one of 4 types: (1) A NASA community developed standard used within at least one self defined community where the proposed standard has not been approved or adopted by an external standards organization and where new implementations are expected to be developed from scratch, using the proposed standard as the implementation specification; (2) A standard already approved by an external standards organization but is being proposed for use for the NASA Earth science community; (3) A defacto standard already widely used; or a (4) Technical Note We will discuss real examples of the different types of candidate standards that have been proposed and endorsed (i.e. OPeNDAP's Data Access Protocol, Open Geospatial Consortium's Web Map Server, and the Hierarchical Data Format). We will discuss a potential defacto standard (NASA's Global Change Master Directory (GCMD) Directory Interchange Format (DIF)) that is currently being reviewed. This past year, the SPG has modified its Standards Process to provide a comprehensive but not redundant review of the submitted RFC. The end result of the process tailoring is that the reviews will be completed faster. At each RFC submission, the SPG will decide which reviews will be performed. These reviews are conducted simultaneously and can include these three types: (1) A Technical review to review the technical specification and associated implementations; (2) An Operational Readiness review to evaluate whether the proposed standard works in a NASA environment with NASA Earth

This demonstration illustrates how modern development environments can be used to improve the process of designing and implementing information systems. Following a brief introduction to the topic of application generation, automatic programming, and software environments, one product — TEDIUM* — will be demonstrated.

Since 1985, the Branch of Atlantic Marine Geology has been involved in collecting, processing and digitally mosaicking high and low-resolution side-scan sonar data. Recent development of a UNIX-based image-processing software system includes a series of task specific programs for processing side-scan sonar data. This report describes the steps required to process the collected data and to produce an image that has equal along- and across-track resol

Dreybrodt deals quantitatively with many of the chemical and hydrological processes involved in the formation of karst systems. The book is divided into 3 major parts. The first part develops the basic chemical and fluid-flow principles needed in modeling karst systems. The second part investigates the experimental kinetics of calcite dissolution and precipitation and applies the resulting kinetic laws to the modeling of these processes in systems both open and closed to carbon dioxide. The last part of the book includes a qualitative examination of karst systems, quantitative modeling of the development of karst features, and an examination and modeling of the growth of spelotherms in caves.

Conventional system architectures, developmentprocesses, and tool environments often produce systems which exceed cost expectations and are obsolete before they are fielded. This paper explores some of the reasons for this and provides recommendations for how we can do better. These recommendations are based on DoD and NASA systemdevelopments and on our exploration and development of system/software engineering tools.

This report describes the space position data processingsystem of the NASA Western Aeronautical Test Range. The system is installed at the Dryden Flight Research Facility of NASA Ames Research Center. This operational radar data system (RADATS) provides simultaneous data processing for multiple data inputs and tracking and antenna pointing outputs while performing real-time monitoring, control, and data enhancement functions. Experience in support of the space shuttle and aeronautical flight research missions is described, as well as the automated calibration and configuration functions of the system.

A preliminary list of criteria is proposed for evaluation of solid waste processing technologies for research and technology development (R&TD) in the Advanced Life Support (ALS) Program. Completion of the proposed list by current and prospective ALS technology developers, with regard to specific missions of interest, may enable identification of appropriate technologies (or lack thereof) and guide future development efforts for the ALS Program solid waste processing area. An attempt is made to include criteria that capture information about the technology of interest as well as its system-wide impacts. Some of the criteria in the list are mission-independent, while the majority are mission-specific. In order for technology developers to respond to mission-specific criteria, critical information must be available on the quantity, composition and state of the waste stream, the wast processing requirements, as well as top-level mission scenario information (e.g. safety, resource recovery, planetary protection issues, and ESM equivalencies). The technology readiness level (TRL) determines the degree to which a technology developer is able to accurately report on the list of criteria. Thus, a criteria-specific minimum TRL for mandatory reporting has been identified for each criterion in the list. Although this list has been developed to define criteria that are needed to direct funding of solid waste processing technologies, this list processes significant overlap in criteria required for technology selection for inclusion in specific tests or missions. Additionally, this approach to technology evaluation may be adapted to other ALS subsystems.

This paper describes a component-based framework for data stream processing that allows for configuration, tailoring, and runtime system reconfiguration. The system's architecture is based on a pipes and filters pattern, where data is passed through routes between components. A network of pipes and filters can be dynamically reconfigured in response to a preplanned sequence of processing steps, operator intervention, or a change in one or more data streams. This framework provides several mechanisms supporting dynamic reconfiguration and can be used to build static data stream processing applications such as monitoring or data acquisition systems, as well as self-adjusting systems that can adapt their processing algorithm, presentation layer, or data persistency layer in response to changes in input data streams.