Sample records for bergbauforschung-foster wheeler process

Electron-positron pair production by the Breit-Wheelerprocess embedded in a strong laser pulse is analyzed. The transverse momentum spectrum displays prominent peaks which are interpreted as caustics, the positions of which are accessible by the stationary phases. Examples are given for the superposition of an XFEL beam with an optical high-intensity laser beam. Such a configuration is available, e.g., at LCLS at present and at European XFEL in near future. It requires a counter propagating probe photon beam with high energy which can be generated by synchronized inverse Compton backscattering.

The nonlinear Breit-Wheelerprocess of electron-positron pair production off a probe photon colliding with a low-frequency and a high-frequency electromagnetic wave that propagate in the same direction is analyzed. We calculate the pair-production probability and the spectra of the created pair in the nonlinear Breit-Wheelerprocesses of pair production off a probe photon colliding with two plane waves or one of these two plane waves. The differences of these two cases are discussed. We evidently show, in the two-wave case, the possibility of Breit-Wheeler pair production with simultaneous photon emission into the low-frequency wave and the high multiphoton phenomena: (i) Breit-Wheeler pair production by absorption of the probe photon and a large number of photons from the low-frequency wave, in addition to the absorption of one photon from the high-frequency wave; (ii) Breit-Wheeler pair production by absorption of the probe photon and one photon from the high-frequency wave with simultaneous emission of a large number of photons into the low-frequency wave. The phenomenon of photon emission into the wave cannot happen in the one-wave case. Compared with the one-wave case, the contributions from high multiphoton processes are largely enhanced in the two-wave case. The results presented in this article show a possible way to access the observations of the phenomenon of photon emission into the wave and high multiphoton phenomenon in Breit-Wheeler pair production even with the laser-beam intensity of order 1018 W/cm2.

The case of William Morton Wheeler and Alfred North Whitehead represents a striking example of how biologists and philosophers engaged in a common enterprise in the early twentieth century. Both challenge the notion that the living world is composed of distinct organisms. Based on his studies of the behavior of social insects, Wheeler developed a concept of superorganisms that paved the way for a theory of emergent evolution. This paper argues that Whitehead, whose relation to academic biology has been largely ignored, drew on Wheeler's findings and integrated them into a universal philosophical cosmology. PMID:27477347

As the inverse of Dirac annihilation, the Breit-Wheelerprocess, the production of an electron-positron pair in the collision of two photons, is the simplest mechanism by which light can be transformed into matter. It is also of fundamental importance in high-energy astrophysics, both in the context of the dense radiation fields of compact objects and the absorption of high-energy gamma rays travelling intergalactic distances. However, in the 80 years since its theoretical prediction, this process has never been observed. Here, we present the design of a new class of photon-photon collider, which is capable of detecting significant numbers of Breit-Wheeler pairs using current-generation technology. We further show how our scheme could be implemented on existing laser facilities; successfully achieving this would represent the advent of a new type of high-energy physics experiment. Supported by the Engineering and Physical Sciences Research Council, AWE, Aldermaston and the John Adams Institute (STFC).

> Counter-propagating and suitably polarized light (laser) beams can provide conditions for pair production. Here, we consider in more detail the following two situations: (i) in the homogeneity regions of anti-nodes of linearly polarized ultra-high intensity laser beams, the Schwinger process is dynamically assisted by a second high-frequency field, e.g. by an XFEL beam; and (ii) a high-energy probe photon beam colliding with a superposition of co-propagating intense laser and XFEL beams gives rise to the laser-assisted Breit-Wheelerprocess. The prospects of such bi-frequent field constellations with respect to the feasibility of conversion of light into matter are discussed.

Counter-propagating and suitably polarized light (laser) beams can provide conditions for pair production. Here, we consider in more detail the following two situations: (i) in the homogeneity regions of anti-nodes of linearly polarized ultra-high intensity laser beams, the Schwinger process is dynamically assisted by a second high-frequency field, e.g. by an XFEL beam; and (ii) a high-energy probe photon beam colliding with a superposition of co-propagating intense laser and XFEL beams gives rise to the laser-assisted Breit-Wheelerprocess. The prospects of such bi-frequent field constellations with respect to the feasibility of conversion of light into matter are discussed.

Instrumented vehicles are key tools for in-depth understanding of drivers' behaviours, thus for the design of scientifically based countermeasures to reduce fatalities and injuries. The instrumentation of Powered Two-Wheelers (PTW) has been less widely implemented that for vehicles, in part due to the technical challenges involved. The last decade has seen the development in Europe of several tools and methodologies to study motorcycle riders' behaviours and motorcycle dynamics for a range of situations, including crash events involving falls. Thanks to these tools, a broad-ranging research programme has been conducted, from the design and tuning of real-time falls detection to the study of riding training systems, as well as studies focusing on naturalistic riding situations such as filtering and line splitting. The methodology designed for the in-depth study of riders' behaviours in naturalistic situations can be based upon the combination of several sources of data such as: PTW sensors, context-based video retrieval system, Global Positioning System (GPS) and verbal data on the riders' decisions making process. The goals of this paper are: (1) to present the methodological tools developed and used by INRETS-MSIS (now Ifsttar-TS2/Simu) in the last decade for the study of riders' behaviours in real-world environment as well as on track for situations up to falls, (2) to illustrate the kind of results that can be gained from the conducted studies, (3) to identify the advantages and limitations of the proposed methodology to conduct large scale naturalistic riding studies, and (4) to highlight how the knowledge gained from this approach will fill many of the knowledge gaps about PTW-riders' behaviours and risk factors. PMID:23659861

This is one in a series of status reports prepared by the Tennessee Valley Authority (TVA) for those interested in the conditions of TVA reservoirs. This overview of Wheeler Reservoir summarizes reservoir purposes and operation, reservoir and watershed characteristics, reservoir uses and use impairments, and water quality and aquatic biological conditions. The information presented here is from the most recent reports, publications, and original data available. If no recent data were available, historical data were summarized. If data were completely lacking, environmental professionals with special knowledge of the resource were interviewed. 12 refs., 2 figs.

Designed for use by the general reader, the college student, and the teacher, this book analyzes the life and literary career of Josh Billings (Henry Wheeler Shaw), emphasizing his literary ventures and artistic talents. The analysis reveals Billings' talents as a subtle humorist, homespun philosopher, and artist of the essay. Chapters include…

Number of Contacts Made--Over 44 landowner contacts were made regarding CREP potential. Out of those 44 contacts, 15 resulted in on-site visits to the property to discuss available options. Articles were published in the Wheeler SWCD annual report and newsletter totaling a distribution of 1,200. Two informational displays were viewed by approximately 500 people: one at the Wheeler SWCD Annual Meeting and the second at the Wheeler County Fair. Number of Contracts Negotiated and Signed--3 CREP contracts in Wheeler County were signed within this contract period. They included landowners on Stephenson Creek, Bear Creek and Lost Valley Creek. The project done on Lost Valley Creek was handled by the Gilliam Co. Riparian Buffer Specialist filling in during the Wheeler position being vacated. Work was also started and is proceeding on another four contracts. Problems Encountered During Contract Year: (1) Riparian Buffer position vacated in October 2004 and District had difficulty filling the position. This set the district back in some of the delineated goals. Existing district staff is now up-to-speed on training, etc. and District is confident of achieving outlined goals. (2) Issues involving qualification of irrigated rates and how to process irrigated acres through CREP. (3) Issues involving clarification of eligibility as it relates to financial status of landowner; and (4) Landowner comfort in signing up for federal programs.

Wheeler Ridge and vicinity, California, is a site of major tectonic activity, both historically and over recent geologic time. The epicenter of the 7.5 magnitude Kern County earthquake occurred here on July 21,1952, and numerous geologic and topographic features indicate rapid geologic processes. The ridge itself (upper-right center) is a geologic fold that is growing out of the southern San Joaquin Valley. A prominent 'wind gap,' now used for passage of the California aquaduct (with the aid of a pumping station), is evidence that the ridge grew faster than tranversing streams could erode down. Nearby abrupt and/or landslid mountain fronts similarly indicate a vigorous tectonic setting here, just north of the San Andreas fault. The Interstate 5 freeway can be seen crossing agricultural fields on the right and entering the very rugged and steep Grapevine Canyon toward the bottom.

This anaglyph was generated by first draping a Landsat satellite image over a preliminary topographic map from the Shuttle Radar Topography Mission (SRTM), then generating two differing perspectives, one for each eye. When viewed through special glasses, the result is a vertically exaggerated view of the Earth's surface in its full three dimensions. Anaglyph glasses cover the left eye with a red filter and cover the right eye with a blue filter. Landsat has been providing visible and infrared views of the Earth since 1972. SRTM elevation data matches the 30 meter resolution of most Landsat images and will substantially help in analyses of the large and growing Landsat image archive.

The elevation data used in this image was acquired by SRTM aboard the Space Shuttle Endeavour, launched on February 11, 2000. SRTM used the same radar instrument that comprised the Spaceborne Imaging Radar-C/X-Band Synthetic Aperture Radar (SIR-C/X-SAR) that flew twice on the Space Shuttle Endeavour in 1994. SRTM was designed to collect three-dimensional measurements of the Earth's surface. To collect

Space physicist Edward Wheeler Hones Jr. died on 17 September 2012 at his home in Los Alamos, N. M. He was 90 years old. The cause of death was a heart attack that came following a brief hospitalization.

Foster Wheeler has introduced a new COMPACT Circulating Fluidized Bed (CFB) boiler design based on the rectangular hot solids separator. The Compact design also enables easy implementation of new designs for INTREX fluid bed heat exchangers. These new products result in many benefits which affect the boiler economy and operation. After initial development of the Compact CFB design it has been applied in demonstration and industrial scale units. The performance of Compact CFB has been proved to be equivalent to conventional Foster Wheeler CFB has been proved to be equivalent to conventional Foster Wheeler CFB boilers with high availability. Several new Foster Wheeler Compact boilers are being built or already in operation. Operational experiences from different units will be discussed in this paper. There are currently Compact units with 100--150 MW{sub e} capacity under construction. With the scale-up experience with conventional CFB boilers and proven design approach and scale-up steps, Foster Wheeler will have the ability to provide large Compact CFB boilers up to 400--600 MW{sub e} capacity.

Number of Contacts Made--I have contacted 35 landowners in Wheeler County. Of the 35 contacts 12 have resulted in meeting on their property to discuss available options. Included an article in the Annual Report and Wheeler SWCD newsletter mailed to 550 landowners. Contacts are primarily through networking with others here in the office as well as working closely with the NRCS office. Number of Contracts Negotiated--This Project has produced five riparian buffers within the past contract year. Each has greater meaning to the landowner than simply a buffer. In most cases the buffer is providing the landowner with improved grazing management and/or more reliable water source for livestock. Landowners also feel the enhanced wildlife habitat is a bonus to the program. Other Accomplishments--I took part in the John Day Subbasin Planning process and was able to offer assistance into the inventory items related to Wheeler County. I was often the only local representative able to attend the meetings. I assisted the Wheeler SWCD in writing a successful OWEB grant to remove 110 acres of junipers for watershed restoration, range rehabilitation, and economic development. One partner in the project is a manufacturer that uses juniper as their primary construction material. The goal is to create a pilot project that may grow into a self sustaining industry within the county. I also assisted in writing a small grant to improve water usage in the Muddy Creek watershed. I assisted with the Pine Creek Conservation Area ''Twilight Tour'' as well as the Wheeler SWCD ''Annual Meeting and Dinner''. Both events were successful in getting information out about our riparian buffer program. Facilitate office training and utilization of advanced GIS technology and mapping. Problems Encountered During Contract Year--The NRCS Cultural Resources Review process has ground to a halt. It is takes 6 months to get initial results from the Portland offices. Nearly all requests require site surveys

The Hill-Wheeler ansatz for the total wave function, within the Generator Coordinate Method framework, is generalized by recourse to the theory of distributions. The ensuing approach allows one to obtain a basis that spans the collective subspace, without having to deal explicitly with the eigenvectors and eigenvalues of the overlap kernel. Applications to an exactly soluble model and anharmonic vibrations illustrate the present treatment.

Disposal of industrial waste resulted in massive DDT contamination atWheeler National Wildlife Refuge, Alabama. Nearly a decade after the cessation of DDT manufacturing at the facility responsible, concentrations of DDT residues in the local fauna are still high enough to suggest avian reproductive impairment and mortality. Populations of fish-eating birds are low, endangered species are being exposed, and muscle lipids of game birds contain up to 6900 parts of DDT (isomers and metabolites) per million.

Wheeler Ridge and vicinity, California, is a site of major tectonic activity, both historically and over recent geologic time. The epicenter of the 7.5 magnitude Kern County earthquake occurred here on July 21,1952, and numerous geologic and topographic features indicate rapid geologic processes. The ridge itself (upper-right center) is a geologic fold that is growing out of the southern San Joaquin Valley. A prominent 'wind gap,' now used for passage of the California aquaduct (with the aid of a pumping station), is evidence that the ridge grew faster than tranversing streams could erode down. Nearby abrupt and/or landslid mountain fronts similarly indicate a vigorous tectonic setting here, just north of the San Andreas fault. The Interstate 5 freeway can be seen crossing agricultural fields on the right and entering the very rugged and steep Grapevine Canyon toward the bottom.

This stereoscopic image was generated by draping a Landsat satellite image over a preliminary Shuttle Radar Topography Mission (SRTM) elevation model. Two differing perspectives were then calculated, one for each eye. They can be seen in 3-D by viewing the left image with the right eye and the right image with the left eye (cross-eyed viewing), or by downloading and printing the image pair and viewing them with a stereoscope. When stereoscopically merged, the result is a vertically exaggerated view of the Earth's surface in its full three dimensions. Landsat has been providing visible and infrared views of the Earth since 1972. SRTM elevation data matches the 30-meter resolution of most Landsat images and will substantially help in analyses of the large and growing Landsat image archive.

The elevation data used in this image was acquired by SRTM aboard the Space Shuttle Endeavour, launched on February 11, 2000. SRTM used the same radar instrument that comprised the Spaceborne Imaging Radar-C/X-Band Synthetic Aperture Radar (SIR-C/X-SAR) that flew twice on the Space Shuttle

This is an alternative interpretation of Jacques, et. al. (2007), Wheeler's thought experiment with delayed choice. The researchers find that the choice of observables changes the previous behavior of the photon inside the interferometer. Stepping outside the QM box, we propose that elementary waves from the detectors travel backwards through the interferometer, and the photon is following such a ray in the reverse direction. Thus a change in observables changes the behavior of the photon for the simple reason that the observable is transmitting information to the photon and the photon is able to change its polarization mid-stream in response to a change in that information. According to this explanation there is no delayed choice. It is an illusion.

The US Department of Energy (DOE) has awarded Foster Wheeler Development Corporation a contract to develop a partial gasification module (PGM) that represents a critical element of several potential coal-fired Vision 21 plants. When utilized for electrical power generation, these plants will operate with efficiencies greater than 60% while producing near zero emissions of traditional stack gas pollutants. The new process partially gasifies coal at elevated pressure producing a coal derived syngas and a char residue. The syngas can be used to fuel the most advanced power producing equipment such as solid oxide fuel cells or gas turbines or processed to produce clean liquid fuels or chemicals for industrial users. The char residue is not wasted; it can also be used to generate electricity by fueling boilers that drive the most advanced ultra-supercritical pressure steam turbines. The unique aspect of the process is that it utilizes a pressurized circulating fluidized bed partial gasifier and does not attempt to consume the coal in a single step. To convert all the coal to syngas in a single step requires extremely high temperatures ({approx} 2500 to 2800F) that melt and vaporize the coal and essentially drive all coal ash contaminants into the syngas. Since these contaminants can be corrosive to power generating equipment, the syngas must be cooled to near room temperature to enable a series of chemical processes to clean the syngas. Foster Wheeler's process operates at much lower temperatures that control/minimize the release of contaminants; this eliminates/minimizes the need for the expensive, complicated syngas heat exchangers and chemical cleanup systems typical of high temperature gasification. By performing the gasification in a circulating bed, a significant amount of syngas can still be produced despite the reduced temperature and the circulating bed allows easy scale up to large size plants. Rather than air, it can also operate with oxygen to facilitate

During the early part of his career, John Archibald Wheeler made an astonishing number of contributions to nuclear and particle physics, as well as to classical electrodynamics, often in collaboration with another physicist. He was also a major contributor to the Manhattan Project (in Chicago and Hanford rather than Los Alamos), and, following World War II, became an influential scientific cold warrior. His early achievements in physics include the calculated scattering of light by light (with Gregory Breit), the prediction of nuclear rotational states (with Edward Teller), the theory of fission (with Niels Bohr), action-at-a-distance electrodynamics (with Richard Feynman), the theory of positronium, the universal weak interaction (with Jayme Tiomno), and the proposed use of the muon as a nuclear probe particle. He gained modest fame as the person who identified xenon 135 as a reactor poison. His Project Matterhorn contributed significantly to the design of the H bomb, and his Project 137, which he had hoped would flower into a major defense lab, served as the precursor to the Jason group.

Protection and enhancement of water quality is essential for attaining the full complement of beneficial uses of TVA reservoirs. The responsibility for improving and protecting TVA reservoir water quality is shared by various federal, state, and local agencies, as well as the thousands of corporations and property owners whose individual decisions affect water quality. TVA's role in this shared responsibility includes collecting and evaluating water resources data, disseminating water resources information, and acting as a catalyst to bring together agencies and individuals that have a responsibility or vested interest in correcting problems that have been identified. This report is one in a series of status reports that will be prepared for each of TVA's reservoirs. The purpose of this status report is to provide an up-to-date overview of the characteristics and conditions of Wheeler Reservoir, including: reservoir purposes and operation; physical characteristics of the reservoir and the watershed; water quality conditions: aquatic biological conditions: designated, actual, and potential uses of the reservoir and impairments of those uses; ongoing or planned reservoir management activities. Information and data presented here are form the most recent reports, publications, and original data available. 21 refs., 8 figs., 29 tabs.

Along with recent progress in next-generation sequencing technology, it has become easier to process larger amounts of genome sequencing data at a lower cost. The most time-consuming step of next-generation sequencing data analysis involves the mapping of read data into a reference genome. Although the Burrows-Wheeler Alignment (BWA) tool is one of the most widely used open-source software tools for aligning read sequences, it still has a limitation in that it does not fully support a multi-thread mechanism during the alignment generation step. In this article, we propose a BWA-MT tool based on BWA that supports multi-thread mechanisms for processing alignment generation. To evaluate BWA-MT, we used an evaluation system equipped with 24 cores and 128 GB of memory. As workloads, we used the hg19 human genome reference sequence and sequences of various read sizes from the 1 to 40 M spots. In our evaluation, BWA-MT showed a maximum of 3.66-times better performance, and generated the same Sequence Alignment/Map result file as that of BWA. Although the ability to speed up the procedure might be dependent on computing resources, we confirmed that BWA-MT is a highly effective and fast alignment tool. PMID:27323088

In this paper we study how all the physical ``constants'' vary in the framework described by a model in which we have taken into account the generalize conservation principle for its stress-energy tensor. This possibility enable us to take into account the adiabatic matter creation in order to get rid of the entropy problem. We try to generalize this situation by contemplating multi-fluid components. To validate all the obtained results we explore the possibility of considering the variation of the ``constants'' in the quantum cosmological scenario described by the Wheeler-DeWitt equation. For this purpose we explore the Wheeler-DeWitt equation in different contexts but from a dimensional point of view. We end by presenting the Wheeler-DeWitt equation in the case of considering all the constants varying. The quantum potential is obtained and the tunneling probability is studied.

This dissertation has two objectives. The first objective is to determine where best to situate the study of mentoring (i.e. the 'making of scientists') on the landscape of the history of science and science studies. This task is accomplished by establishing mentoring studies as a link between the robust body of literature dealing with Research Schools and the emerging scholarship surrounding the development, dispersion, and evolution of pedagogy in the training of twentieth century physicists. The second, and perhaps more significant and novel objective, is to develop a means to quantitatively assess the mentoring workmanship of scientific craftsmen who preside over the final stages of preparation when apprentices are transformed into professional scientists. The project builds upon a 2006 Master's Thesis that examined John Archibald Wheeler's work as a mentor of theoretical physicists at Princeton University in the years 1938--1976. It includes Wheeler's work as a mentor at the University of Texas and is qualitatively and quantitatively enhanced by virtue of the author having access to five separate collections with archival holdings of John Wheeler's papers and correspondence, as well as having access to thirty one tape recorded interviews that feature John Wheeler as either the interviewee or a prominent subject of discussion. The project also benefited from the opportunity to meet with and gather background information from a number of John Wheeler's former colleagues and students. Included in the dissertation is a content analysis of the acknowledgements in 949 Ph.D. dissertations, 122 Master's Theses, and 670 Senior Theses that were submitted during Wheeler's career as an active mentor. By establishing a census of the students of the most active mentors at Princeton and Texas, it is possible to tabulate the publication record of these apprentice groups and obtain objective measures of mentoring efficacy. The dissertation concludes by discussing the wider

In 1952 John Wheeler turned his attention from nuclear physics and national defense to a backwater of physics: general relativity. Over the next 25 years, with students and postdocs he led a ``revolution'' that made relativity a major subfield of fundamental physics and a tool for astrophysics. Wheeler viewed curved spacetime as a nonlinear dynamical entity, to be studied via tools of geometrodynamics (by analogy with electrodynamics) -- including numerical relativity, for which his students laid the earliest foundations. With Joseph Weber (his postdoc), he did theoretical work on gravitational waves that helped launch Weber on a career of laying foundations for modern gravitational-wave detectors. Wheeler and his students showed compellingly that massive stars must form black holes; and he gave black holes their name, formulated the theory of their pulsations and stability (with Tullio Regge), and mentored several generations of students in seminal black-hole research (including Jacob Bekenstein's black-hole entropy). Before the discovery of pulsars, Wheeler identified magnetized, spinning neutron stars as the likely power sources for supernova remnants including the Crab nebula. He identified the Planck length and time as the characteristic scales for the laws of quantum gravity, and formulated the concept of quantum fluctuations of spacetime geometry and quantum foam. With Bryce DeWitt, he defined a quantum wave function on the space of 3-geometries and derived the Wheeler-DeWitt equation that governs it, and its a sum-over-histories action principle. Wheeler was a great inspiration to his colleagues and students, pointing the directions toward fruitful research problems and making intuitive-leap speculations about what lies beyond the frontiers of knowledge. Many of his ideas that sounded crazy at the time were ``just crazy enough to be right''.

The world of powered two-wheelers has changed dramatically in recent decades, along with a steady increase in the number and diversity of the fleet and uses. This evolution has led to some benefits in terms of mobility, but also some drawbacks in terms of safety. The problems involved are neither simple nor monolithic and there is a lack of knowledge about their different facets and backgrounds that require research work both in terms of risk exposure, accidents factors and impact severity. This special issue of Accident Analysis and Prevention regroups 30 papers devoted to an improvement in this knowledge, exploring the different aspects involved in the safety of powered two-wheelers by complementary methods. PMID:23036376

We evaluated the freshwater mussel fishery on Wheeler Reservoir, a 27,155-hectare mainstream impoundment of the Tennessee River in Alabama. During July 1991 through June 1992, we used a roving creel survey to conduct 285 interviews over 57 weekdays and 12 weekend days. Total harvest during the 12-month survey period was estimated to be 570 metric tons, and included 15 mussel species. The most frequently harvested species were the washboard Megalonaias nervosa. Ohio pigtoe Pleurobema cordatum, and butterfly Ellipsaria lineolata. Harvest peaked in June at 290,414 mussels. Among collection techniques, total estimated effort was highest for divers (71,160 musseler-hours). The total estimated value of the 12-month mussel harvest (in terms of money paid to harvesters) from Wheeler Reservoir was US$2,119,921.

I discuss the meaning of probability in the Everett-Wheeler interpretation of quantum mechanics, together with the problem of defining histories. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. In short, probability is degree of truth. These ideas relate to traditional naive ideas of time and chance. Indeed, I argue that Everettian quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.

Powered-two-wheelers (PTWs) constitute a very vulnerable type of road users. The notable increase in their share in traffic and the high risk of severe accident occurrence raise the need for further research. However, current research on PTW safety is not as extensive as for other road users (passenger cars, etc.). Consequently, the objective of this research is to provide a critical review of research on Power-Two-Wheeler behaviour and safety with regard to data collection, methods of analysis and contributory factors, and discuss the needs for further research. Both macroscopic analyses (accident frequency, accident rates and severity) and microscopic analyses (PTW rider behaviour, interaction with other motorised traffic) are examined and discussed in this paper. The research gaps and the needs for future research are identified, discussed and put in a broad framework. When the interactions between behaviour, accident frequency/rates and severity are co-considered and co-investigated with the various contributory factors (riders, other users, road and traffic environment, vehicles), the accident and injury causes as well as the related solutions are better identified. PMID:24882114

John Archibald Wheeler was born on July 9, 1911, in Jacksonville, Florida, and passed away on April 13, 2008, in Hightstown, New Jersey; his influence on gravitational physics and science in general will remain forever. Among his many and important contributions to physics, he was one of the fathers of the renaissance of General Relativity. After a golden starting age of General Relativity, a few years after the Einstein's papers of 1915-1916, Einstein's gravitational theory was for many years, to quote the preface of a 1960 book of General Relativity [1], confined to "an ivory tower…and no doubt many a relativist looks forward to the day when the governments will seek his opinion on important questions".

We consider an extension of Wheeler-DeWitt minisuperpace cosmology with additional interaction terms that preserve the linear structure of the theory. General perturbative methods are developed and applied to known semiclassical solutions for a closed Universe filled with a massless scalar. The exact Feynman propagator of the free theory is derived by means of a conformal transformation in minisuperspace. As an example, a stochastic interaction term is considered, and first order perturbative corrections are computed. It is argued that such an interaction can be used to describe the interaction of the cosmological background with the microscopic d.o.f. of the gravitational field. A Helmoltz-like equation is considered for the case of interactions that do not depend on the internal time, and the corresponding Green's kernel is obtained exactly. The possibility of linking this approach to fundamental theories of quantum gravity is investigated.

It had been shown that the Qadir-Wheeler inhomogeneous cosmological model constructed by a ``cut-and-paste'' procedure from two closed Friedmann models develops a corridor in the late stages of crunch. It had been argued that the proper length of the corridor tends to infinity with York time. However, it was not proved that it does so, and nor was it really clear how it would tend to infinity. Again, it was argued that the volume of the model would shrink to zero, but it was not apparent precisely how it would do so. The earlier conjecture for the first order behavior of the proper length and volume of the model is proved and the next order correction worked out so as to make clear at what York time the onset of the crunch occurs.

The Foster Wheeler pyrolysis technology is a process in which used, shredded tires are heated in absence of air to produce a light fuel oil, solid fuel, and high grade steel. Hot gases (600 deg. C), containing no oxygen, are passed through the bed of tires causing pyrolysis to occur. Oil in the vapor phase is condensed and collected in the quench column. Remaining gases either fuel the process or are recycled through the reactor. Solid products are continuously removed from the reactor.

Wheeler and eastern Gray Counties are in the east·central part of the Texas Panhandle. The two counties are characterized by rolling to fairly rugged topography with many sand-dune areas and a well developed drainage system.

Problems Encountered During Contract Year--Wheeler County residents are mostly non participants when it comes to Farm Services programs. Results of the counties non participation is the rental rates are the lowest in the state. There is a government fear factor as well as an obvious distance limitation. The FSA office is nearly 150 mile roundtrip from two of the counties urban areas. I find myself not only selling the CREP-Riparian Buffer but also selling Farm Services in general. Training has been very limited. NRCS is obviously not designed around training and certification. They are an on-the-job training organization. It has caused a hesitation in my outreach program and a great deal of frustration. I feel my confidence will strengthen with the follow through of the current projects. The most evident problem has come to light as of late. The program is too expensive to implement. The planting is too intensive for a 12''-18'' rainfall area. I provide the potential landowner a spread sheet with the bonuses, the costs, and the final outcome. No matter the situation, CREP or CCRP, the landowner always balks at the cost. The program assumes the landowner has the capital to make the initial investment. For example, project No.2 is going to be a minimum width buffer. It is approximately 3,000 ft long and 5.5 acres. The buffer for tree planting and fencing alone will result in a cost of nearly $13,000. With the water developments it nears $23,000. That is nearly 10% of a 250 mother-cow operating budget. Project No.1, the tree planting estimate is $45,000. This alone is nearly 25% of the same type of budget. I would greatly appreciate any help in finding a third party willing to put money to work covering the initial costs of the program, expecting reimbursement from Farm Services Agency. I believe this could create a powerful tool in buffering streams in Wheeler County. Outlook for Contract Year 2--I have been in this position now for 6 months. I am beginning to feel a

A DDT manufacturing plant that operated on the Redstone Arsenal near Huntsville, Alabama discharged DDT-Iaden effluent from 1947 to 1970 into a creek on Wheeler National Wildlife Refuge. Seven to 9 years after the plant closed, high DDT, DDE, and DDD levels were reported in soils, river sediments, and fish in the area. Eleven of 27 mallards (Anas platyrhynchos) collected on the Refuge during February 1979 had carcass DDE residues that exceeded levels associated with eggshell thinning. DDE residues in a smaller number of mallards exceeded levels associated with egg breakage, poor hatchability, and abnormal hehavior and poor survival of offspring. Several avian species have disappeared from the Refuge since 1950, probably due to both industrial discharges of DDT from the plant and insecticidal use of DDT in the area. The contamination still presents a threat to herons, waterfowl, and raptors including occasional wintering or migrant eagles (Haliaeetus leucocephalus), and probably many other avian species. A maternity colony of endangered gray bats (Myotis grisescens) is also threatened by this contamination.

Physical properties of the quantum gravitational vacuum state are explored by solving a lattice version of the Wheeler-DeWitt equation. The constraint of diffeomorphism invariance is strong enough to uniquely determine part of the structure of the vacuum wave functional in the limit of infinitely fine triangulations of the three-sphere. In the large fluctuation regime, the nature of the wave function solution is such that a physically acceptable ground state emerges, with a finite nonperturbative correlation length naturally cutting off any infrared divergences. The location of the critical point in Newton’s constant Gc, separating the weak from the strong coupling phase, is obtained, and it is inferred from the general structure of the wave functional that fluctuations in the curvatures become unbounded at this point. Investigations of the vacuum wave functional further suggest that for weak enough coupling, G

Text is a media that is often used to convey information in both wired and wireless-based network. One limitation of the wireless system is the network bandwidth. In this study we implemented a text compression application with lossless compression technique using combination of Burrows wheeler transform, move to front, and Huffman coding methods. With the addition of the compression of the text, it is expected to save network resources. This application provides information about compression ratio. From the testing process, it concludes that text compression with only Huffman coding method will be efficient when the number of text characters are above 400 characters, meanwhile text compression with burrows wheeler transform, move to front, and Huffman coding methods will be efficient when the number of text characters are above 531 characters. Combination of these methods are more efficient than just Huffman coding when the number of text characters are above 979 characters. The more characters that are compressed and the more patterns of the same symbol, the better the compression ratio.

The paper proposed a model for estimating waiting endurance times of electric two-wheelers at signalized intersections using survival analysis method. Waiting duration times were collected by video cameras and they were assigned as censored and uncensored data to distinguish between normal crossing and red-light running behavior. A Cox proportional hazard model was introduced, and variables revealing personal characteristics and traffic conditions were defined as covariates to describe the effects of internal and external factors. Empirical results show that riders do not want to wait too long to cross intersections. As signal waiting time increases, electric two-wheelers get impatient and violate the traffic signal. There are 12.8% of electric two-wheelers with negligible wait time. 25.0% of electric two-wheelers are generally nonrisk takers who can obey the traffic rules after waiting for 100 seconds. Half of electric two-wheelers cannot endure 49.0 seconds or longer at red-light phase. Red phase time, motor vehicle volume, and conformity behavior have important effects on riders' waiting times. Waiting endurance times would decrease with the longer red-phase time, the lower traffic volume, or the bigger number of other riders who run against the red light. The proposed model may be applicable in the design, management and control of signalized intersections in other developing cities. PMID:24895659

In this paper, we consider the Wheeler-DeWitt equation modified by a deformation of the second quantized canonical commutation relations. Such modified commutation relations are induced by a Generalized Uncertainty Principle. Since the Wheeler-DeWitt equation can be related to a Sturm-Liouville problem where the associated eigenvalue can be interpreted as the cosmological constant, it is possible to explicitly relate such an eigenvalue to the deformation parameter of the corresponding Wheeler-DeWitt equation. The analysis is performed in a Mini-Superspace approach where the scale factor appears as the only degree of freedom. The deformation of the Wheeler-DeWitt equation gives rise to a Cosmological Constant even in absence of matter fields. As a Cosmological Constant cannot exist in absence of the matter fields in the undeformed Mini-Superspace approach, so the existence of a non-vanishing Cosmological Constant is a direct consequence of the deformation by the Generalized Uncertainty Principle. In fact, we are able to demonstrate that a non-vanishing Cosmological Constant exists even in the deformed flat space. We also discuss the consequences of this deformation on the big bang singularity.

This article presents an interview with Elizabeth Boling, aka Noel Wheeler, aka Skater Owens, aka EXB, editor of "TechTrends" magazine. EXB has edited AECT's bi-monthly peer-reviewed magazine since January 2003. She is also an associate professor in the Instructional Systems Technology department at Indiana University Bloomington (IUB), and in her…

Background: Approximately 5% of all households in Sri Lanka operate a three-wheeler as their primary source of income. However, very little is known about the occupational health risks associated with driving these vehicles. Objectives: The aim of this study was to assess occupational risk factors, including the number of hours worked associated with the 4-week prevalence of low back pain (LBP) among drivers of three-wheelers. Methods: Questionnaires were administered to 200 full-time drivers of three-wheelers from the Galle District in Sri Lanka. Occupational, psychological, socio-demographic, lifestyle, and anthropometric variables were collected. Univariate and multivariate analysis were used to investigate the correlation between occupational risk factors of the prevalence of LBP. Results: 15.5% of respondents reported experiencing LBP in the previous 4 months. Univariate analysis revealed that the number of hours worked per week, feeling pressure to compete with other drivers, and perceived stress scale scores were significantly associated with the 4-week prevalence of LBP. Multivariate analysis found that the number of hours worked per week and engine type were significantly associated with LBP. Conclusions: LBP is common among drivers of three-wheelers in Sri Lanka. Long work hours and two-stroke engines were significantly associated with LBP. Results from this study point towards a role for educational, behavioral health, and policy interventions to help prevent and reduce LBP among these drivers. PMID:25133353

Wheeler Ridge is an eastward propagating and north-vergent fault-bend fold (10km axis, 330m relief) at the front of the Transverse Range thrust system, southern San Joaquin Valley, CA. The occurrence of wind and water gaps along tear faults as well as elevated, distinct geomorphic surfaces along the fold axis show that Wheeler Ridge is actively growing over the kyr-timescale. The soils, tectonic geomorphology, structural modeling, and slip rate estimations were research foci in the 1990's. Previous studies used total station surveys, contour maps, trenches (5m deep), RASCAL (RAster SCanning Airborne Lidar) scanning (~1pnt/1.54m, ~0.8 km2) on the forelimb, and aerial photographs to map distinct geomorphic surfaces & features, and generate topographic profiles perpendicular and parallel to the fault axis. In September 2014, a ~40 km2 area of high resolution (>4pnts/m2) light detection and ranging (lidar) data will be collected at Wheeler Ridge by the National Center for Airborne Laser Mapping (NCALM) via a graduate student Seed grant. The new topographic data will enable us to build on previous studies of meter scale topography on the 10km fold and refine the geometric characterization of identified landforms that have been used to infer uplift, east fault tip propagation, and fold geometry in the last 250 kyr. These features include 5 geomorphic surfaces, stream channel geometry, alluvial terraces on the forelimb as well as wind and water gaps. This initial work will refine the representation of these features with high resolution topography as well as apply additional morphometric techniques to the hillslopes and channels, examine the spatial variation of landforms with independently determined uplift rates, create an updated, detailed geomorphic map of the 10km fold axis of Wheeler Ridge, and use previous subsurface modeling and these new constraints to inform an updated structural model for the growth history of Wheeler Ridge.

Lie symmetries are discussed for the Wheeler-De Witt equation in Bianchi Class A cosmologies. In particular, we consider general relativity, minimally coupled scalar-field gravity and hybrid gravity as paradigmatic examples of the approach. Several invariant solutions are determined and classified according to the form of the scalar-field potential. The approach gives rise to a suitable method to select classical solutions and it is based on the first principle of the existence of symmetries.

Context: Concerns about road safety have been increasingly associated with two-wheeler riding and especially with young commuters in India. Aims: The study was designed to explore inclination to speeding and to profile the driving behaviors in two-wheeler riding young men and women who reported a tendency to ride faster than their peers. Design: A cross-sectional survey design was used. Materials and Methods: On the basis of three focus group discussions and review of literature, a survey was prepared to tap domains such as affect states associated with riding/speeding, factors contributing to speeding, inclination for competing, perceived speed and safety, etc. The study sample comprised of 961 two-wheeler riding college-going young men and women in Bangalore. Statistical Analysis: Descriptive and inferential statistical procedures were used including Chi-square, Spearman's rank correlation, and independent sample t-test. Results: The sample was divided into two subgroups on the basis of self-report of greater speeding than one's peers. A subgroup of 349 participants endorsed the item regarding inclination to ride faster than one's peers, whereas, the remaining 612 participants did not endorse it. The profiles of these two subgroups were obtained in terms of sociodemographic variables, riding behaviors, and associated domains. Significant differences between the subgroups emerged on domains such as motives for riding fast, tendency for competing, perceived safety and frequency of minor accidents while riding. Conclusions: Several correlates of the tendency to speeding among young two-wheeler riders emerged that have implications for enhancing safe riding. PMID:25788799

Drosophila buzzatii (Patterson & Wheeler), a typical cactophilic species of the repleta group, is registered for the first time emerging from Melon (Cucumis melo) in western Argentina. The analysis of inversion polymorphism and genetic diversity of mitochondrial cytochrome oxidase subunit I gene (mtCOI) provided additional evidence that corroborated the presence of a high proportion of D. buzzatii among the flies emerged from melon. This finding set the scenario for a broader range of possible hosts and host-related distribution and dispersion for this widespread species. PMID:26960546

Two-wheeler vehicles in Delhi, India--roughly 70% of the total vehicle fleet--are responsible for a significant portion of the city's vehicle emissions and petroleum consumption. An inspection and maintenance (I/M) program that ensures vehicle emission control systems are well maintained can complement other emission reduction strategies. This paper presents the initial findings of extensive data collected on vehicle characteristics and emissions for two-wheeler vehicles operating in Delhi in a series of I/M camps conducted by the Society of Indian Automobile Manufacturers and various partners in late 1999. The analysis shows idle HC and CO emissions [measured in terms of parts per million (ppm) and volume % (vol %), respectively] in a slow declining trend with subsequent model years, reflecting tighter emission standards and more advanced emission technologies. The I/M benefits--3 vol % and 39% reduction in idle and mass CO, respectively; 40 vol % and 22% reduction in idle and mass HC, respectively; and a 10-20% increase in fuel efficiency--were higher than those reported in the literature. Although these benefits are substantial, any implementation strategy needs to consider cost-effectiveness. In the present study, only 10% of vehicles--contributing 22% of the total vehicle emissions--failed the idle CO standard. Fleet emissions data variability necessitates a large sample size to develop a baseline for the vehicle fleet, but a smaller, scientifically designed sample and better data collection quality could periodically track the benefits at future camps. PMID:11686242

A recent study found that cutting shoots under water while xylem was under tension (which has been the standard protocol for the past few decades) could produce artefactual embolisms inside the xylem, overestimating hydraulic vulnerability relative to shoots cut under water after relaxing xylem tension (Wheeler et al. 2013). That study also raised the possibility that such a 'Wheeler effect' might occur in studies of leaf hydraulic vulnerability. We tested for such an effect for four species by applying a modified vacuum pump method to leaves with minor veins severed, to construct leaf xylem hydraulic vulnerability curves. We tested for an impact on leaf xylem hydraulic conductance (Kx ) of cutting the petiole and minor veins under water for dehydrated leaves with xylem under tension compared with dehydrated leaves after previously relaxing xylem tension. Our results showed no significant 'cutting artefact' for leaf xylem. The lack of an effect for leaves could not be explained by narrower or shorter xylem conduits, and may be due to lesser mechanical stress imposed when cutting leaf petioles, and/or to rapid refilling of emboli in petioles. These findings provide the first validation of previous measurements of leaf hydraulic vulnerability against this potential artefact. PMID:25039813

For a few years, the use of powered two-wheelers has taken off in Paris. It then became critical for the City of Paris to understand both the mechanisms leading to traffic accidents involving at least one powered two-wheelers user and the perception of their risk when riding in dense urban areas. In so doing, two studies were carried out along similar lines so that their results could be compared. The first study focused on the perception of situations where accidents are most likely to occur. The second one was an analysis of police reports of accidents involving at least one powered two-wheelers and the drawing-up of prototypical accident scenarios. Comparing the results of the two studies revealed a gap between perceived and objective risks of these users. In fact, they rather fear the situations during which a car driver is changing lanes, while accidents involving them occur more often when a car driver turns (right, left or U). Knowledge of this dissonance in terms of awareness of road risks for powered two-wheelers and equally, other road users, will give the City of Paris food for thought. The promising results of this study have encouraged the City of Paris to extend it to other types of users, such as cyclists or elderly pedestrians. PMID:23036388

Two brief guides offer suggestions for persons with physical disabilities who are considering the purchase of adaptive driving equipment, battery-powered scooters, or three wheelers. The first guide offers guidelines for individuals considering purchase of special hand controls or other modifications or a van lift to enhance their independence in…

In this research we present a new method for image compression which is a combination of lossy and lossless image coder .In lossy part we use a 3-level decomposition of discrete wavelet transform with hard threshold technique. lossless image coding part is Burrows Wheeler Transform (BWT) which has received considerable attention in recent years because of its simplicity and effectiveness. The BWT is used with other additional methods of image compression such as, MTF, RLE and entropy coding algorithms. the effectiveness of over method is presented and compare with JPEG and JPEG2000 for different images concidering criteria such as PSNR ,Compression Ratio (CR) and images quality(HVS) in our future works we use BWT with other source coding algorithm such as Lempel-Ziv and different mother function for DWT.

Power-Two-Wheelers (PTWs) constitute a vulnerable class of road users with increased frequency and severity of accidents. The present paper focuses of the PTW accident risk factors and reviews existing literature with regard to the PTW drivers' interactions with the automobile drivers, as well as interactions with infrastructure elements and weather conditions. Several critical risk factors are revealed with different levels of influence to PTW accident likelihood and severity. A broad classification based on the magnitude and the need for further research for each risk factor is proposed. The paper concludes by discussing the importance of dealing with accident configurations, the data quality and availability, methods implemented to model risk and exposure and risk identification which are critical for a thorough understanding of the determinants of PTW safety. PMID:22579296

The Columbine-Hondo Wilderness Study Area and the Wheeler Peak Wilderness, near Taos, New Mexico, were studied. Two areas within the study area and wilderness have substantiated mineral-resource potential for molybdenum and are surrounded by an area of probable molybdenum potential. A small area on the north side of the Columbine-Hondo Wilderness Study Area also has probable molybdenum potential; this area is peripheral to the Red River mining area, which includes the Questa mine. Two other areas within the study area and wilderness have probable mineral-resource potential for copper, lead, and zinc, and four areas have probable mineral-resource potential for gold and silver, as well as base metals.

Wheeler Ridge is an asymmetric east-propagating anticline (10km axis, 330m relief) above a north-vergent blind thrust deforming Quaternary alluvial fan and shallow marine rocks at the northern front of the Transverse Ranges, San Joaquin Valley, CA. This area was a research foci in the 1990's when the soils, u-series soil carbonate dating, and subsurface structure of deformed strata identified from oil wells were used to create a kinematic model of deformation, and estimates of fault slip, uplift, and lateral propagation rates. A recent collection of light detection and ranging (lidar) topographic data and optically stimulated luminescence (OSL) data allow us to complete meter scale topographic analyses of the fluvial networks and hillslopes and correlate geomorphic response to tectonics. We interpret these results using a detailed morphological map and observe drainage network and hillslope process transitions both along and across the fold axis. With lidar topography, we extract common morphometrics (e.g., channel steepness-- ksn, eroded volume, hillslope relief) to illustrate how the landscape is responding to variations in uplift rate along the fold axis and show asymmetry of surface response on the forelimb and backlimb. The forelimb is dominated by large drainages with landslides initiating in the marine units at the core of the fold. Our topographic analysis shows that the stream channel indices values on the forelimb increase along the fold axis, away from the propagation tip. The backlimb drainages are dominantly long and linear with broad ridgelines. Using lidar and fieldwork, we see that uplifted backlimb surfaces preserve the deformed fan surface. The preliminary OSL results from alluvial fan units improve age control of previously defined surfaces, refining our understanding of the deposition and uplift of alluvial fan units on preserved on backlimb.

The interior of both Schwarzschild and Kantowski-Sachs black holes is studied via a non-canonical noncommutative extension of the Wheeler-DeWitt equation in the interesting work (Bastos et al. in Phys. Rev. D 84:024005, 2011) by neglecting the cubic and quartic terms arising the Hamiltonian. Here, we consider the problem when these terms are present and provide a more realistic analytical solution to the problem.

We show that the solutions of the Wheeler-DeWitt equation in a homogeneous and isotropic universe are given by triconfluent Heun functions for the spatially closed, flat, and open geometries of the Friedmann-Robertson-Walker universe filled with different forms of energy. In a matter-dominated universe, we find the polynomial solution and the energy density spectrum. In the cases of radiation-dominated and vacuum universes, there are no polynomial solutions as shown.

An experimental approach to the verification of specific relations between thermodynamic properties as predicted from the Griffiths-Wheeler theory of critical phenomena in multicomponent systems is developed for the particular case of ordinary liquid-liquid critical points of binary mixtures. Densities ρ(T) , isobaric heat capacities per unit volume Cp(T) , and previously reported values of the slope of the critical line (dT/dp)c for five critical mixtures are used to check the thermodynamic consistency of Cp and ρ near the critical point. An appropriate treatment of ρ(T) data is found to provide the key solution to this issue. In addition, various alternative treatments for Cp(T) data provide values for both the critical exponent α and the ratio between the critical amplitudes of the heat capacity A+/A- that are in agreement with their widely accepted counterparts, whereas two-scale-factor universality is successfully verified in one of the systems studied.

The paper proposes a methodology based on Bayesian Networks for identifying the power two wheeler (PTW) driving patterns that arise at the emergence of a critical incident based on high resolution driving data (100Hz) from a naturalistic PTW driving experiment. The proposed methodology aims at identifying the prevailing PTW drivers' actions at the beginning and during critical incidents and associating the critical incidents to specific PTW driving patterns. Results using data from one PTW driver reveal three prevailing driving actions for describing the onset of an incident and an equal number of actions that a PTW driver executes during the course of an incident to avoid a crash. Furthermore, the proposed methodology efficiently relates the observed sets of actions with different types of incidents occurring during overtaking or due to the interactions of the rider with moving or stationary obstacles and the opposing traffic. The observed interrelations define several driving patterns that are characterized by different initial actions, as well as by different likelihood of sequential actions during the incident. The proposed modeling may have significant implications to the efficient and less time consuming analysis of the naturalist data, as well as to the development of custom made PTW driver assistance systems. PMID:23375128

Wheeler's delayed-choice experiment highlights strange features of quantum theory such as pre-sensing of the experimental setup by the quantum object and the role of time. A recent proposal for such an experiment with an interferometer having a quantum beam splitter (QBS) [R. Ionicioiu and D. R. Terno, Phys. Rev. Lett. 107, 230406 (2011), 10.1103/PhysRevLett.107.230406] and its subsequent experimental implementations through photonics and NMR have produced results including the modification in the concept of complementarity. Here we propose a matter-wave Mach-Zehnder-Bragg cavity-QED interferometric setup with final QBS engineered through a cavity field that is taken initially in the superposition of zero and one photon. The setup operates through first-order off-resonant Bragg diffraction of the neutral atoms from the cavity fields with the matter wave's particle (wave) nature marked through the absence (presence) of a photon in the final cavity. The proposal, addressing the issue through atomic de Broglie waves, can be executed within the present cavity-QED experimental scenario with appreciable success probability and fidelity.

Powered two wheelers (PTWs) come in diverse forms and are used for a range of purposes in very different parts of the world. In many parts of the world, the forms and uses of PTWs are changing, influenced by social, economic and demographic changes. Most of the challenges associated with PTWs relate to safety, while the majority of the opportunities relate to mobility. The challenges for improving safety relate to the PTW user, other road users, the road environment, the vehicle, data and research, and socio-political dimensions. The relative importance of particular challenges varies between developed and developing countries, and among developing countries according to whether PTWs are largely used for recreation or for transport. PTWs present a range of psychological, transport, economic and environmental opportunities to individuals and societies. The fun and excitement of riding PTWs is a major motivator for their purchase and use for recreational purposes, both off-road and on-road. The transport and economic advantages to the individual also need to be considered. At a societal level, research has examined the potential for increasing PTW volumes to reduce fossil fuel use and traffic congestion in busy cities. The future of PTWs may differ greatly between countries and environmental and technological changes are leading to an evolution in the form of PTWs to encompass new modes of personal transportation. PMID:22062331

In recent years, the role of "sleepiness at the wheel" in the occurrence of accidents has been increasingly highlighted with several national and international public health campaigns based on consensual research publications. However, one aspect of this phenomenon is rarely taken into account, i.e., the risk of sleep-induced accidents while riding powered two-wheelers (PTWs). PTWs are indeed involved in a high percentage of fatal accidents mostly with young male riders. The effects of sleepiness may be different in drivers and riders, partly because riders may be stimulated more by the road environment. But riders (differently from drivers) have also to maintain continuously a balance between their own stability and the need of following the road, even when they are directly exposed to adverse climatic conditions. We, therefore, gathered the limited scientific literature on this topic and tried to analyze how riders may be affected differently by sleepiness. Finally we provide some suggestions as to how this question may be better approached in the future. PMID:26140871

The decade of the 1870s was a time of extensive exploration and surveying in the American West. The nation needed knowledge of the cultural features, topography, natural resources, and geology of this land to promote and aid the 'rapid development of an empire.' The need was particularly acute in the region that still was known in the early 1870s as Colorado Territory. There, cities and towns were springing up along the base of the Front Range, railroads were expanding, and in the mountains prospectors and miners were exploring the countryside seeking and extracting the region's abundant mineral resources. Also, recurring conflicts between the newcomers and Native Americans made it desirable to have accurate maps for military purposes. Four major government-sponsored scientific surveys formed the principal organized effort to provide critical knowledge of the land. Civilian scientists led three of these: John Wesley Powell ('Geographical and Topographical Survey of the Colorado River of the West'); Ferdinand V. Hayden ('Geological and Geographical Survey of the Territories'); and Clarence King ('Geological Exploration of the Fortieth Parallel'). Lt. George Montague Wheeler, a young graduate of West Point (Class of 1866) and a member of the U.S. Army Corps of Engineers, led the fourth and most ambitious project ('United States Geographical Surveys West of the One Hundredth Meridian').

PURPOSE. To provide an overview of associations between wheelchair propulsion biomechanics for both everyday and racing wheelchairs, wheeling-related upper limb injuries, and quality of life of manual wheelchair users through a synthesis of the available information. METHODS. A search of publications was carried out in PubMed and SportsDiscus databases. Studies on wheelchair propulsion biomechanics, upper limb injuries associated with wheelchair propulsion and quality of life of wheelchair users were identified. Relevant articles cited in identified articles but not cited in PubMed or SportsDiscus were also included. RESULTS. Wheelchair sports participation has positive impact on quality of life and research in racing wheelchair biomechanics can indirectly promote the visibility of wheelchair sports. The impact of pushrim-activated power-assisted wheelchairs (a hybrid between manual and battery-powered wheelchairs) and geared manual wheels on wheelers' everyday life were discussed. CONCLUSIONS. The study of wheelchair propulsion biomechanics focuses on how a wheelchair user imparts power to the wheels to achieve mobility and the accumulated knowledge can help to improve wheelchair users' mobility, reduce physical stress associated with wheelchair propulsion, and as a result, enhance quality of life. PMID:20932232

This report focuses on lithium-ion (Li-ion) battery technology applications for two- and possibly three-wheeled vehicles. The author of this report visited the People's Republic of China (PRC or China) to assess the status of Li-ion battery technology there and to analyze Chinese policies, regulations, and incentives for using this technology and for using two- and three-wheeled vehicles. Another objective was to determine if the Li-ion batteries produced in China were available for benchmarking in the United States. The United States continues to lead the world in Li-ion technology research and development (R&D). Its strong R&D program is funded by the U.S. Department of Energy and other federal agencies, such as the National Institute of Standards and Technology and the U.S. Department of Defense. In Asia, too, developed countries like China, Korea, and Japan are commercializing and producing this technology. In China, more than 120 companies are involved in producing Li-ion batteries. There are more than 139 manufacturers of electric bicycles (also referred to as E-bicycles, electric bikes or E-bikes, and electric two-wheelers or ETWs in this report) and several hundred suppliers. Most E-bikes use lead acid batteries, but there is a push toward using Li-ion battery technology for two- and three-wheeled applications. Highlights and conclusions from this visit are provided in this report and summarized.

This study addresses the impact of an actual drive pattern on the sizing and cost of a battery pack for a plug-in hybrid electric two-wheeler. To estimate the daily average travel distance in fixing the all-electric range of two wheelers, a study conducted in Coimbatore city is presented. A MATLAB simulation model developed for estimating the energy and power requirements in an all-electric strategy using an Indian driving cycle (IDC) and a real-world driving pattern are discussed. The simulation results reveal the impact of the real-world driving pattern on energy consumption and also the influence of all-electric range in sizing the battery pack. To validate the results, a plug-in hybrid electric two-wheeler developed by modifying a standard two-wheeler has been tested on the road with the help of the IDC simulator kit. An annual battery cost comparison shows that nickel-metal-hydride batteries are more economical and suitable for in plug-in hybrid electric two-wheelers.

Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

Next-generation sequencing technologies have led to the sequencing of more and more genomes, propelling related research into the era of big data. In this paper, we present ParaBWT, a parallelized Burrows-Wheeler transform (BWT) and suffix array construction algorithm for big genome data. In ParaBWT, we have investigated a progressive construction approach to constructing the BWT of single genome sequences in linear space complexity, but with a small constant factor. This approach has been further parallelized using multi-threading based on a master-slave coprocessing model. After gaining the BWT, the suffix array is constructed in a memory-efficient manner. The performance of ParaBWT has been evaluated using two sequences generated from two human genome assemblies: the Ensembl Homo sapiens assembly and the human reference genome. Our performance comparison to FMD-index and Bwt-disk reveals that on 12 CPU cores, ParaBWT runs up to 2.2× faster than FMD-index and up to 99.0× faster than Bwt-disk. BWT construction algorithms for very long genomic sequences are time consuming and (due to their incremental nature) inherently difficult to parallelize. Thus, their parallelization is challenging and even relatively small speedups like the ones of our method over FMD-index are of high importance to research. ParaBWT is written in C++, and is freely available at http://parabwt.sourceforge.net. PMID:27295644

We present a computer simulation model that is a one-to-one copy of an experimental realization of Wheeler's delayed-choice experiment that employs a single photon source and a Mach-Zehnder interferometer composed of a 50/50 input beam splitter and a variable output beam splitter with adjustable reflection coefficient R [V. Jacques, E. Wu, F. Grosshans, F. Treussart, P. Grangier, A. Aspect, J.-F. Roch, Phys. Rev. Lett. 100 (2008) 220402]. For 0⩽R⩽0.5, experimentally measured values of the interference visibility V and the path distinguishability D, a parameter quantifying the which-path information (WPI), are found to fulfill the complementary relation V2+D2⩽1, thereby allowing to obtain partial WPI while keeping interference with limited visibility. The simulation model that is solely based on experimental facts that satisfies Einstein's criterion of local causality and that does not rely on any concept of quantum theory or of probability theory, reproduces quantitatively the averages calculated from quantum theory. Our results prove that it is possible to give a particle-only description of the experiment, that one can have full WPI even if D=0, V=1 and therefore that the relation V2+D2⩽1 cannot be regarded as quantifying the notion of complementarity.

The Smirnoff-Wheeler (SW) pathway has been proven to be the only significant source of l-ascorbic acid (AsA; vitamin C) in the seedlings of the model plant Arabidopsis thaliana. It is yet uncertain whether the same pathway holds for all other plants and their various organs as AsA may also be synthesized through alternative pathways. In this study, we have cloned some of the genes involved in the SW-pathway from acerola (Malpighia glabra), a plant containing enormous amount of AsA, and examined the expression patterns of these genes in the plant. The AsA contents of acerola leaves were about 8-fold more than that of Arabidopsis with 5-700-fold higher mRNA abundance in AsA-biosynthesizing genes. The unripe fruits have the highest AsA content but the accumulation was substantially repressed as the fruit transitions to maturation. The mRNAs encoding these genes showed correlation in their expression with the AsA contents of the fruits. Although very little AsA was recorded in the seeds the mRNAs encoding all the genes, with the exception of the mitochondrially located L-galactono-1,4-lactone dehydrogenase, were clearly detected in the seeds of the unripe fruits. In young leaves of acerola, the expression of most genes were repressed by the dark and induced by light. However, the expression of GDP-D-mannose pyrophosphorylase similar to that encoded by A. thaliana VTC1 was induced in the dark. The expressions of all the genes surged after 24h following wounding stress on the young leaves. These findings will advance the investigation into the molecular factors regulating the biosynthesis of abundant AsA in acerola. PMID:18952318

The report gives results of economic evaluations of two processes: the Rockwell International aqueous carbonate process (ACP) and the Wellman-Lord process, the latter applied to a sulfuric acid plant, the Foster Wheeler Resox process, and the Allied Chemical coal reduction proces...

This dissertation examines the rise, present use, and future growth of the electric two-wheeler (E2W, a.k.a. E2W or e-scooter) in China, the world's most successful electric-drive vehicle. The E2W market has been experiencing tremendous growth with over 30 million now in regular use on Chinese streets. The adoption of E2W technology is significant because, along with their air quality and energy (low-carbon) benefits compared to gasoline powered motorcycles, E2Ws are driving the development of improved and lower cost batteries and may lead to a shift toward larger three-and four-wheel electric vehicles (EV). This dissertation explores three questions: why the E2W market grew so rapidly in China, what factors are driving and resisting its growth, and how future growth might impact the adoption of electric vehicles. In Chapter 1, the context for this analysis is built by describing China's transportation past, present, and future challenges. E2Ws are also introduced and compared with gasoline-powered motorcycles on several metrics, such as performance, air emissions, and energy use. In Chapter 2, data from the literature was collected and analyzed to understand the history and important reasons for E2W growth in China. To supplement these data, the author and colleagues interviewed leaders of E2W and battery companies and toured several manufacturing plants. In Chapter 3, E2W and bicycles users were surveyed to understand how and why they use (or don't use) E2Ws. In Chapter 4, valve-regulated lead-acid (VRLA) batteries commonly used in today's E2Ws were laboratory tested to determine their performance characteristics. Data were also compiled on their cost, and on the cost and performance of Li-ion batteries. In Chapter 5, the future of E2Ws in China was assessed by integrating data from the previous three chapters and from the literature to create a force-field analysis of the E2W market. This chapter concludes by examining the spillover effects E2W market growth may

This study aimed to understand the non-exhaust (NE) emission of particles from wear of summer tire and concrete pavement, especially for two wheelers and small cars. A fully enclosed laboratory-scale model was fabricated to simulate road tire interaction with a facility to collect particles in different sizes. A road was cast using the M-45 concrete mixture and the centrifugal casting method. It was observed that emission of large particle non exhaust emission (LPNE) as well as PM 10 and PM 2.5 increased with increasing load. The LPNE was 3.5 mg tire -1 km -1 for a two wheeler and 6.4 mg tire -1 km -1 for a small car. The LPNE can lead to water pollution through water run-off from the roads. The contribution of the PM 10 and PM 2.5 was smaller compared to the LPNE particles (less than 0.1%). About 32 percent of particle mass of PM 10 was present below 1 μm. The number as well as mass size distribution for PM 10 was observed to be bi-modal with peaks at 0.3 μm and 4-5 μm. The NE emissions did not show any significant trend with change in tire pressure.

A modified semi-classical method is used to construct both ground and excited state solutions to the canonically quantized vacuum Bianchi IX (Mixmaster) cosmological models. Employing a modified form of the semi-classical Ansatz we solve the relevant Wheeler-DeWitt equation asymptotically by integrating a set of linear transport equations along the flow of a suitably chosen solution to the corresponding Euclidean-signature Hamilton-Jacobi equation. For the Moncrief-Ryan (or ‘wormhole’) Hamilton-Jacobi solution, we compute the ground state quantum correction term associated with operator ordering ambiguities and show how higher order correction terms can be computed. We also determine the explicit, leading order forms of a family of excited states and show how to compute their quantum corrections as smooth, globally defined functions on the Bianchi IX minisuperspace. These excited state solutions are peaked away from the minisuperspace origin and are labeled by a pair of positive integers that can be plausibly interpreted as graviton excitation numbers for the two independent anisotropy degrees of freedom. The Euclidean-signature semi-classical method used here is applicable to more general models, representing a significant progress in the Wheeler-DeWitt approach to quantum gravity.

This paper examines Pfadt and Wheeler's (1995) suggestions that the methods of statistical process control (SPC) be incorporated into applied behavior analysis. The research strategies of SPC are examined and compared to those of applied behavior analysis. I argue that the statistical methods that are a part of SPC would likely reduce applied behavior analysts' intimate contacts with the problems with which they deal and would, therefore, likely yield poor treatment and research decisions. Examples of these kinds of results and decisions are drawn from the cases and data Pfadt and Wheeler present. This paper also describes and clarifies many common misconceptions about SPC, including W. Edwards Deming's involvement in its development, its relationship to total quality management, and its confusion with various other methods designed to detect sources of unwanted variability. PMID:7592156

The first committed step in the biosynthesis of l-ascorbate from d-glucose in plants requires conversion of GDP-l-galactose to l-galactose 1-phosphate by a previously unidentified enzyme. Here we show that the protein encoded by VTC2, a gene mutated in vitamin C-deficient Arabidopsis thaliana strains, is a member of the GalT/Apa1 branch of the histidine triad protein superfamily that catalyzes the conversion of GDP-l-galactose to l-galactose 1-phosphate in a reaction that consumes inorganic phosphate and produces GDP. In characterizing recombinant VTC2 from Arabidopsis thaliana as a specific GDP-l-galactose/GDP-d-glucose phosphorylase, we conclude that enzymes catalyzing each of the ten steps of the Smirnoff-Wheeler pathway from glucose to ascorbate have been identified. Finally, we identify VTC2 homologs in plants, invertebrates, and vertebrates, suggesting that a similar reaction is used widely in nature. PMID:17462988

In this paper, we consider an A d S 5 bulk with k=-1- FRW branes, together with bosons test particles, evolving in the 5D hyperspace. In the first part, we compute the wave function of the scalar fields in the bulk and the allowed mass spectrum for physically relevant cases. Also, an important quantization law, connecting the mass spectrum of the bosons on the brane and the bulk mass parameter is written down. In the second part, in oder to develop a quantization model, we use the Wheeler-DeWitt equation and solve its Schrödinger-like form, obtaining the wave function of the Universe. The solutions describe a universe emerging out of nothing, without tunneling. Lastly, using a mixture of states, we emphasize a smooth universe, with neither Bangs nor Crunches.

Motivation: With the development of high-throughput sequencing, the number of assembled genomes continues to rise. It is critical to well organize and index many assembled genomes to promote future genomics studies. Burrows–Wheeler Transform (BWT) is an important data structure of genome indexing, which has many fundamental applications; however, it is still non-trivial to construct BWT for large collection of genomes, especially for highly similar or repetitive genomes. Moreover, the state-of-the-art approaches cannot well support scalable parallel computing owing to their incremental nature, which is a bottleneck to use modern computers to accelerate BWT construction. Results: We propose de Bruijn branch-based BWT constructor (deBWT), a novel parallel BWT construction approach. DeBWT innovatively represents and organizes the suffixes of input sequence with a novel data structure, de Bruijn branch encoding. This data structure takes the advantage of de Bruijn graph to facilitate the comparison between the suffixes with long common prefix, which breaks the bottleneck of the BWT construction of repetitive genomic sequences. Meanwhile, deBWT also uses the structure of de Bruijn graph for reducing unnecessary comparisons between suffixes. The benchmarking suggests that, deBWT is efficient and scalable to construct BWT for large dataset by parallel computing. It is well-suited to index many genomes, such as a collection of individual human genomes, with multiple-core servers or clusters. Availability and implementation: deBWT is implemented in C language, the source code is available at https://github.com/hitbc/deBWT or https://github.com/DixianZhu/deBWT Contact: ydwang@hit.edu.cn Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27307614

Powered two-wheeler (PTW) vehicles complying with recent European type approval standards (stages Euro 2 and Euro 3) were tested on chassis dynamometer in order to measure exhaust emissions of about 25 volatile organic compounds (VOCs) in the range C1-C7, including carcinogenic compounds as benzene and 1,3-butadiene. The fleet consists of a moped (engine capacity ≤ 50 cm(3)) and three fuel injection motorcycles of different engine capacities (150, 300 and 400 cm(3)). Different driving conditions were tested (US FPT cycle, constant speed). Due to the poor control of the combustion and catalyst efficiency, moped is the highest pollutant emitter. In fact, fuel injection strategy and three way catalyst with lambda sensor are able to reduce VOC motorcycles' emission of about one order of magnitude with respect to moped. Cold start effect, that is crucial for the assessment of actual emission of PTWs in urban areas, was significant: 30-51% of extra emission for methane. In the investigated speed range, moped showed a significant maximum of VOC emission factor at minimum speed (10 km/h) and a slightly decreasing trend from 20 to 60 km/h; motorcycles showed on the average a less significant peak at 10 km/h, a minimum at 30-40 km/h and then an increasing trend with a maximum emission factor at 90 km/h. Carcinogenic VOCs show the same pattern of total VOCs. Ozone Formation Potential (OFP) was estimated by using Maximum Incremental Reactivity scale. The greatest contribution to tropospheric ozone formation comes from alkenes group which account for 50-80% to the total OFP. VOC contribution effect on greenhouse effect is negligible with respect to CO2 emitted. PMID:24095967

This study aimed at determining the phytochemical constituents of Euphorbia golondrina L.C. Wheeler, an alien invasive medicinal herb that is used for the treatment of gastroenteritis related ailments, diabetes, conjunctivitis, gastritis, enterocolitis, tonsillitis, vaginitis, hemorrhoids, prostatism, warts and painful swellings by the Mundani people of the mount Bambouto Caldera in SouthWestern Cameroon, and to evaluate its in vitro antimicrobial and antioxidant activities. Susceptibility testing by agar well diffusion assay revealed good antibacterial activity with inhibition zone diameter of 20 ± 1.1 mm against Bacillus cereus followed by Staphylococcus aureus with inhibition zone diameter of 17 ± 1.6 mm which was significantly lower (P

The production of Pertussis Vaccine was reevaluated at the Instituto Nacional de Higiene "Rafael Rangel" in order to optimise it in terms of vaccine yield, potency, specific toxicity and efficiency (cost per doses). Four different processes, using two culture media (Cohen-Wheeler and Fermentación Glutamato Prolina-1) and two types of bioreactors (25 L Fermentador Caracas and a 450 L industrial fermentor) were compared. Runs were started from freeze-dried strains (134 or 509) and continued until the obtention of the maximal yield. It was found that the combination Fermentación Glutamato Prolina-1/industrial fermentor, shortened the process to 40 hours while consistently yielding a vaccine of higher potency (7.91 +/- 2.56 IU/human dose) and lower specific toxicity in a mice bioassay. In addition, the physical aspect of the preparation was rather homogeneous and free of dark aggregates. Most importantly, the biomass yield more than doubled those of the Fermentador Caracas using the two different media and that in the industrial fermentor with the Cohen-Wheeler medium. Therefore, the cost per doses was substantially decreased. PMID:9279028

A process is described for minimizing the cracking tendency and uncontrolled dimensional change, and improving the strength of a rammed plastic refractory reactor liner comprising phosphate-bonded silicon carbide or phosphate-bonded alumina. It consists of heating the reactor liner placed or mounted in a reactor, prior to its first use, from ambient temperature up to a temperature of from about 490/sup 0/C to about 510/sup 0/C, the heating being carried out by heating the liner at a rate to produce a temperature increase of the liner not greater than about 6/sup 0/C per hour.

This work provides an overview of our recent results in studying two most important and widely discussed quantum processes: electron-positron pairs production off a probe photon propagating through a polarized short-pulsed electromagnetic (e.g. laser) wave field or generalized Breit-Wheelerprocess, and a single a photon emission off an electron interacting with the laser pules, so-called non-linear Compton scattering. We show that the probabilities of particle production in both processes are determined by interplay of two dynamical effects, where the first one is related to the shape and duration of the pulse and the second one is non-linear dynamics of the interaction of charged fermions with a strong electromagnetic field. We elaborate suitable expressions for the production probabilities and cross sections, convenient for studying evolution of the plasma in presence of strong electromagnetic fields.

Young's double-slit experiment realized with particles sent one at a time through an interferometer is at the heart of quantum mechanics. The striking feature is that the phenomenon of interference, interpreted as a wave following simultaneously two paths, is incompatible with our common sense representation of a particle following one route or the other but not both. The work described in this book is dedicated to the study of wave-particle duality for a single photon emitted by the triggered photoluminescence of a single NV color center in a diamond nanocrystal. We first present the realization of a single-photon interference experiment using a Fresnel's biprism, in a scheme equivalent to the standard Young's double-slit textbook experiment. We then discuss the complementarity between interference and which-path information in this two-path interferometer. We finally describe the experimental realization of Wheeler's delayed-choice gedanken experiment, which is a fascinating and subtle illustration of wave-particle duality. In such experiment, the choice either to observe interference fringes, obviously associated to a wave-like behavior, or to know which path of the interferometer has been followed, according to a particle-like behavior, is made whereas the photon has already entered into the interferometer. Furthermore, the choice is made by a quantum random number generator and is relativistically separated from the entering of the photon into the interferometer. The results of that experiment show once again that no classical physical reality can be attributed to the photon independent of the measurement apparatus, as stated by the complementarity principle. La théorie quantique nécessite de renoncer à certaines images classiques héritées du sens commun. Elle stipule en particulier une description duale de la lumière et de la matière, présentant simultanément les propriétés d'une onde et d'une particule et conduisant ainsi à des repr

Pressurized fluidization is a promising new technology for the clean and efficient combustion of coal. Its principle is to operate a coal combustor at high inlet gas velocity to increase the flow of reactants, at an elevated pressure to raise the overall efficiency of the process. Unfortunately, commercialization of large pressurized fluidized beds is inhibited by uncertainties in scaling up units from the current pilot plant levels. In this context, our objective is to conduct a study of the fluid dynamics and solid capture of a large pressurized coal-fired unit. The idea is to employ dimensional similitude to simulate in a cold laboratory model the flow in a Pressurized Circulating Fluid Bed ''Pyrolyzer,'' which is part of a High Performance Power System (HIPPS) developed by Foster Wheeler Development Corporation (FWDC) under the DOE's Combustion 2000 program.

Foster Wheeler Development Corporation addressed the task of further broadening the application of the RESOX process for converting sulfur dioxide to elemental sulfur. The major effort, to date, had been concentrated on treating the off-gas from one specific front-end SO'' concentrator process - the Bergbau-Forschung dry adsorption system. By selecting two other concentrator processes, which furnish differing inlet compositions, and coupling these to coals with differing characteristics, the scope and flexibility of the RESOX process was evaluated. The front-end concentrator processes which were chosen are the Wellman-Lord (WL) and Chemico-Basic (CB); both are regenerative and employ wet scrubbing. WL is based on the chemistry of the sodium sulfite/bisulfite system, which can absorb and free SO/sub 2/. CB accomplishes the same function with the magnesium oxide/sulfite system. WL can concentrate the SO/sub 2/ to 85 to vol % while CB currently utilizes a direct fired calciner, which limits the SO/sub 2/ concentration to approximately 13%. Each front-end process was studied and adapted to the RESOX process. Each was then coupled to three different coals, selected as reductants through a preliminary screening procedure, for a series of tests at FWDC's 1200 aft/sup 3//h pilot plant. The coals chosen had ASTM rankings of Anthracite, High-Volatile C Bituminous, and Subbituminous A. The maximum sulfur yield realized within a series varied from 68.1 to 85.2%. The program demonstrated the ability of the RESOX process to handle a broader range of reducing agents and front-end gas compositions than heretofore tested. For each front-end process, a sulfur yield of approximately 80 wt % of the quantity available from its gas composition was realized with at least one of the reductants tested.

Just as automobiles need fuel to operate, so do nuclear reactors. When fossil fuels such as gasoline are burned to power an automobile, they are consumed immediately and nearly completely in the process. When the fuel is gone, energy production stops. Nuclear reactors are incapable of achieving this near complete burn-up because as the fuel (uranium) that powers them is burned through the process of nuclear fission, a variety of other elements are also created and become intimately associated with the uranium. Because they absorb neutrons, which energize the fission process, these accumulating fission products eventually poison the fuel by stopping the production of energy from it. The fission products may also damage the structural integrity of the fuel elements. Even though the uranium fuel is still present, sometimes in significant quantities, it is unburnable and will not power a reactor unless it is separated from the neutron-absorbing fission products by a method called fuel reprocessing. Construction of the Fuel Reprocessing Complex at the Chem Plant started in 1950 with the Bechtel Corporation serving as construction contractor and American Cyanamid Company as operating contractor. Although the Foster Wheeler Corporation assumed responsibility for the detailed working design of the overall plant, scientists at Oak Ridge designed all of the equipment that would be employed in the uranium separations process. After three years of construction activity and extensive testing, the plant was ready to handle its first load of irradiated fuel.

Presents an excerpt from the book entitled "Lonely Hearts of the Cosmos." Provides narration of behind-the-scenes events in the lives, the scientific debates, and the intellectual triumphs of the two physicists responsible for inventing the concept of the black hole. (JJK)

The wave-particle dual nature of light and matter and the fact that the choice of measurement determines which one of these two seemingly incompatible behaviours we observe are examples of the counterintuitive features of quantum mechanics. They are illustrated by Wheeler’s famous `delayed-choice’ experiment, recently demonstrated in a single-photon experiment. Here, we use a single ultracold metastable helium atom in a Mach-Zehnder interferometer to create an atomic analogue of Wheeler’s original proposal. Our experiment confirms Bohr’s view that it does not make sense to ascribe the wave or particle behaviour to a massive particle before the measurement takes place. This result is encouraging for current work towards entanglement and Bell’s theorem tests in macroscopic systems of massive particles.

We revise the taxonomy of the exclusively Neotropical Myrmicinae ant genus Blepharidatta (Attini), redescribing the known species (B. brasiliensis and B. conops), and describing two new species, B. delabiei sp. n. (Brazil: Bahia) and B. fernandezi sp. n. (Colombia: Amazonas). We also describe worker sting apparatuses, larvae, males, and ergatoid gynes of all species, except for B. fernandezi, known only from few worker specimens; we provide a key for identifying workers, present distribution maps for all species and summarize the knowledge on the biology of Blepharidatta species. PMID:26623844

This report was prepared as an account of work sponsored by an agency of the United States Government. Neither the United States Government nor an agency thereof, nor any of the their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, A combined-cycle High Performance Power System (HIPPS) capable of overall cycle efficiencies approaching 50% has been proposed and designed by Foster Wheeler Development Corporation (FWDC). A pyrolyzer in the first stage of the HIPPS process converts a coal feedstock into fuel gas and char at an elevated pressure of 1.4 Map. (206 psia) and elevated temperature of 930 C (1700 F). The generated char serves as the feedstock for a Pulverized Coal (PC) boiler operating at atmospheric pressure, and the fuel gas is directly fired in a gas turbine. The hydrodynamic behavior of the pyrolyzer strongly influences the quality of both the fuel gas and the generated char, the energy split between the gas turbine and the steam turbine, and hence the overall efficiency of the system. By utilizing a simplified set of scaling parameters (Glicksman et al.,1993), a 4/7th labscale cold model of the pyrolyzer operating at ambient temperature and pressure was constructed and tested. The scaling parameters matched include solid to gas density ratio, Froude number, length to diameter ratio; dimensionless superficial gas velocity and solid recycle rate, particle sphericity and particle size distribution (PSD).

This publication provides an introduction to meat processing for adult students in vocational and technical education programs. Organized in four chapters, the booklet provides a brief overview of the meat processing industry and the techniques of meat processing and butchering. The first chapter introduces the meat processing industry and…

This slide presentation details shuttle processing flow which starts with wheel stop and ends with launching. The flow is from landing the orbiter is rolled into the Orbiter Processing Facility (OPF), where processing is performed, it is then rolled over to the Vehicle Assembly Building (VAB) where it is mated with the propellant tanks, and payloads are installed. A different flow is detailed if the weather at Kennedy Space Center requires a landing at Dryden.

A review of the literature published in 2014 on the focus of Anaerobic Process. It is divided into the following sections. •Pretreatment •Organic waste •multiple-stage co-digestion •Process Methodology and Technology. PMID:26420080

Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs.

This patent describes a process for manufacturing char and hydrocarbons from discarded used tires. The process consists of: introducing the substantially whole tires into a reactor; pyrolyzing the substantially whole tires in a reaction chamber continuously at a temperature and pressure and for a reaction time sufficient to cause the tires to dissociate into a vapor and a solid phase; the pyrolyzing step including directly heating the tires with a radiant heat source at temperatures of 1000{sup 0} to 3000{sup 0}F; producing char from the solid phase; and processing the vapor phase to produce hydrocarbons.

A review of the literature published in 2015 on topics relating to disinfection processes is presented. This review is divided into the following sections: disinfection methods, disinfection byproducts, and microbiology and microbial communities. PMID:27620087

The Gordon Research Conference (GRC) on MULTIPHOTON PROCESSES was held at Tilton School, Tilton, NH. Emphasis was placed on current unpublished research and discussion of the future target areas in this field.

The NCI Grants Process provides an overview of the end-to-end lifecycle of grant funding. Learn about the types of funding available and the basics for application, review, award, and on-going administration within the NCI.

Nature's enzymes are an ongoing source of inspiration for scientists. The complex processes behind their selectivity and efficiency is slowly being unraveled, and these findings have spawned many biomimetic catalysts. However, nearly all focus on the conversion of small molecular substrates. Nature itself is replete with inventive catalytic systems which modify, replicate, or decompose entire polymers, often in a processive fashion. Such processivity can, for example, enhance the rate of catalysis by clamping to the polymer substrate, which imparts a large effective molarity. Reviewed herein are the various strategies for processivity in nature's arsenal and their properties. An overview of what has been achieved by chemists aiming to mimic one of nature's greatest tricks is also included. PMID:25244684

Describes the kinds of computer equipment needed for a personal word processing system. The characteristics and capabilities of specific devices, including keyboards, printers, and disk drives, are discussed. (JL)

This anodizing process traces its origin to the 1960's when Reynolds Metals Company, under contract with Goddard Space Flight Center, developed a multipurpose anodizing electrolyte (MAE) process to produce a hard protective finish for spacecraft aluminum. MAE produces a high-density, abrasion-resistant film prior to the coloring step, in which the pores of the film are impregnated with a metallic form of salt. Tru-Color product applications include building fronts, railing, curtain walls, doors and windows.

Processing of electric power has been presented as a discipline that draws on almost every field of electrical engineering, including system and control theory, communications theory, electronic network design, and power component technology. The cost of power processing equipment, which often equals that of expensive, sophisticated, and unconventional sources of electrical energy, such as solar batteries, is a significant consideration in the choice of electric power systems.

Launch Weather Officers (LWOs) from the 45th Weather Squadron (45 WS) and forecasters from the National Weather Service (NWS) Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violating the Lightning Launch Commit Criteria (LLCC) (Krider et al. 2006; Space Shuttle Flight Rules (FR), NASA/JSC 2004)). As a result, the Applied Meteorology Unit (AMU) developed a tool that creates an anvil threat corridor graphic that can be overlaid on satellite imagery using the Meteorological Interactive Data Display System (MIDDS, Short and Wheeler, 2002). The tool helps forecasters estimate the locations of thunderstorm anvils at one, two, and three hours into the future. It has been used extensively in launch and landing operations by both the 45 WS and SMG. The Advanced Weather Interactive Processing System (AWIPS) is now used along with MIDDS for weather analysis and display at SMG. In Phase I of this task, SMG tasked the AMU to transition the tool from MIDDS to AWIPS (Barrett et aI., 2007). For Phase II, SMG requested the AMU make the Anvil Forecast Tool in AWIPS more configurable by creating the capability to read model gridded data from user-defined model files instead of hard-coded files. An NWS local AWIPS application called AGRID was used to accomplish this. In addition, SMG needed to be able to define the pressure levels for the model data, instead of hard-coding the bottom level as 300 mb and the top level as 150 mb. This paper describes the initial development of the Anvil Forecast Tool for MIDDS, followed by the migration of the tool to AWIPS in Phase I. It then gives a detailed presentation of the Phase II improvements to the AWIPS tool.

The ChemScan UV-6100 is a spectrometry system originally developed by Biotronics Technologies, Inc. under a Small Business Innovation Research (SBIR) contract. It is marketed to the water and wastewater treatment industries, replacing "grab sampling" with on-line data collection. It analyzes the light absorbance characteristics of a water sample, simultaneously detects hundreds of individual wavelengths absorbed by chemical substances in a process solution, and quantifies the information. Spectral data is then processed by ChemScan analyzer and compared with calibration files in the system's memory in order to calculate concentrations of chemical substances that cause UV light absorbance in specific patterns. Monitored substances can be analyzed for quality and quantity. Applications include detection of a variety of substances, and the information provided enables an operator to control a process more efficiently.

A process for the fluorination of uranium metal is described. It is known that uranium will react with liquid chlorine trifluoride but the reaction proceeds at a slow rate. However, a mixture of a halogen trifluoride together with hydrogen fluoride reacts with uranium at a significantly faster rate than does a halogen trifluoride alone. Bromine trifluoride is suitable for use in the process, but chlorine trifluoride is preferred. Particularly suitable is a mixture of ClF/sub 3/ and HF having a mole ratio (moles

The primary thrust of the semiconductor processing is outlined. The purpose is to (1) advance the theoretical basis for bulk growth of elemental and compound semiconductors in single crystal form, and (2) to develop a new experimental approaches by which semiconductor matrices with significantly improved crystalline and chemical perfection can be obtained. The most advanced approaches to silicon crystal growth is studied. The projected research expansion, directed toward the capability of growth of 4 inch diameter silicon crystals was implemented. Both intra and interdepartmental programs are established in the areas of process metallurgy, heat transfer, mass transfer, and systems control. Solutal convection in melt growth systems is also studied.

A process is described for manufacturing carbon black and hydrocarbons from discarded tires, comprising: introducing the tires into a reactor; pyrolyzing the tires in a pyrolysis reaction vessel substantially in the absence of artificially introduced oil heating media at a temperature and pressure and for a reaction time sufficient to cause the tires to dissociate into a vapor phase and a solid phase; the pyrolyzing step including directly, internally heating the tires in the reaction vessel using microwave energy; producing carbon black from the solid phase; and processing the vapor phase to produce hyrocarbons.

A black chrome coating, originally developed for spacecraft solar cells, led to the development of an efficient flat plate solar collector. The coating, called Chromonyx, helps the collector absorb more heat. Olympic Solar Corporation was formed to electroplate the collector. The coating technique allows 95% of the sun's energy to be utilized. The process is widely used.

This version of Texaco's gasification process for high-ash-content solids is not extended to include the production of superheated steam, as described in US Patent 4,247,302. The hot, raw gas stream passes through fewer coolers, producing a high-pressure steam instead of a superheated steam.

This publication contains instructional materials for teacher and student use for a course in information processing. The materials are written in terms of student performance using measurable objectives. The course includes 10 units. Each instructional unit contains some or all of the basic components of a unit of instruction: performance…

A process for separating tetravalent plutonium from aqueous solutions and from niobium and zirconium by precipitation on lanthanum oxalate is described. The oxalate ions of the precipitate may be decomposed by heating in the presence of an oxidizing agent, forming a plutonium compound readily soluble in acid. (AEC)

In the accompanying photos, a laboratory technician is restoring the once-obliterated serial number of a revolver. The four-photo sequence shows the gradual progression from total invisibility to clear readability. The technician is using a new process developed in an applications engineering project conducted by NASA's Lewis Research Center in conjunction with Chicago State University. Serial numbers and other markings are frequently eliminated from metal objects to prevent tracing ownership of guns, motor vehicles, bicycles, cameras, appliances and jewelry. To restore obliterated numbers, crime laboratory investigators most often employ a chemical etching technique. It is effective, but it may cause metal corrosion and it requires extensive preparatory grinding and polishing. The NASA-Chicago State process is advantageous because it can be applied without variation to any kind of metal, it needs no preparatory work and number recovery can be accomplished without corrosive chemicals; the liquid used is water.

This library is used to get process information (eg memory and timing). By setting an environment variable, the runtime system loads libprocmon.so while loading your executable. This library causes the SIGPROF signal to be triggered at time intervals. The procmon signal handler calls various system routines (eg clock_gettime, malinfo, getrusage, and ioctl {accessing the /proc filesystem}) to gather information about the process. The information is then printed to a file which can be viewed graphicallymore » via procmon_plot.pl. This information is obtained via a sampling approach. As with any sampling approach, the information it gathers will not be completely accurate. For example, if you are looking at memory high-water mark the memory allocation and freeing could have occurred between samples and thus would not be "seen" by this program. See "Usage" below for environment variables that affect this monitor (eg time between sampling).« less

Solar radiation and the processes that control its deposition in the Earth atmosphere are considered. The published data obtained since 1978 define a reference solar spectral irradiance for use in atmospheric chemical and dynamical studies, while long term satellite measurements are now providing information on variations in the output of the Sun over a range of time scales. As concerns absorption of solar radiation in the atmosphere, new cross section data for molecular oxygen and ozone are now available. Line-by-line calculations used to predict infrared flux divergences, both as regards assumptions made in radiative transfer calculations and in the spectroscopic parameters used as inputs are examined. Also examined are the influence of radiative processes on planetary scale wave activity, photochemical acceleration of radiative damping, and the breakdown of local thermodynamic equilibrium at mesospheric altitudes.

Fermentation process consists essentially of fermenting a 10-45% w/w aqueous slurry of granular starch for the production of ethanol with an ethanol-producing microorganism in the presence of alpha-amylase and glucoamylase, the conduct of said fermentation being characterized by low levels of dextrin and fermentable sugars in solution in the fermentation broth throughout the fermentation, and thereafter recovering enzymes from the fermentation broth for use anew in fermentation of granular starch.

An improved process for producing a methane-enriched gas wherein a hydrogen-deficient carbonaceous material is treated with a hydrogen-containing pyrolysis gas at an elevated temperature and pressure to produce a product gas mixture including methane, carbon monoxide and hydrogen. The improvement comprises passing the product gas mixture sequentially through a water-gas shift reaction zone and a gas separation zone to provide separate gas streams of methane and of a recycle gas comprising hydrogen, carbon monoxide and methane for recycle to the process. A controlled amount of steam also is provided which when combined with the recycle gas provides a pyrolysis gas for treatment of additional hydrogen-deficient carbonaceous material. The amount of steam used and the conditions within the water-gas shift reaction zone and gas separation zone are controlled to obtain a steady-state composition of pyrolysis gas which will comprise hydrogen as the principal constituent and a minor amount of carbon monoxide, steam and methane so that no external source of hydrogen is needed to supply the hydrogen requirements of the process. In accordance with a particularly preferred embodiment, conditions are controlled such that there also is produced a significant quantity of benzene as a valuable coproduct.

Ceramics represent a unique class of materials that are distinguished from common metals and plastics by their: (1) high hardness, stiffness, and good wear properties (i.e., abrasion resistance); (2) ability to withstand high temperatures (i.e., refractoriness); (3) chemical durability; and (4) electrical properties that allow them to be electrical insulators, semiconductors, or ionic conductors. Ceramics can be broken down into two general categories, traditional and advanced ceramics. Traditional ceramics include common household products such as clay pots, tiles, pipe, and bricks, porcelain china, sinks, and electrical insulators, and thermally insulating refractory bricks for ovens and fireplaces. Advanced ceramics, also referred to as ''high-tech'' ceramics, include products such as spark plug bodies, piston rings, catalyst supports, and water pump seals for automobiles, thermally insulating tiles for the space shuttle, sodium vapor lamp tubes in streetlights, and the capacitors, resistors, transducers, and varistors in the solid-state electronics we use daily. The major differences between traditional and advanced ceramics are in the processing tolerances and cost. Traditional ceramics are manufactured with inexpensive raw materials, are relatively tolerant of minor process deviations, and are relatively inexpensive. Advanced ceramics are typically made with more refined raw materials and processing to optimize a given property or combination of properties (e.g., mechanical, electrical, dielectric, optical, thermal, physical, and/or magnetic) for a given application. Advanced ceramics generally have improved performance and reliability over traditional ceramics, but are typically more expensive. Additionally, advanced ceramics are typically more sensitive to the chemical and physical defects present in the starting raw materials, or those that are introduced during manufacturing.

To implement the analysis techniques and to provide end-to-end processing, a system was designed with the following capabilities: receive and catalog data from many sources; organize the data on mass storage for rapid access; edit for reasonableness; create new data sets by sorting on parameter, averaging and merging; provide statistical analysis and display tools; and distribute data on demand. Consideration was given to developing a flexible system that could meet immediate workshop needs and respond to future requirements. System architecture and data set details implemented are discussed.

A new spinoff product was derived from Geospectra Corporation's expertise in processing LANDSAT data in a software package. Called ATOM (for Automatic Topographic Mapping), it's capable of digitally extracting elevation information from stereo photos taken by spaceborne cameras. ATOM offers a new dimension of realism in applications involving terrain simulations, producing extremely precise maps of an area's elevations at a lower cost than traditional methods. ATOM has a number of applications involving defense training simulations and offers utility in architecture, urban planning, forestry, petroleum and mineral exploration.

This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The main objective was to improve understanding of the origin and evolution of the Earth`s lithosphere by studying selected processes, such as deformation and magmatic intrusion during crustal extension, formation and extraction of mantle melts, fluid transport of heat and mass, and surface processes that respond to deep-seated events. Additional objectives were to promote and develop innovative techniques and to support relevant educational endeavors. Seismic studies suggest that underplating of crust by mantle melts is an important crustal-growth mechanism, that low-angle faults can be seismogenic, and that shear deformation creates mantle anisotropy near plate boundaries. Results of geochemical work determined that magmas from oceanic intraplate islands are derived from a uniform depth in the upper mantle, whereas melts erupted at mid-ocean ridges are mixed from a range of depths. The authors have determined the extent and style of fluid infiltration and trace-element distribution in natural magmatic systems, and, finally, investigated {sup 21}Ne as a tool for dating of surficial materials.

A liquid phase process is described for oligomerization of C[sub 4] and C[sub 5] isoolefins or the etherification thereof with C[sub 1] to C[sub 6] alcohols wherein the reactants are contacted in a reactor with a fixed bed acid cation exchange resin catalyst at an LHSV of 5 to 20, pressure of 0 to 400 psig and temperature of 120 to 300 F wherein the improvement is the operation of the reactor at a pressure to maintain the reaction mixture at its boiling point whereby at least a portion but less than all of the reaction mixture is vaporized. By operating at the boiling point and allowing a portion of the reaction mixture to vaporize, the exothermic heat of reaction is dissipated by the formation of more boil up and the temperature in the reactor is controlled. 2 figs.

A liquid phase process is described for oligomerization of C[sub 4] and C[sub 5] isoolefins or the etherification thereof with C[sub 1] to C[sub 6] alcohols wherein the reactants are contacted in a reactor with a fixed bed acid cation exchange resin catalyst at an LHSV of 5 to 20, pressure of 0 to 400 psig and temperature of 120 to 300 F wherein the improvement is the operation of the reactor at a pressure to maintain the reaction mixture at its boiling point whereby at least a portion but less than all of the reaction mixture is vaporized. By operating at the boiling point and allowing a portion of the reaction mixture to vaporize, the exothermic heat of reaction is dissipated by the formation of more boil up and the temperature in the reactor is controlled. 2 figures.

A method of joining metal parts for the preparation of relatively long, thin fuel element cores of uranium or alloys thereof for nuclear reactors is described. The process includes the steps of cleaning the surfaces to be jointed, placing the sunfaces together, and providing between and in contact with them, a layer of a compound in finely divided form that is decomposable to metal by heat. The fuel element members are then heated at the contact zone and maintained under pressure during the heating to decompose the compound to metal and sinter the members and reduced metal together producing a weld. The preferred class of decomposable compounds are the metal hydrides such as uranium hydride, which release hydrogen thus providing a reducing atmosphere in the vicinity of the welding operation.

Electronic Imagery, Inc.'s ImageScale Plus software, developed through a Small Business Innovation Research (SBIR) contract with Kennedy Space Flight Center for use on space shuttle Orbiter in 1991, enables astronauts to conduct image processing, prepare electronic still camera images in orbit, display them and downlink images to ground based scientists for evaluation. Electronic Imagery, Inc.'s ImageCount, a spin-off product of ImageScale Plus, is used to count trees in Florida orange groves. Other applications include x-ray and MRI imagery, textile designs and special effects for movies. As of 1/28/98, company could not be located, therefore contact/product information is no longer valid.

An improved crystallization process is disclosed for separating a crystallizable material and an excluded material which is at least partially excluded from the solid phase of the crystallizable material obtained upon freezing a liquid phase of the materials. The solid phase is more dense than the liquid phase, and it is separated therefrom by relative movement with the formation of a packed bed of solid phase. The packed bed is continuously formed adjacent its lower end and passed from the liquid phase into a countercurrent flow of backwash liquid. The packed bed extends through the level of the backwash liquid to provide a drained bed of solid phase adjacent its upper end which is melted by a condensing vapor.

A liquid phase process for oligomerization of C.sub.4 and C.sub.5 isoolefins or the etherification thereof with C.sub.1 to C.sub.6 alcohols wherein the reactants are contacted in a reactor with a fixed bed acid cation exchange resin catalyst at an LHSV of 5 to 20, pressure of 0 to 400 psig and temperature of 120 to 300.degree. F. wherein the improvement is the operation of the reactor at a pressure to maintain the reaction mixture at its boiling point whereby at least a portion but less than all of the reaction mixture is vaporized. By operating at the boiling point and allowing a portion of the reaction mixture to vaporize, the exothermic heat of reaction is dissipated by the formation of more boil up and the temperature in the reactor is controlled.

A liquid phase process for oligomerization of C.sub.4 and C.sub.5 isoolefins or the etherification thereof with C.sub.1 to C.sub.6 alcohols wherein the reactants are contacted in a reactor with a fixed bed acid cation exchange resin catalyst at an LHSV of 5 to 20, pressure of 0 to 400 psig and temperature of 120.degree. to 300.degree. F. wherein the improvement is the operation of the reactor at a pressure to maintain the reaction mixture at its boiling point whereby at least a portion but less than all of the reaction mixture is vaporized. By operating at the boiling point and allowing a portion of the reaction mixture to vaporize, the exothermic heat of reaction is dissipated by the formation of more boil up and the temperature in the reactor is controlled.

The authors used geophysical, geochemical, and numerical modeling to study selected problems related to Earth's lithosphere. We interpreted seismic waves to better characterize the thickness and properties of the crust and lithosphere. In the southwestern US and Tien Shari, crust of high elevation is dynamically supported above buoyant mantle. In California, mineral fabric in the mantle correlate with regional strain history. Although plumes of buoyant mantle may explain surface deformation and magmatism, our geochemical work does not support this mechanism for Iberia. Generation and ascent of magmas remains puzzling. Our work in Hawaii constrains the residence of magma beneath Hualalai to be a few hundred to about 1000 years. In the crust, heat drives fluid and mass transport. Numerical modeling yielded robust and accurate predictions of these processes. This work is important fundamental science, and applies to mitigation of volcanic and earthquake hazards, Test Ban Treaties, nuclear waste storage, environmental remediation, and hydrothermal energy.

Under a NASA Small Business Innovation Research (SBIR) contract, Axiomatics Corporation developed a shunting Dielectric Sensor to determine the nutrient level and analyze plant nutrient solutions in the CELSS, NASA's space life support program. (CELSS is an experimental facility investigating closed-cycle plant growth and food processing for long duration manned missions.) The DiComp system incorporates a shunt electrode and is especially sensitive to changes in dielectric property changes in materials at measurements much lower than conventional sensors. The analyzer has exceptional capabilities for predicting composition of liquid streams or reactions. It measures concentrations and solids content up to 100 percent in applications like agricultural products, petrochemicals, food and beverages. The sensor is easily installed; maintenance is low, and it can be calibrated on line. The software automates data collection and analysis.

A new area of biology has been opened up by nanoscale exploration of the living world. This has been made possible by technological progress, which has provided the tools needed to make devices that can measure things on such length and time scales. In a sense, this is a new window upon the living world, so rich and so diverse. Many of the investigative methods described in this book seek to obtain complementary physical, chemical, and biological data to understand the way it works and the way it is organised. At these length and time scales, only dedicated instrumentation could apprehend the relevant phenomena. There is no way for our senses to observe these things directly. One important field of application is molecular medicine, which aims to explain the mechanisms of life and disease by the presence and quantification of specific molecular entities. This involves combining information about genes, proteins, cells, and organs. This in turn requires the association of instruments for molecular diagnosis, either in vitro, e.g., the microarray or the lab-on-a-chip, or in vivo, e.g., probes for molecular biopsy, and tools for molecular imaging, used to localise molecular information in living organisms in a non-invasive way. These considerations concern both preclinical research for drug design and human medical applications. With the development of DNA and RNA chips [1], genomics has revolutionised investigative methods for cells and cell processes [2,3]. By sequencing the human genome, new ways have been found for understanding the fundamental mechanisms of life [4]. A revolution is currently under way with the analysis of the proteome [5-8], i.e., the complete set of proteins that can be found in some given biological medium, such as the blood plasma. The goal is to characterise certain diseases by recognisable signatures in the proteomic profile, as determined from a blood sample or a biopsy, for example [9-13]. What is at stake is the early detection of

A three semester, 60-credit course package in the topic of Administrative Data Processing (ADP), offered in 1966 at Stockholm University (SU) and the Royal Institute of Technology (KTH) is described. The package had an information systems engineering orientation. The first semester focused on datalogical topics, while the second semester focused on the infological topics. The third semester aimed to deepen the students’ knowledge in different parts of ADP and at writing a bachelor thesis. The concluding section of this paper discusses various aspects of the department’s first course effort. The course package led to a concretisation of our discipline and gave our discipline an identity. Our education seemed modern, “just in time”, and well adapted to practical needs. The course package formed the first concrete activity of a group of young teachers and researchers. In a forty-year perspective, these people have further developed the department and the topic to an internationally well-reputed body of knowledge and research. The department has produced more than thirty professors and more than one hundred doctoral degrees.

What is Hydrothermal Circulation?Hydrothermal circulation occurs when seawater percolates downward through fractured ocean crust along the volcanic mid-ocean ridge (MOR) system. The seawater is first heated and then undergoes chemical modification through reaction with the host rock as it continues downward, reaching maximum temperatures that can exceed 400 °C. At these temperatures the fluids become extremely buoyant and rise rapidly back to the seafloor where they are expelled into the overlying water column. Seafloor hydrothermal circulation plays a significant role in the cycling of energy and mass between the solid earth and the oceans; the first identification of submarine hydrothermal venting and their accompanying chemosynthetically based communities in the late 1970s remains one of the most exciting discoveries in modern science. The existence of some form of hydrothermal circulation had been predicted almost as soon as the significance of ridges themselves was first recognized, with the emergence of plate tectonic theory. Magma wells up from the Earth's interior along "spreading centers" or "MORs" to produce fresh ocean crust at a rate of ˜20 km3 yr-1, forming new seafloor at a rate of ˜3.3 km2 yr-1 (Parsons, 1981; White et al., 1992). The young oceanic lithosphere formed in this way cools as it moves away from the ridge crest. Although much of this cooling occurs by upward conduction of heat through the lithosphere, early heat-flow studies quickly established that a significant proportion of the total heat flux must also occur via some additional convective process (Figure 1), i.e., through circulation of cold seawater within the upper ocean crust (Anderson and Silbeck, 1981). (2K)Figure 1. Oceanic heat flow versus age of ocean crust. Data from the Pacific, Atlantic, and Indian oceans, averaged over 2 Ma intervals (circles) depart from the theoretical cooling curve (solid line) indicating convective cooling of young ocean crust by circulating seawater

In 1998, the Foster Wheeler Environmental Corporation (FWENC) was awarded an 11-year contract to treat transuranic waste at the Oak Ridge National Laboratory, including Melton Valley Storage Tank (MVST) waste. Their baseline tank waste process consists of: (1) Separating the supernate from the sludge, (2) Washing the sludge with water and adding this wash water to the supernate, (3) Stabilizing the supernate/wash water or the washed sludge with additives if either are projected to fail Resource Conservation Recovery Act (RCRA) Toxic Characteristics Leaching Protocol (TCLP) criteria, and (4) Stabilizing both the washed sludge and supernate/wash water by vacuum evaporation. An ''Optimum'' treatment procedure consisted of adding a specified quantity of two stabilizers--ThioRed{reg_sign} and ET Soil Polymer{reg_sign}--and an ''Alternate'' treatment simply increased the amount of ThioRed{reg_sign} added. This report presents the results of a study funded by the Tanks Focus Area (TFA) to provide Oak Ridge Operations (ORO) with independent laboratory data on the performance of the baseline process for treating the sludges, including washing the sludge and treating the wash water (although supernates were not included in the wash water tests). Two surrogate and seven actual tank wastes were used in this evaluation. Surrogate work, as well as the initial work with actual tank sludge, was based on an existing sludge sample from Bethel Valley Evaporator Storage Tank (BVEST) W23. One surrogate was required to be based on a surrogate previously developed to mimic the weighted average chemical composition of the MVST-BVEST using a simple mix of reagent grade chemicals and water, called the ''Quick and Dirty'' surrogate (QnD). The composition of this surrogate was adjusted toward the measured composition of W23 samples. The other surrogate was prepared to be more representative of the W23 sludge sample by precipitation of a nitrate solution at high pH, separating the solution

Electrical energy is being used to process foods. In conventional food processing plants, electricity drives mechanical devices and controls the degree of process. In recent years, several processing technologies are being developed to process foods directly with electricity. Electrotechnologies use...

Reservoir sediments near the Decatur, Alabama, industrial waterfront were screened for acute (9-day) toxicity to 8-day old freshwater mussels In 1990. only two locations, designated as stations Alpha (TRM 303.4L, Dry Branch Embayment) and Delta (TRM 301.lL), were found to be toxic to mussels. Toxicity correlated strongly with unionized ammonia measured daily In porewater samples during the study. A definitive study was conducted in 1991 to determine persistence of the toxicity observed In 1990 at the two locations and to determine levels (magnitude) of toxicity and the role of sediment ammonia and/or other sediment contaminants. The 1991 TVA study, reported herein, found reoccurring (persistent) toxicity at both locations. Porewater was toxic at concentrations of 100, 50, and 25 percent. Toxicity was not observed in samples diluted to 12.5 percent.

The main objectives of the project are to provide performance and environmental data to the design of a PCFB Demonstration project and evaluate Westinghouse advanced ceramic barrier filter system and candle materials. A total test duration of 1,000 to 1,500 hrs in three segments of 500 hrs each has been planned for evaluating the filter unit. A single cluster Westinghouse hot gas candle filter is being tested. The filter system, which houses 112 ceramic candles in three plenums, takes the full flue gas flow from the PCFB combustor. At full load operation (10 MW load, 10 Bar, 850 C), the nominal filtration velocity is 4.3 cm/s. FWEI and WEC have selected a set of advanced ceramic candle materials based on a state of the art evaluation of the material characteristics in the WEC facilities and earlier test experience at many coal-fired test sites including the 2000 hour testing at the Karhula PCFB pilot plant. The selection comprises the following four types of advanced ceramic candles: Schumacher FT-20; 3M SiCoNeX; Pall 326; and Coors mullite. The ICB has supplied coal and the sorbent. Tests have been in progress since November 1995 and are scheduled for completion by the middle of 1996. The filter unit performance so far has been very satisfactory at the nominal design conditions--10 to 12 bar (150 to 175 psis), 800 to 850 C (1,500 to 1,575 F), and nearly 100% dust removal. There was no visible evidence of any dust carry over into the clean side. This paper describes the performance of the filter including the pulse system and the mechanical package.

... review in the Federal Register published on April 11, 2000, (65 FR 19477-78). The petition, supporting... motorcycles do not fully comply with either paragraph S7.9.6.2(b) or paragraph S10.7.1.2.2 (depending on the... model M3W three-wheeled motorcycles manufactured during the period August 1, 2012 to August 14,...

The 1890s was the heyday for the bicycle in the United States. By 1896, bicycle manufacturing was a major industry with 300 established firms. Interest in bicycling, or "wheeling" as it was known then, grew rapidly into a national craze during the latter part of the 19th century. In 1890, American manufacturers produced nearly 30,000 bicycles, and…

... other resources. Methods that would be used to reduce tree density and hazardous fuels are: non... shrub and grass communities from fire exclusion; stand density and forest fuels adjacent to arterial... forest by reducing stand density; promote the development of large trees for eventual woody...

Production of electron-positron pairs in the collision of a high-energy photon with a high-intensity few-cycle laser pulse is studied. By utilizing the frameworks of laser-dressed spinor and scalar quantum electrodynamics, a comparison between the production of pairs of Dirac and Klein-Gordon particles is drawn. Positron energy spectra and angular distributions are presented for various laser parameters. We identify conditions under which predictions from Klein-Gordon theory either closely resemble or largely differ from those of the proper Dirac theory. In particular, we address the question to which extent the relevance of spin effects is influenced by the short duration of the laser pulse.

A technique for distributed packet processing includes sequentially passing packets associated with packet flows between a plurality of processing engines along a flow through data bus linking the plurality of processing engines in series. At least one packet within a given packet flow is marked by a given processing engine to signify by the given processing engine to the other processing engines that the given processing engine has claimed the given packet flow for processing. A processing function is applied to each of the packet flows within the processing engines and the processed packets are output on a time-shared, arbitered data bus coupled to the plurality of processing engines.

The article takes up the notion of process and artmaking as an event to be understood neither as a singular moment of forces (e.g., artist, artwork, viewer, and/or site) coming together, nor as the "end" of a productive process that is then superseded by another event. Rather, the authors suggest that the artmaking process can be understood as an…

In an improved coal liquefaction process, including a critical solvent deashing stage, high value product recovery is improved and enhanced process-derived solvent is provided by recycling second separator underflow in the critical solvent deashing stage to the coal slurry mix, for inclusion in the process solvent pool.

A word processing system developed at the University of Kansas is described. A cost-benefit analysis of the system v standard typewriter, mag-card, and other advanced word processing systems is developed. Intangible benefits such as worker satisfaction and reduced training and editing needs are discussed, and possible uses outlined. (MSE)

... Blood Donation Information > Blood Donation Process Blood Donation Process Page Content Donating blood is a safe, simple, ... this test, as well as during the donation process, is sterile, used only once and then disposed. ...

... Approvals The Drug Development Process The Drug Development Process Share Tweet Linkedin Pin it More sharing options ... public. More Information More in The Drug Development Process Step 1: Discovery and Development Step 2: Preclinical ...

Different desalination processes are evaluated for feed, capacity, performance, energy requirements, and cost. These include distillation, reverse osmosis, or electrodialysis. Detailed information is given on distillation processes and membrane processes.

This report contains viewgraphs from the Special Parallel Processing Workshop. These viewgraphs deal with topics such as parallel processing performance, message passing, queue structure, and other basic concept detailing with parallel processing.

An improved process is described for solvent refining lubricating oil base stocks from petroleum fractions containing both aromatic and nonaromatic constituents. The process utilizes n-methyl-2-pyrrolidone as a selective solvent for aromatic hydrocarbons wherein the refined oil fraction and the extract fraction are freed of final traces of solvent by stripping with gaseous ammonia. The process has several advantages over conventional processes including a savings in energy required for the solvent refining process, and reduced corrosion of the process equipment.

The three experiments reported here examined the process goal paradox, which has emerged from the literature on goal setting and conscious processing. We predicted that skilled but anxious performers who adopted a global movement focus using holistic process goals would outperform those who used part-oriented process goals. In line with the conscious processing hypothesis, we also predicted that performers using part process goals would experience performance impairment in test compared with baseline conditions. In all three experiments, participants performed motor tasks in baseline and test conditions. Cognitive state anxiety increased in all of the test conditions. The results confirmed our first prediction; however, we failed to find unequivocal evidence to support our second prediction. The consistent pattern of the results lends support to the suggestion that, for skilled athletes who perform under competitive pressure, using a holistic process goal that focuses attention on global aspects of a motor skill is a more effective attentional focus strategy than using a part process goal. PMID:20587818

In response to decreasing funding levels available to support activities at the Idaho Chemical Processing Plant (ICPP) and a desire to be cost competitive, the Department of Energy Idaho Operations Office (DOE-ID) and Lockheed Idaho Technologies Company have increased their emphasis on cost-saving measures. The ICPP Effectiveness Improvement Initiative involves many activities to improve cost effectiveness and competitiveness. This report documents the methodology and results of one of those cost cutting measures, the Process Efficiency Improvement Activity. The Process Efficiency Improvement Activity performed a systematic review of major work processes at the ICPP to increase productivity and to identify nonvalue-added requirements. A two-phase approach was selected for the activity to allow for near-term implementation of relatively easy process modifications in the first phase while obtaining long-term continuous improvement in the second phase and beyond. Phase I of the initiative included a concentrated review of processes that had a high potential for cost savings with the intent of realizing savings in Fiscal Year 1996 (FY-96.) Phase II consists of implementing long-term strategies too complex for Phase I implementation and evaluation of processes not targeted for Phase I review. The Phase II effort is targeted for realizing cost savings in FY-97 and beyond.

To sustain gains from a process improvement initiative, healthcare organizations should: Explain to staff why a process improvement initiative is needed. Encourage leaders within the organization to champion the process improvement, and tie their evaluations to its outcomes. Ensure that both leaders and employees have the skills to help sustain the sought-after process improvements. PMID:24968631

Under the sponsorship of the United States Department of Energy, Foster Wheeler Development Corporation, and its team members, Westinghouse, Gilbert/Commonwealth, and the Institute of Gas Technology are developing second-generation pressurized fluidized bed combustion technology capable of achieving net plant efficiency in excess of 45 percent based on the higher heating value of the coal. A three-phase program entails design and costing of a 500 MWe power plant and identification of developments needed to commercialize this technology (Phase 1), testing of individual components (Phase 2), and finally testing these components in an integrated mode (Phase 3). This paper briefly describes the results of the first two phases as well as the progress on the third phase. Since other projects which use the same technology are in construction or in negotiation stages-namely, the Power System Development Facility and the Four Rivers Energy Modernization Projects-brief descriptions of these are also included.

With the advent of the commercial 3D video card in the mid 1990s, we have seen an order of magnitude performance increase with each generation of new video cards. While these cards were designed primarily for visualization and video games, it became apparent after a short while that they could be used for scientific purposes. These Graphical Processing Units (GPUs) are rapidly being incorporated into data processing tasks usually reserved for general purpose computers. It has been found that many image processing problems scale well to modern GPU systems. We have implemented four popular hyperspectral processing algorithms (N-FINDR, linear unmixing, Principal Components, and the RX anomaly detection algorithm). These algorithms show an across the board speedup of at least a factor of 10, with some special cases showing extreme speedups of a hundred times or more.

The laser cladding process, which consists of adding a melt powder to a substrate in order to improve or change the behavior of the material against corrosion, fatigue and so on, involves a lot of parameters. In order to perform good tracks some parameters need to be controlled during the process. The authors present here a low cost performance system using two CCD matrix cameras. One camera provides surface temperature measurements while the other gives information relative to the powder distribution or geometric characteristics of the tracks. The surface temperature (thanks to Beer Lambert`s law) enables one to detect variations in the mass feed rate. Using such a system the authors are able to detect fluctuation of 2 to 3g/min in the mass flow rate. The other camera gives them information related to the powder distribution, a simple algorithm applied to the data acquired from the CCD matrix camera allows them to see very weak fluctuations within both gaz flux (carriage or protection gaz). During the process, this camera is also used to perform geometric measurements. The height and the width of the track are obtained in real time and enable the operator to find information related to the process parameters such as the speed processing, the mass flow rate. The authors display the result provided by their system in order to enhance the efficiency of the laser cladding process. The conclusion is dedicated to a summary of the presented works and the expectations for the future.

Thermochemical processes which lead to the production of hydrogen and oxygen from water without the consumption of any other material have a number of advantages when compared to other processes such as water electrolysis. It is possible to operate a sequence of chemical steps with net work requirements equal to zero at temperatures well below the temperature required for water dissociation in a single step. Various types of procedures are discussed, giving attention to halide processes, reverse Deacon processes, iron oxide and carbon oxide processes, and metal and alkali metal processes. Economical questions are also considered.

Digital image processing is now commonplace in radiology, nuclear medicine and sonography. This article outlines underlying principles and concepts of digital image processing. After completing this article, readers should be able to: List the limitations of film-based imaging. Identify major components of a digital imaging system. Describe the history and application areas of digital image processing. Discuss image representation and the fundamentals of digital image processing. Outline digital image processing techniques and processing operations used in selected imaging modalities. Explain the basic concepts and visualization tools used in 3-D and virtual reality imaging. Recognize medical imaging informatics as a new area of specialization for radiologic technologists. PMID:15352557

Ethylene and coproducts are produced from steam cracking of hydrocarbons such as ethane, propane, butane, naphtha and gas oil in tubular reactor coils installed in externally fired heaters. Due to increasing costs of raw materials and uncertainty of supply, there were flurries of research and development for alternative feedstocks and processes. Alternative processes include crude or residual oil cracking processes, ethanol dehydration processes and syngas based processes. In this series of articles the state-of-the-art for these processes is reviewed and they are assessed for their potential for commercialization.

The US Army Research Laboratory (ARL) has developed an acoustic signal processing toolbox (ASPT) for acoustic sensor array processing. The intent of this document is to describe the toolbox and its uses. The ASPT is a GUI-based software that is developed and runs under MATLAB. The current version, ASPT 3.0, requires MATLAB 6.0 and above. ASPT contains a variety of narrowband (NB) and incoherent and coherent wideband (WB) direction-of-arrival (DOA) estimation and beamforming algorithms that have been researched and developed at ARL. Currently, ASPT contains 16 DOA and beamforming algorithms. It contains several different NB and WB versions of the MVDR, MUSIC and ESPRIT algorithms. In addition, there are a variety of pre-processing, simulation and analysis tools available in the toolbox. The user can perform simulation or real data analysis for all algorithms with user-defined signal model parameters and array geometries.

A process for separating condensable organic components from gas streams. The process makes use of a membrane made from a polymer material that is glassy and that has an unusually high free volume within the polymer material.

The feasibility and possible advantages of processing materials in a nongravitational field are considered. Areas of investigation include biomedical applications, the processing of inorganic materials, and flight programs and funding.

Conservation of materials and energy is a major objective to the philosophy of sustainability. Where production processes can be intensified to assist these objectives, significant advances have been developed to assist conservation as well as cost. Process intensification (PI) h...

The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

Cost and schedule overruns are often caused by poor requirements that are produced by people who do not understand the requirement process. This paper provides a high-level overview of the requirements discovery process.

Some elementary properties and examples of Markov processes are reviewed. It is shown that the definition of the Markov property naturally leads to a classification of Markov processes into linear and nonlinear ones.

EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

A study was made at pilot scale of a variety of processes for dewatering and stabilization of waste activated sludge from a pure oxygen activated sludge system. Processes evaluated included gravity thickening, dissolved air flotation thickening, basket centrifugation, scroll cent...

This chapter presents a novel architecture and software-hardware design system for materials processing techniques that are widely applicable to laser direct-write patterning tools. This new laser material processing approach has been crafted by association with the genome and genotype concepts, where predetermined and prescribed laser pulse scripts are synchronously linked with the tool path geometry, and each concatenated pulse sequence is intended to induce a specific material transformation event and thereby express a particular material attribute. While the experimental approach depends on the delivery of discrete amplitude modulated laser pulses to each focused volume element with high fidelity, the architecture is highly versatile and capable of more advanced functionality. The capabilities of this novel architecture fall short of the coherent spatial control techniques that are now emerging, but can be readily applied to fundamental investigations of complex laser-material interaction phenomena, and easily integrated into commercial and industrial laser material processing applications. Section 9.1 provides a brief overview of laser-based machining and materials processing, with particular emphasis on the advantages of controlling energy deposition in light-matter interactions to subtly affect a material's thermodynamic properties. This section also includes a brief discussion of conventional approaches to photon modulation and process control. Section 9.2 comprehensively describes the development and capabilities of our novel laser genotype pulse modulation technique that facilitates the controlled and precise delivery of photons to a host material during direct-write patterning. This section also reviews the experimental design setup and synchronized photon control scheme, along with performance tests and diagnostic results. Section 9.3 discusses selected applications of the new laser genotype processing technique, including optical property variations

A process for conditioning natural gas containing C.sub.3+ hydrocarbons and/or acid gas, so that it can be used as combustion fuel to run gas-powered equipment, including compressors, in the gas field or the gas processing plant. Compared with prior art processes, the invention creates lesser quantities of low-pressure gas per unit volume of fuel gas produced. Optionally, the process can also produce an NGL product.

The Plasma Hearth Process (PHP) is a high-temperature waste treatment process being developed by Science Applications International Corporation (SAIC) for the Department of Energy (DOE) that destroys hazardous organics while stabilizing radionuclides and hazardous metals in a vitreous slag waste form. The PHP has potential application for the treatment of a wide range of mixed waste types in both the low-level and transuranic (TRU) mixed waste categories. DOE, through the Office of Technology Development`s Mixed Waste Integrated Program (MWIP) is conducting a three phase development project to ready the PHP for implementation in the DOE complex.

While business instructors are still teaching spirit and stencil duplicating processes, most businesses now use copiers or offset printing processes. The article discusses offset and copier skills needed by office workers, pointing out that the processes being taught should be compatible with those used in business. (MF)

Discusses process analytical chemistry as a discipline designed to supply quantitative and qualitative information about a chemical process. Encourages academic institutions to examine this field for employment opportunities for students. Describes the five areas of process analytical chemistry, including off-line, at-line, on-line, in-line, and…

This month’s Processing column will continue the theme of “How Is It Processed?” The column will focus on tofu, which is sometimes called “the cheese of Asia.” It is a nutritious, protein-rich bean curd made by coagulating soy milk. There are many different types of tofu, and they are processed in a...

Mathematical modeling and simulations of semisolid filling processes remains a critical issue in understanding and optimizing the process. Semisolid slurries are non-Newtonian materials that exhibit complex rheological behavior. There the way these slurries flow in cavities is very different from the way liquid in classical casting fills cavities. Actually filling in semisolid processing is often counter intuitive

The purpose of this project was to explore the use of polar processing techniques in SIGINT-related signal processing applications. An investigation of ways to apply the CORDIC arithmetic algorithm to signal processing problems, and an application of the TMC2330 Coordinate Transformer chip in a coprocessor or accelerator board for a Sun workstation are covered.

This article discusses the constitutional amendment process. Although the process is not described in great detail, Article V of the United States Constitution allows for and provides instruction on amending the Constitution. While the amendment process currently consists of six steps, the Constitution is nevertheless quite difficult to change.…

This document contains four papers from a symposium on change processes in organizations. "Mid-stream Corrections: Decisions Leaders Make during Organizational Change Processes" (David W. Frantz) analyzes three organizational leaders to determine whether and how they take corrective actions or adapt their decision-making processes when…

This month’s Processing column on the theme of “How Is It Processed?” focuses on yogurt. Yogurt is known for its health-promoting properties. This column will provide a brief overview of the history of yogurt and the current market. It will also unveil both traditional and modern yogurt processing t...

The items in this compilation, all relating to metallurgical processing, are presented in two sections. The first section includes processes which are general in scope and applicable to a variety of metals or alloys. The second describes the processes that concern specific metals and their alloys.

In a process for separating insoluble red mud from Bayer process streams the improvement is described which comprises contacting and mixing a Bayer process stream with a tertiary polyamine having a molecular weight of at least about 10,000 in an amount effective to reduce the iron content thereof.

Focusing on the process of reading comprehension, this book contains chapters on some central topics relevant to understanding the processes associated with comprehending text. The articles and their authors are as follows: (1) "Comprehension Processes: Introduction" (K. Rayner); (2) "The Role of Meaning in Word Recognition" (D. A. Balota); (3)…

This report describes the processing of plutonium at Los Alamos National Laboratory (LANL), and operation illustrating concepts that may be applicable to the processing of lunar materials. The toxic nature of plutonium requires a highly closed system for processing lunar surface materials.

Teacher-educator and researcher Daniel L. Kohut suggests in "Musical Performance: Learning Theory and Pedagogy" that there are many problems that result from the way music teachers often teach. Most teachers focus on the process, not the goal. The Natural Learning Process that Kohut advocates is the same process that young children use when they…

Discusses the term "process" using definitions from a small desk dictionary. Claims that communication theory reflects a process point of view. Describes how most models of the communication process are similar to that delineated by Aristotle. Discusses current business communication texts in light of these models. (JD)

The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

An alternative to the immersion process for the electrodeposition of chromium from aqueous solutions on the inside diameter (ID) of long tubes is described. The Vessel Plating Process eliminates the need for deep processing tanks, large volumes of solutions, and associated safety and environmental concerns. Vessel Plating allows the process to be monitored and controlled by computer thus increasing reliability, flexibility and quality. Elimination of the trivalent chromium accumulation normally associated with ID plating is intrinsic to the Vessel Plating Process. The construction and operation of a prototype Vessel Plating Facility with emphasis on materials of construction, engineered and operational safety and a unique system for rinse water recovery are described.

Objective of the AISI Direct Steelmaking Program is to develop a process for producing steel directly from ore and coal; the process should be less capital intensive, consume less energy, and have higher productivity. A task force was formed to examine available processes: trough, posthearth, IRSID, Electric Arc Furnace, energy optimizing furnace. It is concluded that there is insufficient incentive to replace a working BOF with any of these processes to refine hot metal; however, if new steelmaking capacity is required, IRSID and EOF should be considered. A fully continuous process should not be considered until direct ironmaking and continuous refining are perfected.

There is an increasing demand for an ironmaking process with lower capital cost, energy consumption and emissions than a blast furnace. It is the hypothesis of the present work that an optimized combination of two reasonable proven technologies will greatly enhance the overall process. An example is a rotary hearth furnace (RHF) linked to a smelter (e.g., AISI, HIsmelt). The objective of this research is to select promising process combinations, develop energy, materials balance and productivity models for the individual processes, conduct a limited amount of basic research on the processes and evaluate the process combinations. Three process combinations were selected with input from the industrial partners. The energy-materials and productivity models for the RHF, smelter, submerged arc furnace and CIRCOFER were developed. Since utilization of volatiles in coal is critical for energy and CO{sub 2} emission reduction, basic research on this topic was also conducted. The process models developed are a major product developed in this research. These models can be used for process evaluation by the industry. The process combinations of an RHF-Smelter and a simplified CIRCOFER-Smelter appear to be promising. Energy consumption is reduced and productivity increased. Work on this project is continuing using funds from other sources.

The commercialization of solar power generation necessitates the development of low cost manufacturing method of silicon suitable for solar cells. The manufacturing methods of semiconductor grade silicon (SEG-Si) and the development of solar grade silicon (SOG-Si) in foreign countries was investigated. It was concluded that the most efficient method of developing such materials was the hydrogen reduction process of trichlorosilane (TCS), using a fluidized bed reactor. The low cost reduction of polysilicon requires cost reductions of raw materials, energy, labor, and capital. These conditions were carefully reviewed. The overall conclusion was that a development program should be based on the TCS-FBR process and that the experimental program should be conducted in test facilities capable of producing 10 tons of silicon granules per year.

This essay is a discussion of the concept of eigenform, due to Heinz von Foerster, and its relationship with discrete physics and quantum mechanics. We interpret the square root of minus one as a simple oscillatory process - a clock, and as an eigenform. By taking a generalization of this identification of i as a clock and eigenform, we show how quantum mechanics emerges from discrete physics.

The Spitzer Telemetry Processing System (SirtfTlmProc) was designed to address objectives of JPL's Multi-mission Image Processing Lab (MIPL) in processing spacecraft telemetry and distributing the resulting data to the science community. To minimize costs and maximize operability, the software design focused on automated error recovery, performance, and information management. The system processes telemetry from the Spitzer spacecraft and delivers Level 0 products to the Spitzer Science Center. SirtfTlmProc is a unique system with automated error notification and recovery, with a real-time continuous service that can go quiescent after periods of inactivity. The software can process 2 GB of telemetry and deliver Level 0 science products to the end user in four hours. It provides analysis tools so the operator can manage the system and troubleshoot problems. It automates telemetry processing in order to reduce staffing costs.

Optical processing techniques are used to transform, manipulate, or transmit information. The Soviet Union has vigorously pursued optical processing since the 1960s. This report summarizes Soviet capabilities in hardware, particularly in materials and devices, as well as their capability in applications such as image processing and signal processing/computing. Soviet work in optical signal processing may be characterized as follows: good in terms of fundamental science of materials; capable of producing good materials (often on a par with the West); curious lack of activity with ferroelectric liquid crystals; unique capability in biochrome materials; good capabilities in waveguide devices; good research on spatial light modulators using electro-optic materials; lacking in fabrication techniques for devices; good in terms of statistical analysis of expected system performance; lacking in microelectronic support capabilities; and general lack of innovation for new signal processing architectures. 400 refs., 14 figs., 7 tabs.

A detailed survey on the motivations, design, applications, current status, and limitations of computers designed for symbolic processing is provided. Symbolic processing computations are performed at the word, relation, or meaning levels, and the knowledge used in symbolic applications may be fuzzy, uncertain, indeterminate, and ill represented. Various techniques for knowledge representation and processing are discussed from both the designers' and users' points of view. The design and choice of a suitable language for symbolic processing and the mapping of applications into a software architecture are then considered. The process of refining the application requirements into hardware and software architectures is treated, and state-of-the-art sequential and parallel computers designed for symbolic processing are discussed.

Pultrusion is a process through which high-modulus, lightweight composite structural members such as beams, truss components, stiffeners, etc., are manufactured. The pultrusion process, though a well-developed processing art, lacks a fundamental scientific understanding. The objective here was to determine, both experimentally and analytically, the process parameters most important in characterizing and optimizing the pultrusion of uniaxial fibers. The effects of process parameter interactions were experimentally examined as a function of the pultruded product properties. A numerical description based on these experimental results was developed. An analytical model of the pultrusion process was also developed. The objective of the modeling effort was the formulation of a two-dimensional heat transfer model and development of solutions for the governing differential equations using the finite element method.

This presentation describes the development of the proposed Process Flow Diagram (PFD) for the Tokamak Exhaust Processing System (TEP) of ITER. A brief review of design efforts leading up to the PFD is followed by a description of the hydrogen-like, air-like, and waterlike processes. Two new design values are described; the mostcommon and most-demanding design values. The proposed PFD is shown to meet specifications under the most-common and mostdemanding design values.

Fjords are a major feature of coasts and provide geologists and oceanographers with an excellent environment for studying and modeling coastal processes and products. This book brings together and integrates an enormous amount of information on fjords and provides the reader with a thorough, interdisciplinary account of current research with emphasis on sedimentary processes. The processes demonstrated in fjords are often relevant to the estuarine or open ocean environment.

The chapters of this volume represent the invited papers delivered at the conference. They are arranged according to thermatic proximity beginning with atoms and continuing with molecules and surfaces. Section headings include multiphoton processes in atoms, field fluctuations and collisions in multiphoton process, and multiphoton processes in molecules and surfaces. Abstracts of individual items from the conference were prepared separately for the data base. (GHT)

Research on acoustic levitation, air-jet levitation, and heat transfer from molten samples is reported. The goal was to obtain a better understanding and improving the quality of containerless processing systems. These systems are applied to the processing of materials in situations in which contact with a container must be avoided, and have potential application in both ground based and orbiting laboratories. Containerless processing is reviewed. The development of glasses from materials which normally crystallize upon cooling, are studied.

A system and method are disclosed for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy. 96 figs.

A system and method for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy.

A system and method for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy.

A software system was developed which processes digitized scatterometer data from the 13.3 GHz, 1.6 GHz and 400 MHz scatterometer systems. In addition to this, the hardware capability has been developed to recover the raw analog radar signals and the aircraft parameters from an ADAS data stream in a digital format for processing by the software package. Software for the preparation of data reports and chart presentation of scattering coefficients time histories has also been developed. This report documents the development of the software, describes key components of the processing system and presents examples of the processed data and procedure for software operation.

A new concept for processing spacecraft type wastes has been evaluated. The feasibility of reacting various waste materials with steam at temperatures of 538 - 760 C in both a continuous and batch reactor with residence times from 3 to 60 seconds has been established. Essentially complete gasification is achieved. Product gases are primarily hydrogen, carbon dioxide, methane, and carbon monoxide. Water soluble synthetic wastes are readily processed in a continuous tubular reactor at concentrations up to 20 weight percent. The batch reactor is able to process wet and dry wastes at steam to waste weight ratios from 2 to 20. Feces, urine, and synthetic wastes have been successfully processed in the batch reactor.

Diesel Engine Combustion Processes guides the engineer and research technician toward engine designs which will give the ``best payoff`` in terms of emissions and fuel economy. Contents include: Three-dimensional modeling of soot and NO in a direct-injection diesel engine; Prechamber for lean burn for low NOx; Modeling and identification of a diesel combustion process with the downhill gradient search method; The droplet group micro-explosions in W/O diesel fuel emulsion sprays; Combustion process of diesel spray in high temperature air; Combustion process of diesel engines at regions with different altitude; and more.

Processes for the living polymerization of olefin monomers with terminal carbon-carbon double bonds are disclosed. The processes employ initiators that include a metal atom and a ligand having two group 15 atoms and a group 16 atom or three group 15 atoms. The ligand is bonded to the metal atom through two anionic or covalent bonds and a dative bond. The initiators are particularly stable under reaction conditions in the absence of olefin monomer. The processes provide polymers having low polydispersities, especially block copolymers having low polydispersities. It is an additional advantage of these processes that, during block copolymer synthesis, a relatively small amount of homopolymer is formed.

Processes for the living polymerization of olefin monomers with terminal carbon-carbon double bonds are disclosed. The processes employ initiators that include a metal atom and a ligand having two group 15 atoms and a group 16 atom or three group 15 atoms. The ligand is bonded to the metal atom through two anionic or covalent bonds and a dative bond. The initiators are particularly stable under reaction conditions in the absence of olefin monomer. The processes provide polymers having low polydispersities, especially block copolymers having low polydispersities. It is an additional advantage of these processes that, during block copolymer synthesis, a relatively small amount of homopolymer is formed.

Processes for the living polymerization of olefin monomers with terminal carbon-carbon double bonds are disclosed. The processes employ initiators that include a metal atom and a ligand having two group 15 atoms and a group 16 atom or three group 15 atoms. The ligand is bonded to the metal atom through two anionic or covalent bonds and a dative bond. The initiators are particularly stable under reaction conditions in the absence of olefin monomer. The processes provide polymers having low polydispersities, especially block copolymers having low polydispersities. It is an additional advantage of these processes that, during block copolymer synthesis, a relatively small amount of homopolymer is formed.

Processes for the living polymerization of olefin monomers with terminal carbon-carbon double bonds are disclosed. The processes employ initiators that include a metal atom and a ligand having two group 15 atoms and a group 16 atom or three group 15 atoms. The ligand is bonded to the metal atom through two anionic or covalent bonds and a dative bond. The initiators are particularly stable under reaction conditions in the absence of olefin monomer. The processes provide polymers having low polydispersities, especially block copolymers having low polydispersities. It is an additional advantage of these processes that, during block copolymer synthesis, a relatively small amount of homopolymer is formed.

A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

Decision makers conduct biased predecision processing when they restructure their mental representation of the decision environment to favor one alternative before making their choice. The question of whether biased predecision processing occurs has been controversial since L. Festinger (1957) maintained that it does not occur. The author reviews relevant research in sections on theories of cognitive dissonance, decision conflict, choice certainty, action control, action phases, dominance structuring, differentiation and consolidation, constructive processing, motivated reasoning, and groupthink. Some studies did not find evidence of biased predecision processing, but many did. In the Discussion section, the moderators are summarized and used to assess the theories. PMID:12848220

A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of the additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

The design process is one of the sources used to produce requirements for a computer system to integrate and manage product design data, program management information, and technical computation and engineering data management activities of the aerospace design process. Design activities were grouped chronologically and explored for activity type, activity interface, data quantity, and data flow. The work was based on analysis of the design process of several typical aerospace products, including both conventional and supersonic airplanes and a hydrofoil design. Activities examined included research, preliminary design, detail design, manufacturing interface, product verification, and product support. The design process was then described in an IPAD environment--the future.

A solvent refining process is disclosed utilizing n-methyl-2-pyrrolidone as solvent in which primary extract from the extraction zone is cooled to form a secondary raffinate and secondary extract and the secondary and primary raffinates are blended to produce an increased yield of product of desired quality. In a preferred embodiment of the process, the lubricating oil feedstock to the process is first contacted with a stripping medium previously used in the process for the recovery of solvent from at least one of the product streams whereby solvent contained in said stripping medium is recovered therefrom.

Traditionally, the production of high quality Synthetic Aperture Radar imagery has been an area where a potential user would have to expend large amounts of money in either the bespoke development of a processing chain dedicated to his requirements or in the purchase of a dedicated hardware platform adapted using accelerator boards and enhanced memory management. Whichever option the user adopted there were limitations based on the desire for a realistic throughput in data load and time. The user had a choice, made early in the purchase, for either a system that adopted innovative algorithmic manipulation, to limit the processing time of the purchase of expensive hardware. The former limits the quality of the product, while the latter excludes the user from any visibility into the processing chain. Clearly there was a need for a SAR processing architecture that gave the user a choice into the methodology to be adopted for a particular processing sequence, allowing him to decide on either a quick (lower quality) product or a detailed slower (high quality) product, without having to change the algorithmic base of his processor or the hardware platform. The European Commission, through the Advanced Techniques unit of the Joint Research Centre (JRC) Institute for Remote Sensing at Ispra in Italy, realizing the limitations on current processing abilities, initiated its own program to build airborne SAR and Electro-Optical (EO) sensor systems. This program is called the European Airborne Remote Sensing Capabilities (EARSEC) program. This paper describes the processing system developed for the airborne SAR sensor system. The paper considers the requirements for the system and the design of the EARSEC Airborne SAR Processing System. It highlights the development of an open SAR processing architecture where users have full access to intermediate products that arise from each of the major processing stages. It also describes the main processing stages in the overall

The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

In response to the problem facing college faculties of choosing textbooks that are both "readable" by students and adequate in content coverage, a text selection process has been developed that can be used with or without the aid of a reading specialist. The first step in the process, a preliminary check, examines each proposed text's publication…

This document represents the roadmap for Processing Technology Research in the US Mining Industry. It was developed based on the results of a Processing Technology Roadmap Workshop sponsored by the National Mining Association in conjunction with the US Department of Energy, Office of Energy Efficiency and Renewable Energy, Office of Industrial Technologies. The Workshop was held January 24 - 25, 2000.

A process is described for separating condensable organic components from gas streams. The process makes use of a membrane made from a polymer material that is glassy and that has an unusually high free volume within the polymer material. 6 figures.

I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

Literature published in 2015 and early 2016 related to food processing wastes treatment for industrial applications are reviewed. This review is a subsection of the Treatment Systems section of the annual Water Environment Federation literature review and covers the following food processing industries and applications: general, meat and poultry, fruits and vegetables, dairy and beverage, and miscellaneous treatment of food wastes. PMID:27620095

An overview is given of the processing of plastic materials from the handling of polymers in the pellet and powder form to manufacturing of a plastic fabricated product. Various types of equipment used and melt processing ranges of various polymer formulations to make the myriad of plastic products that are commercially available are discussed. PMID:1175556

This invention is comprised of a process for dissolution of spent high efficiency particulate air (HEPA) filters and then combining the complexed filter solution with other radioactive wastes prior to calcining the mixed and blended waste feed. The process is an alternate to a prior method of acid leaching the spent filters which is an inefficient method of treating spent HEPA filters for disposal.

EDUCATION HAS BEEN SLOW IN FULLY UTILIZING ELECTRONIC DATA PROCESSING EQUIPMENT (EDP). EDUCATOR CONFIDENCE IN THE EDP HAS GROWN, HOWEVER, AS A RESULT OF THE SUCCESS OF ELECTRONIC DATA PROCESSING IN SCIENCE, INDUSTRY, AND OTHER PROFESSIONS. THE DEVELOPMENT OF SOLID STATE TRANSISTORIZED COMPUTERS HAS MADE POWERFUL DESK-SIZE COMPUTERS A REALITY AND…

In this paper we propose a model for neural processing that addresses both the evolutionary and functional aspects of neural systems that are observed in nature, from the simplest neural collections to dense large scale associations such as human brains. We propose both an architecture and a process in which these components interact to create the emergent behavior that we define as the 'mind'.

This article examines relationships among process, product, and playmaking in a southeastern playwriting and performance program for teen girls, Playmaking for Girls (PFG). The authors have chosen to focus on tensions between process and product. Such tensions are present in the challenges teachers experience when privileging student-centered…

A process for dissolution of spent high efficiency particulate air (HEPA) filters and then combining the complexed filter solution with other radioactive wastes prior to calcining the mixed and blended waste feed. The process is an alternate to a prior method of acid leaching the spent filters which is an inefficient method of treating spent HEPA filters for disposal.

This article provides insights into the "Applying Mathematical Processes" resources, developed by the Nuffield Foundation. It features Nuffield AMP activities--and related ones from Bowland Maths--that were designed to support the teaching and assessment of key processes in mathematics--representing a situation mathematically, analysing,…

A process ts described for obtaining a closely bonded coating of steel or iron on uranium. The process consists of providing, between the steel and uramium. a layer of silver. amd then pressure rolling tbe assembly at about 600 deg C until a reduction of from l0 to 50% has been obtained.

The planning technique or device, regardless of its degree of sophistication, is only a tool and cannot be substituted for effective managers. The planning process must be an integral part of the entire management process, which often evolves over many generations of trial and error. The function of planning and management entails the continuous,…

This month’s column will continue the theme of “How Is It Processed?” The food to be discussed this month is hummus. Hummus is known for its healthfulness. This column will provide a brief overview of the history of hummus and the current market. It will also unveil hummus processing techniques....

Recent work in the areas of microwave processing and joining of ceramics is briefly reviewed. Advantages and disadvantages of microwave processing as well as some of the current issues in the field are discussed. Current state and potential for future commercialization of this technology is also addressed.

Recent work in the areas of microwave processing and joining of ceramics is briefly reviewed. Advantages and disadvantages of microwave processing as well as some of the current issues in the field are discussed. Current state and potential for future commercialization of this technology is also addressed.

Food processing can have many beneficial effects. However, processing may also alter the allergenic properties of food proteins. A wide variety of processing methods is available and their use depends largely on the food to be processed. In this review the impact of processing (heat and non-heat treatment) on the allergenic potential of proteins, and on the antigenic (IgG-binding) and allergenic (IgE-binding) properties of proteins has been considered. A variety of allergenic foods (peanuts, tree nuts, cows' milk, hens' eggs, soy, wheat and mustard) have been reviewed. The overall conclusion drawn is that processing does not completely abolish the allergenic potential of allergens. Currently, only fermentation and hydrolysis may have potential to reduce allergenicity to such an extent that symptoms will not be elicited, while other methods might be promising but need more data. Literature on the effect of processing on allergenic potential and the ability to induce sensitisation is scarce. This is an important issue since processing may impact on the ability of proteins to cause the acquisition of allergic sensitisation, and the subject should be a focus of future research. Also, there remains a need to develop robust and integrated methods for the risk assessment of food allergenicity. PMID:25778347

A new publication service abstracts, indexes, and prepares microfiche of environmental impact statements (EIS). This new service is designed to streamline the EIS process by reducing the cost and time of preparation, by eliminating redundancy of similar statements, and by working with the government to standardize the preparation process. (MA)

A process is described for dissolution of spent high efficiency particulate air (HEPA) filters and then combining the complexed filter solution with other radioactive wastes prior to calcining the mixed and blended waste feed. The process is an alternate to a prior method of acid leaching the spent filters which is an inefficient method of treating spent HEPA filters for disposal. 4 figures.

The various mechanisms by which ablation of materials can be induced with lasers are discussed in this paper. The various ablation processes and potential applications are reviewed from the threshold for ablation up to fluxes of about 10{sup 13} W/cm{sup 2}, with emphasis on three particular processes; namely, front-surface spallation, two-dimensional blowoff, and contained vaporization.

Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC.

Various aspects and applications or microsystem process networks are described. The design of many types of microsystems can be improved by ortho-cascading mass, heat, or other unit process operations. Microsystems having energetically efficient microchannel heat exchangers are also described. Detailed descriptions of numerous design features in microcomponent systems are also provided.

Various aspects and applications of microsystem process networks are described. The design of many types of Microsystems can be improved by ortho-cascading mass, heat, or other unit process operations. Microsystems having energetically efficient microchannel heat exchangers are also described. Detailed descriptions of numerous design features in microcomponent systems are also provided.

Various aspects and applications of microsystem process networks are described. The design of many types of microsystems can be improved by ortho-cascading mass, heat, or other unit process operations. Microsystems having exergetically efficient microchannel heat exchangers are also described. Detailed descriptions of numerous design features in microcomponent systems are also provided.

A process for the preparation of technetium-99m labeled pharmaceuticals is disclosed. The process comprises initially isolating technetium-99m pertechnetate by adsorption upon an adsorbent packing in a chromatographic column. The technetium-99m is then eluted from the packing with a biological compound to form a radiopharmaceutical.

An associative list processing unit and method comprising employing a plurality of prioritized cell blocks and permitting inserts to occur in a single clock cycle if all of the cell blocks are not full. Also, an associative list processing unit and method comprising employing a plurality of prioritized cell blocks and using a tree of prioritized multiplexers descending from the plurality of cell blocks.

This special bibliography lists 724 articles, papers, and reports which discuss various aspects of the use of the space environment for materials science research or for commercial enterprise. The potentialities of space processing and the improved materials processes that are made possible by the unique aspects of the space environment are emphasized. References identified in April, 1978 are cited.

CO2-laser processing of plastics has been studied experimentally and theoretically. Welding of cylindrical parts made from polycarbonate and polypropylene, cutting of polymethyl-methacrylate plates, and drilling holes in polypropylene are presented as examples. A good coincidence between theoretical and experimental results in case of laser welding has been found. Some practical aspects of laser processing of plastics has been given.

Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

This book consists of 18 selected papers that focus on the broad topic of the educational process. All the papers were originally presented at the Fourteenth Annual Conference of the Australian College of Education, which was held in May 1973. Titles of the papers include "The Educational Process: The Raw Material,""Towards a Sociology of…

Discusses how to build a successful cleaning process in order to most effectively maintain school facilities, explaining that the cleaning processes used plays a critical role in productivity. Focuses on: developing a standardized system; making sure that employees have the right tools for the work they perform; training employees; tracking and…

Shows how olefin isomerization and the exotic olefin metathesis reaction can be harnessed in industrial processes. Indicates that the Shell Higher Olefins Process makes use of organometallic catalysts to manufacture alpha-olefins and internal carbon-11 through carbon-14 alkenes in a flexible fashion that can be adjusted to market needs. (JN)

The protein family of kinesins contains processive motor proteins that move stepwise along microtubules. This mechanism requires the precise coupling of the catalytic steps in the two heads, and their precise mechanical coordination. Here we show that these functionalities can be uncoupled in chimera of processive and non-processive kinesins. A chimera with the motor domain of Kinesin-1 and the dimerization domain of a non-processive Kinesin-3 motor behaves qualitatively as conventional kinesin and moves processively in TIRF and bead motility assays, suggesting that spatial proximity of two Kinein-1 motor domains is sufficient for processive behavior. In the reverse chimera, the non-processive motor domains are unable to step along microtubules, despite the presence of the Kinesin-1 neck coiled coil. Still, ATP-binding to one head of these chimera induces ADP-release from the partner head, a characteristic feature of alternating site catalysis. These results show that processive movement of kinesin dimers requires elements in the motor head that respond to ADP-release and induce stepping, in addition to a proper spacing of the motor heads via the neck coiled coil. PMID:19242550

Describes research on in situ processing to develop necessary theory and understanding of the underground process to facilitate commercialization of a wide range of mineral deposits. Goal is to produce laboratory and computer-based tools to allow site evaluation based on field and laboratory measurements of mineral and associated overburdens.…

The DOE will soon choose between treating contaminated nickel scrap as a legacy waste and developing high-volume nickel decontamination processes. In addition to reducing the volume of legacy wastes, a decontamination process could make 200,000 tons of this strategic metal available for domestic use. Contaminants in DOE nickel scrap include {sup 234}Th, {sup 234}Pa, {sup 137}Cs, {sup 239}Pu (trace), {sup 60}Co, U, {sup 99}Tc, and {sup 237}Np (trace). This report reviews several industrial-scale processes -- electrorefining, electrowinning, vapormetallurgy, and leaching -- used for the purification of nickel. Conventional nickel electrolysis processes are particularly attractive because they use side-stream purification of process solutions to improve the purity of nickel metal. Additionally, nickel purification by electrolysis is effective in a variety of electrolyte systems, including sulfate, chloride, and nitrate. Conventional electrorefining processes typically use a mixed electrolyte which includes sulfate, chloride, and borate. The use of an electrorefining or electrowinning system for scrap nickel recovery could be combined effectively with a variety of processes, including cementation, solvent extraction, ion exchange, complex-formation, and surface sorption, developed for uranium and transuranic purification. Selected processes were reviewed and evaluated for use in nickel side-stream purification. 80 refs.

An overview of the nature, benefits, and steps involved in Colorado's Guaranteed Graduate Program, a process that assures that high school graduates have the knowledge and skills considered essential for entry into employment and postsecondary education, begins this document. A discussion of the portfolio process follows, along with descriptions…

A study of the speech process was conducted. The process is described as one closely linked to the one involved in the problem of the serial order in behavior. It is pointed out that in the speech of young children the grammatical relations that are properties of elementary underlying sentences appear in the grammatical meanings. Six examples of…

The industrial future of lasers in material processing lies in the combination of the laser with automatic machinery. One possible form of such a combination is an intelligent workstation which monitors the process as it occurs and adjusts itself accordingly, either by self teaching or by comparison to a process data bank or algorithm. In order to achieve this attractive goal in-process signals are required. Two devices are described in this paper. One is the Laser Beam Analyser which is now maturing into a second generation with computerised output. The other is the Acoustic Mirror, a totally novel analytic technique, not yet fully understood, but which nevertheless can act as a very effective process monitor.

Techniques are being developed to provide lower cost polysilicon material for solar cells. Existing technology which normally provides semiconductor industry polysilicon material is undergoing changes and also being used to provide polysilicon material for solar cells. Economics of new and existing technologies are presented for producing polysilicon. The economics are primarily based on the preliminary process design of a plant producing 1,000 metric tons/year of silicon. The polysilicon processes include: Siemen's process (hydrogen reduction of trichlorosilane); Union Carbide process (silane decomposition); and Hemlock Semiconductor process (hydrogen reduction of dichlorosilane). The economics include cost estimates of capital investment and product cost to produce polysilicon via the technology. Sensitivity analysis results are also presented to disclose the effect of major paramentes such as utilities, labor, raw materials and capital investment.

The Spacelab Data Processing Facility (SDPF) processes, monitors, and accounts for the payload data from Spacelab and other Shuttle missions and forwards relevant data to various user facilities worldwide. The SLDPF is divided into the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). The SIPS division demultiplexes, synchronizes, time tags, quality checks, accounts for the data, and formats the data onto tapes. The SOPS division further edits, blocks, formats, and records the data on tape for shipment to users. User experiments must conform to the Spacelab's onboard High Rate Multiplexer (HRM) format for maximum process ability. Audio, analog, instrumentation, high density, experiment data, input/output data, quality control and accounting, and experimental channel tapes along with a variety of spacelab ancillary tapes are provided to the user by SLDPF.

The recent progress of Business Process Management (BPM) is reflected by the figures of the related industry. Wintergreen Research estimates that the international market for BPM-related software and services accounted for more than USD 1 billion in 2005 with a tendency towards rapid growth in the subsequent couple of years [457]. The relevance of business process modeling to general management initiatives has been previously studied in the 1990s [28]. Today, Gartner finds that organizations that had the best results in implementing business process management spent more than 40 percent of the total project time on discovery and construction of their initial process model [265]. As a consequence, Gartner considers Business Process Modeling to be among the Top 10 Strategic Technologies for 2008.

The intent of this report is to document a procedure used at LANL for HIP bonding aluminum cladding to U-10Mo fuel foils using a formed HIP can for the Domestic Reactor Conversion program in the NNSA Office of Material, Management and Minimization, and provide some details that may not have been published elsewhere. The HIP process is based on the procedures that have been used to develop the formed HIP can process, including the baseline process developed at Idaho National Laboratory (INL). The HIP bonding cladding process development is summarized in the listed references. Further iterations with Babcock & Wilcox (B&W) to refine the process to meet production and facility requirements is expected.

Processing-refining of raw materials from extraterrestrial sources is detailed for a space materials handling facility. The discussion is constrained to those steps necessary to separate desired components from raw or altered input ores, semi-purified feedstocks, or process scrap and convert the material into elements, alloys, and consumables. The materials are regarded as originating from dead satellites and boosters, lunar materials, and asteroids. Strong attention will be given to recycling reagent substances to avoid the necessity of transporting replacements. It is assumed that since no aqueous processes exist on the moon, the distribution of minerals will be homogeneous. The processing-refining scenario will include hydrochemical, pyrochemical, electrochemical, and physical techniques selected for the output mass rate/unit plant mass ratio. Flow charts of the various materials processing operations which could be performed with lunar materials are provided, noting the necessity of delivering several alloying elements from the earth due to scarcities on the moon.

An improved catalytic reduction process for the direct recovery of elemental sulfur from various SO.sub.2 -containing industrial gas streams. The catalytic process provides combined high activity and selectivity for the reduction of SO.sub.2 to elemental sulfur product with carbon monoxide or other reducing gases. The reaction of sulfur dioxide and reducing gas takes place over certain catalyst formulations based on cerium oxide. The process is a single-stage, catalytic sulfur recovery process in conjunction with regenerators, such as those used in dry, regenerative flue gas desulfurization or other processes, involving direct reduction of the SO.sub.2 in the regenerator off gas stream to elemental sulfur in the presence of a catalyst.

An improved catalytic reduction process for the direct recovery of elemental sulfur from various SO[sub 2]-containing industrial gas streams. The catalytic process provides combined high activity and selectivity for the reduction of SO[sub 2] to elemental sulfur product with carbon monoxide or other reducing gases. The reaction of sulfur dioxide and reducing gas takes place over certain catalyst formulations based on cerium oxide. The process is a single-stage, catalytic sulfur recovery process in conjunction with regenerators, such as those used in dry, regenerative flue gas desulfurization or other processes, involving direct reduction of the SO[sub 2] in the regenerator off gas stream to elemental sulfur in the presence of a catalyst. 4 figures.

Techniques are being developed to provide lower cost polysilicon material for solar cells. Existing technology which normally provides semiconductor industry polysilicon material is undergoing changes and also being used to provide polysilicon material for solar cells. Economics of new and existing technologies are presented for producing polysilicon. The economics are primarily based on the preliminary process design of a plant producing 1,000 metric tons/year of silicon. The polysilicon processes include: Siemen's process (hydrogen reduction of trichlorosilane); Union Carbide process (silane decomposition); and Hemlock Semiconductor process (hydrogen reduction of dichlorosilane). The economics include cost estimates of capital investment and product cost to produce polysilicon via the technology. Sensitivity analysis results are also presented to disclose the effect of major paramentes such as utilities, labor, raw materials and capital investment.

There has been tremendous development within measurement science and technology over the past couple of decades. New sensor technologies and compact versatile signal recovery electronics are continuously expanding the limits of what can be measured and the accuracy with which this can be done. Miniaturization of sensors and the use of nanotechnology push these limits further. Also, thanks to powerful and cost-effective computer systems, sophisticated measurement and reconstruction algorithms previously only accessible in advanced laboratories are now available for in situ online measurement systems. The process industries increasingly require more process-related information, motivated by key issues such as improved process control, process utilization and process yields, ultimately driven by cost-effectiveness, quality assurance, environmental and safety demands. Industrial process tomography methods have taken advantage of the general progress in measurement science, and aim at providing more information, both quantitatively and qualitatively, on multiphase systems and their dynamics. The typical approach for such systems has been to carry out one local or bulk measurement and assume that this is representative of the whole system. In some cases, this is sufficient. However, there are many complex systems where the component distribution varies continuously and often unpredictably in space and time. The foundation of industrial tomography is to conduct several measurements around the periphery of a multiphase process, and use these measurements to unravel the cross-sectional distribution of the process components in time and space. This information is used in the design and optimization of industrial processes and process equipment, and also to improve the accuracy of multiphase system measurements in general. In this issue we are proud to present a selection of the 145 papers presented at the 5th World Congress on Industrial Process Tomography in Bergen

In a sequence of papers on the topic of message construction for interstellar communication by means of a cosmic language, representations of various kinds of concepts of reality in a Lingua Cosmica system [1]. Those studied were logic relations of a static character. The present contribution contains an important, fundamental extension: groundwork is done for the purpose of interpreting (dynamic) proc esses of various sorts in the linguistic system. Individual processes are abstracted in a logic sense and provided with basic properties as termination and communication functions. They can be combined into kinds of processes: sequential and parallel ones represented by only one inductive definition in logic. Based on concepts from the so-called process algebra, processes are provided with channel s mapping them to their states. State vectors are introduced to represent states of conglomerates of processes. Communication between processes (locally or globally) is effectuated by means of state transitions. Together with a programmed arb itration function, state vectors play a crucial role in representing communication. With these ingredients possibilities for general interpretations of a wide range of processes in the Lingua Cosmica system come in view.

Recent development of a simple single solvent technology goes far to meet the complete gas processing needs. The use of methanol, as practiced in the IPFEXOL process, where it is used not only as a hydrate inhibitor and antifreeze agent but as an acid gas extraction solvent makes the complete gas processing scheme simple and probably the most cost effective as well. This paper presents several gas processing applications where water, hydrocarbon liquids and acid gases are removed from natural wellhead production gases. Water and hydrocarbon liquids removal is achieved to the extent necessary to make a pipeline transportable gas or meet downstream cryogenic processing demands. These are illustrated with recent applications of the IFPEX-1 process successfully operating today in North America and the Far East. A recent North Sea offshore project is highlighted showing the particular advantages in offshore applications. For the removal of water and hydrocarbon liquids together with a substantial quantity of not only CO{sub 2} but H{sub 2}S, the most complete methanol use scheme is presented. This is illustrated with the development of an advanced version of the IFPEX-2 process containing some innovative but simple equipment concepts which yields high pressure dry acid gases for reinjection or a high quality acid gas destined to Claus type sulfur recovery.

This paper describes a new method for determining, improving, and controlling the measurement process errors (or measurement uncertainty) of a measurement system used to monitor product as it is manufactured. The method is called the Process Measurement Assurance Program (PMAP). It integrates metrology early into the product realization process and is a step beyond statistical process control (SPC), which monitors only the product. In this method, a control standard is used to continuously monitor the status of the measurement system. Analysis of the control standard data allow the determination of the measurement error inherent in the product data and allow one to separate the variability in the manufacturing process from variability in the measurement process. These errors can be then associated with either the measurement equipment, variability of the measurement process, operator bias, or local environmental effects. Another goal of PMAP is to determine appropriate re-calibration intervals for the measurement system, which may be significantly longer or shorter than the interval typically assigned by the calibration organization.

Process monitoring, which can be defined as the measurement of process variables with the smallest possible delay, is combined with process models to form the basis for successful process control. Minimizing the measurement delay leads inevitably to employing online, in situ sensors where possible, preferably using noninvasive measurement methods with stable, low-cost sensors. Microalgal processes have similarities to traditional bioprocesses but also have unique monitoring requirements. In general, variables to be monitored in microalgal processes can be categorized as physical, chemical, and biological, and they are measured in gaseous, liquid, and solid (biological) phases. Physical and chemical process variables can be usually monitored online using standard industrial sensors. The monitoring of biological process variables, however, relies mostly on sensors developed and validated using laboratory-scale systems or uses offline methods because of difficulties in developing suitable online sensors. Here, we review current technologies for online, in situ monitoring of all types of process parameters of microalgal cultivations, with a focus on monitoring of biological parameters. We discuss newly introduced methods for measuring biological parameters that could be possibly adapted for routine online use, should be preferably noninvasive, and are based on approaches that have been proven in other bioprocesses. New sensor types for measuring physicochemical parameters using optical methods or ion-specific field effect transistor (ISFET) sensors are also discussed. Reviewed methods with online implementation or online potential include measurement of irradiance, biomass concentration by optical density and image analysis, cell count, chlorophyll fluorescence, growth rate, lipid concentration by infrared spectrophotometry, dielectric scattering, and nuclear magnetic resonance. Future perspectives are discussed, especially in the field of image analysis using in situ

The Video Image Stabilization And Registration (VISAR) process is an award winning video image processing software developed at NASA's Marshall Space Flight Center. VISAR has a wide variety of application areas where the refinement of digital video is needed. It is used to correct jitter, rotation, and zoom effects by registering and processing on individual image captures that are a part of normal video capturing. Its most prominent uses were the 1996 Olympic Bombing case and in identifying Saddam Hussein during the Iraq war. Based on first-hand knowledge, this paper describes the VISAR process, which consists of several steps designed to refine digital video using VISAR software. The process determines the differences between two video images so that one, or both, of the images can be changed in ways that make them match as well as possible. Corrections include changes in position (horizontal and vertical image shifts), changes in orientation (image rotation), and changes in magnification (image zoom). While much of the VISAR process is automated, in its current embodiment it requires the user to initially identify the area of interest and to reset a threshold parameter if the default gives unacceptable results. The basic process that is used is an old tried and true method that determines how well the two images match. This process is called cross-correlation. It gives a single number, the correlation coefficient, that is equal to 1.0 if the images are perfectly matched, is equal to 0.0 if the images have nothing in common, and is equal to -1.0 if one image is the negative of the other. This basic process is used by many image stabilization methods. With VISAR we use it in a manner that provides statistical information needed to best determine orientation and magnification.

The project identified and quantified ways to reduce the cost of hydrogen liquefaction, and reduce the cost of hydrogen distribution. The goal was to reduce the power consumption by 20% and then to reduce the capital cost. Optimizing the process, improving process equipment, and improving ortho-para conversion significantly reduced the power consumption of liquefaction, but by less than 20%. Because the efficiency improvement was less than the target, the program was stopped before the capital cost was addressed. These efficiency improvements could provide a benefit to the public to improve the design of future hydrogen liquefiers. The project increased the understanding of hydrogen liquefaction by modeling different processes and thoroughly examining ortho-para separation and conversion. The process modeling provided a benefit to the public because the project incorporated para hydrogen into the process modeling software, so liquefaction processes can be modeled more accurately than using only normal hydrogen. Adding catalyst to the first heat exchanger, a simple method to reduce liquefaction power, was identified, analyzed, and quantified. The demonstrated performance of ortho-para separation is sufficient for at least one identified process concept to show reduced power cost when compared to hydrogen liquefaction processes using conventional ortho-para conversion. The impact of improved ortho-para conversion can be significant because ortho para conversion uses about 20-25% of the total liquefaction power, but performance improvement is necessary to realize a substantial benefit. Most of the energy used in liquefaction is for gas compression. Improvements in hydrogen compression will have a significant impact on overall liquefier efficiency. Improvements to turbines, heat exchangers, and other process equipment will have less impact.

This final report documents the development and installation of software and hardware for Robotic Welding Process Control. Primary emphasis is on serial communications between the CYRO 750 robotic welder, Heurikon minicomputer running Hunter & Ready VRTX, and an IBM PC/AT, for offline programming and control and closed-loop welding control. The requirements for completion of the implementation of the Rocketdyne weld tracking control are discussed. The procedure for downloading programs from the Intergraph, over the network, is discussed. Conclusions are made on the results of this task, and recommendations are made for efficient implementation of communications, weld process control development, and advanced process control procedures using the Heurikon.

Neural Analog Information Processing (NAIP) is an effort to develop general purpose pattern classification architectures based upon biological information processing principles. This paper gives an overview of NAIP and its relationship to the previous work in neural modeling from which its fundamental principles are derived. It also presents a theorem concerning the stability of response of a slab (a two dimensional array of identical simple processing units) to time-invariant (spatial) patterns. An experiment (via computer emulation) demonstrating classification of a spatial pattern by a simple, but complete NAIP architecture is described. A concept for hardware implementation of NAIP architectures is briefly discussed.

A new powdered-carbon treatment process is being developed for the elimination of the present problems, associated with the disposal of biologically active sewage waste solids, and with water reuse. This counter-current flow process produces an activated carbon, which is obtained from the pyrolysis of the sewage solids, and utilizes this material to remove the adulterating materials from the water. Additional advantages of the process are the elimination of odors, the removal of heavy metals, and the potential for energy conservation.

Robot contour processes include those with contact force like car body grinding or deburring of complex castings, as well as those with little or no contact force like inspection. This paper describes ways of characterizing, identifying, and estimating contours and robot trajectories. Contour and robot are modeled as stochastic processes in order to emphasize that both successive robot cycles and successive industrial workpieces are similar but not exactly the same. The stochastic models can be used to identify the state of a workpiece or process, or to design a filter to estimate workpiece, shape and robot position from robot-based measurements.

This invention relates to apparatus for processing municipal waste, and more particularly to vibrating mesh screen conveyor systems for removing grit, glass, and other noncombustible materials from dry municipal waste. Municipal waste must be properly processed and disposed of so that it does not create health risks to the community. Generally, municipal waste, which may be collected in garbage trucks, dumpsters, or the like, is deposited in processing areas such as landfills. Land and environmental controls imposed on landfill operators by governmental bodies have increased in recent years, however, making landfill disposal of solid waste materials more expensive. 6 figs.

A heat sealing process was developed by SEBRA based on technology that originated in work with NASA's Jet Propulsion Laboratory. The project involved connecting and transferring blood and fluids between sterile plastic containers while maintaining a closed system. SEBRA markets the PIRF Process to manufacturers of medical catheters. It is a precisely controlled method of heating thermoplastic materials in a mold to form or weld catheters and other products. The process offers advantages in fast, precise welding or shape forming of catheters as well as applications in a variety of other industries.

There is described an improved coal liquefaction quenching process which prevents the formation of coke with a minimum reduction of thermal efficiency of the coal liquefaction process. In the process, the rapid cooling of the liquid/solid products of the coal liquefaction reaction is performed without the cooling of the associated vapor stream to thereby prevent formation of coke and the occurrence of retrograde reactions. The rapid cooling is achieved by recycling a subcooled portion of the liquid/solid mixture to the lower section of a phase separator that separates the vapor from the liquid/solid products leaving the coal reactor.

The purpose of this project was to determine the impact of a new breakthrough technology, ultrasonic processing, on various industries, including steel, aluminum, metal casting, and forging. The specific goals of the project were to evaluate core principles and establish quantitative bases for the ultrasonc processing of materials, and to demonstrate key applications in the areas of grain refinement of alloys during solidification and degassing of alloy melts. This study focussed on two classes of materials - aluminum alloys and steels - and demonstrated the application of ultrasonic processing during ingot casting.

Chemical processes presented in this document include cleaning, pickling, surface finishes, chemical milling, plating, dry film lubricants, and polishing. All types of chemical processes applicable to aluminum, for example, are to be found in the aluminum alloy section. There is a separate section for each category of metallic alloy plus a section for non-metals, such as plastics. The refractories, super-alloys and titanium, are prime candidates for the space shuttle, therefore, the chemical processes applicable to these alloys are contained in individual sections of this manual.

"Peen Plating," a NASA developed process for applying molybdenum disulfide, is the key element of Techniblast Co.'s SURFGUARD process for applying high strength solid lubricants. The process requires two machines -- one for cleaning and one for coating. The cleaning step allows the coating to be bonded directly to the substrate to provide a better "anchor." The coating machine applies a half a micron thick coating. Then, a blast gun, using various pressures to vary peening intensities for different applications, fires high velocity "media" -- peening hammers -- ranging from plastic pellets to steel shot. Techniblast was assisted by Rural Enterprises, Inc. Coating service can be performed at either Techniblast's or a customer's facility.

This paper describes work using historical film material, including what is believed to be the world's first feature length film. The digital processing of historical film material permits many new facilities: digital restoration, electronic storage, automated indexing, and electronic delivery to name a few. Although the work aims ultimately to support all of the previously mentioned facilities, this paper concentrated upon automatic scene change detection, brightness correction, and frame registration. These processes are fundamental to a more complete and complex processing system, but, by themselves, could be immediately used in computer-assisted film cataloging.

In this paper, we propose a color image processing method by combining modern signal processing technique with knowledge about the properties of the human color vision system. Color signals are processed differently according to their visual importance. The emphasis of the technique is on the preservation of total visual quality of the image and simultaneously taking into account computational efficiency. A specific color image enhancement technique, termed Hybrid Vector Median Filtering is presented. Computer simulations have been performed to demonstrate that the new approach is technically sound and results are comparable to or better than traditional methods.

This paper reviews innovative industrial materials processes that have the potential for significant improvements in energy use, yet require long-term research to achieve that potential. Potential revolutionary alternatives are reviewed for the following industries: iron and steel; aluminum; petroleum refining; paper and pulp; food and kindred products; stone, clay and glass; textiles; and chemicals. In total, 45 candidate processes were identified. Examples of these processes include direct steelmaking and ore-to-powder systems that potentially require 30% and 40% less energy, respectively, than conventional steelmaking systems; membrane separations and freeze crystallization that offer up to 90% reductions in energy use when compared with distillation; cold processing of cement that offers a 50% reduction in energy requirements; and dry forming of paper that offers a 25% reduction in the energy needed for papermaking.

Describes the diversity of plants. Outlines novel developmental and complex genetic processes that are specific to plants. Identifies approaches that can be used to solve problems in plant biology. Cites the advantages of using higher plants for experimental systems. (RT)

Evaporation has been an established technology in the metal finishing industry for many years. In this process, wastewaters containing reusable materials, such as copper, nickel, or chromium compounds are heated, producing a water vapor that is continuously removed and condensed....

Presents a literature review of the petroleum processing wastes, covering publications of 1977. This review covers studies such as the use of activated carbon in petroleum and petrochemical waste treatment. A list of 15 references is also presented. (HM)

Advances in electronics and computer science have enabled industries (pulp/paper, iron/steel, petroleum/chemical) to attain better control of their processes with resulting increases in quality, productivity, profitability, and compliance with government regulations. (JN)

An overview of three main types of simulation approach (explanatory, abstraction, and estimation) is presented, along with a discussion of their capabilities limitations, and the steps required for their validation. A process model being developed through the Forest Response Prog...

LASER, an acronym for Light Amplification by Stimulated Emission of Radiation have unique properties, Which make it differ from ordinary light such as it is highly coherent, monochromatic, negligible divergence and scattering loss and a intense beam of electromagnetic radiation or light. It also occur in a wide range of wavelength/frequency (from Ultraviolet to Infrared), energy/power and beam-mode/configurations ; Due to these unique properties, it have use in wide application of ceramic processing for industrial manufacturing, fabrication of electronic circuit such as marking, serializing, engraving, cutting, micro-structuring because laser only produces localized heating, without any contact and thermal stress on the any part during processing. So there is no risk of fracturing that occurs during mechanical sawing and also reduce Cost of processing. The discussion in this paper highlight the application of laser in ceramics processing.

Integration of Advanced Technologies will Update Ethylene Plants. Nearly 93 million tons of ethylene are produced annually in chemical plants worldwide, using an energy intensive process that consumes 2.5 quadrillion Btu per year.

Irradiation of high-energy ultrasonic vibration in metals and alloys generates oscillating strain and stress fields in solids, and introduces nonlinear effects such as cavitation, acoustic streaming, and radiation pressure in molten materials. These nonlinear effects can be utilized to assist conventional material processingprocesses. This article describes recent research at Oak Ridge National Labs and Purdue University on using high-intensity ultrasonic vibrations for degassing molten aluminum, processing particulate-reinforced metal matrix composites, refining metals and alloys during solidification process and welding, and producing bulk nanostructures in solid metals and alloys. Research results suggest that high-intensity ultrasonic vibration is capable of degassing and dispersing small particles in molten alloys, reducing grain size during alloy solidification, and inducing nanostructures in solid metals.

An associative list processing unit and method comprising employing a plurality of prioritized cell blocks and permitting inserts to occur in a single clock cycle if all of the cell blocks are not full.

Cooperative processing for the 1990's using client-server technology is addressed. The main theme is concepts of downsizing from mainframes and minicomputers to workstations on a local area network (LAN). This document is presented in view graph form.

Two types of space processing operations may be considered economically justified; they are manufacturing operations that make profits and experiment operations that provide needed applied research results at lower costs than those of alternative methods. Some examples from the Skylab experiments suggest that applied research should become cost effective soon after the space shuttle and Spacelab become operational. In space manufacturing, the total cost of space operations required to process materials must be repaid by the value added to the materials by the processing. Accurate estimates of profitability are not yet possible because shuttle operational costs are not firmly established and the markets for future products are difficult to estimate. However, approximate calculations show that semiconductor products and biological preparations may be processed on a scale consistent with market requirements and at costs that are at least compatible with profitability using the Shuttle/Spacelab system.

An improved multistep liquefaction process for organic carbonaceous mater which produces a virtually completely solvent-soluble carbonaceous liquid product. The solubilized product may be more amenable to further processing than liquid products produced by current methods. In the initial processing step, the finely divided organic carbonaceous material is treated with a hydrocarbonaceous pasting solvent containing from 10% and 100% by weight process-derived phenolic species at a temperature within the range of 300 C to 400 C for typically from 2 minutes to 120 minutes in the presence of a carbon monoxide reductant and an optional hydrogen sulfide reaction promoter in an amount ranging from 0 to 10% by weight of the moisture- and ash-free organic carbonaceous material fed to the system. As a result, hydrogen is generated via the water/gas shift reaction at a rate necessary to prevent condensation reactions. In a second step, the reaction product of the first step is hydrogenated.

Background information and exercises are provided to: (1) establish or expand understanding of the concepts, methods, and terminology of computer processing of image producing data; (2) develop insight into the advantages of computer based image processing compared with the photointerpretation approach for processing, classifying, interpreting, and applying remote sensing data; (3) foster a broad perspective on the principal of the main techniques for image enhancement, pattern recognition, and thematic classification; (4) appreciate the pros and cons of batch and interactive modes of image analysis; (5) examine and evaluate some specific computer generated products for subscenes in Pennsylvania and New Jersey; and (6) interrelate these particular examples of output with more theoretical explanations of computer processing strategies and procedures.

A process for removing phenols from an aqueous solution is provided, which comprises the steps of contacting a mixture comprising the solution and a metal oxide, forming a phenol metal oxide complex, and removing the complex from the mixture.

George is a fast and flexible library, implemented in C++ with Python bindings, for Gaussian Process regression useful for accounting for correlated noise in astronomical datasets, including those for transiting exoplanet discovery and characterization and stellar population modeling.

... noisy) listening situations and may have difficulties with reading, spelling, attention, and language problems. APD is common in older adults, particularly when hearing loss is present. It is likely that many processes and problems contribute to APD in children. In ...

Described is a graduate level engineering course offered at the University of Southern California on coal liquefaction processes. Lecture topics and course requirements are discussed. A 64-item bibliography of papers used in place of a textbook is included. (BT)

An improved multistep liquefaction process for organic carbonaceous mater which produces a virtually completely solvent-soluble carbonaceous liquid product. The solubilized product may be more amenable to further processing than liquid products produced by current methods. In the initial processing step, the finely divided organic carbonaceous material is treated with a hydrocarbonaceous pasting solvent containing from 10% and 100% by weight process-derived phenolic species at a temperature within the range of 300.degree. C. to 400.degree. C. for typically from 2 minutes to 120 minutes in the presence of a carbon monoxide reductant and an optional hydrogen sulfide reaction promoter in an amount ranging from 0 to 10% by weight of the moisture- and ash-free organic carbonaceous material fed to the system. As a result, hydrogen is generated via the water/gas shift reaction at a rate necessary to prevent condensation reactions. In a second step, the reaction product of the first step is hydrogenated.

Hydrocracking processes convert aromatic gas oils into high quality gasoline, diesel, and turbine stocks. They operate at high hydrogen pressures, typically greater than 1500 psig. Operating temperatures range from 600-700{degrees}F (315-382{degrees}C). Commercial catalysts vary in activity and selectivity, allowing process designers to emphasize middle distillates, naphtha, or both. Catalysts are quite stable in use, with two year unit run lengths typical. A pretreatment step to remove nitrogen compounds is usually part of the same process unit. These HDN units operate integrally with the hydrocracking. The hydrogenation reactions are strongly exothermic, while the cracking is roughly thermal neutral. This combination can lead to temperature runaways. To avoid this, cold hydrogen is injected at several points in hydrocracking reactors. The mechanics of mixing this hydrogen with the oil and redistributing the mixture over the catalyst bed are very important in controlling process operation and ensuring long catalyst life.

The Mobil Olefin to Gasoline and Distillate (MOGD) process is described in which light olefinic compunds can be converted to high quality gasoline and distillate. This process, now ready for commercialization is based on a unique synthetic zeolite catalyst, the shape of which selectively oligomerizes light olefins to higher molecular weight iso-olefins. The highly flexible process can be designed to produce distillate/gasoline ratios of 0/100 to 90/10 for a commercial plant, depending on market requirements. MOGD is applicable to a wide range of feed streams ranging from ethylene to 400 degrees F end point olefinic naphtha. The process has been tested using commercially produced catalyst in refinery-scale equipment.

Some conclusions of this presentation are: (1) Radiation-assisted nanotechnology applications will continue to grow; (2) The APPF will provide a unique focus for radiolytic processing of nanomaterials in support of DOE-DP, other DOE and advanced manufacturing initiatives; (3) {gamma}, X-ray, e-beam and ion beam processing will increasingly be applied for 'green' manufacturing of nanomaterials and nanocomposites; and (4) Biomedical science and engineering may ultimately be the biggest application area for radiation-assisted nanotechnology development.

A treatment process for a hydrogen-containing off-gas stream from a refinery, petrochemical plant or the like. The process includes three separation steps: condensation, membrane separation and hydrocarbon fraction separation. The membrane separation step is characterized in that it is carried out under conditions at which the membrane exhibits a selectivity in favor of methane over hydrogen of at least about 2.5.

Partly-digital, partly-optical 'hybrid' image processing attempts to use the properties of each domain to synergistic advantage: while Fourier optics furnishes speed, digital processing allows the use of much greater algorithmic complexity. The video-rate image-coordinate transformation used is a critical technology for real-time hybrid image-pattern recognition. Attention is given to the separation of pose variables, image registration, and both single- and multiple-frame registration.

Barnett Banks of Florida, Inc. operates 150 banking offices in 80 Florida cities. Banking offices have computerized systems for processing deposits or withdrawals in checking/savings accounts, and for handling commercial and installment loan transactions. In developing a network engineering design for the terminals used in record processing, an affiliate, Barnett Computing Company, used COSMIC's STATCOM program. This program provided a reliable network design tool and avoided the cost of developing new software.

An improved coal liquefaction process is provided which enables conversion of a coal-oil slurry to a synthetic crude refinable to produce larger yields of gasoline and diesel oil. The process is characterized by a two-step operation applied to the slurry prior to catalytic desulfurization and hydrogenation in which the slurry undergoes partial hydrogenation to crack and hydrogenate asphaltenes and the partially hydrogenated slurry is filtered to remove minerals prior to subsequent catalytic hydrogenation.

A zero gravity processing furnace system was designed that will allow acquisition of photographic or other visual information while the sample is being processed. A low temperature (30 to 400 C) test model with a flat specimen heated by quartz-halide lamps was constructed. A high temperature (400 to 1000 C) test model heated by resistance heaters, utilizing a cylindrical specimen and optics, was also built. Each of the test models is discussed in detail. Recommendations are given.

This invention relates to an improved process for the production of liquid carbonaceous fuels and solvents from carbonaceous solid fuels, especially coal. The claimed improved process includes the hydrocracking of the light SRC mixed with a suitable hydrocracker solvent. The recycle of the resulting hydrocracked product, after separation and distillation, is used to produce a solvent for the hydrocracking of the light solvent refined coal.

To convert raw data into environmental products, the National Weather Service and other organizations use the Global 9000 image processing system marketed by Global Imaging, Inc. The company's GAE software package is an enhanced version of the TAE, developed by Goddard Space Flight Center to support remote sensing and image processing applications. The system can be operated in three modes and is combined with HP Apollo workstation hardware.

A process of preparing spherical high bulk density nitroguanidine by dissing low bulk density nitroguanidine in N-methyl pyrrolidone at elevated temperatures and then cooling the solution to lower temperatures as a liquid characterized as a nonsolvent for the nitroguanidine is provided. The process is enhanced by inclusion in the solution of from about 1 ppm up to about 250 ppm of a metal salt such as nickel nitrate, zinc nitrate or chromium nitrate, preferably from about 20 to about 50 ppm.

A multi-dimensional model of the Resin Transfer Molding (RTM) process was developed for the prediction of the infiltration behavior of a resin into an anisotropic fiber preform. Frequency dependent electromagnetic sensing (FDEMS) was developed for in-situ monitoring of the RTM process. Flow visualization and mold filling experiments were conducted to verify sensor measurements and model predictions. Test results indicated good agreement between model predictions, sensor readings, and experimental data.

The information contained in this report is intended to supplement the original Environmental Impact Statement (EIS) for the Defense Waste Processing Facility (DWPF). Since the original EIS in 1982, alterations have been made to he conceptual process that reduce the impact to the groundwater. This reduced impact is documented in this report along with an update of the understanding of seismology and geology of the Savannah River Site. 6 refs., 2 figs., 2 tabs.

A process of preparing spherical high bulk density nitroguanidine by dissolving low bulk density nitroguanidine in N-methyl pyrrolidone at elevated temperatures and then cooling the solution to lower temperatures as a liquid characterized as a nonsolvent for the nitroguanidine is presented. The process is enhanced by inclusion in the solution of from about 1 ppm up to about 250 ppm of a metal salt such as nickel nitrate, zinc nitrate or chromium nitrate, preferably from about 20 to about 50 ppm.

Addressing the increasing importance for firms to have a thorough knowledge of statistically based quality control procedures, this book presents the fundamentals of statistical process control (SPC) in a non-mathematical, practical way. It provides real-life examples and data drawn from a wide variety of industries. The foundations of good quality management and process control, and control of conformance and consistency during production are given. Offers clear guidance to those who wish to understand and implement modern SPC techniques.

A software system design is proposed and demonstrated with pilot-project software. The system permits the Apple II microcomputer to be used for personalized computer-assisted instruction in the digital image processing of LANDSAT images. The programs provide data input, menu selection, graphic and hard-copy displays, and both general and detailed instructions. The pilot-project results are considered to be successful indicators of the capabilities and limits of microcomputers for digital image processing education.

Meteorologists at NASA's Goddard Space Flight Center are conducting an extensive program of research in weather and climate related phenomena. This paper focuses on meteorological image processing applications directed toward gaining a detailed understanding of severe weather phenomena. In addition, the paper discusses the ground data handling and image processing systems used at the Goddard Space Flight Center to support severe weather research activities and describes three specific meteorological studies which utilized these facilities.

Gaia is the European Space Agency's (ESA's) ambitious space astrometry mission with a main objective to map astrometrically and spectro-photometrically not less than 1000 million celestial objects in our galaxy with unprecedented accuracy. The announcement of opportunity (AO) for the data processing will be issued by ESA late in 2006. The Gaia Data Processing and Analysis Consortium (DPAC) has been formed recently and is preparing an answer to this AO. The satellite will downlink around 100 TB of raw telemetry data over a mission duration of 5--6 years. To achieve its required astrometric accuracy of a few tens of microarcseconds, a highly involved processing of this data is required. In addition to the main astrometric instrument Gaia will host a radial-velocity spectrometer and two low-resolution dispersers for multi-color photometry. All instrument modules share a common focal plane consisting of a CCD mosaic about 1 m^2 in size and featuring close to 10^9 pixels. Each of the various instruments requires relatively complex processing while at the same time being interdependent. We describe the composition and structure of the DPAC and the envisaged overall architecture of the system. We shall delve further into the core processing---one of the nine so-called coordination units comprising the Gaia processing system.

In many facilities, energy management is simply a matter of managing the energy required for lighting and space conditioning. In many others, however, energy management is much more complex and involves large motors and controls, industrial insulation, complex combustion monitoring, unique steam distribution problems, significant amounts of waste heat, etc. Typical facilities offering large energy management opportunities include industrial facilities, large office and commercial operations, government institutions such as schools, hospitals and prisons. Such facilities generally have specialized industrial, commercial or institutional processes that incorporate many of the concepts covered in other chapters. These processes require thorough analytical evaluations to determine the appropriate energy-saving measures. This chapter provides some examples. In this chapter the authors present a suggested procedure for process energy improvement. Then, motors and controls are discussed since they form an integral part of most processes. Next, some sample case studies of process energy management opportunities are provided. Finally, the authors outline some common process activities where better energy management can be practiced. Air compressors are also discussed.

Synroc is a titanate-based ceramic material currently being developed for immobilizing high-level nuclear reactor wastes in solid form. Synroc D is a unique variation of Synroc. It can contain the high-level defense wastes, particularly those in storage at the Savannah River Plant. In this report, we review the early development of the initial Synroc process, discuss modification and other options that simplify it overall, and recommend the future direction of research and development in the processing area. A reference Synroc process is described briefly and contrasted with the Savannah River Laboratory glass-based reference case. Preliminary engineering layouts show Synroc to be a more complex processing operation and, thus, more expensive than the glass-based process. However, we believe that simplifications, which will significantly reduce the cost difference, are possible. Further research and development will continue in the areas of slurry processing, fluidized bed calcination, and mineralization. This last will use sintering, hot uniaxial pressing, or hot isostatic pressing.

This paper is a summary of the activities performed for the process development of laser thermal forming sheet metal parts in support of rapid prototyping. A 400 watt pulsed Nd:YAG laser and 50 watt desktop CO{sub 2} laser were used during initial process development. Several tool-assisted laser forming approaches were conceived during the development of the process, and simple fixtures for process development/understanding were used throughout all testing. Much of the actual forming was performed with the base material in an unfixtured state. CRES (304) was used for baseline development, but the effort was directed toward forming titanium (e.g., 6Al-4V, 15V-3Cr-3Sn-3Al). Several DOE (i.e., Design of Experiment) techniques were employed during development and a Neural Net Computer Model was conceived for process control. This program was a joint effort in cooperation with the American Welding Society under contract with the Defense Advanced Research Projects Agency (DARPA). A synopsis of the laser forming process development, future opportunities, and applications are presented.

Studsvik has completed over four years of operation at its Erwin, TN facility. During this time period Studsvik processed over 3.3 million pounds (1.5 million kgs) of radioactive ion exchange bead resin, powdered filter media, and activated carbon, which comprised a cumulative total activity of 18,852.5 Ci (6.98E+08 MBq). To date, the highest radiation level for an incoming resin container has been 395 R/hr (3.95 Sv/h). The Studsvik Processing Facility (SPF) has the capability to safely and efficiently receive and process a wide variety of solid and liquid Low Level Radioactive Waste (LLRW) streams including: Ion Exchange Resins (IER), activated carbon (charcoal), graphite, oils, solvents, and cleaning solutions with contact radiation levels of up to 400 R/hr (4.0 Sv/h). The licensed and heavily shielded SPF can receive and process liquid and solid LLRWs with high water and/or organic content. This paper provides an overview of the last four years of commercial operations processing radioactive LLRW from commercial nuclear power plants. Process improvements and lessons learned will be discussed.

A unique process cycle and apparatus design separates the consumer (cryogenic) load return flow from most of the recycle return flow of a refrigerator and/or liquefier process cycle. The refrigerator and/or liquefier process recycle return flow is recompressed by a multi-stage compressor set and the consumer load return flow is recompressed by an independent consumer load compressor set that maintains a desirable constant suction pressure using a consumer load bypass control valve and the consumer load return pressure control valve that controls the consumer load compressor's suction pressure. The discharge pressure of this consumer load compressor is thereby allowed to float at the intermediate pressure in between the first and second stage recycle compressor sets. Utilizing the unique gas management valve regulation, the unique process cycle and apparatus design in which the consumer load return flow is separate from the recycle return flow, the pressure ratios of each recycle compressor stage and all main pressures associated with the recycle return flow are allowed to vary naturally, thus providing a naturally regulated and balanced floating pressure process cycle that maintains optimal efficiency at design and off-design process cycle capacity and conditions automatically.

A unique process cycle and apparatus design separates the consumer (cryogenic) load return flow from most of the recycle return flow of a refrigerator and/or liquefier process cycle. The refrigerator and/or liquefier process recycle return flow is recompressed by a multi-stage compressor set and the consumer load return flow is recompressed by an independent consumer load compressor set that maintains a desirable constant suction pressure using a consumer load bypass control valve and the consumer load return pressure control valve that controls the consumer load compressor's suction pressure. The discharge pressure of this consumer load compressor is thereby allowed to float at the intermediate pressure in between the first and second stage recycle compressor sets. Utilizing the unique gas management valve regulation, the unique process cycle and apparatus design in which the consumer load return flow is separate from the recycle return flow, the pressure ratios of each recycle compressor stage and all main pressures associated with the recycle return flow are allowed to vary naturally, thus providing a naturally regulated and balanced floating pressure process cycle that maintains optimal efficiency at design and off-design process cycle capacity and conditions automatically.

A process for forming large-grain polycrystalline films from amorphous films for use as photovoltaic devices. The process operates on the amorphous film and uses the driving force inherent to the transition from the amorphous state to the crystalline state as the force which drives the grain growth process. The resultant polycrystalline film is characterized by a grain size that is greater than the thickness of the film. A thin amorphous film is deposited on a substrate. The formation of a plurality of crystalline embryos is induced in the amorphous film at predetermined spaced apart locations and nucleation is inhibited elsewhere in the film. The crystalline embryos are caused to grow in the amorphous film, without further nucleation occurring in the film, until the growth of the embryos is halted by imgingement on adjacently growing embryos. The process is applicable to both batch and continuous processing techniques. In either type of process, the thin amorphous film is sequentially doped with p and n type dopants. Doping is effected either before or after the formation and growth of the crystalline embryos in the amorphous film, or during a continuously proceeding crystallization step.

A process is described for forming large-grain polycrystalline films from amorphous films for use as photovoltaic devices. The process operates on the amorphous film and uses the driving force inherent to the transition from the amorphous state to the crystalline state as the force which drives the grain growth process. The resultant polycrystalline film is characterized by a grain size that is greater than the thickness of the film. A thin amorphous film is deposited on a substrate. The formation of a plurality of crystalline embryos is induced in the amorphous film at predetermined spaced apart locations and nucleation is inhibited elsewhere in the film. The crystalline embryos are caused to grow in the amorphous film, without further nucleation occurring in the film, until the growth of the embryos is halted by impingement on adjacently growing embryos. The process is applicable to both batch and continuous processing techniques. In either type of process, the thin amorphous film is sequentially doped with p and n type dopants. Doping is effected either before or after the formation and growth of the crystalline embryos in the amorphous film, or during a continuously proceeding crystallization step. 10 figs.

The use of a 'process of care' is well established in several health professions, most evidently within the field of nursing. Now ingrained within methods of care delivery, it offers a logical approach to problem solving and ensures an appropriate delivery of interventions that are specifically suited to the individual patient. Paramedicine is a rapidly advancing profession despite a wide acknowledgement of limited research provisions. This frequently results in the borrowing of evidence from other disciplines. While this has often been useful, there are many concerns relating to the acceptable limit of evidence transcription between professions. To date, there is no formally recognised 'process of care'-defining activity within the pre-hospital arena. With much current focus on the professional classification of paramedic work, it is considered timely to formally define a formula that underpins other professional roles such as nursing. It is hypothesised that defined processes of care, particularly the nursing process, may have features that would readily translate to pre-hospital practice. The literature analysed was obtained through systematic searches of a range of databases, including Ovid MEDLINE, Cumulative Index to Nursing and Allied Health. The results demonstrated that the defined process of care provides nursing with more than just a structure for practice, but also has implications for education, clinical governance and professional standing. The current nursing process does not directly articulate to the complex and often unstructured role of the paramedic; however, it has many principles that offer value to the paramedic in their practice. Expanding the nursing process model to include the stages of Dispatch Considerations, Scene Assessment, First Impressions, Patient History, Physical Examination, Clinical Decision-Making, Interventions, Re-evaluation, Transport Decisions, Handover and Reflection would provide an appropriate model for pre

The Savannah River Site's HB-Line Facility completed a campaign in which fifty nine cans of neptunium oxide were produced and shipped to the Idaho National Laboratory in the 9975 shipping container. The neptunium campaign was divided into two parts: Part 1 which consisted of oxide made from H-Canyon neptunium solution which did not require any processing prior to conversion into an oxide, and Part 2 which consisted of oxide made from additional H-Canyon neptunium solutions which required processing to purify the solution prior to conversion into an oxide. The neptunium was received as a nitrate solution and converted to oxide through ion-exchange column extraction, precipitation, and calcination. Numerous processing challenges were encountered in order make a final neptunium oxide product that could be shipped in a 9975 shipping container. Among the challenges overcome was the issue of scale: translating lab scale production into full facility production. The balance between processing efficiency and product quality assurance was addressed during this campaign. Lessons learned from these challenges are applicable to other processing projects.

Thermal spray processing has been used for a number of years to cost-effecticely apply TBC`s for a wide range of heat engine applications. In particular, bond coats are applied by plasma spray and HVOF techniques and partially-stabilized zirconia top coats are applied by plasma spray methods. Thermal spray involves melting and rapid transport of the molten particles to the substrate, where high-rate solidification and coating build-up occur. It is the very nature of this melt processing that leads to the unique layered microstructure, as well as the apparent imperfections, so readily identified with thermal spray. Modeling the process, process-induced residual stresses, and thermal conductivity will be discussed in light of a new understanding of porosity and its anisotropy. Microcracking can be understood using new approaches, allowing a fuller view of the processing-performance connection. Detailed electron microscopic, novel neutron diffraction and fracture analysis of the deposits can lead to a better understanding of how overall microstructure can be controlled to influence critical properties of the deposited TBC system.

Thermal spray processing has been used for a number of years to cost-effecticely apply TBC's for a wide range of heat engine applications. In particular, bond coats are applied by plasma spray and HVOF techniques and partially-stabilized zirconia top coats are applied by plasma spray methods. Thermal spray involves melting and rapid transport of the molten particles to the substrate, where high-rate solidification and coating build-up occur. It is the very nature of this melt processing that leads to the unique layered microstructure, as well as the apparent imperfections, so readily identified with thermal spray. Modeling the process, process-induced residual stresses, and thermal conductivity will be discussed in light of a new understanding of porosity and its anisotropy. Microcracking can be understood using new approaches, allowing a fuller view of the processing-performance connection. Detailed electron microscopic, novel neutron diffraction and fracture analysis of the deposits can lead to a better understanding of how overall microstructure can be controlled to influence critical properties of the deposited TBC system.

In order to simulate and optimize the microwave sintering of a silicon nitride and tungsten carbide/cobalt toolbits process, a microwave sintering process model has been built. A cylindrical sintering furnace was used containing a heat insulating layer, a susceptor layer, and an alumina tube containing the green toolbit parts between parallel, electrically conductive, graphite plates. Dielectric and absorption properties of the silicon nitride green parts, the tungsten carbide/cobalt green parts, and an oxidizable susceptor material were measured using perturbation and waveguide transmission methods. Microwave absorption data were measured over a temperature range from 20 degrees C to 800 degrees C. These data were then used in the microwave process model which assumed plane wave propagation along the radial direction and included the microwave reflection at each interface between the materials and the microwave absorption in the bulk materials. Heat transfer between the components inside the cylindrical sintering furnace was also included in the model. The simulated heating process data for both silicon nitride and tungsten carbide/cobalt samples closely follow the experimental data. By varying the physical parameters of the sintering furnace model, such as the thickness of the susceptor layer, the thickness of the allumina tube wall, the sample load volume and the graphite plate mass, the model data predicts their effects which are helpful in optimizing those parameters in the industrial sintering process. PMID:15323110

This basic premise--that environmental policy results from a multitude of responses to a variety of problems--explains to some extent the general lack of coordination between major environmental protection programs. It also suggests why there has been so much new legislation and regulation in the last decade and a half. Solutions have been proposed and adopted to address a myriad of problems, from leaking underground storage tanks to catastrophic releases of toxic materials. In the absence of broader environmental goals or policy, these individual solutions often gain a life of their own and their passage becomes a high priority for their sponsors. Given this situation, the best way an individual can influence environmental policy is to become involved in solving a problem--in making one`s voice heard in the decision-making process. But influencing the outcome of a decision-making process can be difficult at best, and is impossible without an understanding of the process itself.In addition to knowing the process, it is also important to understand the role played by the professionals involved in the process, the way in which decision makers view the world, and the ways in which a position or opinion on a particular issue can be best brought to the attention of decision makers.

Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.

This slide presentation reviews the payload processing functions at Kennedy Space Center. It details some of the payloads processed at KSC, the typical processing tasks, the facilities available for processing payloads, and the capabilities and customer services that are available.

Combining accurate fluid property databases with a commercial equation-solving software package running on a desktop computer allows simulation of cryogenic processes without extensive computer programming. Computer simulation can be a powerful tool for process development or optimization. Most engineering simulations to date have required extensive programming skills in languages such as Fortran, Pascal, etc. Authors of simulation code have also usually been responsible for choosing and writing the particular solution algorithm. This paper describes a method of simulating cryogenic processes with a commercial software package on a desktop personal computer that does not require these traditional programming tasks. Applications include modeling of cryogenic refrigerators, heat exchangers, vapor-cooled power leads, vapor pressure thermometers, and various other engineering problems.

The current worldwide demand of lubricating oils has increased the research for new technologies to obtain products with better quality, using processes less complicated than the current ones and at the same time decrease the process costs. The most familiar general process to obtain lubricating oils is by means of aromatic extraction with solvent. However, this stage represents elevated cost by raw materials consumptions; for that reason, it has increased the study of new catalytic technologies to substitute this step. In this work we are showing the last advances obtained by IMP developments about the application of the catalytic hydrogenation of aromatic compounds in lubricating oils, using a catalyst containing molybdenum as active metal and nickel and/or phosporous as promoters, - supported on gamma alumina with different concentration of metals. These catalysts have been evaluated in a pilot plant unit using several feeds of lubricating oils at different operating conditions, obtaining products with better quality than those produced by solvent extraction.

A method for the recovery of uranium from sulfuric acid solutions is described. In the present process, sulfuric acid is added to the uranium bearing solution to bring the pH to between 1 and 1.8, preferably to about 1.4, and aluminum metal is then used as a reducing agent to convert hexavalent uranium to the tetravalent state. As the reaction proceeds, the pH rises amd a selective precipitation of uranium occurs resulting in a high grade precipitate. This process is an improvement over the process using metallic iron, in that metallic aluminum reacts less readily than metallic iron with sulfuric acid, thus avoiding consumption of the reducing agent and a raising of the pH without accomplishing the desired reduction of the hexavalent uranium in the solution. Another disadvantage to the use of iron is that positive ferric ions will precipitate with negative phosphate and arsenate ions at the pH range employed.

A process of growing a material on a substrate, particularly growing a Group II-VI or Group III-V material, by a vapor-phase growth technique where the growth process eliminates the need for utilization of a mask or removal of the substrate from the reactor at any time during the processing. A nucleation layer is first grown upon which a middle layer is grown to provide surfaces for subsequent lateral cantilever growth. The lateral growth rate is controlled by altering the reactor temperature, pressure, reactant concentrations or reactant flow rates. Semiconductor materials, such as GaN, can be produced with dislocation densities less than 10.sup.7 /cm.sup.2.

A process for concentrating fission-product-containing waste solutions from fuel element processing is described. The process comprises the addition of sugar to the solution, preferably after it is made alkaline; spraying the solution into a heated space whereby a dry powder is formed; heating the powder to at least 220 deg C in the presence of oxygen whereby the powder ignites, the sugar is converted to carbon, and the salts are decomposed by the carbon; melting the powder at between 800 and 900 deg C; and cooling the melt. (AEC) antidiuretic hormone from the blood by the liver. Data are summarized from the following: tracer studies on cardiovascular functions; the determination of serum protein-bound iodine; urinary estrogen excretion in patients with arvanced metastatic mammary carcinoma; the relationship between alheroclerosis aad lipoproteins; the physical chemistry of lipoproteins; and factors that modify the effects of densely ionizing radia

The software program is called Computer Managed Process Planning, (CMPP). This system employs computer technology to speed and simplify process planning of cylindrical parts. It is aimed at the machining of expensive materials requiring tight tolerances and complex manufacturing processes. Thus far, the program has resulted in a 75 percent reduction in manpower and a 70 percent increase in productivity. The software performs four tasks: (1) it manipulates the manufacturing database, accepting input information from a central CAD system or from an interactive display; (2) it finds the dimensioning reference surfaces, clamping surfaces, and locating surfaces for each machining operation; (3) it analyzes dimensions, tolerances, and stock removals in all cuts for each operation to ensure that bluprint specifications can be achieved; and (4) it prints a summary of operations or a routing sheet.

Comparatively little is known about chemosensory processing during sleep. Earlier studies with significant methodological limitations investigated whether olfactory stimulation is processed during sleep at all. The scantness of available data is explained by physiological aspects and methodological difficulties (e.g. rapid adaptation, co-stimulation, etc.). Chemosensory processing during sleep can be assessed by means of event-related potentials, induced arousals or awakenings or by assessing effects on psychological functions. Chemosensory event-related potentials could be demonstrated in 2006. Recent studies with improved methodology have shown that isolated olfactory stimulation does not lead to arousals or awakenings. Finally, the impact of nocturnal olfactory stimulation on learning and emotional dream content could be described. PMID:20480129

Engineering design of the third distillation column in the process was accomplished. The initial design is based on a 94.35% recovery of dichlorosilane in the distillate and a 99.9% recovery of trichlorosilane in the bottoms. The specified separation is achieved at a reflux ratio of 15 with 20 trays (equilibrium stages). Additional specifications and results are reported including equipment size, temperatures and pressure. Specific raw material requirements necessary to produce the silicon in the process are presented. The primary raw materials include metallurgical grade silicon, silicon tetrachloride, hydrogen, copper (catalyst) and lime (waste treatment). Hydrogen chloride is produced as by product in the silicon deposition. Cost analysis of the process was initiated during this reporting period.

The isothermal processes of membrane separation, supercritical extraction and chromatography were examined using availability analysis. The general approach was to derive equations that identified where energy is consumed in these processes and how they compare with conventional separation methods. These separation methods are characterized by pure work inputs, chiefly in the form of a pressure drop which supplies the required energy. Equations were derived for the energy requirement in terms of regular solution theory. This approach is believed to accurately predict the work of separation in terms of the heat of solution and the entropy of mixing. It can form the basis of a convenient calculation method for optimizing membrane and solvent properties for particular applications. Calculations were made on the energy requirements for a membrane process separating air into its components.

Vacuum condensers play a critical role in supporting vacuum processing operations. Although they may appear similar to atmospheric units, vacuum condensers have their own special designs, considerations and installation needs. By adding vacuum condensers, precondensers and intercondensers, system cost efficiency can be optimized. Vacuum-condensing systems permit reclamation of high-value product by use of a precondenser, or reduce operating costs with intercondensers. A precondenser placed between the vacuum vessel and ejector system will recover valuable process vapors and reduce vapor load to an ejector system--minimizing the system`s capital and operating costs. Similarly, an intercondenser positioned between ejector stages can condense motive steam and process vapors and reduce vapor load to downstream ejectors as well as lower capital and operating costs. The paper describes vacuum condenser systems, types of vacuum condensers, shellside condensing, tubeside condensing, noncondensable gases, precondenser pressure drop, system interdependency, equipment installation, and equipment layout.

Disclosed is a single chamber ultra-high vacuum processing system for the oduction of hermetically sealed quartz resonators wherein electrode metallization and sealing are carried out along with cleaning and bake-out without any air exposure between the processing steps. The system includes a common vacuum chamber in which is located a rotatable wheel-like member which is adapted to move a plurality of individual component sets of a flat pack resonator unit past discretely located processing stations in said chamber whereupon electrode deposition takes place followed by the placement of ceramic covers over a frame containing a resonator element and then to a sealing stage where a pair of hydraulic rams including heating elements effect a metallized bonding of the covers to the frame.

Disclosed are: (1) a process comprising spray drying a powder-containing slurry, the slurry containing a powder constituent susceptible of oxidizing under the temperature conditions of the spray drying, while reducing the tendency for oxidation of the constituent by including as a liquid constituent of the slurry an organic liquid; (2) a process comprising spray drying a powder-containing slurry, the powder having been pretreated to reduce content of a powder constituent susceptible of oxidizing under the temperature conditions of the spray drying, the pretreating comprising heating the powder to react the constituent; and (3) a process comprising reacting ceramic powder, grinding the reacted powder, slurrying the ground powder, spray drying the slurried powder, and blending the dried powder with metal powder. 2 figs.

(1) A process comprising spray drying a powder-containing slurry, the slurry containing a powder constituent susceptible of oxidizing under the temperature conditions of the spray drying, while reducing the tendency for oxidation of the constituent by including as a liquid constituent of the slurry an organic liquid; (2) a process comprising spray drying a powder-containing slurry, the powder having been pretreated to reduce content of a powder constituent susceptible of oxidizing under the temperature conditions of the spray drying, the pretreating comprising heating the powder to react the constituent; and (3) a process comprising reacting ceramic powder, grinding the reacted powder, slurrying the ground powder, spray drying the slurried powder, and blending the dried powder with metal powder.

Recent monkey studies provide intriguing information for an open question whether face processing is a special perceptual process and is organized as such at birth, or has its origin in a more general system that becomes specialized with experience. Before seeing any faces or face-like objects, macaque monkeys showed a preference for faces rather than nonface objects. Furthermore, they showed remarkable face processing abilities both for human and monkey faces. It was also shown that macaque newborns are able to imitate human facial gestures, indicating the ability to match their own facial movements to observed facial gestures. Taken together, it seems very likely that newborns can acquire the knowledge about the basic structure of their own face, presumably through proprioception, so that facial structure would become a familiar and attractive visual object without the experience of the face itself. PMID:19339171

In this paper, I discuss the techniques and processes of timbral organization I developed while writing my chamber work, Afterimage. I compare my techniques with illustrative examples by other composers to place my work in historical context. I examine three elements of my composition process. The first is the process of indexing and cataloging basic sonic materials. The second consists of the techniques and mechanics of manipulating and assembling these collections into larger scale phrases, textures, and overall form in a musical work. The third element is the more elusive, and often extra-musical, source of inspiration and motivation. The evocative power of tone color is both immediately evident yet difficult to explain. What is timbre? This question cannot be answered solely in scientific terms; subjective factors affect our perception of it.

Overcoming the disadvantages of equidistant discretization of continuous actions, we introduce an approach that separates time into slices of varying length bordered by certain events. Such events are points in time at which the equations describing the system`s behavior that is, the equations which specify the ongoing processes-change. Between two events the system`s parameters stay continuous. A high-level semantics for drawing logical conclusions about dynamic systems with continuous processes is presented, and we have developed an adequate calculus to automate this reasoning process. In doing this, we have combined deduction and numerical calculus, offering logical reasoning about precise, quantitative system information. The scenario of multiple balls moving in 1-dimensional space interacting with a pendulum serves as demonstration example of our method.

Today's need for rapid software development has generated a great interest in employing Commercial-Off-The-Shelf (COTS) software products as a way of managing cost, developing time, and effort. With an abundance of COTS software packages to choose from, the problem now is how to systematically evaluate, rank, and select a COTS product that best meets the software project requirements and at the same time can leverage off the current corporate information technology architectural environment. This paper describes a systematic process for decision support in evaluating and ranking COTS software. Performed right after the requirements analysis, this process provides the evaluators with more concise, structural, and step-by-step activities for determining the best COTS software product with manageable risk. In addition, the process is presented in phases that are flexible to allow for customization or tailoring to meet various projects' requirements.

The phenomenon of process damping as a stabilising effect in milling has been encountered by machinists since milling and turning began. It is of great importance when milling aerospace alloys where maximum surface speed is limited by excessive tool wear and high speed stability lobes cannot be attained. Much of the established research into regenerative chatter and chatter avoidance has focussed on stability lobe theory with different analytical and time domain models developed to expand on the theory first developed by Trusty and Tobias. Process damping is a stabilising effect that occurs when the surface speed is low relative to the dominant natural frequency of the system and has been less successfully modelled and understood. Process damping is believed to be influenced by the interference of the relief face of the cutting tool with the waveform traced on the cut surface, with material properties and the relief geometry of the tool believed to be key factors governing performance. This study combines experimental trials with Finite Element (FE) simulation in an attempt to identify and understand the key factors influencing process damping performance in titanium milling. Rake angle, relief angle and chip thickness are the variables considered experimentally with the FE study looking at average radial and tangential forces and surface compressive stress. For the experimental study a technique is developed to identify the critical process damping wavelength as a means of measuring process damping performance. For the range of parameters studied, chip thickness is found to be the dominant factor with maximum stable parameters increased by a factor of 17 in the best case. Within the range studied, relief angle was found to have a lesser effect than expected whilst rake angle had an influence.

The Aerosol Robotic Network (AERONET) database has evolved in measurement accuracy, data quality products, availability to the scientific community over the course of 21 years with the support of NASA, PHOTONS and all federated partners. This evolution is periodically manifested as a new data version release by carefully reprocessing the entire database with the most current algorithms that fundamentally change the database and ultimately the data products used by the community. The newest processing, Version 3, will be released in 2015 after the entire database is reprocessed and real-time data processing becomes operational. All V 3 algorithms have been developed, individually vetted and represent four main categories: aerosol optical depth (AOD) processing, inversion processing, database management and new products. The primary trigger for release of V 3 lies with cloud screening of the direct sun observations and computation of AOD that will fundamentally change all data available for analysis and all subsequent retrieval products. This presentation will illustrate the innovative approach used for cloud screening and assesses the elements of V3 AOD relative to the current version. We will also present the advances in the inversion product processing with emphasis on the random and systematic uncertainty estimates. This processing will be applied to the new hybrid measurement scenario intended to provide inversion retrievals for all solar zenith angles. We will introduce automatic quality assurance criteria that will allow near real time quality assured aerosol products necessary for real time satellite and model validation and assimilation. Last we will introduce the new management structure that will improve access to the data database. The current version 2 will be supported for at least two years after the initial release of V3 to maintain continuity for on going investigations.

Orchestrator is a software application infrastructure for telemetry monitoring, logging, processing, and distribution. The architecture has been applied to support operations of a variety of planetary rovers. Built in Java with the Eclipse Rich Client Platform, Orchestrator can run on most commonly used operating systems. The pipeline supports configurable parallel processing that can significantly reduce the time needed to process a large volume of data products. Processors in the pipeline implement a simple Java interface and declare their required input from upstream processors. Orchestrator is programmatically constructed by specifying a list of Java processor classes that are initiated at runtime to form the pipeline. Input dependencies are checked at runtime. Fault tolerance can be configured to attempt continuation of processing in the event of an error or failed input dependency if possible, or to abort further processing when an error is detected. This innovation also provides support for Java Message Service broadcasts of telemetry objects to clients and provides a file system and relational database logging of telemetry. Orchestrator supports remote monitoring and control of the pipeline using browser-based JMX controls and provides several integration paths for pre-compiled legacy data processors. At the time of this reporting, the Orchestrator architecture has been used by four NASA customers to build telemetry pipelines to support field operations. Example applications include high-volume stereo image capture and processing, simultaneous data monitoring and logging from multiple vehicles. Example telemetry processors used in field test operations support include vehicle position, attitude, articulation, GPS location, power, and stereo images.

The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.

We give a purely probabilistic demonstration that all effects of non-random (external, conservative) forces on the diffusion process can be encoded in the Nelson ansatz for the second Newton law. Each random path of the process together with a probabilistic weight carries a phase accumulation (complex valued) weight. Random path summation (integration) of these weights leads to the transition probability density and transition amplitude respectively between two spatial points in a given time interval. The Bohm-Vigier, Fenyes-Nelson-Guerra and Feynman descriptions of the quantum particle behaviours are in fact equivalent.

A cyclic process for controlling environmental emissions of volatile organic compounds (VOC) from vapor recovery in storage and dispensing operations of liquids maintains a vacuum in the storage tank ullage. In one of a two-part cyclic process ullage vapor is discharged through a vapor recovery system in which VOC are stripped from vented gas with a selectively gas permeable membrane. In the other part, the membrane is inoperative while gas pressure rises in the ullage. Ambient air is charged to the membrane separation unit during the latter part of the cycle.

The first flight of the Advanced Thin Ionization Calorimeter (ATIC) experiment from McMurdo, Antarctica lasted for 16 days, starting on December 28, 2000. The ATIC instrument consists of a fully active 320-crystal, 960-channel Bismuth Germanate (BGO) calorimeter, 202 scintillator strips (808 channels) in 3 hodoscopes, interleaved with graphite target layers, and a 4480-pixel silicon matrix charge detector. We have developed an object-oriented data processing package based on ROOT. In this paper, we describe the data processing scheme used in handling the accumulated 45 GB of flight data. We discuss calibration issues, particularly the time-dependence of housekeeping information.

A method of reversibly brazing surfaces together. An interface is affixed to each surface. The interfaces can be affixed by processes such as mechanical joining, welding, or brazing. The two interfaces are then brazed together using a brazing process that does not defeat the surface to interface joint. Interfaces of materials such as Ni-200 can be affixed to metallic surfaces by welding or by brazing with a first braze alloy. The Ni-200 interfaces can then be brazed together using a second braze alloy. The second braze alloy can be chosen so that it minimally alters the properties of the interfaces to allow multiple braze, heat and disassemble, rebraze cycles.

The aim of the assessment reported is to candidly examine the contribution that solar industrial process heat (SIPH) is realistically able to make in the near and long-term energy futures of the United States. The performance history of government and privately funded SIPH demonstration programs, 15 of which are briefly summarized, and the present status of SIPH technology are discussed. The technical and performance characteristics of solar industrial process heat plants and equipment are reviewed, as well as evaluating how the operating experience of over a dozen SIPH demonstration projects is influencing institutional acceptance and economoc projections. Implications for domestic energy policy and international implications are briefly discussed. (LEW)

This is the first book to be devoted completely to array signal processing, a subject that has become increasingly important in recent years. The book consists of six chapters. Chapter 1, which is introductory, reviews some basic concepts in wave propagation. The remaining five chapters deal with the theory and applications of array signal processing in (a) exploration seismology, (b) passive sonar, (c) radar, (d) radio astronomy, and (e) tomographic imaging. The various chapters of the book are self-contained. The book is written by a team of five active researchers, who are specialists in the individual fields covered by the pertinent chapters.

A process of converting an actinide metal such as thorium, uranium, or plnium to an actinide oxide material by admixing the actinide metal in an aqueous medium with a hypochlorite as an oxidizing agent for sufficient time to form the actinide oxide material and recovering the actinide oxide material is provided together with a low temperature process of preparing an actinide oxide nitrate such as uranyl nitrte. Additionally, a composition of matter comprising the reaction product of uranium metal and sodium hypochlorite is provided, the reaction product being an essentially insoluble uranium oxide material suitable for disposal or long term storage.

Mallinckrodt Institute of Radiology (MIR) is using a digital image processing system which employs NASA-developed technology. MIR's computer system is the largest radiology system in the world. It is used in diagnostic imaging. Blood vessels are injected with x-ray dye, and the images which are produced indicate whether arteries are hardened or blocked. A computer program developed by Jet Propulsion Laboratory known as Mini-VICAR/IBIS was supplied to MIR by COSMIC. The program provides the basis for developing the computer imaging routines for data processing, contrast enhancement and picture display.

A coal liquefaction process and apparatus therefor are disclosed. According to this invention, a finely divided coal slurry and a solvent are contacted with molecular hydrogen in the presence of a catalyst, the slurry is separated into a gaseous component, a liquid component and a solid residue, the solid residue (which is the liquefaction residue) is then supplied to a molten metal bath together with oxygen gas to generate a gas entraining fine powdery solids, and the thus recovered fine powdery solids are returned to the liquefaction process as a catalyst.

Most of the power in the signals from microcalorimeters occurs at relatively low frequencies. At these frequencies, typical amplifiers will have significant amounts of 1/f noise. Our laboratory systems can also suffer from pickup at several harmonics of the AC power line, and from microphonic pickup at frequencies that vary with the configuration of the apparatus. We have developed some optimal signal processing techniques in order to construct the best possible estimates of our pulse heights in the presence of these non-ideal effects. In addition to a discussion of our laboratory systems, we present our plans for providing this kind of signal processing in flight experiments.

Image Processing Library computer program, IPLIB, is collection of subroutines facilitating use of COMTAL image-processing system driven by HP 1000 computer. Functions include addition or subtraction of two images with or without scaling, display of color or monochrome images, digitization of image from television camera, display of test pattern, manipulation of bits, and clearing of screen. Provides capability to read or write points, lines, and pixels from image; read or write at location of cursor; and read or write array of integers into COMTAL memory. Written in FORTRAN 77.

A process for converting an actinide metal such as thorium, uranium, or plutonium to an actinide oxide material by admixing the actinide metal in an aqueous medium with a hypochlorite as an oxidizing agent for sufficient time to form the actinide oxide material and recovering the actinide oxide material is described together with a low temperature process for preparing an actinide oxide nitrate such as uranyl nitrate. Additionally, a composition of matter comprising the reaction product of uranium metal and sodium hypochlorite is provided, the reaction product being an essentially insoluble uranium oxide material suitable for disposal or long term storage.

This invention is comprised of a process of converting an actinide metal such as thorium, uranium, or plutonium to an actinide oxide material by admixing the actinide metal in an aqueous medium with a hypochlorite as an oxidizing agent for sufficient time to form the actinide oxide material and recovering the actinide oxide material is provided together with a low temperature process of preparing an actinide oxide nitrate such as uranyl nitrate. Additionally, a composition of matter comprising the reaction product of uranium metal and sodium hypochlorite is provided, the reaction product being an essentially insoluble uranium oxide material suitable for disposal or long term storage.

A process is presented for recovering uranium values from calutron deposits. The process consists in treating such deposits to produce an oxidlzed acidic solution containing uranium together with the following imparities: Cu, Fe, Cr, Ni, Mn, Zn. The uranium is recovered from such an impurity-bearing solution by adjusting the pH of the solution to the range 1.5 to 3.0 and then treating the solution with hydrogen peroxide. This results in the precipitation of uranium peroxide which is substantially free of the metal impurities in the solution. The peroxide precipitate is then separated from the solution, washed, and calcined to produce uranium trioxide.

This presentation describes the process used to collect, review, integrate, and assess research requirements desired to be a part of research and payload activities conducted on the ISS. The presentation provides a description of: where the requirements originate, to whom they are submitted, how they are integrated into a requirements plan, and how that integrated plan is formulated and approved. It is hoped that from completing the review of this presentation, one will get an understanding of the planning process that formulates payload requirements into an integrated plan used for specifying research activities to take place on the ISS.

A welding method is provided for forming a weld joint between first and second elements of a workpiece. The method includes heating the first and second elements to form an interface of material in a plasticized or melted state interface between the elements. The interface material is then allowed to cool to a plasticized state if previously in a melted state. The interface material, while in the plasticized state, is then mixed, for example, using a grinding/extruding process, to remove any dendritic-type weld microstructures introduced into the interface material during the heating process.

An improved coking process for normally solid carbonaceous materials wherein the yield of liquid product from the coker is increased by adding ammonia or an ammonia precursor to the coker. The invention is particularly useful in a process wherein coal liquefaction bottoms are coked to produce both a liquid and a gaseous product. Broadly, ammonia or an ammonia precursor is added to the coker ranging from about 1 to about 60 weight percent based on normally solid carbonaceous material and is preferably added in an amount from about 2 to about 15 weight percent.

In the prior art processing of uranium ores, the ore is flrst digested with nitric acid and filtered, and the uranium values are then extracted tom the filtrate by contacting with an organic solvent. The insoluble residue has been processed separately in order to recover any uranium which it might contain. The improvement consists in contacting a slurry, composed of both solution and residue, with the organic solvent prior to filtration. Tbe result is that uranium values contained in the residue are extracted along with the uranium values contained th the solution in one step.

In many areas of computer science entities can “reproduce”, “replicate”, or “create new instances”. Paramount examples are threads in multithreaded programs, processes in operating systems, and computer viruses, but many others exist: procedure calls create new incarnations of the callees, web crawlers discover new pages to be explored (and so “create” new tasks), divide-and-conquer procedures split a problem into subproblems, and leaves of tree-based data structures become internal nodes with children. For lack of a better name, I use the generic term systems with process creation to refer to all these entities.

This patent describes a process for preparing DNA from cellular materials for use in genetic studies of eukaryotic systems, a process for isolating DNA fragments from proteolytic digestion products and detergent products in a solution with the DNA fragments produced in stripping undesired cellular constituents from the DNA. It comprises the step of dialyzing a solution containing the DNA fragments, detergent products, and proteolytic digestion products against a solution containing PEG for a time effective to yield DNA sufficiently pure of the genetic studies.

Research related to bioenergy is a major focus in the U.S. as science agencies, universities, and commercial labs seek to create new energy-efficient fuels. The Biomass Processing Project is one of the funded projects of the joint USDA-DOE Biomass Research and Development Initiative. The Biomass Processing Photolibrary has numerous images, but there are no accompanying abstracts to explain what you are seeing. The project website, however, makes available the full text of presentations and publications and also includes an exhaustive biomass glossary that is being developed into an ASAE Standard.

Statecharts is a visual language for specifying the behavior of reactive systems. The Language extends finite-state machines with concepts of hierarchy, concurrency, and priority. Despite its popularity as a design notation for embedded system, precisely defining its semantics has proved extremely challenging. In this paper, a simple process algebra, called Statecharts Process Language (SPL), is presented, which is expressive enough for encoding Statecharts in a structure-preserving and semantic preserving manner. It is establish that the behavioral relation bisimulation, when applied to SPL, preserves Statecharts semantics

The silicon web process takes advantage of natural crystallographic stabilizing forces to grow long, thin single crystal ribbons directly from liquid silicon. The ribbon, or web, is formed by the solidification of a liquid film supported by surface tension between two silicon filaments, called dendrites, which border the edges of the growing strip. The ribbon can be propagated indefinitely by replenishing the liquid silicon as it is transformed to crystal. The dendritic web process has several advantages for achieving low cost, high efficiency solar cells. These advantages are discussed.

A process and apparatus for the separation of complex mixtures of carbonaceous material by sequential elution with successively stronger solvents. In the process, a column containing glass beads is maintained in a fluidized state by a rapidly flowing stream of a weak solvent, and the sample is injected into this flowing stream such that a portion of the sample is dissolved therein and the remainder of the sample is precipitated therein and collected as a uniform deposit on the glass beads. Successively stronger solvents are then passed through the column to sequentially elute less soluble materials.

A process and apparatus are disclosed for the separation of complex mixtures of carbonaceous material by sequential elution with successively stronger solvents. In the process, a column containing glass beads is maintained in a fluidized state by a rapidly flowing stream of a weak solvent, and the sample is injected into this flowing stream such that a portion of the sample is dissolved therein and the remainder of the sample is precipitated therein and collected as a uniform deposit on the glass beads. Successively stronger solvents are then passed through the column to sequentially elute less soluble materials. 1 fig.

An improved process is described for the treatment of metallic uranium surfaces preparatory to being given hot dip coatings. The process consists in first pickling the uraniunn surInce with aqueous 50% to 70% nitric acid, at 60 to 70 deg C, for about 5 minutes, rinsing the acid solution from the uranium article, promptly drying and then passing it through a molten alkali-metal halide flux consisting of 42% LiCl, 53% KCla and 5% NaCl into a molten metal bath consisting of 85 parts by weight of zinc and 15 parts by weight of aluminum

A dual solvent refining process is claimed for solvent refining petroleum based lubricating oil stocks with n-methyl-2-pyrrolidone as selective solvent for aromatic oils wherein a highly paraffinic oil having a narrow boiling range approximating the boiling point of n-methyl-2-pyrrolidone is employed as a backwash solvent. The process of the invention results in an increased yield of refined lubricating oil stock of a predetermined quality and simplifies separation of the solvents from the extract and raffinate oil fractions.

A cyclic process for controlling environmental emissions of volatile organic compounds (VOC) from vapor recovery in storage and dispensing operations of liquids maintains a vacuum in the storage tank ullage. In the first part of a two-part cyclic process ullage vapor is discharged through a vapor recovery system in which VOC are stripped from vented gas with a selectively gas permeable membrane. In the second part, the membrane is inoperative while gas pressure rises in the ullage. In one aspect of this invention, a vacuum is drawn in the membrane separation unit thus reducing overall VOC emissions.

Without pictures, it is difficult to give students a feeling for wave propagation, transmission, and reflection. Even with pictures, wave propagation is still static to many. However, when students use and modify scripts that generate wavefronts and rays through a geologic model that they have modified themselves, we find that students gain a real feeling for wave propagation. To facilitate teaching 2-D seismic reflection data processing (from acquisition through migration) to our undergraduate and graduate Reflection Seismology students, we use Seismic Un*x (SU) software. SU is maintained and distributed by Colorado School of Mines, and it is freely available (at www.cwp.mines.edu/cwpcodes). Our approach includes use of synthetic and real seismic data, processing scripts, and detailed explanation of the scripts. Our real data were provided by Gregory F. Moore of the University of Hawaii. This approach can be used by any school at virtually no expense for either software or data, and can provide students with a sound introduction to techniques used in processing of reflection seismic data. The same software can be used for other purposes, such as research, with no additional expense. Students who have completed a course using SU are well equipped to begin using it for research, as well. Scripts for each processing step are supplied and explained to the students. Our detailed description of the scripts means students do not have to know anything about SU to start. Experience with the Unix operating system is preferable but not necessary -- our notes include Computer Hints to help the beginner work with the Unix operating system. We include several examples of synthetic model building, acquiring shot gathers through synthetic models, sorting shot gathers to CMP gathers, gain, 1-D frequency filtering, f-k filtering, deconvolution, semblance displays and velocity analysis, flattening data (NMO), stacking the CMPs, and migration. We use two real (marine) data sets. One

A method has been found for enhancing the melt flow of thermoplastic polyimides during processing. A high molecular weight 422 copoly(amic acid) or copolyimide was fused with approximately 0.05 to 5 pct by weight of a low molecular weight amic acid or imide additive, and this melt was studied by capillary rheometry. Excellent flow and improved composite properties on graphite resulted from the addition of a PMDA-aniline additive to LARC-TPI. Solution viscosity studies imply that amic acid additives temporarily lower molecular weight and, hence, enlarge the processing window. Thus, compositions containing the additive have a lower melt viscosity for a longer time than those unmodified.

The overall objective of the program is to demonstrate and commercialize the IGT two-phase BIOGAS Process for optimized methane production from, and simultaneous stabilization of, municipal solid waste (MSW). The specific objective of the current program is to conduct a laboratory-scale investigation of simple, cost-effective feed pretreatment techniques and selected digestion reactor designs to optimize methane production from MSW-sludge blends, and to select the best pretreatment and digestion conditions for testing during the subsequent program for process development unit (PDU) operation. A significant portion of the program efforts to date has been directed at evaluating and/or developing feeding, mixing and discharging systems for handling high concentration, large particle size RDF slurries for anaerobic digestion processes. The performance of such processes depends significantly on the operational success of these subsystems. The results of the subsystem testing have been implemented in the design and operation of the 10-L, 20-L, and 125-L digesters. These results will also be utilized to design the CSTR and the upflow digesters of a large two-phase system. Data collected during the initial phase of this research showed in general that methane production from RDF decreased as the loading rate was increased. Thermophilic digestion did not appear to be significantly better than mesophlic digestion. 9 figures, 3 tables.

The separation of uranium from an aqueous solution containing a water soluble uranyl salt is described. The process involves adding an alkali thiocyanate to the aqueous solution, contacting the resulting solution with methyl isobutyl ketons and separating the resulting aqueous and organic phase. The uranium is extracted in the organic phase as UO/sub 2/(SCN)/sub/.

A non lead frit paste is evaluated. A two step process is discussed where the bulk of the metallization is Mo/Sn but a small ohmic pad is silver. A new matrix of paste formulations is developed. A variety of tests are performed on paste samples to determine electrical, thermal and structural properties.

A well-designed and executed field trip experience serves not only to enrich and supplement course content, but also creates opportunities to build basic science process skills. The National Science Education Standards call for science teachers "to design and manage learning environments that provide students with the time, space, and resources…

The applications of computers and data processing to astronomy are discussed. Among the topics covered are the emerging national information infrastructure, workstations and supercomputers, supertelescopes, digital astronomy, astrophysics in a numerical laboratory, community software, archiving of ground-based observations, dynamical simulations of complex systems, plasma astrophysics, and the remote control of fourth dimension supercomputers.

Describes how the advanced cracking reactor process, which is ready for a logical commercial application, offers total liquids feedstock flexibility from light naphthenes through vacuum gas oils in the same production unit. Several processes are presently being developed which are aimed at maintaining olefin selectivity when cracking the heaviest feeds. Addresses the problems posed by such heavy feedstocks. The following trends favor the ACR process in the 1980s: natural gas price decontrol; limited natural gas reserves; few new domestic LPG-based ethylene plants will be built; an economic recovery will create the need for more ethylene capacity; modest increases in ''real'' crude oil prices; plentiful supplies of vacuum gas oil at prices making it an attractive ethylene feedstock; and increasing supplies of light naphtha at prices making it an attractive ethylene feedstock as well. Predicts that these factors will swing the preferred feedstocks for ethylene manufacture back to crude oil distillates before the end of the decade. Argues that in this environment, the ACR process can deliver the lowest cost ethylene in the industry. ACR has full-range feedstock flexibility, high selectivity to ethylene, and less sensitivity to feedstock costs and co-product credits.

Describes the key features that distinguish standards development organizations (SDOs) and analyzes these features in light of recent work in political economy. It is concluded that many of the features that lead to a slower process may be interpreted as an efficient institutional response to problems posed by industry standardization. (24…

Hazard analysis and critical control programs (HACCP) will eventually be required for commercial shell egg processing plants. Sanitation is an essential prerequisite program for HACCP and is based upon current Good Manufacturing Practices (cGMPs) as listed in the Code of Federal Regulations. Good ...

The report examines process alternatives for the optimal use of natural gas and biomass for production of fuel-cell vehicle fuel, emphasizing maximum displacement of petroleum and maximum reduction of overall fuel-cycle carbon dioxide (CO2) emissions at least cost. Three routes a...

These papers are related to the basic comprehensive research and development plan of the Eastern Regional Institute for Education (ERIE). The first paper, Improving Process Education: A Comprehensive Plan by Burton G. Andreas, describes the comprehensive plan and introduces the succeeding papers. The goals of the program are to improve process…

A variety of products made from lunar resources will be required for a lunar outpost. These products might be made by adapting existing processing techniques to the lunar environment, or by developing new techniques unique to the moon. In either case, processing techniques used on the moon will have to have a firm basis in basic principles of materials science and engineering, which can be used to understand the relationships between composition, processing, and properties of lunar-derived materials. These principles can also be used to optimize the properties of a product, once a more detailed knowledge of the lunar regolith is obtained. Using three types of ceramics (monolithic glasses, glass fibers, and glass-ceramics) produced from lunar simulants, we show that the application of materials science and engineering priciples is useful in understanding and optimizing the mechanical properties of ceramics on the moon. We also demonstrate that changes in composition and/or processing can have a significant effect on the strength of these materials.

Broad and major concerns dealing with reading are set forth in this monograph to provoke discussion and examination by both researchers and practitioners. In Part 1, Kenneth S. Goodman presents a psycholinguistic view of language and reading (within a transformational-generative framework) as essentially a set of processes of recoding, decoding,…

A waterflood process is claimed wherein a slug of biopolymer is injected into a formation, followed by a slug of synthetic polymer. The biopolymer slug protects the synthetic polymer from degradation due to presence of salts or surfactants in the formation.

In learning genetics, many students misunderstand and misinterpret what "dominance" means. Understanding is easier if students realize that dominance is not a mechanism, but rather a consequence of underlying cellular processes. For example, metabolic pathways are often little affected by changes in enzyme concentration. This means that…

Suggested program of material processing experiments in space described in 81 page report. For each experiment, report discusses influence of such gravitational effects as convection, buoyancy, sedimentation, and hydrostatic pressure. Report contains estimates of power and mission duration required for each experiment. Lists necessary equipment and appropriate spacecraft.

A NEW METHOD OF TEACHING WRITING TO STUDENTS OF ENGLISH AS A FOREIGN LANGUAGE IS PRESENTED. THE AUTHOR DEFINES LEARNING TO WRITE, WHICH IS ESSENTIALLY DIFFERENT FROM THE OTHER LANGUAGE SKILLS (ORAL PRODUCTION, GRAMMAR, AND READING), AS A PROCESS THAT REQUIRES ACTIVE THOUGHT IN THE NECESSARY SELECTION AND ORGANIZATION OF EXPERIENCE. WRITING…

Accented speech conveys important nonverbal information about the speaker as well as presenting the brain with the problem of decoding a non-canonical auditory signal. The processing of non-native accents has seldom been studied in neurodegenerative disease and its brain basis remains poorly understood. Here we investigated the processing of non-native international and regional accents of English in cohorts of patients with Alzheimer's disease (AD; n=20) and progressive nonfluent aphasia (PNFA; n=6) in relation to healthy older control subjects (n=35). A novel battery was designed to assess accent comprehension and recognition and all subjects had a general neuropsychological assessment. Neuroanatomical associations of accent processing performance were assessed using voxel-based morphometry on MR brain images within the larger AD group. Compared with healthy controls, both the AD and PNFA groups showed deficits of non-native accent recognition and the PNFA group showed reduced comprehension of words spoken in international accents compared with a Southern English accent. At individual subject level deficits were observed more consistently in the PNFA group, and the disease groups showed different patterns of accent comprehension impairment (generally more marked for sentences in AD and for single words in PNFA). Within the AD group, grey matter associations of accent comprehension and recognition were identified in the anterior superior temporal lobe. The findings suggest that accent processing deficits may constitute signatures of neurodegenerative disease with potentially broader implications for understanding how these diseases affect vocal communication under challenging listening conditions. PMID:22664324

The use of relevant data is at the core of the school review process approach. Various forms of relevant data are gathered, including demographic data, student achievement data, community perception data, and data on the current program (e.g., curriculum, instructional strategies, school climate factors, etc.). Perception questionnaires are…

The use femtosecond pulses for materials processing results in very precise cutting and drilling with high efficiency. Energy deposited in the electrons is not coupled into the bulk during the pulse, resulting in negligible shock or thermal loading to adjacent areas.

A process is described for calcining green coke in at least three heating stages, which comprises preheating the green coke in the first stage, preliminarily calcining the coke in the second stage, cooling the coke; and calcining the coke in the third stage, volatile matter from the second stage being burned during the third stage. The product coke is suitable for preparing graphite electrodes.

Historically, education employees have been hired after a process that consists of these steps: Determining the need for a position, posting the vacancy, paper-screening applications, an interview with a panel or committee, background check, reference calling, and finally the selection of a candidate. This is a very time-consuming and costly…

A process is described for the liquefaction of coal in a hydrogen donor solvent in the presence of hydrogen and a co-catalyst combination of iron and a Group VI or Group VIII non-ferrous metal or compounds of the catalysts.

Examines the tendency to construct genograms in an affective vacuum--a phenomenon consistent with Bowen theory, yet potentially problematic in the ongoing process of treatment. Offers alternatives to nonprocess-oriented genogram construction in an effort to enhance the experience for the client and to broaden the therapist's diagnostic…

The Image Processing for Teaching project provides a powerful medium to excite students about science and mathematics, especially children from minority groups and others whose needs have not been met by traditional teaching. Using professional-quality software on microcomputers, students explore a variety of scientific data sets, including…

Academic/administrative data processing is centralized in one computer facility at East Tennessee State University. Students check computer printouts listing course availability, fill out one schedule card, present it to an operator at a computer terminal, and are able to register for classes in less than a minute. (Author/MLF)

Our research group has been interested in the processing-structure-property relationships in semicrystalline polymers and blends for many years. In situ real time x ray scattering at elevated temperatures is being used to monitor the development of structure. An ongoing collaboration with Dr. Malcolm Capel at the Brookhaven National Synchrotron Light Source allows the performance of real time wide and small angle x ray scattering to study the phase transformations in semicrystalline polymers. The first part of my presentation will be about our recent use of x ray scattering to study blends of poly(ethylene terephthalate) with polyarylates. The purpose of the next portion of the presentation is to show how we may study effects of self-deformation of polymers during processing in the gravity environment, using real time x ray scattering. In this way, how processing stresses alter the microstructure of semicrystalline polymers was learned, and ultimately microgravity processing strategies that will result in more uniform morphology in these polymers is hoped to be developed.

Radio frequency (RF) heating is a commonly used food processing technology that has been applied for drying and baking as well as thawing of frozen foods. Its use in pasteurization, as well as for sterilization and disinfection of foods, is more limited. This column will review various RF heating ap...

The data processing in two telecommunication market investigations is described. One of the studies concerns the office applications of communication and the other the experiences with a videotex terminal. Statistical factorial analysis was performed on a large mass of data. A comparison between utilization intentions and effective utilization is made. Extensive rewriting of statistical analysis computer programs was required.

Backprojection has long been applied to SAR image formation. It has equal utility in forming the range-velocity maps for Ground Moving Target Indicator (GMTI) radar processing. In particular, it overcomes the problem of targets migrating through range resolution cells.

This article presents the results of an exploratory study asking faculty in the first-year writing program and instruction librarians about their research process focusing on results specifically related to serendipity. Steps to prepare for serendipity are highlighted as well as a model for incorporating serendipity into a first-year writing…

Four Native American poets in easy narrative style tell about some of the aesthetic judgments they make in their work and, in the process, shed some light upon the traditions from which their poetry emerges. Joy Harjo discusses how she wrote "The Woman Hanging from the Thirteenth Floor Window," her use of repetition influenced by music and…

To force students--at the very beginning of the writing process--to be aware of audience and to gain insight into their own writing, in-class writing and sharing exercises can be invaluable. For example, students can present to the class their subject for an upcoming paper, with the class responding on paper to such questions as: (1) What do you…

This document contains four papers from a symposium on the performance evaluation process and human resource development (HRD). "Assessing the Effectiveness of OJT (On the Job Training): A Case Study Approach" (Julie Furst-Bowe, Debra Gates) is a case study of the effectiveness of OJT in one of a high-tech manufacturing company's product lines.…

The magnetization process, hysteresis (the difference in the path of magnetization for an increasing and decreasing magnetic field), hysteresis loops, and hard magnetic materials are discussed. The fabrication of classroom projects for demonstrating hysteresis and the hysteresis of common magnetic materials is described in detail.

The process of evaluating computer software is discussed, and a procedure is described which focuses upon ease of use, program content, instructional design and kind of incentive, value conflicts, input/output capacity, teacher management options, built-in authoring systems, and special learning needs. (CL)

Process for the removal of plutonium polymer and ionic actinides from aqueous solutions by absorption onto a solid extractant loaded on a solid inert support such as polystyrenedivinylbenzene. The absorbed actinides can then be recovered by incineration, by stripping with organic solvents, or by acid digestion. Preferred solid extractants are trioctylphosphine oxide and octylphenyl-N,N-diisobutylcarbamoylmethylphosphine oxide and the like.

Findings from studies of attention, semantic memory, and the pragmatics of language are reviewed and implications for intervention with children whose language is disordered are discussed. Selectivity and resource allocation are the attention topics considered while schemata, frames, inferences, and narrative discourse processing are addressed…

The technology of the automatic information processing field has progressed dramatically in the past few years and has created a problem in common term usage. As a solution, "Datamation" Magazine offers this glossary which was compiled by the U.S. Bureau of the Budget as an official reference. The terms appear in a single alphabetic sequence,…

This document contains four papers from a symposium on performance improvement processes. In "Never the Twain Shall Meet?: A Glimpse into High Performance Work Practices and Downsizing" (Laurie J. Bassi, Mark E. Van Buren) evidence from a national cross-industry of more than 200 establishments is used to demonstrate that high-performance work…

The structure of art as a symbol system is composed of three dimensions: observer, process, and product. Each dimension is described, discussed, and its application to art therapy illustrated through the case study of a 12-year-old boy suffering from a progressive neurological disorder. (LSR)

The market for tire-derived materials is growing rapidly, with the largest market being tire-derived fuels. There is therefore a growing demand for higher quality products. This paper describes the processing and removal of steel from scrap tires.

IMAGEP manipulates digital image data to effect various processing, analysis, and enhancement functions. It is keyboard-driven program organized into nine subroutines. Within subroutines are sub-subroutines also selected via keyboard. Algorithm has possible scientific, industrial, and biomedical applications in study of flows in materials, analysis of steels and ores, and pathology, respectively.

Software process assessments (SPA's) are part of an ongoing program of continuous quality improvements in AT&T. Their use was found to be very beneficial by software development organizations in identifying the issues facing the organization and the actions required to increase both quality and productivity in the organization.

This paper proposes a process for community colleges to engage and direct discontinuous change in the face of the coming millennium. Described are several characteristics of change envisioned in the near future, including a complete break with the past, major reconstruction of nearly every element of the organization, and modification of the…

Identifies types and distributions of errors in text produced by optical character recognition (OCR) and proposes a process using machine learning techniques to recognize and correct errors in OCR texts. Results of experiments indicating that this strategy can reduce human interaction required for error correction are reported. (25 references)…

The objective of this project is to facilitate deployment of enzyme-based biomass conversion technology. The immediate goal is to explore integration issues that impact process performance and to demonstrate improved performance of the lower-cost enzymes being developed by Genencor and Novozymes.

A laser material processing system and method are provided. A further aspect of the present invention employs a laser for micromachining. In another aspect of the present invention, the system uses a hollow waveguide. In another aspect of the present invention, a laser beam pulse is given broad bandwidth for workpiece modification.

This module provides instructional materials that are designed to help teachers train students in job skills for entry-level jobs as instrumentation technicians. This text addresses the basics of troubleshooting control loops, and the transducers, transmitters, signal conditioners, control valves, and controllers that enable process systems to…

Written to help couples prepare for parenthood and to improve the effectiveness of parents, this book provides extensive guidelines and background information for accomplishing the basic tasks of parenting. Chapter One depicts parenting as a process, delineates parents' tasks and describes how parents learn to be parents. Based on Erikson's theory…

Fundamental methods of signal processing used in normal mesosphere stratosphere troposphere (MST) radar observations are described. Complex time series of received signals obtained in each range gate are converted into Doppler spectra, from which the mean Doppler shift, spectral width and signal-to-noise ratio (SNR) are estimated. These spectral parameters are further utilized to study characteristics of scatterers and atmospheric motions.

An improved process for catalytic solvent refining or hydroliquefaction of non-anthracitic coal at elevated temperatures under hydrogen pressure in a solvent comprises using as catalyst a mixture of a 1,2- or 1,4-quinone and an alkaline compound, selected from ammonium, alkali metal, and alkaline earth metal oxides, hydroxides or salts of weak acids. 1 fig.

Processes and apparatus for providing improved catalytic cracking, specifically improved recovery of olefins, LPG or hydrogen from catalytic crackers. The improvement is achieved by passing part of the wet gas stream across membranes selective in favor of light hydrocarbons over hydrogen.

The invention is a pervaporation process and pervaporation equipment, using a series of membrane modules, and including inter-module reheating of the feed solution under treatment. The inter-module heating is achieved within the tube or vessel in which the modules are housed, thereby avoiding the need to repeatedly extract the feed solution from the membrane module train.

This paper presents a neuropsychosocial model of information processing to explain a victimization experience, specifically child sexual abuse. It surveys the relation of sensation, perception, and cognition as a systematic way to provide a framework for studying human behavior and describing human response to traumatic events. (Author/JDD)

The Association of New York Libraries for Technical Services (ANYLTS) is established to develop and run a centralized book processing facility for the public library systems in New York State. ANYLTS plans to receive book orders from the 22 library systems, transmit orders to publishers, receive the volumes from the publishers, print and attach…

Outlines the themes and purposes of humanistic education in the instruction of English-as-a-Second-Language, from the perspective of teacher, trainer, student, colleague, parent, and observer, focusing on the processes, values, and attitudes that underpin humanistic education and that are drawn from humanistic psychology. (Author/CB)

A process for the production of a mono-olefin from a gaseous paraffinic hydrocarbon having at least two carbon atoms or mixtures thereof comprising reacting said hydrocarbons and molecular oxygen in the presence of a platinum catalyst. The catalyst consist essentially of platinum supported on alumina or zirconia monolith, preferably zirconia and more preferably in the absence of palladium, rhodium and gold.

This paper examines the goals of education which should include the transmission of knowledge, training of skills necessary in a technological society, training of intellectual/abstractive capacities, autonomy and flexibility, and healthy personal and interpersonal functioning. The author argues that a process-oriented classroom approach is more…

An improved process for catalytic solvent refining or hydroliquefaction of non-anthracitic coal at elevated temperatures under hydrogen pressure in a solvent comprises using as catalyst a mixture of a 1,2- or 1,4-quinone and an alkaline compound, selected from ammonium, alkali metal, and alkaline earth metal oxides, hydroxides or salts of weak acids.

We introduce and explore the asymmetric inclusion process (ASIP), an exactly solvable bosonic counterpart of the fermionic asymmetric exclusion process (ASEP). In both processes, random events cause particles to propagate unidirectionally along a one-dimensional lattice of n sites. In the ASEP, particles are subject to exclusion interactions, whereas in the ASIP, particles are subject to inclusion interactions that coalesce them into inseparable clusters. We study the dynamics of the ASIP, derive evolution equations for the mean and probability generating function (PGF) of the sites’ occupancy vector, obtain explicit results for the above mean at steady state, and describe an iterative scheme for the computation of the PGF at steady state. We further obtain explicit results for the load distribution in steady state, with the load being the total number of particles present in all lattice sites. Finally, we address the problem of load optimization, and solve it under various criteria. The ASIP model establishes bridges between statistical physics and queueing theory as it represents a tandem array of queueing systems with (unlimited) batch service, and a tandem array of growth-collapse processes.

Modified compression molding process produces plastic molding compounds that are strong, homogeneous, free of residual stresses, and have improved ablative characteristics. The conventional method is modified by applying a vacuum to the mold during the molding cycle, using a volatile sink, and exercising precise control of the mold closure limits.

This talk will examine some mathematical methods for image processing and the solution of underdetermined, linear inverse problems. The talk will have a tutorial flavor, mostly accessible to undergraduates, while still presenting research results. The primary approach is the use of optimization problems. We will find that relaxing the usual assumption of convexity will give us much better results.

The effects of materials processing on the properties and behavior of high temperature yttrium barium copper oxide (YBCO) superconductors were investigated. Electrical, magnetic, and structural characteristics of thin films (300 nm) YBA2CU3O(delta) structures grown by pulsed laser deposition on LaAlO3 and SrTiO3 substrates were used to evaluate processing. Pole projection and thin film diffraction measurements were used to establish grain orientation and verify structural integrity of the samples. Susceptibility magnetization, and transport measurements were used to evaluate the magnetic and electrical transport properties of the samples. Our results verified that an unfortunate consequence of processing is inherent changes to the internal structure of the material. This effect translates into modifications in the properties of the materials, and undesired feature that makes it very difficult to consistently predict material behavior. The results show that processing evaluation must incorporate a comprehensive understanding of the properties of the materials. Future studies will emphasize microstructural characteristics of the materials, in particular, those microscopic properties that map macroscopic behavior.

In recent years, there has been strong demand for the development of novel devices and equipment that support advanced industries including IT/semiconductors, the environment, energy and aerospace along with the achievement of higher efficiency and reduced environmental impact. Many studies have been conducted on the fabrication of innovative inorganic materials with novel individual properties and/or multifunctional properties including electrical, dielectric, thermal, optical, chemical and mechanical properties through the development of particle processing. The fundamental technologies that are key to realizing such materials are (i) the synthesis of nanoparticles with uniform composition and controlled crystallite size, (ii) the arrangement/assembly and controlled dispersion of nanoparticles with controlled particle size, (iii) the precise structural control at all levels from micrometer to nanometer order and (iv) the nanostructural design based on theoretical/experimental studies of the correlation between the local structure and the functions of interest. In particular, it is now understood that the application of an external stimulus, such as magnetic energy, electrical energy and/or stress, to a reaction field is effective in realizing advanced particle processing [1-3]. This special issue comprises 12 papers including three review papers. Among them, seven papers are concerned with phosphor particles, such as silicon, metals, Si3N4-related nitrides, rare-earth oxides, garnet oxides, rare-earth sulfur oxides and rare-earth hydroxides. In these papers, the effects of particle size, morphology, dispersion, surface states, dopant concentration and other factors on the optical properties of phosphor particles and their applications are discussed. These nanoparticles are classified as zero-dimensional materials. Carbon nanotubes (CNT) and graphene are well-known one-dimensional (1D) and two-dimensional (2D) materials, respectively. This special issue also

The mission design for Cassini-Huygens calls for a four-year orbital survey of the Saturnian system and the descent into the Titan atmosphere and eventual soft-landing of the Huygens probe. The Cassini orbiter tour consists of 76 orbits around Saturn with 44 close Titan flybys and 8 targeted icy satellite flybys. The Cassini orbiter spacecraft carries twelve scientific instruments that will perform a wide range of observations on a multitude of designated targets. The science opportunities, frequency of encounters, the length of the Tour, and the use of distributed operations pose significant challenges for developing the science plan for the orbiter mission. The Cassini Science Planning Process is the process used to develop and integrate the science and engineering plan that incorporates an acceptable level of science required to meet the primary mission objectives far the orbiter. The bulk of the integrated science and engineering plan will be developed prior to Saturn Orbit Insertion (Sol). The Science Planning Process consists of three elements: 1) the creation of the Tour Atlas, which identifies the science opportunities in the tour, 2) the development of the Science Operations Plan (SOP), which is the conflict-free timeline of all science observations and engineering activities, a constraint-checked spacecraft pointing profile, and data volume allocations to the science instruments, and 3) an Aftermarket and SOP Update process, which is used to update the SOP while in tour with the latest information on spacecraft performance, science opportunities, and ephemerides. This paper will discuss the various elements of the Science Planning Process used on the Cassini Mission to integrate, implement, and adapt the science and engineering activity plans for Tour.

The rapid prototyping of application specific signal processors (RASSP) program is an ARPA/tri-service effort to dramatically improve the process by which complex digital systems, particularly embedded signal processors, are specified, designed, documented, manufactured, and supported. The domain of embedded signal processing was chosen because it is important to a variety of military and commercial applications as well as for the challenge it presents in terms of complexity and performance demands. The principal effort is being performed by two major contractors, Lockheed Sanders (Nashua, NH) and Martin Marietta (Camden, NJ). For both, improvements in methodology are to be exercised and refined through the performance of individual 'Demonstration' efforts. The Lockheed Sanders' Demonstration effort is to develop an infrared search and track (IRST) processor. In addition, both contractors' results are being measured by a series of externally administered (by Lincoln Labs) six-month Benchmark programs that measure process improvement as a function of time. The first two Benchmark programs are designing and implementing a synthetic aperture radar (SAR) processor. Our demonstration team is using commercially available VME modules from Mercury Computer to assemble a multiprocessor system scalable from one to hundreds of Intel i860 microprocessors. Custom modules for the sensor interface and display driver are also being developed. This system implements either proprietary or Navy owned algorithms to perform the compute-intensive IRST function in real time in an avionics environment. Our Benchmark team is designing custom modules using commercially available processor ship sets, communication submodules, and reconfigurable logic devices. One of the modules contains multiple vector processors optimized for fast Fourier transform processing. Another module is a fiberoptic interface that accepts high-rate input data from the sensors and provides video-rate output data to a

To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

Many of the aspects of the climate system that are of the greatest interest (e.g., the sensitivity of the system to external forcings) are emergent properties that arise via the complex interplay between disparate processes. This is also true for climate models most diagnostics are not a function of an isolated portion of source code, but rather are affected by multiple components and procedures. Thus any model-observation mismatch is hard to attribute to any specific piece of code or imperfection in a specific model assumption. An alternative approach is to identify diagnostics that are more closely tied to specific processes -- implying that if a mismatch is found, it should be much easier to identify and address specific algorithmic choices that will improve the simulation. However, this approach requires looking at model output and observational data in a more sophisticated way than the more traditional production of monthly or annual mean quantities. The data must instead be filtered in time and space for examples of the specific process being targeted.We are developing a data analysis environment called PROcess-Based Explorer (PROBE) that seeks to enable efficient and systematic computation of process-based diagnostics on very large sets of data. In this environment, investigators can define arbitrarily complex filters and then seamlessly perform computations in parallel on the filtered output from their model. The same analysis can be performed on additional related data sets (e.g., reanalyses) thereby enabling routine comparisons between model and observational data. PROBE also incorporates workflow technology to automatically update computed diagnostics for subsequent executions of a model. In this presentation, we will discuss the design and current status of PROBE as well as share results from some preliminary use cases.

The present work is aimed at understanding and explaining some of the aspects of visual signal processing at the retinal level while exploiting the same towards the development of some simple techniques in the domain of digital image processing. Classical studies on retinal physiology revealed the nature of contrast sensitivity of the receptive field of bipolar or ganglion cells, which lie in the outer and inner plexiform layers of the retina. To explain these observations, a difference of Gaussian (DOG) filter was suggested, which was subsequently modified to a Laplacian of Gaussian (LOG) filter for computational ease in handling two-dimensional retinal inputs. Till date almost all image processing algorithms, used in various branches of science and engineering had followed LOG or one of its variants. Recent observations in retinal physiology however, indicate that the retinal ganglion cells receive input from a larger area than the classical receptive fields. We have proposed an isotropic model for the non-classical receptive field of the retinal ganglion cells, corroborated from these recent observations, by introducing higher order derivatives of Gaussian expressed as linear combination of Gaussians only. In digital image processing, this provides a new mechanism of edge detection on one hand and image half-toning on the other. It has also been found that living systems may sometimes prefer to "perceive" the external scenario by adding noise to the received signals in the pre-processing level for arriving at better information on light and shade in the edge map. The proposed model also provides explanation to many brightness-contrast illusions hitherto unexplained not only by the classical isotropic model but also by some other Gestalt and Constructivist models or by non-isotropic multi-scale models. The proposed model is easy to implement both in the analog and digital domain. A scheme for implementation in the analog domain generates a new silicon retina

Examines the history of process and post-process in composition studies, focusing on ways in which terms, such as "current-traditional rhetoric,""process," and "post-process" have contributed to the discursive construction of reality. Argues that use of the term post-process in the context of second language writing needs to be guided by a…

... TSP received a document that purports to be a legal process requiring payment from the participant's account, the account will be unfrozen: (i) Upon payment pursuant to a qualifying legal process; or (ii) As... process. (i) The TSP will hold in abeyance the processing of a payment required by legal process if...

The evolution of vaccines (e.g., live attenuated, recombinant) and vaccine production methods (e.g., in ovo, cell culture) are intimately tied to each other. As vaccine technology has advanced, the methods to produce the vaccine have advanced and new vaccine opportunities have been created. These technologies will continue to evolve as we strive for safer and more immunogenic vaccines and as our understanding of biology improves. The evolution of vaccine process technology has occurred in parallel to the remarkable growth in the development of therapeutic proteins as products; therefore, recent vaccine innovations can leverage the progress made in the broader biotechnology industry. Numerous important legacy vaccines are still in use today despite their traditional manufacturing processes, with further development focusing on improving stability (e.g., novel excipients) and updating formulation (e.g., combination vaccines) and delivery methods (e.g., skin patches). Modern vaccine development is currently exploiting a wide array of novel technologies to create safer and more efficacious vaccines including: viral vectors produced in animal cells, virus-like particles produced in yeast or insect cells, polysaccharide conjugation to carrier proteins, DNA plasmids produced in E. coli, and therapeutic cancer vaccines created by in vitro activation of patient leukocytes. Purification advances (e.g., membrane adsorption, precipitation) are increasing efficiency, while innovative analytical methods (e.g., microsphere-based multiplex assays, RNA microarrays) are improving process understanding. Novel adjuvants such as monophosphoryl lipid A, which acts on antigen presenting cell toll-like receptors, are expanding the previously conservative list of widely accepted vaccine adjuvants. As in other areas of biotechnology, process characterization by sophisticated analysis is critical not only to improve yields, but also to determine the final product quality. From a regulatory

Apparatus for processing coal to prevent the creation of extreme fines and to extract pyrites from the principal coal fractions in which there are two air circulating circuits having processing components which cooperate in their respective circuits to result initially in substantial extraction of fines in the first circuit while releasing principal granulated coal fractions and pyrites to the second circuit where specific gravity separation of the pyrites and principal coal fractions occur. The apparatus includes a source of drying heat added to the air moving in the circuits and delivered at the places where surface moisture drying is most effective. Furthermore, the apparatus is operated so as to reduce coal to a desired size without creating an excessive volume of extreme fines, to separate pyrites and hard to grind components by specific gravity in a region where fines are not present, and to use the extreme fines as a source of fuel to generate drying heat.

Disclosed herein is a coated substrate and a process for forming films on substrates and for providing a particularly smooth film on a substrate. The method of this invention involves subjecting a surface of a substrate to contact with a stream of ions of an inert gas having sufficient force and energy to substantially change the surface characteristics of said substrate, and then exposing a film-forming material to a stream of ions of an inert gas having sufficient energy to vaporize the atoms of said film-forming material and to transmit the vaporized atoms to the substrate surface with sufficient force to form a film bonded to the substrate. This process is particularly useful commercially because it forms strong bonds at room temperature. This invention is particularly useful for adhering a gold film to diamond and forming ohmic electrodes on diamond, but also can be used to bond other films to substrates.

Design and performance details of the advanced information processing system (AIPS) for fault and damage tolerant data processing on aircraft and spacecraft are presented. AIPS comprises several computers distributed throughout the vehicle and linked by a damage tolerant data bus. Most I/O functions are available to all the computers, which run in a TDMA mode. Each computer performs separate specific tasks in normal operation and assumes other tasks in degraded modes. Redundant software assures that all fault monitoring, logging and reporting are automated, together with control functions. Redundant duplex links and damage-spread limitation provide the fault tolerance. Details of an advanced design of a laboratory-scale proof-of-concept system are described, including functional operations.

My talk is concerned with a review, not necessarily of the latest theoretical developments, but rather of an old idea which has contributed to recent theoretical activities. By soft pion processes I mean processes in which low energy pions are emitted or absorbed or scattered, just as we use the word soft photon in a similar context. Speaking more quantitatively, we may call a pion soft if its energy is small compared to a natural scale in the reaction. This scale is determined by the particular dynamics of pion interaction, and one may roughly say that a pion is soft if its energy is small compared to the energies of the other individual particles that participate in the reaction. It is important to note at this point that pion is by far the lightest member of all the hadrons, and much of the success of the soft pion formulas depends on this fact.

Solcoseryl® is a protein-free haemodialysate, containing a broad spectrum of low molecular components of cellular mass and blood serum obtained from veal calves. Solcoseryl® improves the transport of oxygen and glucose to cells that are under hypoxic conditions. It increases the synthesis of intracellular ATP and contributes to an increase in the level of aerobic glycolysis and oxidative phosphorylation. It activates the reparative and regenerative processes in tissues by stimulating fibroblast proliferation and repair of the collagen vascular wall. The formulations of Solcoseryl® are infusion, injection, gel and ointment, and it is also available as a dental paste for inflammatory processes of the mouth cavity, gums and lips. PMID:26179567

The first flight of the Advanced Thin Ionization Calorimeter (ATIC) experiment from McMurdo, Antarctica lasted for 16 days, starting in December, 2000. The ATIC instrument consists of a fully active 320-crystal, 960-channel Bismuth Germanate (BGO) calorimeter, 202 scintillator strips in 3 hodoscopes interleaved with a graphite target, and a 4480-pixel silicon matrix charge detector. We have developed an Object Oriented data processing package based on ROOT. In this paper, we will describe the data processing scheme used in handling the accumulated 45 GB of flight data. We will also discuss trigger issues by comparing the measured energy-dependent trigger efficiency with its simulation and calibration issues by considering the time-dependence of housekeeping information, etc.

The application of artificial intelligence principles to the processing of radar signals is considered theoretically. The main capabilities required are learning and adaptation in a changing environment, processing and modeling information (especially dynamics and uncertainty), and decision-making based on all available information (taking its reliability into account). For the application to combat-aircraft radar systems, the tasks include the combination of data from different types of sensors, reacting to electronic counter-countermeasures, evaluation of how much data should be acquired (energy and radiation management), control of the radar, tracking, and identification. Also discussed are related uses such as monitoring the avionics systems, supporting pilot decisions with respect to the radar system, and general applications in radar-system R&D.

This introduction to the industrial primary aluminum production process presents a short description of the electrolytic reduction technology, the history of aluminum, and the importance of this metal and its production process to modern society. Aluminum's special qualities have enabled advances in technologies coupled with energy and cost savings. Aircraft capabilities have been greatly enhanced, and increases in size and capacity are made possible by advances in aluminum technology. The metal's flexibility for shaping and extruding has led to architectural advances in energy-saving building construction. The high strength-to-weight ratio has meant a substantial reduction in energy consumption for trucks and other vehicles. The aluminum industry is therefore a pivotal one for ecological sustainability and strategic for technological development. PMID:24806722

A two-step process for dissolving plutonium metal, which two steps can be carried out sequentially or simultaneously. Plutonium metal is exposed to a first mixture containing approximately 1.0M-1.67M sulfamic acid and 0.0025M-0.1M fluoride, the mixture having been heated to a temperature between 45.degree. C. and 70.degree. C. The mixture will dissolve a first portion of the plutonium metal but leave a portion of the plutonium in an oxide residue. Then, a mineral acid and additional fluoride are added to dissolve the residue. Alteratively, nitric acid in a concentration between approximately 0.05M and 0.067M is added to the first mixture to dissolve the residue as it is produced. Hydrogen released during the dissolution process is diluted with nitrogen.

This article, the second in a series of articles on Leading Better Care, describes the actions undertaken in recent years in NHS Lanarkshire to improve selection processes for nursing, midwifery and allied health professional (NMAHP) posts. This is an area of significant interest to these professions, management colleagues and patients given the pivotal importance of NMAHPs to patient care and experience. In recent times the importance of selecting staff not only with the right qualifications but also with the right attributes has been highlighted to ensure patients are well cared for in a safe, effective and compassionate manner. The article focuses on NMAHP selection processes, tracking local, collaborative development work undertaken to date. It presents an overview of some of the work being implemented, highlights a range of important factors, outlines how evaluation is progressing and concludes by recommending further empirical research. PMID:25370266