Sample records for segas process

Astronaut Ronald M. Sega suspends himself in the weightlessness aboard the Space Shuttle Discovery's crew cabin, as the Remote Manipulator System (RMS) arm holds the Wake Shield Facility (WSF) aloft. The mission specialist is co-principle investigator on the the WSF project. Note the University of Colorado, Colorado Springs banner above his head.

STS-76 Payload Commander Ronald M. Sega is donning his launch/entry suit in the Operations and Checkout Building with assistance from a suit technician. The third docking between the Russian Space Station Mir and the U.S. Space Shuttle marks the second trip into space for Sega, who recently served a five-month assignment in Russia as operations director for NASA activities there. Once suitup activities are completed the six-member STS-76 flight crew will depart for Launch Pad 39B, where the Space Shuttle Atlantis is undergoing final preparations for liftoff during an approximately seven-minute launch window opening around 3:13 a.m. EST, March 22.

The Wake Shield Facility is displayed on a test stand at JSC. Astronaut Ronald M. Sega, mission specialist for STS-60, is seen with the facility during a break in testing in the acoustic and vibration facility at JSC.

The experiments were carried out to assess the effect of fertilizer application on indigenous medicinal plant Andrographis paniculata Nees (Sega-gyi) on yield components such as plant heigh (cm), fresh weight of whole plant (g), dry weigth of whole plant (g), dry weigth of leave per plant (g), mineral elemental contents of the leaves (N, P, K, Ca and Mg) and medically active compound andrographolide of the leaves from the green-house experiment. Various methods applied in the growth of medicinal plant A. paniculata Nees (Sega-gyi), comprised the dripping (Dropwise) and the spraying methods of the prepared blue green algae (BGA) Spirulina, the composite mixture of prepared BGA+ soil, mineral fertilizer + soil and soil itself as control. In all the fertilizer treatments, the dripping (Dropwise) method using the BGA biofertilizer gave rise to the highest growth of 100 cm when the average fresh weigth of the whole plant was 440g. Andrographolide crystals were isolated, identified and confirmed by chromatographic techniques. A single standard HPLC peak by UV detection (225 nm) indication a retention time of 4.36 min and its melting point (232 C) were found to correspond to the literature values. Analytical results of the leaves of Sega-gyi by the dripping (Dropwise) method indicated the presence of 2.12% andrographolide and also the mineral elements with the composition of N (22.78), P (1.93), K (16.15), Ca (23.70) and Mg (4.85) mg/g. Although the mechanism of micro-algal plant growth regulatory action has not yet been studied, from this research work it was observed that the BGA biofertilizer promotes plant growth, improves the soil physical conditions, and also enhance the yield of medicinally active compound andrographolide.

The experiments were carried out to assess the effect of fertilizer application on indigenous medicinal plant Andrographis paniculata Nees (Sega-gyi) on yield components such as plant heigh (cm), fresh weight of whole plant (g), dry weigth of whole plant (g), dry weigth of leave per plant (g), mineral elemental contents of the leaves (N, P, K, Ca and Mg) and medically active compound andrographolide of the leaves from the green-house experiment. Various methods applied in the growth of medicinal plant A. paniculata Nees (Sega-gyi), comprised the dripping (Dropwise) and the spraying methods of the prepared blue green algae (BGA) Spirulina, the composite mixture of prepared BGA+ soil, mineral fertilizer + soil and soil itself as control. In all the fertilizer treatments, the dripping (Dropwise) method using the BGA biofertilizer gave rise to the highest growth of 100 cm when the average fresh weigth of the whole plant was 440g. Andrographolide crystals were isolated, identified and confirmed by chromatographic techniques. A single standard HPLC peak by UV detection (225 nm) indication a retention time of 4.36 min and its melting point (232 C) were found to correspond to the literature values. Analytical results of the leaves of Sega-gyi by the dripping (Dropwise) method indicated the presence of 2.12% andrographolide and also the mineral elements with the composition of N (22.78), P (1.93), K (16.15), Ca (23.70) and Mg (4.85) mg/g. Although the mechanism of micro-algal plant growth regulatory action has not yet been studied, from this research work it was observed that the BGA biofertilizer promotes plant growth, improves the soil physical conditions, and also enhance the yield of medicinally active compound andrographolide.

Rattan is one of the natural resources in the Peninsular of Malaysia, Indonesia, etc. The lack of information of the properties of rattan is a reason why this material is not known as engineering materials. The flexibility or the plasticity of the rattan is actually a strong point to develop it as an engineering materials such as a reinforced of the cement to resist the earthquake or as a spring. This paper therefore shows the properties of rattan calamus caesius (Rattan Sega) and its application as spring. The determination of the properties of rattan was conducted according to ASTM standard with a suitable modification. This research shows the plasticity of the rattan which makes it remain in spring form. The stiffness coefficient of the spring was measured based on the relation of the force and displacement. The value of the stiffness of spring gained from measurement was compared with the analytical method which is valid in an elastic region

Different hand thinning treatments were conducted on Segae date palm cultivar to study their effect on bunch yield and fruit quality. Five thinning treatments; control (no thinning (A)), removing 10 cm of strands length per bunch (B), removing 20 cm of strands length per bunch (C), removing the middle of the bunch (D), removing the middle of the bunch and removing 10 cm of strands length per bunch (E) were investigated at Deirab, Riyadh, Saudi Arabia. Fruit thinning substantially decreased bunch yield and increased fruit weight, flesh weight, flesh weight, fruit size, fruit dimensions in both seasons as compared with the control (no thinning) treatment. Fruit thinning had significant effect on the fruit acidity, total soluble solids and total sugars in both seasons. Thinning treatments had no effect on seed weight, reducing sugars, non-reducing sugars and moisture content in two seasons. It could be recommended that removing the middle of the bunch and removing 10 cm of strands length per bunch (treatment E) is the most appropriate practice for thinning as it gave the highest bunch yield with best fruit quality as compared with other applied treatments. Principle component analysis determined into three components which explained 82.92 percentage and 82.11 percentage of the total variance in the first and second seasons, respectively. First component (50.98 percentage and 43.20 percentage) strongly influenced by fruit length, fruit diameter, fruit weight, fruit volume, seed weight and flesh weight at first and second seasons, respectively. second component (19.69 percentage and 24.95) was affected strongly by total sugars, non-reducing sugars and bunch weight and total sugars, non-reducing sugars at first and second seasons, respectively. Third component (12.24 percentage and 13.97) was affected strongly by total soluble solids and moisture content at first and second seasons, respectively. This information can be used for future studies and can be used in

The gaseous, liquid and solid radioactive effluents generated by the fuel reprocessing, can't be release in the environment. They have to be treated in order to respect the limits of the pollution regulations. These processing are detailed and discussed in this technical paper. A second part is devoted to the SPIN research program relative to the separation of the long life radionuclides in order to reduce the radioactive wastes storage volume. (A.L.B.)

This publication provides an introduction to meat processing for adult students in vocational and technical education programs. Organized in four chapters, the booklet provides a brief overview of the meat processing industry and the techniques of meat processing and butchering. The first chapter introduces the meat processing industry and…

Full Text Available Effluents from multiindustrail activities influence inland water system directly, which subsiquently affect groundwater quality and human health. Some previous reports indicated that inadequate treatment process of discharged effluent of Dhaka Export Processing Zone (DEPZ increased the concentrations of pollutants in surface water system and deteriorated total fishing and agricultural system around DEPZ and its connected area. Therefore, the present study was conducted to investigate wether the concentration of selective metals viz. Li, V, Cr, Co, Ni, Cu, Zn, Ga, As, Se, Rb, Sr, Ag, Cd, Cs, Ba, Pb and U in two types of groundwater sources were either with in the permissible guidlines or influenced by DEPZ multi industrail on their levels of contamination. The concentrations of metals were determined using inductively Couples Plasma Mass Spectrometry (ICP-MS. The mean concentrations of the elements in both types of groundwater were in the levels of their permissible guidlines, except for Ni (12.91 µg/L, Ga (0.48µg/L, Sr (90.26 µg/L and Cs (0.07µg//Lin groundwater inside DEPZ, which were 1.30, 5.00, 1.50 and 1.40 times higher than the maximum permissible limit (MPL of 10 µg/L, 0.09 µg/L, 60 µg/L, and 0.05µg/L, respectively. The mean concentrations of Li (6.85 µg/L, Zn(268 µg/L, Ga (0.12 µg/L, Sr (131 µg/L and Cs (0.07 µg/L were 3.43, 1.34, 1.33, 2.18, 1.40 times higher then the MPL of 2 µg/L, 200 µg/L, 0.09 µg/L, 60 µg/L and 0.05 µg/L, respectively, in groundwater around DEPZ. Comparatively Zn and Sr possessed higher concentrations, and Cs and U possessed lower concentration in both types of groundwater sources. The elements were distributed in homogeneous and hetrogeneous manner among the source points for deep-tubewell (DTWS and shallow tubewell (STWs, respectively. The significant positive correlations were found between the elements viz., Co-V (0.85, Ni-Sr ((0.70, Co-Cd (0.86, As-Se (0.99, Cs-Zn (0.95, Li-U (0.,71, Zn-U (0

The item 'process development' comprises the production of acetonic/butonal with C. acetobylicum and the yeasting of potato waste. The target is to increase productivity by taking the following measures - optimation of media, on-line process analysis, analysis of reaction, mathematic modelling and identification of parameters, process simulation, development of a state estimator with the help of the on-line process analysis and the model, optimization and adaptive control.

The Poisson process is a stochastic counting process that arises naturally in a large variety of daily life situations. We present a few definitions of the Poisson process and discuss several properties as well as relations to some well-known probability distributions. We further briefly discuss the

Data Processing discusses the principles, practices, and associated tools in data processing. The book is comprised of 17 chapters that are organized into three parts. The first part covers the characteristics, systems, and methods of data processing. Part 2 deals with the data processing practice; this part discusses the data input, output, and storage. The last part discusses topics related to systems and software in data processing, which include checks and controls, computer language and programs, and program elements and structures. The text will be useful to practitioners of computer-rel

Well-written and accessible, this classic introduction to stochastic processes and related mathematics is appropriate for advanced undergraduate students of mathematics with a knowledge of calculus and continuous probability theory. The treatment offers examples of the wide variety of empirical phenomena for which stochastic processes provide mathematical models, and it develops the methods of probability model-building.Chapter 1 presents precise definitions of the notions of a random variable and a stochastic process and introduces the Wiener and Poisson processes. Subsequent chapters examine

The 1988 progress report of the Data Processing laboratory (Polytechnic School, France), is presented. The laboratory research fields are: the semantics, the tests and the semantic analysis of the codes, the formal calculus, the software applications, the algorithms, the neuron networks and VLSI (Very Large Scale Integration). The investigations concerning the polynomial rings are performed by means of the standard basis approach. Among the research topics, the Pascal codes, the parallel processing, the combinatorial, statistical and asymptotic properties of the fundamental data processing tools, the signal processing and the pattern recognition. The published papers, the congress communications and the thesis are also included [fr

Suppurative process in the case of bronchiectatic disease, abscess and gang rene of lungs, has been described. Characteristic signs of roentgenologic pictu re of the above-mentioned diseases are considered. It is shown,that in most cas es roentgenologic studies give a possibility to make a high-quality diagnosis of suppurative processes

Inspiration for most research and optimisations on design processes still seem to focus within the narrow field of the traditional design practise. The focus in this study turns to associated businesses of the design professions in order to learn from their development processes. Through interviews...... and emerging production methods....

Process development: The paper describes the organization and laboratory facilities of the group working on radioactive ore processing studies. Contains a review of the carried research and the plans for the next future. A list of the published reports is also presented

The majority of foods that are consumed in our developed society have been processed. Processing promotes a non-enzymatic reaction between proteins and sugars, the Maillard reaction (MR). Maillard reaction products (MRPs) contribute to the taste, smell and color of many food products, and thus

The membrane processes have played important role in the industrial separation process. These technologies can be found in all industrial areas such as food, beverages, metallurgy, pulp and paper, textile, pharmaceutical, automotive, biotechnology and chemical industry, as well as in water treatment for domestic and industrial application. Although these processes are known since twentieth century, there are still many studies that focus on the testing of new membranes' materials and determining of conditions for optimal selectivity, i. e. the optimum transmembrane pressure (TMP) or permeate flux to minimize fouling. Moreover the researchers proposed some calculation methods to predict the membrane processes properties. In this article, the laboratory scale experiments of membrane separation techniques, as well their validation by calculation methods are presented. Because membrane is the "heart" of the process, experimental and computational methods for its characterization are also described.

Process mining includes the automated discovery of processes from event logs. Based on observed events (e.g., activities being executed or messages being exchanged) a process model is constructed. One of the essential problems in process mining is that one cannot assume to have seen all possible...... behavior. At best, one has seen a representative subset. Therefore, classical synthesis techniques are not suitable as they aim at finding a model that is able to exactly reproduce the log. Existing process mining techniques try to avoid such “overfitting” by generalizing the model to allow for more...... support for it). None of the existing techniques enables the user to control the balance between “overfitting” and “underfitting”. To address this, we propose a two-step approach. First, using a configurable approach, a transition system is constructed. Then, using the “theory of regions”, the model...

This discussion paper considers the possibility of applying to the recycle of plutonium in thermal reactors a particular method of partial processing based on the PUREX process but named CIVEX to emphasise the differences. The CIVEX process is based primarily on the retention of short-lived fission products. The paper suggests: (1) the recycle of fission products with uranium and plutonium in thermal reactor fuel would be technically feasible; (2) it would, however, take ten years or more to develop the CIVEX process to the point where it could be launched on a commercial scale; (3) since the majority of spent fuel to be reprocessed this century will have been in storage for ten years or more, the recycling of short-lived fission products with the U-Pu would not provide an effective means of making refabrication fuel ''inaccessible'' because the radioactivity associated with the fission products would have decayed. There would therefore be no advantage in partial processing

Many of the measurements and observations made in a nuclear processing facility to monitor processes and product quality can also be used to monitor the location and movements of nuclear materials. In this session information is presented on how to use process monitoring data to enhance nuclear material control and accounting (MC and A). It will be seen that SNM losses can generally be detected with greater sensitivity and timeliness and point of loss localized more closely than by conventional MC and A systems if process monitoring data are applied. The purpose of this session is to enable the participants to: (1) identify process unit operations that could improve control units for monitoring SNM losses; (2) choose key measurement points and formulate a loss indicator for each control unit; and (3) describe how the sensitivities and timeliness of loss detection could be determined for each loss indicator

Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task…

Since the first edition was published over a decade ago, advancements have been made in the design, operation, and maintenance of sewer systems, and new problems have emerged. For example, sewer processes are now integrated in computer models, and simultaneously, odor and corrosion problems caused...... by hydrogen sulfide and other volatile organic compounds, as well as other potential health issues, have caused environmental concerns to rise. Reflecting the most current developments, Sewer Processes: Microbial and Chemical Process Engineering of Sewer Networks, Second Edition, offers the reader updated...... and valuable information on the sewer as a chemical and biological reactor. It focuses on how to predict critical impacts and control adverse effects. It also provides an integrated description of sewer processes in modeling terms. This second edition is full of illustrative examples and figures, includes...

This review contains more than 100 observations and 224 references on the dissolution phenomenon. The dissolution processes are grouped into three categories: methods of aqueous attack, fusion methods, and miscellaneous observations on phenomena related to dissolution problems

This monograph serves as an introductory text to classical renewal theory and some of its applications for graduate students and researchers in mathematics and probability theory. Renewal processes play an important part in modeling many phenomena in insurance, finance, queuing systems, inventory control and other areas. In this book, an overview of univariate renewal theory is given and renewal processes in the non-lattice and lattice case are discussed. A pre-requisite is a basic knowledge of probability theory.

The technical and economic viability of the fast breeder reactor as an electricity generating system depends not only upon the reactor performance but also on a capability to recycle plutonium efficiently, reliably and economically through the reactor and fuel cycle facilities. Thus the fuel cycle is an integral and essential part of the system. Fuel cycle research and development has focused on demonstrating that the challenging technical requirements of processing plutonium fuel could be met and that the sometimes conflicting requirements of the fuel developer, fuel fabricator and fuel reprocessor could be reconciled. Pilot plant operation and development and design studies have established both the technical and economic feasibility of the fuel cycle but scope for further improvement exists through process intensification and flowsheet optimization. These objectives and the increasing processing demands made by the continuing improvement to fuel design and irradiation performance provide an incentive for continuing fuel cycle development work. (author)

This paper invites to discuss the processes of individualization and organizing being carried out under what we might see as an emerging regime of change. The underlying argumentation is that in certain processes of change, competence becomes questionable at all times. The hazy characteristics...... of this regime of change are pursued through a discussion of competencies as opposed to qualifications illustrated by distinct cases from the Danish public sector in the search for repetitive mechanisms. The cases are put into a general perspective by drawing upon experiences from similar change processes...... in MNCs. The paper concludes by asking whether we can escape from a regime of competence in a world defined by a rhetoric of change and create a more promising world in which doubt and search serve as a strategy for gaining knowledge and professionalism that improve on our capability for mutualism....

For the final chapter of this book, there is basic introduction on welding process. The good radiography must know somehow on welding process so that they can know what kind of welding that must rejected or not. All of the exposure technique that mention in earlier chapter almost applicable in this field because welding process is critical problem if there is no inspection will be done. So, for this chapter, all the discontinuity that usually appeared will be discussed and there is another discontinuity maybe not to important and do not give big impact if found it, do not described here. On top of that, the decision to accept or reject based on code, standard and specification that agreed by both to make sure that decision that agreed is corrected and more meaningful.

Digital images are used as an information carrier in different sciences and technologies. The aspiration to increase the number of bits in the image pixels for the purpose of obtaining more information is observed. In the paper, some methods of compression and contour detection on the basis of two-dimensional Markov chain are offered. Increasing the number of bits on the image pixels will allow one to allocate fine object details more precisely, but it significantly complicates image processing. The methods of image processing do not concede by the efficiency to well-known analogues, but surpass them in processing speed. An image is separated into binary images, and processing is carried out in parallel with each without an increase in speed, when increasing the number of bits on the image pixels. One more advantage of methods is the low consumption of energy resources. Only logical procedures are used and there are no computing operations. The methods can be useful in processing images of any class and assignment in processing systems with a limited time and energy resources.

The processing was made not only to show what are in the film but also to produce radiograph with high quality where the information gathered really presented level of the quality of the object inspected. Besides that, good procedure will make the film with good quality can keep the film in long time for reference. Here, more detailed on how the dark room functioned and its design. So, the good procedure while processed the film will be discussed detailed in this chapter from entering the dark room to exit from there.

A process is described for extracting at least two desired constituents from a mineral, using a liquid reagent which produces the constituents, or compounds thereof, in separable form and independently extracting those constituents, or compounds. The process is especially valuable for the extraction of phosphoric acid and metal values from acidulated phosphate rock, the slurry being contacted with selective extractants for phosphoric acid and metal (e.g. uranium) values. In an example, uranium values are oxidized to uranyl form and extracted using an ion exchange resin. (U.K.)

The search for an optimal design of a heavy water plant is done by means of a simulation model for the mass and enthalpy balances of the SH 2 -H 2 O exchange process. A symplified model for the simulation diagram where the entire plant is represented by a sole tray tower with recicles, and heat and mass feeds/extractions was used. The tower is simulated by the method developed by Tomich with the convergence part given by the algorithm of Broyden. The concluding part of the work is centered in setting the design parameters (flowrates, heat exchange rates, number of plates) wich give the desired process operating conditions. (author) [es

-terminal of the scissile bond, leaving C-terminal fusions to have non-native C-termini after processing. A solution yielding native C-termini would allow novel expression and purification systems for therapeutic proteins and peptides.The peptidyl-Lys metallopeptidase (LysN) of the fungus Armillaria mellea (Am) is one...... of few known proteases to have substrate specificity for the C-terminal side of the scissile bond. LysN exhibits specificity for lysine, and has primarily been used to complement trypsin in to proteomic studies. A working hypothesis during this study was the potential of LysN as a processing protease...

Angled and forked wood – a desired material until 19th century, was swept away by industrialization and its standardization of processes and materials. Contemporary information technology has the potential for the capturing and recognition of individual geometries through laser scanning...

Full Text Available Bentonite has vide variety of uses. Special use of bentonite, where its absorbing properties are employed to provide water-tight sealing is for an underground repository in granites In this paper, bentonite processing and beneficiation are described.

Purpose of this report: This report was prepared for RISO team involved in design of the innovation system Report provides innovation methodology to establish common understanding of the process concepts and related terminology The report does not includeRISO- or Denmark-specific cultural, econom...

I propose that the course of development in first and second language acquisition is shaped by two types of processing pressures--internal efficiency-related factors relevant to easing the burden on working memory and external input-related factors such as frequency of occurrence. In an attempt to document the role of internal factors, I consider…

The process of treating bituminiferous solid materials such as shale or the like to obtain valuable products therefrom, which comprises digesting a mixture of such material in comminuted condition with a suitable digestion liquid, such as an oil, recovering products vaporized in the digestion, and separating residual solid matter from the digestion liquid by centrifuging.

A gold and uranium ore is heap leached in accordance with the process comprising initial agglomeration of fines in the feed by means of a binding agent and cyanide solution. The lixiviant comprises a compatible mixture of sodium cyanide and sodium bicarbonate

A gold and uranium ore is heap leached in accordance with the process comprising initial agglomeration of fines in the feed by means of a binding agent and cyanide solution. The lixiviant comprises a compatible mixture of sodium cyanide and sodium bicarbonate.

Signal processing techniques, extensively used nowadays to maximize the performance of audio and video equipment, have been a key part in the design of hardware and software for high energy physics detectors since pioneering applications in the UA1 experiment at CERN in 1979

Process validation concerns the establishment of the irradiation conditions that will lead to the desired changes of the irradiated product. Process validation therefore establishes the link between absorbed dose and the characteristics of the product, such as degree of crosslinking in a polyethylene tube, prolongation of shelf life of a food product, or degree of sterility of the medical device. Detailed international standards are written for the documentation of radiation sterilization, such as EN 552 and ISO 11137, and the steps of process validation that are described in these standards are discussed in this paper. They include material testing for the documentation of the correct functioning of the product, microbiological testing for selection of the minimum required dose and dose mapping for documentation of attainment of the required dose in all parts of the product. The process validation must be maintained by reviews and repeated measurements as necessary. This paper presents recommendations and guidance for the execution of these components of process validation. (author)

This book provides a rigorous yet accessible introduction to the theory of stochastic processes. A significant part of the book is devoted to the classic theory of stochastic processes. In turn, it also presents proofs of well-known results, sometimes together with new approaches. Moreover, the book explores topics not previously covered elsewhere, such as distributions of functionals of diffusions stopped at different random times, the Brownian local time, diffusions with jumps, and an invariance principle for random walks and local times. Supported by carefully selected material, the book showcases a wealth of examples that demonstrate how to solve concrete problems by applying theoretical results. It addresses a broad range of applications, focusing on concrete computational techniques rather than on abstract theory. The content presented here is largely self-contained, making it suitable for researchers and graduate students alike.

The purpose of this chapter is to contribute to the knowledge on how production offshoring and international operations management vary across cultural contexts. The chapter attempts to shed light on how companies approach the process of offshoring in different cultural contexts. In order...... of globalisation. Yet there are clear differences in how offshoring is conducted in Denmark and Japan. The main differences are outlined in a framework and explained employing cultural variables. The findings lead to a number of propositions suggesting that the process of offshoring is not simply a uniform...... technical-rational calculation of the most efficient organisation of activities across national borders, but it is rather specific to the parent companies’ national contexts....

Full Text Available Photobiomodulation (PBM is a modulation of laser irradiation or monochromatic light (LI on biosystems. There is little research on PBM dynamics although its phenomena and mechanism have been widely studied. The PBM was discussed from dynamic viewpoint in this paper. It was found that the primary process of cellular PBM might be the key process of cellular PBM so that the transition rate of cellular molecules can be extended to discuss the dose relationship of PBM. There may be a dose zone in which low intensity LI (LIL at different doses has biological effects similar to each other, so that biological information model of PBM might hold. LIL may self-adaptively modulate a chronic stress until it becomes successful.

The main features of multiphoton processes are described on a somewhat elementary basis. The emphasis is put on multiphoton ionization of atoms where the influence of resonance effects is given through typical examples. The important role played by the coherence of light is shown to produce a very dramatic influence on multiphoton absorption. Different observations concerning molecules, electrons, as well as solid surfaces illustrate the generality of these very non linear interaction between light and matter

If solar process heat is to find a market, then the decision makers in industrial companies need to be aware that it actually exists. This was one of the main goals of the So-Pro project, which officially drew to a close in April 2012. (orig.)

The VDE system developed had the capability of recognizing up to 248 separate words in syntactic structures. 4 The two systems described are isolated...AND SPEAKER RECOGNITION by M.J.Hunt 5 ASSESSMENT OF SPEECH SYSTEMS ’ ..- * . by R.K.Moore 6 A SURVEY OF CURRENT EQUIPMENT AND RESEARCH’ by J.S.Bridle...TECHNOLOGY IN NAVY TRAINING SYSTEMS by R.Breaux, M.Blind and R.Lynchard 10 9 I-I GENERAL REVIEW OF MILITARY APPLICATIONS OF VOICE PROCESSING DR. BRUNO

A gas suitable for use in containers or motor-vehicles, etc., and consisting mainly of methane, is obtained by distilling at a temperature not exceeding 500/sup 0/C bastard cannel coal, lignite, wood, peat, shale, etc., in an horizontal or vertical retort, through which the material is continuously fed in a thin layer or column by means of a screw conveyor or the like. Cracking or dissociation of the gaseous products is prevented by introducing into the retort part of the gas which is the result of the process and which is compressed to a pressure of at least 15 atmospheres and allowed to expand into the retort to cool and carry away the gaseous products produced. These are then passed through condensers for extracting liquid hydrocarbons, and other hydrocarbons are extracted by passage through washing-oils. The gas is then compressed by a water-cooled pump to a pressure of 15 atmospheres, whereby a spirit similar to petrol is formed, and a stable gas left which is mainly methane, part of the gas being used to carry out the process described above.

A liquid phase process is described for oligomerization of C[sub 4] and C[sub 5] isoolefins or the etherification thereof with C[sub 1] to C[sub 6] alcohols wherein the reactants are contacted in a reactor with a fixed bed acid cation exchange resin catalyst at an LHSV of 5 to 20, pressure of 0 to 400 psig and temperature of 120 to 300 F wherein the improvement is the operation of the reactor at a pressure to maintain the reaction mixture at its boiling point whereby at least a portion but less than all of the reaction mixture is vaporized. By operating at the boiling point and allowing a portion of the reaction mixture to vaporize, the exothermic heat of reaction is dissipated by the formation of more boil up and the temperature in the reactor is controlled. 2 figs.

A liquid phase process is described for oligomerization of C[sub 4] and C[sub 5] isoolefins or the etherification thereof with C[sub 1] to C[sub 6] alcohols wherein the reactants are contacted in a reactor with a fixed bed acid cation exchange resin catalyst at an LHSV of 5 to 20, pressure of 0 to 400 psig and temperature of 120 to 300 F wherein the improvement is the operation of the reactor at a pressure to maintain the reaction mixture at its boiling point whereby at least a portion but less than all of the reaction mixture is vaporized. By operating at the boiling point and allowing a portion of the reaction mixture to vaporize, the exothermic heat of reaction is dissipated by the formation of more boil up and the temperature in the reactor is controlled. 2 figures.

The authors used geophysical, geochemical, and numerical modeling to study selected problems related to Earth's lithosphere. We interpreted seismic waves to better characterize the thickness and properties of the crust and lithosphere. In the southwestern US and Tien Shari, crust of high elevation is dynamically supported above buoyant mantle. In California, mineral fabric in the mantle correlate with regional strain history. Although plumes of buoyant mantle may explain surface deformation and magmatism, our geochemical work does not support this mechanism for Iberia. Generation and ascent of magmas remains puzzling. Our work in Hawaii constrains the residence of magma beneath Hualalai to be a few hundred to about 1000 years. In the crust, heat drives fluid and mass transport. Numerical modeling yielded robust and accurate predictions of these processes. This work is important fundamental science, and applies to mitigation of volcanic and earthquake hazards, Test Ban Treaties, nuclear waste storage, environmental remediation, and hydrothermal energy.

The authors used geophysical, geochemical, and numerical modeling to study selected problems related to Earth's lithosphere. We interpreted seismic waves to better characterize the thickness and properties of the crust and lithosphere. In the southwestern US and Tien Shari, crust of high elevation is dynamically supported above buoyant mantle. In California, mineral fabric in the mantle correlate with regional strain history. Although plumes of buoyant mantle may explain surface deformation and magmatism, our geochemical work does not support this mechanism for Iberia. Generation and ascent of magmas remains puzzling. Our work in Hawaii constrains the residence of magma beneath Hualalai to be a few hundred to about 1000 years. In the crust, heat drives fluid and mass transport. Numerical modeling yielded robust and accurate predictions of these processes. This work is important fundamental science, and applies to mitigation of volcanic and earthquake hazards, Test Ban Treaties, nuclear waste storage, environmental remediation, and hydrothermal energy

In the downward distillation of coal, shale, lignite, or the like, the heat is generated by the combustion of liquid or gaseous fuel above the charge the zone of carbonization thus initiated travelling downwards through the charge. The combustible gases employed are preferably those resulting from the process but gases such as natural gas may be employed. The charge is in a moistened and pervious state the lower parts being maintained at a temperature not above 212/sup 0/F until influenced by contact with the carbonization zone and steam may be admitted to increase the yield of ammonia. The combustible gases may be supplied with insufficient air so as to impart to them a reducing effect.

A method of joining metal parts for the preparation of relatively long, thin fuel element cores of uranium or alloys thereof for nuclear reactors is described. The process includes the steps of cleaning the surfaces to be jointed, placing the sunfaces together, and providing between and in contact with them, a layer of a compound in finely divided form that is decomposable to metal by heat. The fuel element members are then heated at the contact zone and maintained under pressure during the heating to decompose the compound to metal and sinter the members and reduced metal together producing a weld. The preferred class of decomposable compounds are the metal hydrides such as uranium hydride, which release hydrogen thus providing a reducing atmosphere in the vicinity of the welding operation.

This Article argues that the practice of holding so many adjudicative proceedings related to disability in private settings (e.g., guardianship, special education due process, civil commitment, and social security) relative to our strong normative presumption of public access to adjudication may cultivate and perpetuate stigma in contravention of the goals of inclusion and enhanced agency set forth in antidiscrimination laws. Descriptively, the law has a complicated history with disability--initially rendering disability invisible; later, underwriting particular narratives of disability synonymous with incapacity; and, in recent history, promoting the full socio-economic visibility of people with disabilities. The Americans with Disabilities Act (ADA), the marquee civil rights legislation for people with disabilities (about to enter its twenty-fifth year), expresses a national approach to disability that recognizes the role of society in its construction, maintenance, and potential remedy. However, the ADA’s mission is incomplete. It has not generated the types of interactions between people with disabilities and nondisabled people empirically shown to deconstruct deeply entrenched social stigma. Prescriptively, procedural design can act as an "ntistigma agent"to resist and mitigate disability stigma. This Article focuses on one element of institutional design--public access to adjudication--as a potential tool to construct and disseminate counter-narratives of disability. The unique substantive focus in disability adjudication on questions of agency provides a potential public space for the negotiation of nuanced definitions of disability and capacity more reflective of the human condition.

-increasing competitiveness which came to a head as an embroiled dispute resulting from differences in scientific and scientific policy views. In the process a battle was fought over research resources so that, what was at first an apparently personal quarrel, affected the course of research promotion at an institutional level in the area of life sciences in the GDR. Despite several attempts at mediation, old age finally forced the adversaries to put aside their differences.

A new area of biology has been opened up by nanoscale exploration of the living world. This has been made possible by technological progress, which has provided the tools needed to make devices that can measure things on such length and time scales. In a sense, this is a new window upon the living world, so rich and so diverse. Many of the investigative methods described in this book seek to obtain complementary physical, chemical, and biological data to understand the way it works and the way it is organised. At these length and time scales, only dedicated instrumentation could apprehend the relevant phenomena. There is no way for our senses to observe these things directly. One important field of application is molecular medicine, which aims to explain the mechanisms of life and disease by the presence and quantification of specific molecular entities. This involves combining information about genes, proteins, cells, and organs. This in turn requires the association of instruments for molecular diagnosis, either in vitro, e.g., the microarray or the lab-on-a-chip, or in vivo, e.g., probes for molecular biopsy, and tools for molecular imaging, used to localise molecular information in living organisms in a non-invasive way. These considerations concern both preclinical research for drug design and human medical applications. With the development of DNA and RNA chips [1], genomics has revolutionised investigative methods for cells and cell processes [2,3]. By sequencing the human genome, new ways have been found for understanding the fundamental mechanisms of life [4]. A revolution is currently under way with the analysis of the proteome [5-8], i.e., the complete set of proteins that can be found in some given biological medium, such as the blood plasma. The goal is to characterise certain diseases by recognisable signatures in the proteomic profile, as determined from a blood sample or a biopsy, for example [9-13]. What is at stake is the early detection of

What is Hydrothermal Circulation?Hydrothermal circulation occurs when seawater percolates downward through fractured ocean crust along the volcanic mid-ocean ridge (MOR) system. The seawater is first heated and then undergoes chemical modification through reaction with the host rock as it continues downward, reaching maximum temperatures that can exceed 400 °C. At these temperatures the fluids become extremely buoyant and rise rapidly back to the seafloor where they are expelled into the overlying water column. Seafloor hydrothermal circulation plays a significant role in the cycling of energy and mass between the solid earth and the oceans; the first identification of submarine hydrothermal venting and their accompanying chemosynthetically based communities in the late 1970s remains one of the most exciting discoveries in modern science. The existence of some form of hydrothermal circulation had been predicted almost as soon as the significance of ridges themselves was first recognized, with the emergence of plate tectonic theory. Magma wells up from the Earth's interior along "spreading centers" or "MORs" to produce fresh ocean crust at a rate of ˜20 km3 yr-1, forming new seafloor at a rate of ˜3.3 km2 yr-1 (Parsons, 1981; White et al., 1992). The young oceanic lithosphere formed in this way cools as it moves away from the ridge crest. Although much of this cooling occurs by upward conduction of heat through the lithosphere, early heat-flow studies quickly established that a significant proportion of the total heat flux must also occur via some additional convective process (Figure 1), i.e., through circulation of cold seawater within the upper ocean crust (Anderson and Silbeck, 1981). (2K)Figure 1. Oceanic heat flow versus age of ocean crust. Data from the Pacific, Atlantic, and Indian oceans, averaged over 2 Ma intervals (circles) depart from the theoretical cooling curve (solid line) indicating convective cooling of young ocean crust by circulating seawater

Typically represented in event logs, business process data describe the execution of process events over time. Business process intelligence (BPI) techniques such as process mining can be applied to get strategic insight into business processes. Process discovery, conformance checking and

Full Text Available Biological wastewater treatment is not effective treatment method if raw wastewater contains toxic and refractory organics. Advanced oxidation processes are applied before or after biological treatment for the detoxification and reclamation of this kind of wastewaters. The advanced oxidation processes are based on the formation of powerful hydroxyl radicals. Among advanced oxidation processes Fenton process is one of the most promising methods. Because application of Fenton process is simple and cost effective and also reaction occurs in a short time period. Fenton process is applied for many different proposes. In this study, Fenton process was evaluated as an advanced oxidation process in wastewater treatment.

In different industries a lot high-precision parts are produced from hard-processed scarce materials. Forming such details can only be acting during non-contact processing, or a minimum of effort, and doable by the use, for example, of electro-chemical processing. At the present stage of development of metal working processes are important management issues electrochemical machining and its automation. This article provides some indicators and factors of electrochemical machining process.

A technique for distributed packet processing includes sequentially passing packets associated with packet flows between a plurality of processing engines along a flow through data bus linking the plurality of processing engines in series. At least one packet within a given packet flow is marked by a given processing engine to signify by the given processing engine to the other processing engines that the given processing engine has claimed the given packet flow for processing. A processing function is applied to each of the packet flows within the processing engines and the processed packets are output on a time-shared, arbitered data bus coupled to the plurality of processing engines.

Process view technology is catching more attentions in modern business process management, as it enables the customisation of business process representation. This capability helps improve the privacy protection, authority control, flexible display, etc., in business process modelling. One of

This book mentions laser processing with laser principle, laser history, laser beam property, laser kinds, foundation of laser processing such as laser oscillation, characteristic of laser processing, laser for processing and its characteristic, processing of laser hole including conception of processing of laser hole and each material, and hole processing of metal material, cut of laser, reality of cut, laser welding, laser surface hardening, application case of special processing and safety measurement of laser.

In several standards, guidelines and publications, organic food processing is strongly associated with "minimal processing" and "careful processing". The term "minimal processing" is nowadays often used in the general food processing industry and described in literature. The term "careful processing" is used more specifically within organic food processing but is not yet clearly defined. The concept of carefulness seems to fit very well with the processing of organic foods, especially if it i...

are identified, and where we simulate backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and......In this paper we describe methods for randomly thinning certain classes of spatial point processes. In the case of a Markov point process, the proposed method involves a dependent thinning of a spatial birth-and-death process, where clans of ancestors associated with the original points......, thus, can be used as a graphical exploratory tool for inspecting the goodness-of-fit of a spatial point process model. Several examples, including clustered and inhibitive point processes, are considered....

, and where one simulates backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and thus can......This paper describes methods for randomly thinning certain classes of spatial point processes. In the case of a Markov point process, the proposed method involves a dependent thinning of a spatial birth-and-death process, where clans of ancestors associated with the original points are identified...... be used as a diagnostic for assessing the goodness-of-fit of a spatial point process model. Several examples, including clustered and inhibitive point processes, are considered....

Existing solutions for analysis and optimization of manufacturing processes, such as online analysis processing or statistical calculations, have shortcomings that limit continuous process improvements. In particular, they lack means of storing and integrating the results of analysis. This makes the valuable information that can be used for process optimizations used only once and then disposed. The goal of the Advanced Manufacturing Analytics (AdMA) research project is to design an integrate...

As organizations increasingly work in process-oriented manner, the number of business process models that they develop and have to maintain increases. As a consequence, it has become common for organizations to have collections of hundreds or even thousands of business process models. When a

We extend the boson process first to a large class of Cox processes and second to an even larger class of infinitely divisible point processes. Density and moment results are studied in detail. These results are obtained in closed form as weighted permanents, so the extension i called a permanental...... process. Temporal extensions and a particularly tractable case of the permanental process are also studied. Extensions of the fermion process along similar lines, leading to so-called determinantal processes, are discussed....

We extend the boson process first to a large class of Cox processes and second an even larger class of infinitely divisible point processes. Density and moment results are studied in detail. These results are obtained in closed form as weighted permanents, so the extension is called a permanent...... process. Temporal extensions and a particularly tractable case of the permanent process are also studied. Extensions of the ferminon process along similar lines, leading to so-called determinant processes, are discussed at the end. While the permanent process is attractive, the determinant process...

There are many explosive processes in nucleosynthesis: big bang nucleosynthesis, the rp-process, the γ-process, the ν-process, and the r-process. However, I will discuss just the rp-process and the r-process in detail, primarily because both seem to have been very active research areas of late, and because they have great potential for studies with radioactive nuclear beams. I will also discuss briefly the γ-process because of its inevitability in conjunction with the rp-process. (orig.)

In recent years, process intensification (PI) has attracted considerable academic interest as a potential means for process improvement, to meet the increasing demands for sustainable production. A variety of intensified operations developed in academia and industry creates a large number...... of options to potentially improve the process but to identify the set of feasible solutions for PI in which the optimal can be found takes considerable resources. Hence, a process synthesis tool to achieve PI would potentially assist in the generation and evaluation of PI options. Currently, several process...... design tools with a clear focus on specific PI tasks exist. Therefore, in this paper, the concept of a general systematic framework for synthesis and design of PI options in hierarchical steps through analyzing an existing process, generating PI options in a superstructure and evaluating intensified...

Business processes in dentistry are quickly evolving towards "digital dentistry". This means that many steps in the dental process will increasingly deal with computerized information or computerized half products. A complicating factor in the improvement of process performance in dentistry,

.... This research takes a step forward in real time machine vision processing. It investigates techniques for implementing a real time stereovision processing system using two miniature color cameras...

One of the important application of service composition techniques lies in the field of business process management. Essentially a business process can be considered as a composition of services, which is usually prepared by domain experts, and many tasks still have to be performed manually. These

A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged. Recent changes in the

In response to decreasing funding levels available to support activities at the Idaho Chemical Processing Plant (ICPP) and a desire to be cost competitive, the Department of Energy Idaho Operations Office (DOE-ID) and Lockheed Idaho Technologies Company have increased their emphasis on cost-saving measures. The ICPP Effectiveness Improvement Initiative involves many activities to improve cost effectiveness and competitiveness. This report documents the methodology and results of one of those cost cutting measures, the Process Efficiency Improvement Activity. The Process Efficiency Improvement Activity performed a systematic review of major work processes at the ICPP to increase productivity and to identify nonvalue-added requirements. A two-phase approach was selected for the activity to allow for near-term implementation of relatively easy process modifications in the first phase while obtaining long-term continuous improvement in the second phase and beyond. Phase I of the initiative included a concentrated review of processes that had a high potential for cost savings with the intent of realizing savings in Fiscal Year 1996 (FY-96.) Phase II consists of implementing long-term strategies too complex for Phase I implementation and evaluation of processes not targeted for Phase I review. The Phase II effort is targeted for realizing cost savings in FY-97 and beyond

To sustain gains from a process improvement initiative, healthcare organizations should: Explain to staff why a process improvement initiative is needed. Encourage leaders within the organization to champion the process improvement, and tie their evaluations to its outcomes. Ensure that both leaders and employees have the skills to help sustain the sought-after process improvements.

In this paper, we propose a stochastic process W H (t)(H-bar (12,1)) which we call fractional Poisson process. The process W H (t) is self-similar in wide sense, displays long range dependence, and has more fatter tail than Gaussian process. In addition, it converges to fractional Brownian motion in distribution

The topic of process mining has attracted the attention of both researchers and tool vendors in the Business Process Management (BPM) space. The goal of process mining is to discover process models from event logs, i.e., events logged by some information system are used to extract information about

Abstract Process modeling is a central element in any approach to Business Process Management (BPM). However, what hinders both practitioners and aca demics is the lack of support for assessing the quality of process models — let alone realizing high quality process models. Existing frameworks are

The presentation, on which 17 slides/overheads are included in the papers, explained the principles of the Shell coal gasification process and the methods incorporated for control of sulfur dioxide, nitrogen oxides, particulates and mercury. The economics of the process were discussed. The differences between gasification and burning, and the differences between the Shell process and other processes were discussed.

In process mining, one of the main challenges is to discover a process model, while balancing several quality criteria. This often requires repeatedly setting parameters, discovering a map and evaluating it, which we refer to as process exploration. Commercial process mining tools like Disco,

Process mining aims at discovering process models from data logs in order to offer insight into the real use of information systems. Most of the existing process mining algorithms fail to discover complex constructs or have problems dealing with noise and infrequent behavior. The genetic process

Complete text of publication follows. Accurate radiation dosimetry can provide quality assurance in radiation processing. Considerable relevant experiences in dosimetry by the SSDL-MINT has necessitate the development of methods making measurement at gamma plant traceable to the national standard. It involves the establishment of proper calibration procedure and selection of appropriate transfer system/technique to assure adequate traceability to a primary radiation standard. The effort forms the basis for irradiation process control, the legal approval of the process by the public health authorities (medical product sterilization and food preservation) and the safety and acceptance of the product

Remarkable advances have been made in recent years in the science and technology of thin film processes for deposition and etching. It is the purpose of this book to bring together tutorial reviews of selected filmdeposition and etching processes from a process viewpoint. Emphasis is placed on the practical use of the processes to provide working guidelines for their implementation, a guide to the literature, and an overview of each process.

The Campbell process is a stationary random process which can have various correlation functions, according to the choice of an elementary response function. The statistical properties of this process are presented. A numerical algorithm and a subroutine for generating such a process is built up and tested, for the physically interesting case of a Campbell process with Gaussian correlations. The (non-Gaussian) probability distribution appears to be similar to the Gamma distribution

This book presents a framework through transformation and explains how business goals can be translated into realistic plans that are tangible and yield real results in terms of the top line and the bottom line. Process Transformation is like a tangram puzzle, which has multiple solutions yet is essentially composed of seven 'tans' that hold it together. Based on practical experience and intensive research into existing material, 'Process Tangram' is a simple yet powerful framework that proposes Process Transformation as a program. The seven 'tans' are: the transformation program itself, triggers, goals, tools and techniques, culture, communication and success factors. With its segregation into tans and division into core elements, this framework makes it possible to use 'pick and choose' to quickly and easily map an organization's specific requirements. Change management and process modeling are covered in detail. In addition, the book approaches managed services as a model of service delivery, which it ex...

Various dry processes have been studied and more or less developed in order particularly to reduce the waste quantities but none of them had replaced the PUREX process, for reasons departing to policy errors, un-appropriate demonstration examples or too late development, although realistic and efficient dry processes such as a fluoride selective volatility based processes have been demonstrated in France (CLOVIS, ATILA) and would be ten times cheaper than the PUREX process. Dry processes could regain interest in case of a nuclear revival (following global warming fears) or thermal wastes over-production. In the near future, dry processes could be introduced in complement to the PUREX process, especially at the end of the process cycle, for a more efficient recycling and safer storage (inactivation)

This article considers the relevance of honing processes for creating high-quality mechanical engineering products. The features of the honing process are revealed and such important concepts as the task for optimization of honing operations, the optimal structure of the honing working cycles, stepped and stepless honing cycles, simulation of processing and its purpose are emphasized. It is noted that the reliability of the mathematical model determines the quality parameters of the honing process control. An algorithm for continuous control of the honing process is proposed. The process model reliably describes the machining of a workpiece in a sufficiently wide area and can be used to operate the CNC machine CC743.

Sodium tetraphenylborate and sodium titanate are used to assist in the concentration of soluble radionuclide in the Savannah River Plant's high-level waste. In the Defense Waste Processing Facility, concentrated tetraphenylborate/sodium titanate slurry containing cesium-137, strontium-90 and traces of plutonium from the waste tank farm is hydrolyzed in the Salt Processing Cell forming organic and aqueous phases. The two phases are then separated and the organic phase is decontaminated for incineration outside the DWPF building. The aqueous phase, containing the radionuclides and less than 10% of the original organic, is blended with the insoluble radionuclides in the high-level waste sludge and is fed to the glass melter for vitrification into borosilicate glass. During the Savannah River Laboratory's development of this process, copper (II) was found to act as a catalyst during the hydrolysis reactions, which improved the organic removal and simplified the design of the reactor

Described is an activity which demonstrates an organic-based reprographic method that is used extensively for the duplication of microfilm and engineering drawings. Discussed are the chemistry of the process and how to demonstrate the process for students. (CW)

Waste processing and preparing waste to support waste processing relies heavily on ventilation. Ventilation is used at the Hanford Site on the waste storage tanks to provide confinement, cooling, and removal of flammable gases

This text moves from the basics of laser physics to detailed treatments of all major materials processing techniques for which lasers are now essential. New chapters cover laser physics, drilling, micro- and nanomanufacturing and biomedical laser processing.

Office of Personnel Management — Inventory of maps and descriptions of the business processes of the U.S. Office of Personnel Management (OPM), with an emphasis on the processes of the Office of the...

The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

Introducing the notion of Group Decision Process Support Systems (GDPSS) to traditional decision-support theorists.......Introducing the notion of Group Decision Process Support Systems (GDPSS) to traditional decision-support theorists....

Automatic cognitive processing helps us navigate the world. However, if the emotional and cognitive interplay becomes skewed, those cognitive processes can become maladaptive and result in psychopathology. Although biases are present in most mental disorders, different disorders are characterized by

logistics processes in hospitals and aims to provide theoretically and empirically based evidence for improving these processes to both expand the knowledge base of healthcare logistics and provide a decision tool for hospital logistics managers to improve their processes. Case studies were conducted...... processes. Furthermore, a method for benchmarking healthcare logistics processes was developed. Finally, a theoretically and empirically founded framework was developed to support managers in making an informed decision on how to improve healthcare logistics processes. This study contributes to the limited...... literature concerned with the improvement of logistics processes in hospitals. Furthermore, the developed framework provides guidance for logistics managers in hospitals on how to improve their processes given the circumstances in which they operate....

The flavor changing lepton processes, or in another words the lepton flavor changing processes, are described with emphasis on the updated theoretical motivations and the on-going experimental progress on a new high-intense muon source. (author)

to create a new methodology for developing and exploring process models and applications. The paper outlines the process innovation laboratory as a new approach to BPI. The process innovation laboratory is a comprehensive framework and a collaborative workspace for experimenting with process models....... The process innovation laboratory facilitates innovation by using an integrated action learning approach to process modelling in a controlled environment. The study is based on design science and the paper also discusses the implications to EIS research and practice......Most organizations today are required not only to operate effective business processes but also to allow for changing business conditions at an increasing rate. Today nearly every business relies on their enterprise information systems (EIS) for process integration and future generations of EIS...

This technical note summarizes hydrologic and hydraulic (H AND H) processes and the related terminology that will likely be encountered during an evaluation of the effect of ground-water processes on wetland function...

Flue gas desulphurization process are discussed. These processes can be grouped into non-regenerable systems and regenerable systems. The non-regenerable systems produce a product which is either disposed of as waste or sold as a by-product e.g. lime/limestone process. While in the regenerable systems, e.g. Wellman-Lord process, the SO 2 is regenerated from the sorbent(sodium sulphite), which is returned to absorb more SO 2 . Also a newer technology for flue gas desulphurization is discussed. The Ispra process uses bromine as oxidant, producing HBr, from which bromine is regenerated by electrolysis. The only by-products of this process are sulphuric acid and hydrogen, which are both valuable products, and no waste products are produced. Suggested modifications on the process are made based on experimental investigations to improve the efficiency of the process and to reduce its costs

The Plasma Hearth Process (PHP) is a high-temperature waste treatment process being developed by Science Applications International Corporation (SAIC) for the Department of Energy (DOE) that destroys hazardous organics while stabilizing radionuclides and hazardous metals in a vitreous slag waste form. The PHP has potential application for the treatment of a wide range of mixed waste types in both the low-level and transuranic (TRU) mixed waste categories. DOE, through the Office of Technology Development's Mixed Waste Integrated Program (MWIP) is conducting a three phase development project to ready the PHP for implementation in the DOE complex

A method is described for processing liquid radioactive wastes. It includes the heating of the liquid wastes so that the contained liquids are evaporated and a practically anhydrous mass of solid particles inferior in volume to that of the wastes introduced is formed, then the transformation of the solid particles into a monolithic structure. This transformation includes the compressing of the particles and sintering or fusion. The solidifying agent is a mixture of polyethylene and paraffin wax or a styrene copolymer and a polyester resin. The device used for processing the radioactive liquid wastes is also described [fr

This book presents the grind-hardening process and the main studies published since it was introduced in 1990s. The modelling of the various aspects of the process, such as the process forces, temperature profile developed, hardness profiles, residual stresses etc. are described in detail. The book is of interest to the research community working with mathematical modeling and optimization of manufacturing processes.

This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

During the past few years significant advances have taken place in the different areas of dosimetry for radiation processing, mainly stimulated by the increased interest in radiation for food preservation, plastic processing and sterilization of medical products. Reference services both...... and sterilization dosimetry, optichromic dosimeters in the shape of small tubes for food processing, and ESR spectroscopy of alanine for reference dosimetry. In this paper the special features of radiation processing dosimetry are discussed, several commonly used dosimeters are reviewed, and factors leading...

Business process management plays an important role in the management of organizations. More and more organizations describe their operations as business processes. It is common for organizations to have collections of thousands of business processes, but for reasons of confidentiality these

The chapters of this volume represent the invited papers delivered at the conference. They are arranged according to thermatic proximity beginning with atoms and continuing with molecules and surfaces. Section headings include multiphoton processes in atoms, field fluctuations and collisions in multiphoton process, and multiphoton processes in molecules and surfaces. Abstracts of individual items from the conference were prepared separately for the data base

Mathematical modeling and simulations of semisolid filling processes remains a critical issue in understanding and optimizing the process. Semisolid slurries are non-Newtonian materials that exhibit complex rheological behavior. There the way these slurries flow in cavities is very different from the way liquid in classical casting fills cavities. Actually filling in semisolid processing is often counter intuitive

This month’s Processing column on the theme of “How Is It Processed?” focuses on yogurt. Yogurt is known for its health-promoting properties. This column will provide a brief overview of the history of yogurt and the current market. It will also unveil both traditional and modern yogurt processing t...

.e. local guidelines. From a knowledge management point of view, this externalization of generalized processes, gives the opportunity to learn from, evaluate and optimize the processes. "Clinical Process Intelligence" (CPI), will denote the goal of getting generalized insight into patient centered health...

The following major processes involved in the production of crystalline-silicon solar cells were discussed: surface preparation, junction formation, metallization, and assembly. The status of each of these processes, and the sequence in which these processes are applied, were described as they were in 1975, as they were in 1985, and what they might be in the future.

This article discusses the constitutional amendment process. Although the process is not described in great detail, Article V of the United States Constitution allows for and provides instruction on amending the Constitution. While the amendment process currently consists of six steps, the Constitution is nevertheless quite difficult to change.…

The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.

This book deals with colloidal systems in technical processes and the influence of colloidal systems by technical processes. It explores how new measurement capabilities can offer the potential for a dynamic development of scientific and engineering, and examines the origin of colloidal systems and its use for new products. The future challenges to colloidal process engineering are the development of appropriate equipment and processes for the production and obtainment of multi-phase structures and energetic interactions in market-relevant quantities. The book explores the relevant processes and for controlled production and how they can be used across all scales.

A high-temperature process utilizing molten salt extraction from molten metal alloys has been developed for purification of spent power reactor fuels. Experiments with laboratory-scale processing operations show that purification and throughput parameters comparable to the Barnwell Purex process can be achieved by pyrochemical processing in equipment one-tenth the size, with all wastes being discharged as stable metal alloys at greatly reduced volume and disposal cost. This basic technology can be developed for large-scale processing of spent reactor fuels. 13 references, 4 figures

Data Processing: Made Simple, Second Edition presents discussions of a number of trends and developments in the world of commercial data processing. The book covers the rapid growth of micro- and mini-computers for both home and office use; word processing and the 'automated office'; the advent of distributed data processing; and the continued growth of database-oriented systems. The text also discusses modern digital computers; fundamental computer concepts; information and data processing requirements of commercial organizations; and the historical perspective of the computer industry. The

This sequel to the 1978 classic, Thin Film Processes, gives a clear, practical exposition of important thin film deposition and etching processes that have not yet been adequately reviewed. It discusses selected processes in tutorial overviews with implementation guide lines and an introduction to the literature. Though edited to stand alone, when taken together, Thin Film Processes II and its predecessor present a thorough grounding in modern thin film techniques.Key Features* Provides an all-new sequel to the 1978 classic, Thin Film Processes* Introduces new topics, and sever

The design and operation of the two radioactive gas processing systems at the Fast Flux Test Facility (FFTF) exemplifies the concept that will be used in the first generation of Liquid Metal Fast Breeder Reactors (LMFBR's). The two systems, the Radioactive Argon Processing System (RAPS) and the Cell Atmosphere Processing System (CAPS), process the argon and nitrogen used in the FFTF for cover gas on liquid metal systems and as inert atmospheres in steel lined cells housing sodium equipment. The RAPS specifically processes the argon cover gas from the reactor coolant system, providing for decontamination and eventual reuse. The CAPS processes radioactive gasses from inerted cells and other liquid metal cover gas systems, providing for decontamination and ultimate discharge to the atmosphere. The cryogenic processing of waste gas by both systems is described

We investigate the statistical properties of a special branching point process. The initial process is assumed to be a homogeneous Poisson point process (HPP). The initiating events at each branching stage are carried forward to the following stage. In addition, each initiating event independently contributes a nonstationary Poisson point process (whose rate is a specified function) located at that point. The additional contributions from all points of a given stage constitute a doubly stochastic Poisson point process (DSPP) whose rate is a filtered version of the initiating point process at that stage. The process studied is a generalization of a Poisson branching process in which random time delays are permitted in the generation of events. Particular attention is given to the limit in which the number of branching stages is infinite while the average number of added events per event of the previous stage is infinitesimal. In the special case when the branching is instantaneous this limit of continuous branching corresponds to the well-known Yule--Furry process with an initial Poisson population. The Poisson branching point process provides a useful description for many problems in various scientific disciplines, such as the behavior of electron multipliers, neutron chain reactions, and cosmic ray showers

Plasma etching process of UO{sub 2} by using fluorine containing gas plasma is studied as a secondary fuel removal process for DUPIC (Direct Use of PWR spent fuel Into Candu) process which is taken into consideration for potential future fuel cycle in Korea. CF{sub 4}/O{sub 2} gas mixture is chosen for reactant gas and the etching rates of UO{sub 2} by the gas plasma are investigated as functions of CF{sub 4}/O{sub 2} ratio, plasma power, substrate temperature, and plasma gas pressure. It is found that the optimum CF{sub 4}/O{sub 2} ratio is around 4:1 at all temperatures up to 400 deg C and the etching rate increases with increasing r.f. power and substrate temperature. Under 150W r.f. power the etching rate reaches 1100 monolayers/min at 400 deg C, which is equivalent to about 0.5mm/min. (author).

The SILVA laser isotope separation process is based on the laser selective photo-ionization of uranium atomic vapour; the process is presently under development by CEA and COGEMA in France, with the aim to reduce by a factor three the cost of uranium enrichment. The two main components of a SILVA process plant are the lasers (copper vapour lasers and dye lasers) and the separator for the vaporization (with a high energy electron beam), ionization and separation operations. Researches on the SILVA process started in 1985 and the technical and economical feasibility is to be demonstrated in 1997. The progresses of similar rival processes and other processes are discussed and the remaining research stages and themes of the SILVA program are presented

Enterprise Systems Management (ESM) and Business Pro- cess Management (BPM), although highly correlated, have evolved as alternative and mutually exclusive approaches to corporate infrastruc- ture. As a result, companies struggle to nd the right balance between technology and process factors...... in infrastructure implementation projects. The purpose of this paper is articulate a need and a direction to medi- ate between the process-driven and the technology-driven approaches. Using a cross-case analysis, we gain insight into two examples of sys- tems and process implementation. We highlight the dierences...... between them using strategic alignment, Enterprise Systems and Business Process Management theories. We argue that the insights from these cases can lead to a better alignment between process and technology. Implications for practice include the direction towards a closer integration of process...

The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

Featuring contributions from prominent thinkers and researchers, this volume in the ""Advances in Management Information Systems"" series provides a rich set of conceptual, empirical, and introspective studies that epitomize fundamental knowledge in the area of Business Process Transformation. Processes are interpreted broadly to include operational and managerial processes within and between organizations, as well as those involved in knowledge generation. Transformation includes radical and incremental change, its conduct, management, and outcome. The editors and contributing authors pay clo

In this German seminar paper, which was written in the year 2011 at the University of Duisburg for a Bachelor Colloquium in Applied computer science, we show a brief overview of the Rational Unified Process (RUP). Thus, interested students or generally interested people in software development gain a first impression of RUP. The paper includes a survey and overview of the underlying process structure, the phases of the process, its workflows, and describes the always by the RUP developers pos...

In the textbooks, procedural due process is a strictly judicial enterprise; although substantive entitlements are created by legislative and executive action, it is for courts to decide independently what process the Constitution requires. The notion that procedural due process might be committed primarily to the discretion of the agencies themselves is almost entirely absent from the academic literature. The facts on the ground are very different. Thanks to converging strands of caselaw ...

Process Improvement Essentials combines the foundation needed to understand process improvement theory with the best practices to help individuals implement process improvement initiatives in their organization. The three leading programs: ISO 9001:2000, CMMI, and Six Sigma--amidst the buzz and hype--tend to get lumped together under a common label. This book delivers a combined guide to all three programs, compares their applicability, and then sets the foundation for further exploration.

The chapters of this volume represent the invited papers delivered at the conference. They are arranged according to thermatic proximity beginning with atoms and continuing with molecules and surfaces. Section headings include multiphoton processes in atoms, field fluctuations and collisions in multiphoton process, and multiphoton processes in molecules and surfaces. Abstracts of individual items from the conference were prepared separately for the data base. (GHT)

This presentation describes the development of the proposed Process Flow Diagram (PFD) for the Tokamak Exhaust Processing System (TEP) of ITER. A brief review of design efforts leading up to the PFD is followed by a description of the hydrogen-like, air-like, and waterlike processes. Two new design values are described; the mostcommon and most-demanding design values. The proposed PFD is shown to meet specifications under the most-common and mostdemanding design values.

This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

renovation to be overcome. The homeowners were better integrated and their preferences and immaterial values were better taken into account. To keep the decision-making process economically viable and timely, the process as known today still needs to be improved, and new tools need to be developed....... This paper presents a new scheme: the integrated renovation process. One successful case study is introduced, and recommendations for future developments needed in the field are provided....

The Transuranium Processing Plant (TRU) is a remotely operated, hot-cell, chemical processing facility of advanced design. The heart of TRU is a battery of nine heavily shielded process cells housed in a two-story building. Each cell, with its 54-inch-thick walls of a special high-density concrete, has enough shielding to stop the neutrons and gamma radiation from 1 gram of 252/sub Cf/ and associated fission products. Four cells contain chemical processing equipment, three contain equipment for the preparation and inspection of HFIR targets, and two cells are used for analytical chemistry operations. In addition, there are eight laboratories used for process development, for part of the process-control analyses, and for product finishing operations. Although the Transuranium Processing Plant was built for the purpose of recovering transuranium elements from targets irradiated in the High Flux Isotope Reactor (HFIR), it is also a highly versatile facility which has extensive provisions for changing and modifying equipment. Thus, it was a relatively simple matter to install a Solvent Extraction Test Facility (SETF) in one of the TRU chemical processing cells for use in the evaluation and demonstration of solvent extraction flowsheets for the recovery of fissile and fertile materials from irradiated reactor fuels. The equipment in the SETF has been designed for process development and demonstrations and the particular type of mixer-settler contactors was chosen because it is easy to observe and sample

The meaning and mathematical consequences of linearity (managing without a presumed ability to copy) are studied for a path-based model of processes which is also a model of affine-linear logic. This connection yields an affine-linear language for processes, automatically respecting open......-map bisimulation, in which a range of process operations can be expressed. An operational semantics is provided for the tensor fragment of the language. Different ways to make assemblies of processes lead to different choices of exponential, some of which respect bisimulation....

Following a brief historical introduction into general refractometry, the limiting angle refractometer is dealt with in the first section and the differential refractometer in the second section, as well as process engineering information on this measuring method being given. An attempt is made with an extensive close-to-practice description to introduce the planner and technician to this physical measuring method in process engineering in order that they be able to use it themselves if necessary. When properly applied, it can be a valuable help to process control in compliance with process automization.

Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

within Danish tradition of architecture and construction. The objective of the research presented in this paper, is to compare the different design processes behind the making of passive houses in a Danish context. We evaluated the process with regard to the integrated and traditional design process....... Data analysis showed that the majority of the consortiums worked in an integrated manner; though there was room for improvment. Additionally, the paper discusses the challanges of implementing the integrated design process in practice and suggests ways of overcomming some of the barriers . In doing so...

The facility of the present invention comprises a radioactive liquid storage vessel, an exhaust gas dehumidifying device for dehumidifying gases exhausted from the vessel and an exhaust gas processing device for reducing radioactive materials in the exhaust gases. A purified gas line is disposed to the radioactive liquid storage vessel for purging exhaust gases generated from the radioactive liquid, then dehumidified and condensed liquid is recovered, and exhaust gases are discharged through an exhaust gas pipe disposed downstream of the exhaust gas processing device. With such procedures, the scale of the exhaust gas processing facility can be reduced and exhaust gases can be processed efficiently. (T.M.)

The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

, categories and triggers involved in this process. By applying grounded theory to the analysis of the responses, a process-oriented player engagement framework was developed and four main components consisting of objectives, activities, accomplishments and affects as well as the corresponding categories...

This document represents the roadmap for Processing Technology Research in the US Mining Industry. It was developed based on the results of a Processing Technology Roadmap Workshop sponsored by the National Mining Association in conjunction with the US Department of Energy, Office of Energy Efficiency and Renewable Energy, Office of Industrial Technologies. The Workshop was held January 24 - 25, 2000.

This paper details the development and implementation of a ''Process Control Program'' at Duke Power's three nuclear stations - Oconee, McGuire, and Catawba. Each station is required by Technical Specification to have a ''Process Control Program'' (PCP) to control all dewatering and/or solidification activities for radioactive wastes

The value of the National Association for Developmental Education (NADE) accreditation process is far-reaching. Not only do students and programs benefit from the process, but also the entire institution. Through data collection of student performance, analysis, and resulting action plans, faculty and administrators can work cohesively towards…

ANAV decided to implement process-oriented management by adopting the U. S. NEI (Nuclear Electric Industry) model. The article describes the initial phases of the project, its current status and future prospects. The project has been considered as an improvement in the areas of organization and human factors. Recently, IAEA standard drafts are including processes as an accepted management model. (Author)

Furusawa, Akira [Department of Applied Physics, School of Engineering, The University of Tokyo (Japan)

2014-12-04

I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

Separation processes on an industrial scale comprise well over half of the capital and operating costs. They are basic knowledge in every chemical engineering and process engineering study. This book provides comprehensive and fundamental knowledge of university teaching in this discipline,

A process ts described for obtaining a closely bonded coating of steel or iron on uranium. The process consists of providing, between the steel and uramium. a layer of silver. amd then pressure rolling tbe assembly at about 600 deg C until a reduction of from l0 to 50% has been obtained.

This book presents a process-oriented business modeling framework based on semantic technologies. The framework consists of modeling languages, methods, and tools that allow for semantic modeling of business motivation, business policies and rules, and business processes. Quality of the proposed modeling framework is evaluated based on the modeling content of SAP Solution Composer and several real-world business scenarios.

Affine processes possess the property that expectations of exponential affine transformations are given by a set of Riccati differential equations, which is the main feature of this popular class of processes. In this paper we generalise these results for expectations of more general transformati...

Literature published in 2016 and early 2017 related to food processing wastes treatment for industrial applications are reviewed. This review is a subsection of the Treatment Systems section of the annual Water Environment Federation literature review and covers the following food processing industries and applications: general, meat and poultry, fruits and vegetables, dairy and beverage, and miscellaneous treatment of food wastes.

Business Process Intelligence (BPI,) is an emerging area that is getting increasingly popularfor enterprises. The need to improve business process efficiency, to react quickly to changes and to meet regulatory compliance is among the main drivers for BPI. BPI refers to the application of Business

textabstractA new notion of correctness for concurrent processes is introduced and investigated. It is a relationship P sat S between process terms P built up from operators of CCS [Mi 80], CSP [Ho 85] and COSY [LTS 79] and logical formulas S specifying sets of finite communication sequences as in

Covers a broad spectrum of topics and applications that deal with uranium processing and the properties of uranium Offers extensive coverage of both new and established practices for dealing with uranium supplies in nuclear engineering Promotes the documentation of the state-of-the-art processing techniques utilized for uranium and other specialty metals

The DOE will soon choose between treating contaminated nickel scrap as a legacy waste and developing high-volume nickel decontamination processes. In addition to reducing the volume of legacy wastes, a decontamination process could make 200,000 tons of this strategic metal available for domestic use. Contaminants in DOE nickel scrap include 234 Th, 234 Pa, 137 Cs, 239 Pu (trace), 60 Co, U, 99 Tc, and 237 Np (trace). This report reviews several industrial-scale processes -- electrorefining, electrowinning, vapormetallurgy, and leaching -- used for the purification of nickel. Conventional nickel electrolysis processes are particularly attractive because they use side-stream purification of process solutions to improve the purity of nickel metal. Additionally, nickel purification by electrolysis is effective in a variety of electrolyte systems, including sulfate, chloride, and nitrate. Conventional electrorefining processes typically use a mixed electrolyte which includes sulfate, chloride, and borate. The use of an electrorefining or electrowinning system for scrap nickel recovery could be combined effectively with a variety of processes, including cementation, solvent extraction, ion exchange, complex-formation, and surface sorption, developed for uranium and transuranic purification. Selected processes were reviewed and evaluated for use in nickel side-stream purification. 80 refs

This book tells of synopsis of production process of VE(value engineering), object selection method and establishment of target, collection of object information, design of function, write improvement suggestion, evaluation of improvement suggestion, all sorts of worksheets of production process of VE, explanation of IE, explanation of PERT.

Innovation is often thought of as an outcome. In this chapter, we review the literatures on innovation processes pertaining to the invention, development, and implementation of ideas. In particular, we explore how these processes unfold within firms, across multi-party networks, and within

Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

Works are presented on automation systems for editing and publishing operations by methods of processing symbol information and information contained in training selection (ranking of objectives by promise, classification algorithm of tones and noise). The book will be of interest to specialists in the automation of processing textural information, programming, and pattern recognition.

A process for the preparation of technetium-99m labeled pharmaceuticals is disclosed. The process comprises initially isolating technetium-99m pertechnetate by adsorption upon an adsorbent packing in a chromatographic column. The technetium-99m is then eluted from the packing with a biological compound to form a radiopharmaceutical

The DOE will soon choose between treating contaminated nickel scrap as a legacy waste and developing high-volume nickel decontamination processes. In addition to reducing the volume of legacy wastes, a decontamination process could make 200,000 tons of this strategic metal available for domestic use. Contaminants in DOE nickel scrap include {sup 234}Th, {sup 234}Pa, {sup 137}Cs, {sup 239}Pu (trace), {sup 60}Co, U, {sup 99}Tc, and {sup 237}Np (trace). This report reviews several industrial-scale processes -- electrorefining, electrowinning, vapormetallurgy, and leaching -- used for the purification of nickel. Conventional nickel electrolysis processes are particularly attractive because they use side-stream purification of process solutions to improve the purity of nickel metal. Additionally, nickel purification by electrolysis is effective in a variety of electrolyte systems, including sulfate, chloride, and nitrate. Conventional electrorefining processes typically use a mixed electrolyte which includes sulfate, chloride, and borate. The use of an electrorefining or electrowinning system for scrap nickel recovery could be combined effectively with a variety of processes, including cementation, solvent extraction, ion exchange, complex-formation, and surface sorption, developed for uranium and transuranic purification. Selected processes were reviewed and evaluated for use in nickel side-stream purification. 80 refs.

This paper describes the design of a microcontroller base emulator for a conventional industrial process. The emulator is made with microcontroller and is used for testing and evaluating the performances of the industrial regulators. The parameters of the emulated process are fully customizable online and downloadable thru a serial communication from a personal computer.

In the cementation process study, in 1984, design of the waste treatment simulator was finished for the first step. We can experience not only the operation of solidification system but the design and construction of comming large scale plant through the design of cementation process. (Author)

Apparatus is described for monitoring the processes of a nuclear reactor to detect off-normal operation of any process and for testing the monitoring apparatus. The processes are evaluated by response to their paramters, such as temperature, pressure, etc. The apparatus includes a pair of monitoring paths or signal processing units. Each unit includes facilities for receiving on a time-sharing basis, a status binary word made up of digits each indicating the status of a process, whether normal or off-normal, and test-signal binary words simulating the status binary words. The status words and test words are processed in succession during successive cycles. During each cycle, the two units receive the same status word and the same test word. The test words simulate the status words both when they indicate normal operation and when they indicate off-normal operation. Each signal-processing unit includes a pair of memories. Each memory receives a status word or a test word, as the case may be, and converts the received word into a converted status word or a converted test word. The memories of each monitoring unit operate into a non-coincidence which signals non-coincidence of the converted word out of one memory of a signal-processing unit not identical to the converted word of the other memory of the same unit

This article outlines some cognitive process models of writing composition. Possible reasons why students' writing capabilities do not match their abilities in some other school subjects are explored. Research findings on the efficacy of process approaches to teaching writing are presented and potential shortcomings are discussed. Product-based…

Over time, recommended best practices for crude unit materials selection have evolved to accommodate new operating requirements, feed qualities, and product qualities. The shift to heavier oil processing is one of the major changes in crude feed quality occurring over the last 20 years. The three major types of crude unit corrosion include sulfidation attack, naphthenic acid attack, and corrosion resulting from hydrolyzable chlorides. Heavy oils processing makes all three areas worse. Heavy oils have higher sulfur content; higher naphthenic acid content; and are more difficult to desalt, leading to higher chloride corrosion rates. Materials selection involves two major criteria, meeting required safety standards, and optimizing economics of the overall plant. Proper materials selection is only one component of a plant integrity approach. Materials selection cannot eliminate all corrosion. Proper materials selection requires appropriate support from other elements of an integrity protection program. The elements of integrity preservation include: materials selection (type and corrosion allowance); management limits on operating conditions allowed; feed quality control; chemical additives for corrosion reduction; and preventive maintenance and inspection (PMI). The following discussion must be taken in the context of the application of required supporting work in all the other areas. Within that context, specific materials recommendations are made to minimize corrosion due to the most common causes in the crude unit. (author)

In a system for the application of high temperature heat from the HTR one must distinguish between the current generation and the use of process heat. In this respect it is important that the current can be generated by dual purpose power plants. The process heat is used as sensible heat, vaporisation heat and as chemical energy at the chemical conversion for the conversion of raw materials, the refinement of fossil primary energy carriers and finally circuit processes for the fission of water. These processes supply the market for heat, fuels, motor fuels and basic materials. Fifteen examples of HTR heat processes from various projects and programmes are presented in form of energy balances, however in a rather short way. (orig./DG) [de

In this paper, we propose a class of non-Gaussian stationary increment processes, named nonhomogeneous fractional Poisson processes W{sub H}{sup (j)}(t), which permit the study of the effects of long-range dependance in a large number of fields including quantum physics and finance. The processes W{sub H}{sup (j)}(t) are self-similar in a wide sense, exhibit more fatter tail than Gaussian processes, and converge to the Gaussian processes in distribution in some cases. In addition, we also show that the intensity function {lambda}(t) strongly influences the existence of the highest finite moment of W{sub H}{sup (j)}(t) and the behaviour of the tail probability of W{sub H}{sup (j)}(t)

In this paper, we propose a class of non-Gaussian stationary increment processes, named nonhomogeneous fractional Poisson processes W H (j) (t), which permit the study of the effects of long-range dependance in a large number of fields including quantum physics and finance. The processes W H (j) (t) are self-similar in a wide sense, exhibit more fatter tail than Gaussian processes, and converge to the Gaussian processes in distribution in some cases. In addition, we also show that the intensity function λ(t) strongly influences the existence of the highest finite moment of W H (j) (t) and the behaviour of the tail probability of W H (j) (t)

The introduction of on-line sensors for monitoring of nutrient salts concentrations on wastewater treatment plants with nutrient removal, opens a wide new area of modelling wastewater processes. The subject of this thesis is the formulation of operational dynamic models based on time series...... of ammonia, nitrate, and phosphate concentrations, which are measured in the aeration tanks of the biological nutrient removal system. The alternatign operation modes of the BIO-DENITRO and BIO-DENIPHO processes are of particular interest. Time series models of the hydraulic and biological processes are very......-known theory of the processes with the significant effects found in data. These models are called grey box models, and they contain rate expressions for the processes of influent load of nutrients, transport of nutrients between the aeration tanks, hydrolysis and growth of biomass, nitrification...

This is the second of two volumes containing the revised and completed notes of lectures given at the school "Quantum Independent Increment Processes: Structure and Applications to Physics". This school was held at the Alfried-Krupp-Wissenschaftskolleg in Greifswald in March, 2003, and supported by the Volkswagen Foundation. The school gave an introduction to current research on quantum independent increment processes aimed at graduate students and non-specialists working in classical and quantum probability, operator algebras, and mathematical physics. The present second volume contains the following lectures: "Random Walks on Finite Quantum Groups" by Uwe Franz and Rolf Gohm, "Quantum Markov Processes and Applications in Physics" by Burkhard Kümmerer, Classical and Free Infinite Divisibility and Lévy Processes" by Ole E. Barndorff-Nielsen, Steen Thorbjornsen, and "Lévy Processes on Quantum Groups and Dual Groups" by Uwe Franz.

Second and third generation bioethanol and biodiesel are more environmentally friendly fuels than gasoline and petrodiesel, andmore sustainable than first generation biofuels. However, their production processes are more complex and more expensive. In this chapter, we describe a two-stage synthesis......% used for bioethanol process), and steam and electricity from combustion (54%used as electricity) in the bioethanol and biodiesel processes. In the second stage, we saved about 5% in equipment costs and 12% in utility costs for bioethanol separation. This dual synthesis methodology, consisting of a top......-level screening task followed by a down-level intensification task, proved to be an efficient methodology for integrated biofuel process synthesis. The case study illustrates and provides important insights into the optimal synthesis and intensification of biofuel production processes with the proposed synthesis...

This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

This book introduces beryllium; its history, its chemical, mechanical, and physical properties including nuclear properties. The 29 chapters include the mineralogy of beryllium and the preferred global sources of ore bodies. The identification and specifics of the industrial metallurgical processes used to form oxide from the ore and then metal from the oxide are thoroughly described. The special features of beryllium chemistry are introduced, including analytical chemical practices. Beryllium compounds of industrial interest are identified and discussed. Alloying, casting, powder processing, forming, metal removal, joining and other manufacturing processes are covered. The effect of composition and process on the mechanical and physical properties of beryllium alloys assists the reader in material selection. The physical metallurgy chapter brings conformity between chemical and physical metallurgical processing of beryllium, metal, alloys, and compounds. The environmental degradation of beryllium and its all...

This paper is about the own development of business support software. The developed applications are used to support two business processes: one of them is the process of gas transportation and the other is the natural gas processing. This software has interphases with the ERP SAP, software SCADA and on line gas transportation simulation software. The main functionalities of the applications are: entrance on line real time of clients transport nominations, transport programming, allocation of the clients transport nominations, transport control, measurements, balanced pipeline, allocation of gas volume to the gas processing plants, calculate of product tons processed in each plant and tons of product distributed to clients. All the developed software generates information to the internal staff, regulatory authorities and clients. (author)

This book provides a theoretical background of branching processes and discusses their biological applications. Branching processes are a well-developed and powerful set of tools in the field of applied probability. The range of applications considered includes molecular biology, cellular biology, human evolution and medicine. The branching processes discussed include Galton-Watson, Markov, Bellman-Harris, Multitype, and General Processes. As an aid to understanding specific examples, two introductory chapters, and two glossaries are included that provide background material in mathematics and in biology. The book will be of interest to scientists who work in quantitative modeling of biological systems, particularly probabilists, mathematical biologists, biostatisticians, cell biologists, molecular biologists, and bioinformaticians. The authors are a mathematician and cell biologist who have collaborated for more than a decade in the field of branching processes in biology for this new edition. This second ex...

There has been tremendous development within measurement science and technology over the past couple of decades. New sensor technologies and compact versatile signal recovery electronics are continuously expanding the limits of what can be measured and the accuracy with which this can be done. Miniaturization of sensors and the use of nanotechnology push these limits further. Also, thanks to powerful and cost-effective computer systems, sophisticated measurement and reconstruction algorithms previously only accessible in advanced laboratories are now available for in situ online measurement systems. The process industries increasingly require more process-related information, motivated by key issues such as improved process control, process utilization and process yields, ultimately driven by cost-effectiveness, quality assurance, environmental and safety demands. Industrial process tomography methods have taken advantage of the general progress in measurement science, and aim at providing more information, both quantitatively and qualitatively, on multiphase systems and their dynamics. The typical approach for such systems has been to carry out one local or bulk measurement and assume that this is representative of the whole system. In some cases, this is sufficient. However, there are many complex systems where the component distribution varies continuously and often unpredictably in space and time. The foundation of industrial tomography is to conduct several measurements around the periphery of a multiphase process, and use these measurements to unravel the cross-sectional distribution of the process components in time and space. This information is used in the design and optimization of industrial processes and process equipment, and also to improve the accuracy of multiphase system measurements in general. In this issue we are proud to present a selection of the 145 papers presented at the 5th World Congress on Industrial Process Tomography in Bergen

Full Text Available Formal reasoning about distributed algorithms (like Consensus typically requires to analyze global states in a traditional state-based style. This is in contrast to the traditional action-based reasoning of process calculi. Nevertheless, we use domain-specific variants of the latter, as they are convenient modeling languages in which the local code of processes can be programmed explicitly, with the local state information usually managed via parameter lists of process constants. However, domain-specific process calculi are often equipped with (unlabeled reduction semantics, building upon a rich and convenient notion of structural congruence. Unfortunately, the price for this convenience is that the analysis is cumbersome: the set of reachable states is modulo structural congruence, and the processes' state information is very hard to identify. We extract from congruence classes of reachable states individual state-informative representatives that we supply with a proper formal semantics. As a result, we can now freely switch between the process calculus terms and their representatives, and we can use the stateful representatives to perform assertional reasoning on process calculus models.

Deals with the main commercially significant and commonly used welding processes. This title takes the student or novice welder through the individual steps involved in each process in an easily understood way. It covers many of the requirements referred to in European Standards including EN719, EN 729, EN 729 and EN 287.$bWelding processes handbook is a concise, explanatory guide to the main commercially significant and commonly-used welding processes. It takes the novice welder or student through the individual steps involved in each process in a clear and easily understood way. It is intended to provide an up-to-date reference to the major applications of welding as they are used in industry. The contents have been arranged so that it can be used as a textbook for European welding courses in accordance with guidelines from the European Welding Federation. Welding processes and equipment necessary for each process are described so that they can be applied to all instruction levels required by the EWF and th...

Nuclear activities produce organic waste compatible with thermal processes designed to obtain a significant weight and volume reduction as well as to stabilize the inorganic residue in a form suitable for various interim storage or disposal routes. Several processes may be implemented (e.g. excess air, plasma, fluidized bed or rotating furnace) depending on the nature of the waste and the desired objectives. The authors focus on the IRIS rotating-kiln process, which was used for the first time with radioactive materials during the first half of 1999. IRIS is capable of processing highly chlorinated and {alpha}-contaminated waste at a rate of several kilograms per hour, while limiting corrosion due to chlorine as well as mechanical entrainment of radioactive particles in the off-gas stream. Although operated industrially, the process is under continual development to improve its performance and adapt it to a wider range of industrial applications. The main focus of attention today is on adapting the pyrolytic processes to waste with highly variable compositions and to enhance the efficiency of the off-gas purification systems. These subjects are of considerable interest for a large number of heat treatment processes (including all off-gas treatment systems) for which extremely durable, high-performance and low-flow electrostatic precipitators are now being developed. (author)

Nuclear activities produce organic waste compatible with thermal processes designed to obtain a significant weight and volume reduction as well as to stabilize the inorganic residue in a form suitable for various interim storage or disposal routes. Several processes may be implemented (e.g. excess air, plasma, fluidized bed or rotating furnace) depending on the nature of the waste and the desired objectives. The authors focus on the IRIS rotating-kiln process, which was used for the first time with radioactive materials during the first half of 1999. IRIS is capable of processing highly chlorinated and α-contaminated waste at a rate of several kilograms per hour, while limiting corrosion due to chlorine as well as mechanical entrainment of radioactive particles in the off-gas stream. Although operated industrially, the process is under continual development to improve its performance and adapt it to a wider range of industrial applications. The main focus of attention today is on adapting the pyrolytic processes to waste with highly variable compositions and to enhance the efficiency of the off-gas purification systems. These subjects are of considerable interest for a large number of heat treatment processes (including all off-gas treatment systems) for which extremely durable, high-performance and low-flow electrostatic precipitators are now being developed. (author)

In most of the large companies IT project prioritization process is designed based on principles of evidencebased management. We investigate a case of IT project prioritization in a financial institution, and in particular, how managers practice evidence-based management during this process. We use...... a rich dataset built from a longitudinal study of the prioritization process for the IT projects. Our findings indicate that managers reach a decision not only by using evidence but from the interplay between the evidence and the judgment devices that managers employ. The interplay between evidence...

One of the most important goals of an introductory programming course is that the students learn a systematic approach to the development of computer programs. Revealing the programming process is an important part of this; however, textbooks do not address the issue -- probably because...... the textbook medium is static and therefore ill-suited to expose the process of programming. We have found that process recordings in the form of captured narrated programming sessions are a simple, cheap, and efficient way of providing the revelation.We identify seven different elements of the programming...

Performance assessment is the process used to evaluate the environmental consequences of disposal of radioactive waste in the biosphere. An introductory review of the subject is presented. Emphasis is placed on the process of performance assessment from the standpoint of defining the process. Performance assessment, from evolving experience at DOE sites, has short-term and long-term subprograms, the components of which are discussed. The role of mathematical modeling in performance assessment is addressed including the pros and cons of current approaches. Finally, the system/site/technology issues as the focal point of this symposium are reviewed

Semi-Markov Processes: Applications in System Reliability and Maintenance is a modern view of discrete state space and continuous time semi-Markov processes and their applications in reliability and maintenance. The book explains how to construct semi-Markov models and discusses the different reliability parameters and characteristics that can be obtained from those models. The book is a useful resource for mathematicians, engineering practitioners, and PhD and MSc students who want to understand the basic concepts and results of semi-Markov process theory. Clearly defines the properties and

Kinetic Theory, Volume 2: Irreversible Processes deals with the kinetic theory of gases and the irreversible processes they undergo. It includes the two papers by James Clerk Maxwell and Ludwig Boltzmann in which the basic equations for transport processes in gases are formulated, together with the first derivation of Boltzmann's ""H-theorem"" and a discussion of this theorem, along with the problem of irreversibility.Comprised of 10 chapters, this volume begins with an introduction to the fundamental nature of heat and of gases, along with Boltzmann's work on the kinetic theory of gases and s

Described herein are processes for converting a biomass starting material (such as lignocellulosic materials) into a low oxygen containing, stable liquid intermediate that can be refined to make liquid hydrocarbon fuels. More specifically, the process can be a catalytic biomass pyrolysis process wherein an oxygen removing catalyst is employed in the reactor while the biomass is subjected to pyrolysis conditions. The stream exiting the pyrolysis reactor comprises bio-oil having a low oxygen content, and such stream may be subjected to further steps, such as separation and/or condensation to isolate the bio-oil.

The Defense Waste Processing Facility (DWPF) for waste vitrification at the Savannah River Plant (SRP) is in the final design stage. Instrumentation to provide the parameter sensing required to assure the quality of the two-foot-diameter, ten-foot-high waste canister is in the final stage of development. All step of the process and instrumentation are now operating as nearly full-scale prototypes at SRP. Quality will be maintained by assuring that only the intended material enters the canisters, and by sensing the resultant condition of the filled canisters. Primary emphasis will be on instrumentation of the process

Three advanced Uranium enrichment processes are dealt with in the report: AVLIS (Atomic Vapour LASER Isotope Separation), MLIS (Molecular LASER Isotope Separation) and PSP (Plasma Separation Process). The description of the physical and technical features of the processes constitutes a major part of the report. If further presents comparisons with existing industrially used enrichment technologies, gives information on actual development programmes and budgets and ends with a chapter on perspectives and conclusions. An extensive bibliography of the relevant open literature is added to the different subjects discussed. The report was drawn up by the nuclear research Centre (CEA) Saclay on behalf of the Commission of the European Communities

VLSI Electronics: Microstructure Science, Volume 8: Plasma Processing for VLSI (Very Large Scale Integration) discusses the utilization of plasmas for general semiconductor processing. It also includes expositions on advanced deposition of materials for metallization, lithographic methods that use plasmas as exposure sources and for multiple resist patterning, and device structures made possible by anisotropic etching.This volume is divided into four sections. It begins with the history of plasma processing, a discussion of some of the early developments and trends for VLSI. The second section

When granular materials comprising radioactive wastes containing phosphorus are processed at first in a fluidized bed type furnace, if the granular materials are phosphorus-containing activated carbon, granular materials comprising alkali compound such as calcium hydroxide and barium hydroxide are used as fluidizing media. Even granular materials of slow burning speed can be burnt stably in a fluidizing state by high temperature heat of the fluidizing media, thereby enabling to take a long burning processing time. Accordingly, radioactive activated carbon wastes can be processed by burning treatment. (T.M.)

This paper analyses and compares the transnational learning processes in the employment field in the European Union and among the Nordic countries. Based theoretically on a social constructivist model of learning and methodologically on a questionnaire distributed to the relevant participants......, a number of hypotheses concerning transnational learning processes are tested. The paper closes with a number of suggestions regarding an optimal institutional setting for facilitating transnational learning processes.Key words: Transnational learning, Open Method of Coordination, Learning, Employment......, European Employment Strategy, European Union, Nordic countries....

Learn computer programming the easy way with Processing, a simple language that lets you use code to create drawings, animation, and interactive graphics. Programming courses usually start with theory, but this book lets you jump right into creative and fun projects. It's ideal for anyone who wants to learn basic programming, and serves as a simple introduction to graphics for people with some programming skills. Written by the founders of Processing, this book takes you through the learning process one step at a time to help you grasp core programming concepts. You'll learn how to sketch wi

Quantum processing and communication is emerging as a challenging technique at the beginning of the new millennium. This is an up-to-date insight into the current research of quantum superposition, entanglement, and the quantum measurement process - the key ingredients of quantum information processing. The authors further address quantum protocols and algorithms. Complementary to similar programmes in other countries and at the European level, the German Research Foundation (DFG) started a focused research program on quantum information in 1999. The contributions - written by leading experts - bring together the latest results in quantum information as well as addressing all the relevant questions

This final report documents the development and installation of software and hardware for Robotic Welding Process Control. Primary emphasis is on serial communications between the CYRO 750 robotic welder, Heurikon minicomputer running Hunter & Ready VRTX, and an IBM PC/AT, for offline programming and control and closed-loop welding control. The requirements for completion of the implementation of the Rocketdyne weld tracking control are discussed. The procedure for downloading programs from the Intergraph, over the network, is discussed. Conclusions are made on the results of this task, and recommendations are made for efficient implementation of communications, weld process control development, and advanced process control procedures using the Heurikon.

The high cost of laser energy is the crucial issue in any potential laser-processing application. It is expensive relative to other forms of energy and to most bulk chemicals. We show those factors that have previously frustrated attempts to find commercially viable laser-induced processes for the production of materials. Having identified the general criteria to be satisfied by an economically successful laser process and shown how these imply the laser-system requirements, we present a status report on the uranium laser isotope separation (LIS) program at the Lawrence Livermore National Laboratory

During the past few years significant advances have taken place in the different areas of dosimetry for radiation processing, mainly stimulated by the increased interest in radiation for food preservation, plastic processing and sterilization of medical products. Reference services both by international organizations (IAEA) and national laboratories have helped to improve the reliability of dose measurements. In this paper the special features of radiation processing dosimetry are discussed, several commonly used dosimeters are reviewed, and factors leading to traceable and reliable dosimetry are discussed. (author)

The SILVA isotopic laser separation process of atomic uranium vapor requires the use of specific high power visible light laser devices and systems for uranium evaporation and management (separation modules). The CEA, in collaboration with industrialists, has developed these components and built some demonstration plants. The scientific and technological challenges raised by this process are now surmounted. The principle of the SILVA process is the selective photo-ionization of uranium isotopes using laser photon beams tuned to the exact excitation frequency of the isotope electron layers. This paper describes the principle of the SILVA process (lasers and separator), the technical feasibility and actual progress of the program and its future steps, its economical stakes, and the results obtained so far. (J.S.). 2 figs., 2 photos

The purpose of this research is to identify and develop cognitive information processing systems and algorithms that can be implemented with novel architectures and devices with the goal of achieving...

Department of Homeland Security — This map layer includes nonferrous metal processing plants in the United States. The data represent commodities covered by the Minerals Information Team (MIT) of the...

Building upon the previous editions, this textbook is a first course in stochastic processes taken by undergraduate and graduate students (MS and PhD students from math, statistics, economics, computer science, engineering, and finance departments) who have had a course in probability theory. It covers Markov chains in discrete and continuous time, Poisson processes, renewal processes, martingales, and option pricing. One can only learn a subject by seeing it in action, so there are a large number of examples and more than 300 carefully chosen exercises to deepen the reader’s understanding. Drawing from teaching experience and student feedback, there are many new examples and problems with solutions that use TI-83 to eliminate the tedious details of solving linear equations by hand, and the collection of exercises is much improved, with many more biological examples. Originally included in previous editions, material too advanced for this first course in stochastic processes has been eliminated while treatm...

. These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety......This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

The hyperspectral digital imagery collection experiment (HYDICE) sensor records instrument counts for scene data, in-flight spectral and radiometric calibration sequences, and dark current levels onto an AMPEX DCRsi data tape. Following flight, the HYDICE ground data processing subsystem (GDPS) transforms selected scene data from digital numbers (DN) to calibrated radiance levels at the sensor aperture. This processing includes: dark current correction, spectral and radiometric calibration, conversion to radiance, and replacement of bad detector elements. A description of the algorithms for post-flight data processing is presented. A brief analysis of the original radiometric calibration procedure is given, along with a description of the development of the modified procedure currently used. Example data collected during the 1995 flight season, but uncorrected and processed, are shown to demonstrate the removal of apparent sensor artifacts (e.g., non-uniformities in detector response over the array) as a result of this transformation.

This dissertation presents our investigation on how to efficiently exploit reconfigurable hardware to design flexible, high performance, and power efficient network devices capable to adapt to varying processing requirements of network applications and traffic. The proposed reconfigurable network

Full Text Available In this paper researchers discuss a number of structural problems that are faced with when designing a machine-oriented controlled natural language for Afrikaans taking the underlying principles of Attempto Controlled English (ACE) and Processable...

State of electric discharge is detected based on a gas pressure in a sealed container and a discharging current flowing between both of electrodes. When electric arc discharges occur, introduction of gases to be processed is stopped and a voltage applied to both of the electrodes is interrupted. Then, when the gas pressure in the sealed container is lowered to a predetermined value, a power source voltage is applied again to both of the electrodes to recover glow discharges, and the introduction of the gas to be processed is started. With such steps, even if electric arc discharges occur, they are eliminated automatically and, accordingly, normal glow discharges can be recovered, to prevent failures of the device due to electric arc discharges. The glow discharges are recovered automatically without stopping the operation of the gas processing device, and gas injection and solidification processing can be conducted continuously and stably. (T.M.)

Cooperative processing for the 1990's using client-server technology is addressed. The main theme is concepts of downsizing from mainframes and minicomputers to workstations on a local area network (LAN). This document is presented in view graph form.

Economic scale of radiation application in the field of industry, agriculture and medicine in Japan in 1997 was investigated to compare its economic impacts with that of nuclear energy industry. Total production value of radiation application accounted for 54% of nuclear industry including nuclear energy industry and radiation applications in three fields above. Industrial radiation applications were further divided into five groups, namely nondestructive test, RI instruments, radiation facilities, radiation processing and ion beam processing. More than 70% of the total production value was brought about by ion beam processing for use with IC and semiconductors. Future economic prospect of radiation processing of polymers, for example cross-linking, EB curing, graft polymerization and degradation, is reviewed. Particular attention was paid to radiation vulcanization of natural rubber latex and also to degradation of natural polymers. (S. Ohno)

Purpose – The purpose of this paper is to learn more about logistics innovation processes and their implications for the focal organization as well as the supply chain, especially suppliers. Design/methodology/approach – The empirical basis of the study is a longitudinal action research project...... that was triggered by the practical needs of new ways of handling material flows of a hospital. This approach made it possible to revisit theory on logistics innovation process. Findings – Apart from the tangible benefits reported to the case hospital, five findings can be extracted from this study: the logistics...... innovation process model may include not just customers but also suppliers; logistics innovation in buyer-supplier relations may serve as an alternative to outsourcing; logistics innovation processes are dynamic and may improve supplier partnerships; logistics innovations in the supply chain are as dependent...

A timely and important book addressing a variety of acoustic signal processing problems under multiple-input multiple-output (MIMO) scenarios. It uniquely investigates these problems within a unified framework offering a novel and penetrating analysis.

Radiation processing is a relatively young industry with broad applications and considerable commercial success. Dosimetry provides an independent and effective way of developing and controlling many industrial processes. In the sterilization of medical devices and in food irradiation, where the radiation treatment impacts directly on public health, the measurements of dose provide the official means of regulating and approving its use. In this respect, dosimetry provides the operator with a means of characterizing the facility, of proving that products are treated within acceptable dose limits and of controlling the routine operation. This book presents an up-to-date review of the theory, data and measurement techniques for radiation processing dosimetry in a practical and useful way. It is hoped that this book will lead to improved measurement procedures, more accurate and precise dosimetry and a greater appreciation of the necessity of dosimetry for radiation processing. (author)

Full Text Available The article deals with the theoretical basis of the simulation. The study shows the simulation of logistic processes in industrial countries is an integral part of many economic projects aimed at the creation or improvement of logistics systems. The paper was used model Beer Game for management of logistics processes in the enterprise. The simulation model implements in AnyLogic package. AnyLogic product allows us to consider the logistics processes as an integrated system, which allows reaching better solutions. Logistics process management involves pooling the sales market, production and distribution to ensure the temporal level of customer service at the lowest cost overall. This made it possible to conduct experiments and to determine the optimal size of the warehouse at the lowest cost.

Completely self-contained-and heavily illustrated-this introduction to basic concepts and methodologies for digital image processing is written at a level that truly is suitable for seniors and first...

Optical time lenses have proven to be very versatile for advanced optical signal processing. Based on a controlled interplay between dispersion and phase-modulation by e.g. four-wave mixing, the processing is phase-preserving, an hence useful for all types of data signals including coherent multi......-level modulation founats. This has enabled processing of phase-modulated spectrally efficient data signals, such as orthogonal frequency division multiplexed (OFDM) signa In that case, a spectral telescope system was used, using two time lenses with different focal lengths (chirp rates), yielding a spectral...... regeneratio These operations require a broad bandwidth nonlinear platform, and novel photonic integrated nonlinear platform like aluminum gallium arsenide nano-waveguides used for 1.28 Tbaud optical signal processing will be described....

Full Text Available This chapter presents an overview on the compounding and processing techniques of natural rubber compounds. The introductory portion deals with different types of rubbers and principles of rubber compounding. The primary and secondary fillers used...

Economic scale of radiation application in the field of industry, agriculture and medicine in Japan in 1997 was investigated to compare its economic impacts with that of nuclear energy industry. Total production value of radiation application accounted for 54% of nuclear industry including nuclear energy industry and radiation applications in three fields above. Industrial radiation applications were further divided into five groups, namely nondestructive test, RI instruments, radiation facilities, radiation processing and ion beam processing. More than 70% of the total production value was brought about by ion beam processing for use with IC and semiconductors. Future economic prospect of radiation processing of polymers, for example cross-linking, EB curing, graft polymerization and degradation, is reviewed. Particular attention was paid to radiation vulcanization of natural rubber latex and also to degradation of natural polymers. (S. Ohno)

Reasons of the development of desalination processes, the modern desalination technologies, such as multi-stage flash evaporation, multi-effect distillation, reverse osmosis, and the prospects of using nuclear power for desalination purposes are discussed. 9 refs

This short course is divided into three sections devoted respectively to the physics of the process, some practical problems raised by the design of a centrifuge and the present situation of centrifugation in the World. 31 figs., 18 refs

National Oceanic and Atmospheric Administration, Department of Commerce — Collection of annual data on processed seafood products. The Division provides authoritative advice, coordination and guidance on matters related to the collection,...

National Aeronautics and Space Administration — The MPC (Model Process Control) language enables the capture, communication and preservation of a simulation instance, with sufficient detail that it can be...

...-year graduate students in almost any technical discipline. The leading textbook in its field for more than twenty years, it continues its cutting-edge focus on contemporary developments in all mainstream areas of image processing-e.g...

Evaporation has been an established technology in the metal finishing industry for many years. In this process, wastewaters containing reusable materials, such as copper, nickel, or chromium compounds are heated, producing a water vapor that is continuously removed and condensed....

Cooling Transformation diagram ( CCT diagram ) When a IT diagram is used in the heat process modelling, we suppose that a sudden cooling (instantaneous...processes. CE, chooses instead to study thermo-mechanical properties referring to a CCT diagram . This is thinked to be more reliable to give a true...k , mm-_____sml l ml A I 1 III 12.4 This determination is however based on the following approximations: i) A CCT diagram is valid only for the

To ensure proper radioactive drug use (such as quality, diagnostic improvement, and minimal radioactive exposure), the Food and Drug Administration evaluates new drugs with respect to safety, effectiveness, and accuracy and adequacy of the labeling. The IND or NDA process is used for this purpose. A brief description of the process, including the Chemical Classification System and the therapeutic potential classification, is presented as it applies to radiopharmaceuticals. Also, the status of the IND or NDA review of radiopharmaceuticals is given

A conditioning process is qualified by the PTB if the execution of pre-treatment and conditioning occurs so that a safe and orderly final storage of the products and waste containers produced can be assumed. All the relevant operating conditions for the plant are laid down by the producer/conditioner of the waste in a handbook. The elements of product inspection by process qualification are shown in tabular form. (DG) [de

A treatment process for a hydrogen-containing off-gas stream from a refinery, petrochemical plant or the like. The process includes three separation steps: condensation, membrane separation and hydrocarbon fraction separation. The membrane separation step is characterized in that it is carried out under conditions at which the membrane exhibits a selectivity in favor of methane over hydrogen of at least about 2.5.

This paper describes the development of a design and prototype production system for novel structural use of networked small components of wood deploying elastic and plastic bending. The design process engaged with a significant number of different overlapping and interrelated design criteria...... simulation and feedback. The outcome was amplified through carrying out the research over a series of workshops with distinct foci and participation. Two full scale demonstrators have so far been constructed and exhibited as outputs of the process....

Embodiments of the present disclosure provide for methods of using a catalytic system to chemically transform a compound (e.g., a hydrocarbon). In an embodiment, the method does not employ grafting the catalyst prior to catalysis. In particular, embodiments of the present disclosure provide for a process of hydrocarbon (e.g., C1 to C20 hydrocarbon) metathesis (e.g., alkane, olefin, or alkyne metathesis) transformation, where the process can be conducted without employing grafting prior to catalysis.

We discuss strategic renewal from a competence perspective. We argue that the management of speed and timing in this process is viewed distinctively when perceived through a cognitive lens. Managers need more firmly grounded process-understanding. The key idea of this paper is to dynamically conceptualize key activities of strategic renewal, and possible sources of break-down as they relate to the managment of speed and timing. Based on a case from the media industry, we identi...

The goals of this first Gedepeon workshop on hydrogen production processes are: to stimulate the information exchange about research programs and research advances in the domain of hydrogen production processes, to indicate the domains of interest of these processes and the potentialities linked with the coupling of a nuclear reactor, to establish the actions of common interest for the CEA, the CNRS, and eventually EDF, that can be funded in the framework of the Gedepeon research group. This document gathers the slides of the 17 presentations given at this workshop and dealing with: the H 2 question and the international research programs (Lucchese P.); the CEA's research program (Lucchese P., Anzieu P.); processes based on the iodine/sulfur cycle: efficiency of a facility - flow-sheets, efficiencies, hard points (Borgard J.M.), R and D about the I/S cycle: Bunsen reaction (Colette S.), R and D about the I/S cycle: the HI/I 2 /H 2 O system (Doizi D.), demonstration loop/chemical engineering (Duhamet J.), materials and corrosion (Terlain A.); other processes under study: the Westinghouse cycle (Eysseric C.), other processes under study at the CEA (UT3, plasma,...) (Lemort F.), database about thermochemical cycles (Abanades S.), Zn/ZnO cycle (Broust F.), H 2 production by cracking, high temperature reforming with carbon trapping (Flamant G.), membrane technology (De Lamare J.); high-temperature electrolysis: SOFC used as electrolyzers (Grastien R.); generic aspects linked with hydrogen production: technical-economical evaluation of processes (Werkoff F.), thermodynamic tools (Neveu P.), the reactor-process coupling (Aujollet P.). (J.S.)

This book gives descriptions of qualifying subject and test scope like production plan and control, economic feasibility, process management, quality management and operations research, industrial economics like materials and marketing management, production management such as meaning and goals of process management and production plan and control, basic economic concept, official interest and equivalence, and depreciation, and OR concept such as network analysis and PERT CPM and stimulation.

This book gives descriptions of processing harmful waste, including concerned law and definition of harmful waste, current conditions and generation of harmful waste in Korea, international condition of harmful waste, minimizing of generation of harmful waste, treatment and storage. It also tells of basic science for harmful waste disposal with physics, chemistry, combustion engineering, microbiology and technique of disposal such as physical, chemical, biological process, stabilizing and solidification, incineration and waste in landfill.

Natural information processing systems such as biological evolution and human cognition organize information used to govern the activities of natural entities. When dealing with biologically secondary information, these systems can be specified by five common principles that we propose underlie natural information processing systems. The principles equate: (1) human long-term memory with a genome; (2) learning from other humans with biological reproduction; (3) problem solving through random ...

An overview of the recent progress in the area of digital processing of binary images in the context of document processing is presented here. The topics covered include input scan, adaptive thresholding, halftoning, scaling and resolution conversion, data compression, character recognition, electronic mail, digital typography, and output scan. Emphasis has been placed on illustrating the basic principles rather than descriptions of a particular system. Recent technology advances and research in this field are also mentioned.

Full Text Available Experience from prior successful sale of many companies from different business activities, tells us that it is necessary to create approach system, flexible to different buyers and environment. The base of this system is a belief that salesmen can stimulate big buyers to make buying decisions, if the selling process is done well. Emphasis is made on practical selling techniques which are used in the whole selling process.

Synroc is a titanate-based ceramic material currently being developed for immobilizing high-level nuclear reactor wastes in solid form. Synroc D is a unique variation of Synroc. It can contain the high-level defense wastes, particularly those in storage at the Savannah River Plant. In this report, we review the early development of the initial Synroc process, discuss modification and other options that simplify it overall, and recommend the future direction of research and development in the processing area. A reference Synroc process is described briefly and contrasted with the Savannah River Laboratory glass-based reference case. Preliminary engineering layouts show Synroc to be a more complex processing operation and, thus, more expensive than the glass-based process. However, we believe that simplifications, which will significantly reduce the cost difference, are possible. Further research and development will continue in the areas of slurry processing, fluidized bed calcination, and mineralization. This last will use sintering, hot uniaxial pressing, or hot isostatic pressing

The use of ultrasound to promote chemical reactions or sono chemistry is a field of chemistry which involves the process of acoustic cavitations i.e. the collapse of microscopic bubbles in liquid. There are two essential components for the application of sono chemistry, a liquid medium and a source of high-energy vibrations. The liquid medium is necessary because sono chemistry is driven by acoustic cavitations that can only occur in liquids. The source of the vibrational energy is the transducer. The chemical effects of ultrasound include the enhancement of reaction rates at ambient temperatures and striking advancements in stoichiometric and catalytic reactions In some cases, ultrasonic irradiation can increase reactivities by nearly million fold. The ultrasound has large number of applications not only in emending old chemical processes but also in developing new synthetic strategies. Ultrasound enhances all chemical and physical processes e.g., crystallization, vitamin synthesis, preparation of catalysts, dissolution of chemicals, organometallic reactions, electrochemical processes, etc. High-power ultrasonics is a new powerful technology that is not only safe and environmentally friendly in its application but is also efficient and economical. It can be applied to existing processes to eliminate the need for chemicals and/or heat application in a variety of industrial processes. (author)

Innovation leadership has traditionally been focused on leading the companies' product development fast, cost effectively and with an optimal performance driven by technological inventions or by customers´ needs. To improve the efficiency of the product development process focus has been on diffe......Innovation leadership has traditionally been focused on leading the companies' product development fast, cost effectively and with an optimal performance driven by technological inventions or by customers´ needs. To improve the efficiency of the product development process focus has been...... on different types of organisational setup to the product development model and process. The globalization and enhanced competitive markets are however changing the innovation game and the challenge to innovation leadership Excellent product development innovation and leadership seems not any longer to enough...... another outlook to future innovation leadership - Customer Innovation Process Leadership - CIP-leadership. CIP-leadership moves the company's innovation process closer to the customer innovation process and discusses how companies can be involved and innovate in customers' future needs and lead...

Full Text Available In the era of research infrastructures and big data, sophisticated data management practices are becoming essential building blocks of successful science. Most practices follow a data-centric approach, which does not take into account the processes that created, analysed and presented the data. This fact limits the possibilities for reliable verification of results. Furthermore, it does not guarantee the reuse of research, which is one of the key aspects of credible data-driven science. For that reason, we propose the introduction of the new concept of Process Management Plans, which focus on the identification, description, sharing and preservation of the entire scientific processes. They enable verification and later reuse of result data and processes of scientific experiments. In this paper we describe the structure and explain the novelty of Process Management Plans by showing in what way they complement existing Data Management Plans. We also highlight key differences, major advantages, as well as references to tools and solutions that can facilitate the introduction of Process Management Plans.

Pennsylvania is committed to finding a site for a low-level radioactive waste (LLRW) disposal facility through an innovative voluntary process. The Pennsylvania Department of Environmental Protection (DEP) and Chem-Nuclear Systems, Inc. (CNSI) developed the Community Partnering Plan with extensive public participation. The Community Partnering Plan outlines a voluntary process that empowers municipalities to evaluate the advantages and disadvantages of hosting the facility. DEP and CNSI began developing the Community Partnering Plan in July 1995. Before then, CNSI was using a screening process prescribed by state law and regulations to find a location for the facility. So far, approximately 78 percent of the Commonwealth has been identified as disqualified as a site for the LLRW disposal facility. The siting effort will now focus on identifying volunteer host municipalities in the remaining 22 percent of the state. This combination of technical screening and voluntary consideration makes Pennsylvania's process unique. A volunteered site will have to meet the same tough requirements for protecting people and the environment as a site chosen through the screening process. Protection of public health and safety continues to be the foundation of the state's siting efforts. The Community Partnering Plan offers a window of opportunity. If Pennsylvania does not find volunteer municipalities with suitable sites by the end of 1997, it probably will return to a technical screening process

In distributed business process support environments, process interference from multiple stakeholders may cause erroneous process outcomes. Existing solutions to detect and correct interference at runtime employ formal verification and the automatic generation of intervention processes at runtime.

Advanced materials will require improved processing methods due to high melting points, low toughness or ductility values, high reactivity with air or ceramics and typically complex crystal structures with significant anisotropy in flow and/or fracture stress. Materials for structural applications at elevated temperature in critical systems will require processing with a high degree of control. This requires an improved understanding of the relationship between process variables and microstructure to enable control systems to achieve consistently high quality. One avenue to the required level of understanding is computer simulation. Past attempts to do process modeling have been hampered by incomplete data regarding thermophysical or mechanical material behavior. Some of the required data can be calculated. Due to the advances in software and hardware, accuracy and costs are in the realm of acquiring experimental data. Such calculations can, for example, be done at an atomic level to compute lattice energy, fault energies, density of states and charge densities. These can lead to fundamental information about the competition between slip and fracture, anisotropy of bond strength (and therefore cleavage strength), cohesive strength, adhesive strength, elastic modulus, thermal expansion and possibly other quantities which are difficult (and therefore expensive to measure). Some of these quantities can be fed into a process model. It is probable that temperature dependencies can be derived numerically as well. Examples are given of the beginnings of such an approach for Ni 3 Al and MoSi 2 . Solidification problems are examples of the state-of-the-art process modeling and adequately demonstrate the need for extensive input data. Such processes can be monitored in terms of interfacial position vs. time, cooling rate and thermal gradient

Full Text Available Recently, a lot of investigations have been focused on development of the novel mild food processing techniques with the aim to obtain the high quality food products. It is presumed also that they could substitute some of the traditional processes in the food industry. The investigations are primarily directed to usage of high hydrostatic pressure, ultrasound, tribomechanical micronization, microwaves, pulsed electrical fields. The results of the scientific researches refer to the fact that application of some of these processes in particular food industry can result in lots of benefits. A significant energy savings, shortening of process duration, mild thermal conditions, food products with better sensory characteristics and with higher nutritional values can be achieved. As some of these techniques act also on the molecular level changing the conformation, structure and electrical potential of organic as well as inorganic materials, the improvement of some functional properties of these components may occur. Common characteristics of all of these techniques are treatment at ambient or insignificant higher temperatures and short time of processing (1 to 10 minutes. High hydrostatic pressure applied to various foodstuffs can destroy some microorganisms, successfully modify molecule conformation and consequently improve functional properties of foods. At the same time it acts positively on the food products intend for freezing. Tribomechanical treatment causes micronization of various solid materials that results in nanoparticles and changes in structure and electrical potential of molecules. Therefore, the significant improvement of some rheological and functional properties of materials occurred. Ultrasound treatment proved to be potentially very successful technique of food processing. It can be used as a pretreatment to drying (decreases drying time and improves functional properties of food, as extraction process of various components

Radiation processing is a very convenient tool for imparting desirable effects in polymeric materials and it has been an area of enormous interest in the last few decades. The success of radiation technology for processing of synthetic polymers can be attributed to two reasons namely, their ease of processing in various shapes and sizes, and secondly, most of these polymers undergo crosslinking reaction upon exposure to radiation. In recent years, natural polymers are being looked at with renewed interest because of their unique characteristics, such as inherent biocompatibility, biodegradability and easy availability. Traditionally, the commercial exploitation of natural polymers like carrageenans, alginates or starch etc. has been based, to a large extent, on empirical knowledge. But now, the applications of natural polymers are being sought in knowledge - demanding areas such as pharmacy and biotechnology, which is acting as a locomotive for further scientific research in their structure-function relationship. Selected success stories concerning radiation processed natural polymers and application of their derivatives in the health care products industries and agriculture are reported. This publication will be of interest to individuals at nuclear institutions worldwide that have programmes of R and D and applications in radiation processing technologies. New developments in radiation processing of polymers and other natural raw materials give insight into converting them into useful products for every day life, human health and environmental remediation. The book will also be of interest to other field specialists, readers including managers and decision makers in industry (health care, food and agriculture) helping them to understand the important role of radiation processing technology in polysaccharides

Health behavior theories focus on the role of conscious, reflective factors (e.g., behavioral intentions, risk perceptions) in predicting and changing behavior. Dual-process models, on the other hand, propose that health actions are guided not only by a conscious, reflective, rule-based system but also by a nonconscious, impulsive, associative system. This article argues that research on health decisions, actions, and outcomes will be enriched by greater consideration of nonconscious processes. A narrative review is presented that delineates research on implicit cognition, implicit affect, and implicit motivation. In each case, we describe the key ideas, how they have been taken up in health psychology, and the possibilities for behavior change interventions, before outlining directions that might profitably be taken in future research. Correlational research on implicit cognitive and affective processes (attentional bias and implicit attitudes) has recently been supplemented by intervention studies using implementation intentions and practice-based training that show promising effects. Studies of implicit motivation (health goal priming) have also observed encouraging findings. There is considerable scope for further investigations of implicit affect control, unconscious thought, and the automatization of striving for health goals. Research on nonconscious processes holds significant potential that can and should be developed by health psychologists. Consideration of impulsive as well as reflective processes will engender new targets for intervention and should ultimately enhance the effectiveness of behavior change efforts. PsycINFO Database Record (c) 2013 APA, all rights reserved.

Organisations are currently challenged by demands for increased collaborative innovation internally as well as with external and new entities - e.g. across the value chain. The authors seek to develop new approaches to managing collaborative innovative processes in the context of open innovation ...... the diverse matters of concern into a coherent product or service concept, and 2) in the same process move these diverse holders of the matters of concern into a translated actor network which carry or support the concept.......Organisations are currently challenged by demands for increased collaborative innovation internally as well as with external and new entities - e.g. across the value chain. The authors seek to develop new approaches to managing collaborative innovative processes in the context of open innovation...... and public private innovation partnerships. Based on a case study of a collaborative design process in a large electronics company the paper points to the key importance of staging and navigation of collaborative innovation process. Staging and navigation is presented as a combined activity: 1) to translate...

Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.

This book is the first in a set of forthcoming books focussed on state-of-the-art development in the VLSI Signal Processing area. It is a response to the tremendous research activities taking place in that field. These activities have been driven by two factors: the dramatic increase in demand for high speed signal processing, especially in consumer elec­ tronics, and the evolving microelectronic technologies. The available technology has always been one of the main factors in determining al­ gorithms, architectures, and design strategies to be followed. With every new technology, signal processing systems go through many changes in concepts, design methods, and implementation. The goal of this book is to introduce the reader to the main features of VLSI Signal Processing and the ongoing developments in this area. The focus of this book is on: • Current developments in Digital Signal Processing (DSP) pro­ cessors and architectures - several examples and case studies of existing DSP chips are discussed in...

Since operation of the Transuranium Processing Plant began, process changes have been made to counteract problems caused by equipment corrosion, to satisfy new processing requirements, and to utilize improved processes. The new processes, equipment, and techniques have been incorporated into a sequence of steps which satisfies all required processing functions

This research program consists of the development of acoustic containerless processing systems with applications in the areas of research in material sciences, as well as the production of new materials, solid forms with novel and unusual microstructures, fusion target spheres, and improved optical fibers. Efforts have been focused on the containerless processing at high temperatures for producing new kinds of glasses. Also, some development has occurred in the areas of containerlessly supporting liquids at room temperature, with applications in studies of fluid dynamics, potential undercooling of liquids, etc. The high temperature area holds the greatest promise for producing new kinds of glasses and ceramics, new alloys, and possibly unusual structural shapes, such as very uniform hollow glass shells for fusion target applications. High temperature acoustic levitation required for containerless processing has been demonstrated in low-g environments as well as in ground-based experiments. Future activities include continued development of the signals axis acoustic levitator.

The complete sequence used to manufacture complementary metal oxide semiconductor (CMOS) integrated circuits is described. The fixed-gate array concept is presented as a means of obtaining CMOS integrated circuits in a fast and reliable fashion. Examples of CMOS circuits fabricated by both the conventional method and the fixed-gate array method are included. The electrical parameter specifications and characteristics are given along with typical values used to produce CMOS circuits. Temperature-bias stressing data illustrating the thermal stability of devices manufactured by this process are presented. Results of a preliminary study on the radiation sensitivity of circuits manufactured by this process are discussed. Some process modifications are given which have improved the radiation hardness of our CMOS devices. A formula description of the chemicals and gases along with the gas flow rates is also included

The purpose of this book is twofold: to provide a brief, simple introduction to the theory of radiation and its application in astrophysics and to serve as a reference manual for researchers. The first part of the book consists of a discussion of the basic formulas and concepts that underlie the classical and quantum descriptions of radiation processes. The rest of the book is concerned with applications. The spirit of the discussion is to present simple derivations that will provide some insight into the basic physics involved and then to state the exact results in a form useful for applications. The reader is referred to the original literature and to reviews for rigorous derivations.The wide range of topics covered is illustrated by the following table of contents: Basic Formulas for Classical Radiation Processes; Basic Formulas for Quantum Radiation Processes; Cyclotron and Synchrotron Radiation; Electron Scattering; Bremsstrahlung and Collision Losses; Radiative Recombination; The Photoelectric Effect; a...

Full Text Available The essence of the nursing process can be summed up in this quotation by Sir Francis Bacon: “Human knowledge and human powers meet in one; for where the cause is not known the effect cannot be produced.” Arriving at a concise, accurate definition of the nursing process was, for me, an impossible task. It is altogether too vast and too personal a topic to contract down into a niftylooking, we-pay-lip-service-to-it cliché. So what I propose to do is to present my understanding of the nursing process throughout this essay, and then to leave the reader with some overall, general impression of what it all entails.

This is the revised and enlarged 2nd edition of the authors’ original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.

Enzyme catalysts have the potential to improve both the process economics and the environ-mental profile of many oxidation reactions especially in the fine- and specialty-chemical industry, due to their exquisite ability to perform stereo-, regio- and chemo-selective oxida-tions at ambient...... to aldehydes and ketones, oxyfunctionalization of C-H bonds, and epoxidation of C-C double bonds. Although oxygen dependent biocatalysis offers many possibilities, there are numerous chal-lenges to be overcome before an enzyme can be implemented in an industrial process. These challenges requires the combined...... far below their potential maximum catalytic rate at industrially relevant oxygen concentrations. Detailed knowledge of the en-zyme kinetics are therefore required in order to determine the best operating conditions and design oxygen supply to minimize processing costs. This is enabled...

Combining accurate fluid property databases with a commercial equation-solving software package running on a desktop computer allows simulation of cryogenic processes without extensive computer programming. Computer simulation can be a powerful tool for process development or optimization. Most engineering simulations to date have required extensive programming skills in languages such as Fortran, Pascal, etc. Authors of simulation code have also usually been responsible for choosing and writing the particular solution algorithm. This paper describes a method of simulating cryogenic processes with a commercial software package on a desktop personal computer that does not require these traditional programming tasks. Applications include modeling of cryogenic refrigerators, heat exchangers, vapor-cooled power leads, vapor pressure thermometers, and various other engineering problems

A Total Process Surveillance system is under development which can provide, in real-time, additional process information from a limited number of raw measurement signals. This is achieved by using a robust model based observer to generate estimates of the process' internal states. The observer utilises the analytical reduncancy among a diverse range of transducers and can thus accommodate off-normal conditions which lead to transducer loss or damage. The modular hierarchical structure of the system enables the maximum amount of information to be assimilated from the available instrument signals no matter how diverse. This structure also constitutes a data reduction path thus reducing operator cognitive overload from a large number of varying, and possibly contradictory, raw plant signals. (orig.)

In repositories for nuclear waste there are many processes which will be instrumental in corroding the canisters and releasing the nuclides. Based on experiences from studies on the performance of repositories and on an actual design the major mechanisms influencing the integrity and performance of a repository are described and discussed. The paper addresses only conditions in crystalline rock repositories. The low water flow rate in fractures and channels plays a dominant role in limiting the interaction between water and waste. Molecular diffusion in the backfill and rock matrix as well as in the mobile water is an important transport process but actually limits the exchange rate because diffusive transport is slow. Solubility limits of both waste matrix and of individual nuclides are also important. Complicating processes include gas generation by iron corrosion and alpha-radiolysis. (au) (19 refs., 2 figs.)

Signal processing is the discipline of extracting information from collections of measurements. To be effective, the measurements must be organized and then filtered, detected, or transformed to expose the desired information. Distortions caused by uncertainty, noise, and clutter degrade the performance of practical signal processing systems. In aggressively uncertain situations, the full truth about an underlying signal cannot be known. This book develops the theory and practice of signal processing systems for these situations that extract useful, qualitative information using the mathematics of topology -- the study of spaces under continuous transformations. Since the collection of continuous transformations is large and varied, tools which are topologically-motivated are automatically insensitive to substantial distortion. The target audience comprises practitioners as well as researchers, but the book may also be beneficial for graduate students.

as a major neurotransmitter in the central and peripheral nervous systems. The tissue-specific expression of the hormones is regulated at the transcriptional level, but the posttranslational phase is also decisive and is highly complex in order to ensure accurate maturation of the prohormones in a cell...... picomolar concentrations, whereas the cellular expression of gastrin is expressed at higher levels, and accordingly gastrin circulates in 10-20-fold higher concentrations. Both common cancers and the less frequent neuroendocrine tumors express the gastrin gene and prohormone. But the posttranslational......, as are structural features of progastrin that are involved in the precursor activation process. Thus, the review describes how the processing depends on the cell-specific expression of the processing enzymes and kinetics in the secretory pathway....

This volume is the first of two volumes containing the revised and completed notes lectures given at the school "Quantum Independent Increment Processes: Structure and Applications to Physics". This school was held at the Alfried-Krupp-Wissenschaftskolleg in Greifswald during the period March 9 – 22, 2003, and supported by the Volkswagen Foundation. The school gave an introduction to current research on quantum independent increment processes aimed at graduate students and non-specialists working in classical and quantum probability, operator algebras, and mathematical physics. The present first volume contains the following lectures: "Lévy Processes in Euclidean Spaces and Groups" by David Applebaum, "Locally Compact Quantum Groups" by Johan Kustermans, "Quantum Stochastic Analysis" by J. Martin Lindsay, and "Dilations, Cocycles and Product Systems" by B.V. Rajarama Bhat.

It is anticipated that the coupled utilization of coal and nuclear energy will achieve great importance in the future, the coal serving mainly as raw material and nuclear energy more as primary energy. Prerequisite for this development is the availability of high-temperature reactors, the state of development of which is described here. Raw materials for coupled use with nuclear process heat are petroleum, natural gas, coal, lignite, and water. Steam reformers heated by nuclear process heat, which are suitable for numerous processes, are expected to find wide application. The article describes several individual methods, all based on the transport of gas in pipelines, which could be utilized for the long distance transport of 'nuclear energy'.

This book gives an overview of the fundamentals and applications of laser-matter interactions, in particular with regard to laser material processing. Special attention is given to laser-induced physical and chemical processes at gas-solid, liquid-solid, and solid-solid interfaces. Starting with the background physics, the book proceeds to examine applications of lasers in “standard” laser machining and laser chemical processing (LCP), including the patterning, coating, and modification of material surfaces. This fourth edition has been enlarged to cover the rapid advances in the understanding of the dynamics of materials under the action of ultrashort laser pulses, and to include a number of new topics, in particular the increasing importance of lasers in various different fields of surface functionalizations and nanotechnology. In two additional chapters, recent developments in biotechnology, medicine, art conservation and restoration are summarized. Graduate students, physicists, chemists, engineers, a...

This growth of commercial radiation processing has been largely dependent on the achievement in production of reliable and less expensive radiation facilities as well as the research and development effort for new applications. Although world statistics of the growth are not available, Figure 20-1 shows steady growth in the number of EBAs installed in Japan for various purposes. Growth rate of Co-60 sources supplied by AECL (Atomic Energy of Canada Limited), which supplied approximately 80% of the world market, approximately 10% per year, including future growth estimates. Potential applications of radiation processing under development are in environmental conservation (e.g., treatment of sewage sludge, waste water, and exhaust gases) and bioengineering (e.g., immobilization of bioactive materials). The authors plan to introduce her the characteristics of radiation processing, examples of its industrial applications, the status of its research and development activities, and an economic analysis

CRC Press author R. Mohan Sankaran is the winner of the 2011 Peter Mark Memorial Award "… for the development of a tandem plasma synthesis method to grow carbon nanotubes with unprecedented control over the nanotube properties and chirality." -2011 AVS Awards Committee"Readers who want to learn about how nanomaterials are processed, using the most recent methods, will benefit greatly from this book. It contains very recent technical details on plasma processing and synthesis methods used by current researchers developing new nano-based materials, with all the major plasma-based processing techniques used today being thoroughly discussed."-John J. Shea, IEEE Electrical Insulation Magazine, May/June 2013, Vol. 29, No. 3.

The present thesis considers numerical modeling of activated sludge tanks on municipal wastewater treatment plants. Focus is aimed at integrated modeling where the detailed microbiological model the Activated Sludge Model 3 (ASM3) is combined with a detailed hydrodynamic model based on a numerical...... solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially...... hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

The radionuclei 40 K, 81 Kr, 87 Rb, 93 Zr, 107 Pd, 147 Sm, 176 Lu and 205 Pb are built up totally or partially by the s-process. Due to their long half life they are potential chronometers for the age and the development of the s-process. The usefulness of the various nuclei is discussed. For the determination of the mean age of the s-process synthesis and with it the age of the galaxy, 176 Lu is best suited. It is demonstrated that this age can be calculated solely from measured cross section and abundance ratios. Various effects which can limit the usefulness of 176 Lu as a clock are discussed. (orig.) [de

and constructivist multiple criteria decision-making analysis method is selected for developing the work further. The method is introduced and applied to the renovation of a multi-residential historic building. Furthermore, a new scheme, the Integrated Renovation Process, is presented. Finally, the methodology...... is applied to two single-family homes. In practice, such a scheme allowed most informational barriers to sustainable home renovation to be overcome. The homeowners were better integrated and their preferences and immaterial values were better taken into account. They assimilated the multiple benefits...... to keep the decision making process economically viable and timely, the process still needs to be improved and new tools need to be developed....

This comprehensive and engaging textbook introduces the basic principles and techniques of signal processing, from the fundamental ideas of signals and systems theory to real-world applications. Students are introduced to the powerful foundations of modern signal processing, including the basic geometry of Hilbert space, the mathematics of Fourier transforms, and essentials of sampling, interpolation, approximation and compression. The authors discuss real-world issues and hurdles to using these tools, and ways of adapting them to overcome problems of finiteness and localisation, the limitations of uncertainty and computational costs. Standard engineering notation is used throughout, making mathematical examples easy for students to follow, understand and apply. It includes over 150 homework problems and over 180 worked examples, specifically designed to test and expand students' understanding of the fundamentals of signal processing, and is accompanied by extensive online materials designed to aid learning, ...

The phenomenon of process damping as a stabilising effect in milling has been encountered by machinists since milling and turning began. It is of great importance when milling aerospace alloys where maximum surface speed is limited by excessive tool wear and high speed stability lobes cannot be attained. Much of the established research into regenerative chatter and chatter avoidance has focussed on stability lobe theory with different analytical and time domain models developed to expand on the theory first developed by Trusty and Tobias. Process damping is a stabilising effect that occurs when the surface speed is low relative to the dominant natural frequency of the system and has been less successfully modelled and understood. Process damping is believed to be influenced by the interference of the relief face of the cutting tool with the waveform traced on the cut surface, with material properties and the relief geometry of the tool believed to be key factors governing performance. This study combines experimental trials with Finite Element (FE) simulation in an attempt to identify and understand the key factors influencing process damping performance in titanium milling. Rake angle, relief angle and chip thickness are the variables considered experimentally with the FE study looking at average radial and tangential forces and surface compressive stress. For the experimental study a technique is developed to identify the critical process damping wavelength as a means of measuring process damping performance. For the range of parameters studied, chip thickness is found to be the dominant factor with maximum stable parameters increased by a factor of 17 in the best case. Within the range studied, relief angle was found to have a lesser effect than expected whilst rake angle had an influence.

The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.

This textbook covers the entire Business Process Management (BPM) lifecycle, from process identification to process monitoring, covering along the way process modelling, analysis, redesign and automation. Concepts, methods and tools from business management, computer science and industrial

A welding method is provided for forming a weld joint between first and second elements of a workpiece. The method includes heating the first and second elements to form an interface of material in a plasticized or melted state interface between the elements. The interface material is then allowed to cool to a plasticized state if previously in a melted state. The interface material, while in the plasticized state, is then mixed, for example, using a grinding/extruding process, to remove any dendritic-type weld microstructures introduced into the interface material during the heating process.

The sudden death of Pablo Picasso's closest friend Carlos Casagemas in 1901 came as a great shock to the young Picasso. From a young age, Picasso had ruminated on life and death; however, this was his first experience of bereavement. Following the death of Casagemas, Picasso's paintings can be seen as a diary of his grieving process and clearly illustrate the five stages of the grieving process as outlined by Kubler-Ross in ' On Death and Dying ' (1969). Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

A process for converting an actinide metal such as thorium, uranium, or plutonium to an actinide oxide material by admixing the actinide metal in an aqueous medium with a hypochlorite as an oxidizing agent for sufficient time to form the actinide oxide material and recovering the actinide oxide material is described together with a low temperature process for preparing an actinide oxide nitrate such as uranyl nitrate. Additionally, a composition of matter comprising the reaction product of uranium metal and sodium hypochlorite is provided, the reaction product being an essentially insoluble uranium oxide material suitable for disposal or long term storage

In modern medicine, imaging is the most effective tool for diagnostics, treatment planning and therapy. Almost all modalities have went to directly digital acquisition techniques and processing of this image data have become an important option for health care in future. This book is written by a team of internationally recognized experts from all over the world. It provides a brief but complete overview on medical image processing and analysis highlighting recent advances that have been made in academics. Color figures are used extensively to illustrate the methods and help the reader to understand the complex topics.

Complex dynamic processes exhibit many complicated patterns of evolution. How can all these patterns be recognized using only output (observational, experimental) data without prior knowledge of the equations of motion? The powerful method for doing this is based on symbolic dynamics: (1) Present output data in symbolic form (trial language). (2) Topological and metric entropies are constructed. (3) Develop algorithms for computer optimization of entropies. (4) By maximizing entropies, find the most appropriate symbolic language for the purpose of pattern recognition. (5) Test this method using a variety of dynamical models from nonlinear science. The authors are in the process of applying this method for analysis of MHD fluctuations in tokamaks

A method of reversibly brazing surfaces together. An interface is affixed to each surface. The interfaces can be affixed by processes such as mechanical joining, welding, or brazing. The two interfaces are then brazed together using a brazing process that does not defeat the surface to interface joint. Interfaces of materials such as Ni-200 can be affixed to metallic surfaces by welding or by brazing with a first braze alloy. The Ni-200 interfaces can then be brazed together using a second braze alloy. The second braze alloy can be chosen so that it minimally alters the properties of the interfaces to allow multiple braze, heat and disassemble, rebraze cycles.

The concept of instrumentation and control in nuclear power plants incorporates the use of process computers for tasks which are on-line in respect to real-time requirements but not closed-loop in respect to closed-loop control. The general scope of tasks is: - alarm annunciation on CRT's - data logging - data recording for post trip reviews and plant behaviour analysis - nuclear data computation - graphic displays. Process computers are used additionally for dedicated tasks such as the aeroball measuring system, the turbine stress evaluator. Further applications are personal dose supervision and access monitoring. (orig.)

A status report on the uranium Laser Isotope Separation (LIS) Program at the Lawrence Livermore National Laboratory is presented. Prior to this status report, process economic analysis is presented so as to understand how the unique properties of laser photons can be best utilized in the production of materials and components despite the high cost of laser energy. The characteristics of potential applications that are necessary for success are identified, and those factors that have up to now frustrated attempts to find commercially viable laser induced chemical and physical process for the production of new or existing materials are pointed out

Signals occurring in microdosimetric measurements cover a dynamic range of 100 dB at a counting rate which normally stays below 10 4 but could increase significantly in case of an accident. The need for high resolution at low energies, non-linear signal processing to accommodate the specified dynamic range, easy calibration and thermal stability are conflicting requirements which pose formidable design problems. These problems are reviewed, and a practical approach to their solution is given employing a single processing channel. (author)

Present paper is placed in the discussion about how sustainable measures are integrated in the design process by architectural offices. It presents results from interviews with four leading Danish architectural offices working with sustainable architecture and their experiences with it, as well...... as the requirements they meet in terms of how to approach the design process – especially focused on the early stages like a competition. The interviews focus on their experiences with working in multidisciplinary teams and using digital tools to support their work with sustainable issues. The interviews show...... the environmental measures cannot be discarded due to extra costs....

Present paper is placed in the discussion about how sustainable measures are integrated in the design process by architectural offices. It presents results from interviews with four leading Danish architectural offices working with sustainable architecture and their experiences with it, as well...... as the requirements they meet in terms of how to approach the design process – especially focused on the early stages like a competition. The interviews focus on their experiences with working in multidisciplinary teams and using digital tools to support their work with sustainable issues. The interviews show...... the environmental measures cannot be discarded due to extra costs....

A process is presented for recovering uranium values from calutron deposits. The process consists in treating such deposits to produce an oxidlzed acidic solution containing uranium together with the following imparities: Cu, Fe, Cr, Ni, Mn, Zn. The uranium is recovered from such an impurity-bearing solution by adjusting the pH of the solution to the range 1.5 to 3.0 and then treating the solution with hydrogen peroxide. This results in the precipitation of uranium peroxide which is substantially free of the metal impurities in the solution. The peroxide precipitate is then separated from the solution, washed, and calcined to produce uranium trioxide.

Hard exclusive processes in high energy electron proton scattering offer the opportunity to get access to a new generation of parton distributions, the so-called generalized parton distributions (GPDs). This functions provide more detailed informations about the structure of the nucleon than the usual PDFs obtained from DIS. In this work we present a detailed analysis of exclusive processes, especially of hard exclusive meson production. We investigated the influence of exclusive produced mesons on the semi-inclusive production of mesons at fixed target experiments like HERMES. Further we give a detailed analysis of higher order corrections (NLO) for the exclusive production of mesons in a very broad range of kinematics. (orig.)

between organisational growth and managerial role transformation in technology-based new ventures. The chapter begins by reviewing existing literature on organisational growth patterns and establishing a link to managerial roles in order to elucidate the basic premises of the study. The chapter...... for understanding the link between organisational growth and managerial role transformation.......Growing a technology-based new venture is a complex process because these ventures are embedded in turbulent environments that require fast organisational and managerial transformation. This chapter addresses the evolutionary process of such ventures. It seeks to provide insight into the link...

A process for purifying graphite comprising: comminuting graphite containing mineral matter to liberate at least a portion of the graphite particles from the mineral matter; mixing the comminuted graphite particles containing mineral matter with water and hydrocarbon oil to form a fluid slurry; separating a water phase containing mineral matter and a hydrocarbon oil phase containing grahite particles; and separating the graphite particles from the hydrocarbon oil to obtain graphite particles reduced in mineral matter. Depending upon the purity of the graphite desired, steps of the process can be repeated one or more times to provide a progressively purer graphite

model allows for a wide range of (cross-)autocorrelation structures in multivariate point processes. The model is estimated by simulated maximum likelihood (SML) using the efficient importance sampling (EIS) technique. By modeling price intensities based on NYSE trading, we provide significant evidence......In this article, we introduce the so-called stochastic conditional intensity (SCI) model by extending Russell’s (1999) autoregressive conditional intensity (ACI) model by a latent common dynamic factor that jointly drives the individual intensity components. We show by simulations that the proposed...... for a joint latent factor and show that its inclusion allows for an improved and more parsimonious specification of the multivariate intensity process...

Process Energy Reduction (PER) is a demand-side energy reduction approach which complements and often supplants other traditional energy reduction methods such as conservation and heat recovery. Because the application of PER is less obvious than the traditional methods, it takes some time to learn the steps as well as practice to become proficient in its use. However, the benefit is significant, often far outweighing the traditional energy reduction approaches. Furthermore, the method usually results in a better process having less waste and pollution along with improved yields, increased capacity, and lower operating costs

Purpose - The purpose of this paper is to present insights into operations strategy (OS) in practice. It outlines a conceptualization and model of OS processes and, based on findings from an in-depth and longitudinal case study, contributes to further development of extant OS models and methods......; taking place in five dimensions of change - technical-rational, cultural, political, project management, and facilitation; and typically unfolding as a sequential and parallel, ordered and disordered, planned and emergent as well as top-down and bottom-up process. The proposed OS conceptualization...

Some conclusions of this presentation are: (1) Radiation-assisted nanotechnology applications will continue to grow; (2) The APPF will provide a unique focus for radiolytic processing of nanomaterials in support of DOE-DP, other DOE and advanced manufacturing initiatives; (3) {gamma}, X-ray, e-beam and ion beam processing will increasingly be applied for 'green' manufacturing of nanomaterials and nanocomposites; and (4) Biomedical science and engineering may ultimately be the biggest application area for radiation-assisted nanotechnology development.

The invention has for its object improvements relating to processes of pyrogenation of carbonaceous materials in the presence of hydrocarbon wetting agents, notably to those processes for solid fossil combustibles, which improvements consist, chiefly, in adding to the wetting agent and/or carbonaceous materials to be heated, a catalyst constituted by at least one mineral material derived from a polyvalent element such as vanadium, molybdenum, iron, manganese, nickel, cobalt, tin etc, or by a derivative of such an element, the catalyst being added in the proportions of some ten thousandths to some hundredths by weight, for example up to five hundredths.

This presentation describes the process used to collect, review, integrate, and assess research requirements desired to be a part of research and payload activities conducted on the ISS. The presentation provides a description of: where the requirements originate, to whom they are submitted, how they are integrated into a requirements plan, and how that integrated plan is formulated and approved. It is hoped that from completing the review of this presentation, one will get an understanding of the planning process that formulates payload requirements into an integrated plan used for specifying research activities to take place on the ISS.

Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process......This paper discusses two classes of distributions, and stochastic processes derived from them: modified stable (MS) laws and normal modified stable (NMS) laws. This extends corresponding results for the generalised inverse Gaussian (GIG) and generalised hyperbolic (GH) or normal generalised inverse...

Roughly a decade ago, power consumption and heat dissipation concerns forced the semiconductor industry to radically change its course, shifting from sequential to parallel computing. Unfortunately, improving performance of applications has now become much more difficult than in the good old days of frequency scaling. This is also affecting databases and data processing applications in general, and has led to the popularity of so-called data appliances-specialized data processing engines, where software and hardware are sold together in a closed box. Field-programmable gate arrays (FPGAs) incr

A radioactive waste processing container used for processing radioactive wastes into solidification products suitable to disposal such as underground burying or ocean discarding is constituted by using cements. As the cements, calcium sulfoaluminate clinker mainly comprising calcium sulfoaluminate compound; 3CaO 3Al 2 O 3 CaSO 4 , Portland cement and aqueous blast furnace slug is used for instance. Calciumhydroxide formed from the Portland cement is consumed for hydration of the calcium sulfoaluminate clinker. According, calcium hydroxide is substantially eliminated in the cement constituent layer of the container. With such a constitution, damages such as crackings and peelings are less caused, to improve durability and safety. (I.N.)

The silicon web process takes advantage of natural crystallographic stabilizing forces to grow long, thin single crystal ribbons directly from liquid silicon. The ribbon, or web, is formed by the solidification of a liquid film supported by surface tension between two silicon filaments, called dendrites, which border the edges of the growing strip. The ribbon can be propagated indefinitely by replenishing the liquid silicon as it is transformed to crystal. The dendritic web process has several advantages for achieving low cost, high efficiency solar cells. These advantages are discussed.

, which were carried out in Denmark in 2008-2009 using qualitative research methods, revealed changes in the sequence, divisibility and repetitiveness of a number of recruitment tasks and subtasks. The new recruitment process design was identified and presented in the paper. The study concluded......The aim of this research was to determine whether the introduction of e-recruitment has an impact on the process and underlying tasks, subtasks and activities of recruitment. Three large organizations with well-established e-recruitment practices were included in the study. The three case studies...

This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

Genomic signal processing (GSP) can be defined as the analysis, processing, and use of genomic signals to gain biological knowledge, and the translation of that knowledge into systems-based applications that can be used to diagnose and treat genetic diseases. Situated at the crossroads of engineering, biology, mathematics, statistics, and computer science, GSP requires the development of both nonlinear dynamical models that adequately represent genomic regulation, and diagnostic and therapeutic tools based on these models. This book facilitates these developments by providing rigorous mathema

This book shows US the distribution diagram of water and waste water processing with device of water processing, and device of waste water processing, property of water quality like measurement of pollution of waste water, theoretical Oxygen demand, and chemical Oxygen demand, processing speed like zero-order reactions and enzyme reactions, physical processing of water and waste water, chemical processing of water and waste water like neutralization and buffering effect, biological processing of waste water, ammonia removal, and sludges processing.

Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret......Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

htmlabstractWhen we look at successful sales processes occurring in practice, we find they combine two techniques which have been studied separately in the literature. Recommender systems are used to suggest additional products or accessories to include in the bundle under consideration, and

The Process Experimental Pilot Plant (PREPP) at the Idaho National Engineering Laboratory (INEL) was built to convert transuranic contaminated solid waste into a form acceptable for disposal at the Waste Isolation Pilot Plant (WIPP), located near Carlsbad, New Mexico. There are about 2.0 million cubic ft of transuranic waste stored at the Transuranic Storage Area of the INEL's Radioactive Waste Management Complex (RWMC). The Stored Waste Examination Pilot Plant (SWEPP) located at the RWMC will examine this stored transuranic waste to determine if the waste is acceptable for direct shipment to and storage at WIPP, or if it requires shipment to PREPP for processing before shipment to WIPP. The PREPP process shreds the waste, incinerates the shredded waste, and cements (grouts) the shredded incinerated waste in new 55-gal drums. Unshreddable items are repackaged and returned to SWEPP. The process off-gas is cleaned prior to its discharge to the atmosphere, and complies with the effluent standards of the State of Idaho, EPA, and DOE. Waste liquid generated is used in the grouting operation

The invention is a pervaporation process and pervaporation equipment, using a series of membrane modules, and including inter-module reheating of the feed solution under treatment. The inter-module heating is achieved within the tube or vessel in which the modules are housed, thereby avoiding the need to repeatedly extract the feed solution from the membrane module train.

Process algebra is a theoretical framework for the modelling and analysis of the behaviour of concurrent discrete event systems that has been developed within computer science in past quarter century. It has generated a deeper nderstanding of the nature of concepts such as observable behaviour in

Understanding how students translate between mathematical representations is of both practical and theoretical importance. This study examined students' processes in their generation of symbolic and graphic representations of given polynomial functions. The purpose was to investigate how students perform these translations. The result of the study…

In this paper we introduce an instance of the well-know Neyman–Scott cluster process model with clusters having a long tail behaviour. In our model the offspring points are distributed around the parent points according to a circular Cauchy distribution. Using a modified Cramér-von Misses test...

Computer simulation attempts to "mimic" real-life or hypothetical behavior on a computer to see how processes or systems can be improved and to predict their performance under different circumstances. Simulation has been successfully applied in many disciplines and is considered to be a relevant and

Similarity underlies fundamental cognitive capabilities such as memory, categorization, decision making, problem solving, and reasoning. Although recent approaches to similarity appreciate the structure of mental representations, they differ in the processes posited to operate over these representations. We present an experiment that…

.... The result is a process that continues lacking the ability to clarify objectives, chains of command, and policy implementation plans Insights from organizational behavior theory reveal that some of the IAP's sub-optimal performance and irrational behavior are rooted in bureaucratic bargaining and decisions.

Web services have a potential to enhance B2B ecommerce over the Internet by allowing companies and organizations to publish their business processes on service directories where potential trading partners can find them. This can give rise to new business paradigms based on ad-hoc trading relations

For thermonuclear power reactors based on the continuous fusion of deuterium and tritium the principal fuel processing problems occur in maintaining desired compositions in the primary fuel cycled through the reactor, in the recovery of tritium bred in the blanket surrounding the reactor, and in the prevention of tritium loss to the environment. Since all fuel recycled through the reactor must be cooled to cryogenic conditions for reinjection into the reactor, cryogenic fractional distillation is a likely process for controlling the primary fuel stream composition. Another practical possibility is the permeation of the hydrogen isotopes through thin metal membranes. The removal of tritium from the ash discharged from the power system would be accomplished by chemical procedures to assure physiologically safe concentration levels. The recovery process for tritium from the breeder blanket depends on the nature of the blanket fluids. For molten lithium the only practicable possibility appears to be permeation from the liquid phase. For molten salts the process would involve stripping with inert gas followed by chemical recovery. In either case extremely low concentrations of tritium in the melts would be desirable to maintain low tritium inventories, and to minimize escape of tritium through unwanted permeation, and to avoid embrittlement of metal walls. 21 refs

This chapter aims to provide concrete guidance in redesigning business processes. Two alternative methods are described, both of them suitable to boost business performance. The first one is based on a collection of best practices, as applied in various redesign projects. These best practices all

Application of biotechnological principles in the mineral processing, especially in hydrometallurgy, has created new opportunities and challenges for these industries. During the 1950's and 60's, the mining wastes and unused complex mineral resources have been successfully treated in bacterial assisted heap and dump leaching processes for copper and uranium. The interest in bio-leaching processes is the consequence of economic advantages associated with these techniques. For example, copper can be produced from mining wastes for about 1/3 to 1/2 of the costs of copper production by the conventional smelting process from high-grade sulfide concentrates. The economic viability of bio leaching technology lead to its world wide acceptance by the extractive industries. During 1970's this technology grew into a more structured discipline called 'bio hydrometallurgy'. Currently, bio leaching techniques are ready to be used, in addition to copper and uranium, for the extraction of cobalt, nickel, zinc, precious metals and for the desulfurization of high-sulfur content pyritic coals. As a developing technology, the microbiological leaching of the less common and rare metals has yet to reach commercial maturity. However, the research in this area is very active. In addition, in a foreseeable future the biotechnological methods may be applied also for the treatment of high-grade ores and mineral concentrates using adapted native and/or genetically engineered microorganisms. (author)

Radio frequency (RF) heating is a commonly used food processing technology that has been applied for drying and baking as well as thawing of frozen foods. Its use in pasteurization, as well as for sterilization and disinfection of foods, is more limited. This column will review various RF heating ap...

Full Text Available Video processing source code for algorithms and tools used in software media pipelines (e.g. image scalers, colour converters, etc.) The currently available source code is written in C++ with their associated libraries and DirectShow- Filters....

htmlabstractWe present a process algebra for modeling and reasoning about Mobile Ad hoc Networks (MANETs) and their protocols. In our algebra we model the essential modeling concepts of ad hoc networks, i.e. local broadcast, connectivity of nodes and connectivity changes. Connectivity and

Field or frame memories are often used in television receivers for video signal processing functions, such as noise reduction and/or flicker reduction. Television receivers also have graphic features such as teletext, menu-driven control systems, multilingual subtitling, an electronic TV-Guide, etc.

Electron beam Processing Systems (EPS) are used as useful and powerful tools in many industrial application fields such as the production of cross-linked wire, rubber tire, heat shrinkable film and tubing, curing, degradation of polymers, sterilization and environmental application. In this paper, the feature and application fields, the selection of machine ratings and safety measures of EPS will be described. (author)

In the current economy, a shift can be seen from stand-alone business organizations to networks of tightly collaborating business organizations. To allow this tight collaboration, business process management in these collaborative networks is becoming increasingly important. This paper discusses

This paper presents a neuropsychosocial model of information processing to explain a victimization experience, specifically child sexual abuse. It surveys the relation of sensation, perception, and cognition as a systematic way to provide a framework for studying human behavior and describing human response to traumatic events. (Author/JDD)

Full Text Available In present, innovations are spoken as an engine of the world economy because the innovations are transforming not only business entities but the whole industries. The innovations have become a necessity for business entities in order to survive on floating challenging markets. This way, innovations are driving force of companies’ performance. The problem which arises here is a question of measurement innovation’s effect on the financial performance of company or selection between two or more possible variants of innovation’s realization. Various authors which are focused on innovations processes are divided into two groups in their attitudes towards the question of influence of innovations on financial performance of companies. One group of the authors present the idea that any reliable measurement is not possible or efficient. The second group of authors present some methods theoretically applicable on this measurement but they base their approaches mostly on the methods of measurement of investments effectiveness or they suggest employment of indicators or ratios which wouldn’t be clearly connected with the outcome of innovation process. The aim of submitted article is to compare different approaches to evaluation of the innovation processes. The authors compare various approaches here and by use of analysis and synthesis, they determine their own method how to measure outcome of innovation process.

We extend the Gaussian process (GP) framework for bounded regression by introducing two bounded likelihood functions that model the noise on the dependent variable explicitly. This is fundamentally different from the implicit noise assumption in the previously suggested warped GP framework. We...... with the proposed explicit noise-model extension....

Deep inelastic (hard) processes are now at the epicenter of modern high-energy physics. These processes are governed by short-distance dynamics, which reveals the intrinsic structure of elementary particles. The theory of deep inelastic processes is now sufficiently well settled. The authors' aim was to give an effective tool to theoreticians and experimentalists who are engaged in high-energy physics. This book is intended primarily for physicists who are only beginning to study the field. To read the book, one should be acquainted with the Feynman diagram technique and with some particular topics from elementary particle theory (symmetries, dispersion relations, Regge pole theory, etc.). Theoretical consideration of deep inelastic processes is now based on quantum chromodynamics (QCD). At the same time, analysis of relevant physical phenomena demands a synthesis of QCD notions (quarks, gluons) with certain empirical characteristics. Therefore, the phenomenological approaches presented are a necessary stage in a study of this range of phenomena which should undoubtedly be followed by a detailed description based on QCD and electroweak theory. The authors were naturally unable to dwell on experimental data accumulated during the past decade of intensive investigations. Priority was given to results which allow a direct comparison with theoretical predictions. (Auth.)

The data processing in two telecommunication market investigations is described. One of the studies concerns the office applications of communication and the other the experiences with a videotex terminal. Statistical factorial analysis was performed on a large mass of data. A comparison between utilization intentions and effective utilization is made. Extensive rewriting of statistical analysis computer programs was required.

The final assessment of a course must reflect its goals, and contents. An important goal of our introductory programming course is that the students learn a systematic approach for the development of computer programs. Having the programming process as learning objective naturally raises the ques...

This article presents the results of an exploratory study asking faculty in the first-year writing program and instruction librarians about their research process focusing on results specifically related to serendipity. Steps to prepare for serendipity are highlighted as well as a model for incorporating serendipity into a first-year writing…

Microwaves are a common appliance in many households. In the United States microwave heating is the third most popular domestic heating method food foods. Microwave heating is also a commercial food processing technology that has been applied for cooking, drying, and tempering foods. It's use in ...

Certain problems in physics and chemistry lead to the definition of a class of stochastic processes. Although they are not Markovian they can be treated explicitly to some extent. In particular, the probability distribution for large times can be found. It is shown to obey a master equation. This

We compare r-process calculations with recent astronomical observations from the solar system and from ultra-metal-poor, neutron-capture-rich halo stars. These measurements include elemental as well as isotopic r-abundances. We deduce astrophysical conditions under which the observed r-patterns can be obtained, and derive criteria to determine Th/U chronometric ages. (orig.)

This process comprises centrifuging in the presence of a solvent which dissolves the oil and precipitates the paraffins, with the addition of an auxiliary liquid whose surface tension with respect to the oil solution does not exceed 50 dynes/cm.

This paper demonstrates, how cross-functional business processes may be aligned with product specification systems in an intra-organizational environment by integrating planning systems and expert systems, thereby providing an end-to-end integrated and an automated solution to the “build-to-order...

The area of the applications of radiation techniques is very wide. This paper only relates to the applications of radiation techniques in industries including radiation chemical industry, radiation processing of foods and environmental protection by radiation, but the nuclear instruments and the instrumentations of radiation are out-side of our study. (author)

The awareness of the ideas characterized by Communicating Processes Architecture and their adoption by industry beyond their traditional base in safety-critical systems and security is growing. The complexity of modern computing systems has become so great that no one person – maybe not even a small

The analytical treatment of hadronic decay cascades within the framework of the statistical bootstrap model is demonstrated on the basis of a simple variant. Selected problems for a more comprehensive formulation of the model such as angular momentum conservation, quantum statistical effects, and the immediate applicability to particle production processes at high energies are discussed in detail

... many processes and problems contribute to APD in children. In adults, neurological disorders such as stroke, tumors, degenerative disease (such as multiple sclerosis), and head trauma can contribute to APD. APD in children and adults often is best managed by a ...

A process control device comprises a memory device for memorizing a plant operation target, a plant state or a state of equipments related with each other as control data, a read-only memory device for storing programs, a plant instrumentation control device or other process control devices, an input/output device for performing input/output with an operator, and a processing device which conducts processing in accordance with the program and sends a control demand or a display demand to the input/output device. The program reads out control data relative to a predetermined operation target, compares and verify them with actual values to read out control data to be a practice premise condition which is further to be a practice premise condition if necessary, thereby automatically controlling the plant or requiring or displaying input. Practice presuming conditions for the operation target can be examined succesively in accordance with the program without constituting complicated logical figures and AND/OR graphs. (N.H.)

to calculate the requirements of heat processing. Our goal is to put food engineering into a production context. Other courses teach food chemistry, food microbiology and food technology. Topics of great importance and all have to be seen in a broader context of producing good and safe food in a large scale...

The use femtosecond pulses for materials processing results in very precise cutting and drilling with high efficiency. Energy deposited in the electrons is not coupled into the bulk during the pulse, resulting in negligible shock or thermal loading to adjacent areas

Office of Scientific Research Grant AFOSR F49620 82 C 0009 Period: 1 Noveber 1981 through 31 October 1982 Title: Research in Stochastic Processes Co...STA4ATIS CAMBANIS The work briefly described here was developed in connection with problems arising from and related to the statistical comunication

Obsolescence, defined as the process of declining performance of buildings, is a serious threat for the value, the usefulness and the life span of housing properties. Thomsen and van der Flier (2011) developed a model in which obsolescence is categorised on the basis of two distinctions, namely

In learning genetics, many students misunderstand and misinterpret what "dominance" means. Understanding is easier if students realize that dominance is not a mechanism, but rather a consequence of underlying cellular processes. For example, metabolic pathways are often little affected by changes in enzyme concentration. This means that…

Historically, education employees have been hired after a process that consists of these steps: Determining the need for a position, posting the vacancy, paper-screening applications, an interview with a panel or committee, background check, reference calling, and finally the selection of a candidate. This is a very time-consuming and costly…

The real time process algebra of Baeten and Bergstra [Formal Aspects of Computing, 3, 142-188 (1991)] is extended to real space by requiring the presence of spatial coordinates for each atomic action, in addition to the required temporal attribute. It is found that asynchronous communication

Although simulation is typically considered as relevant and highly applicable, in reality the use of simulation is limited. Many organizations have tried to use simulation to analyze their business processes at some stage. However, few are using simulation in a structured and effective manner. This

Teaches the need for refrigerant mixtures, the type of mixtures that can be used for different refrigeration and liquefaction applications, the different processes that can be used and the methods to be adopted for choosing the components of a mixture and their concentration for different applications.

The ionizing radiations available for food processing are defined, their mode of action and principal effects are described. Toxicological studies (animal tests, radiochemistry) concerning irradiated food are reviewed. The characteristics of the irradiation procedure and the prospects of its industrial development in France are presented [fr

. It is concluded that the ISP-model does not fully comply with group members' problem solving process and the involved information seeking behavior. Further, complex academic problem solving seems to be even more complex when it is performed in a group based setting. The study contributes with a new conceptual...

, by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...

A dual layer metallization process is studied using a Tungsten 10% Titanium/Molybdenum sandwich (TiW/Mo) first metal with an Al/.5% Cu for the second metal. This metallization process has: 1) very reliable shallow junction contacts without junction spiking, 2) very high electromigration resistance and (3) A very smooth defect free surface throughout the process. Contact resistance of 50 and 30 ohm-um2 for P and N type silicon respectively is achieved. The TiW/Mo film stress is studied and an optimum condition for low compressive stress is defined. The TiW/Mo is etched using a corrosion free etch process. Electromigration data is presented showing TiW/Mo to be at least an order of magnitude better than Al/Si. The intermetal oxide layer is a planarized sandwich of LTO/SOG/LTO providing a smooth positive slope surface for the Metal 2. Metal l/Metal 2 via resistances are studied with 1.25 ohm-um2 values obtained

How does moving across the geographical borders affect the relationships of diaspora members both here – in the country of residence and there- in the country of origin? The article delineates some of the processes through gendered experiences of the young adults perceived as active actors based...

This chapter discuss the application of nuclear technology in agriculture sector. Nuclear Technology has help agriculture and food processing to develop tremendously. Two techniques widely use in both clusters are ionization radiation and radioisotopes. Among techniques for ionizing radiation are plant mutation breeding, SIT and food preservation. Meanwhile radioisotopes use as a tracer for animal research, plant soil relations water sedimentology

Ionizing radiation is now showing new promise of contributing to the feeding of hungry populations, solving solid and liquid waste disposal problems, providing safer medical supplies, pharmaceuticals, and other commodities, and at the same time reducing energy consumption in industrial processing in general. (author)

This four-volume set, edited by leading experts in soil science, brings together in one collection a series of papers that have been fundamental to the development of soil science as a defined discipline. Tis volume 2 on Soil Properties and Processes covers: - Soil physics - Soil (bio)chemistry -

The growing field of applications of plasma as deposition, etching, surface modification and chemical conversion has stimulated a renewed interest in plasma science in the atomic physical chemistry regime. The necessity to optimize the various plasma processing techniques in terms of rates, and

A system of heat exchangers is disclosed for cooling process fluids. The system is particularly applicable to cooling steam generator blowdown fluid in a nuclear plant prior to chemical purification of the fluid in which it minimizes the potential of boiling of the plant cooling water which cools the blowdown fluid

Full Text Available Business process reengineering determines the change of organizational functions from an orientation focused on operations through a multidimensional approach. Former employees who were mere executors are now determined to take their own decisions and as a result the functional departments lose their reason to exist. Managers do not act anymore as supervisors, but mainly as mentors, while the employees focus more attention on customer needs and less than the head’s. Under these conditions, new organizational paradigms are required, the most important being that of learning organizations. In order to implement a reengineering of the economic processes and promoting a new organizational paradigm the information technology plays a decisive role. The article presents some results obtained in a research theme ANSTI funded by contract no. 501/2000. Economic and financial analysis is performed in order to know the current situation to achieve better results in the future. One of its objectives is the production analyzed as a labour process and the interaction elements of this process. The indicators investigated in the analysis of financial and economic activity of production reflect the development directions, the means and resources to accomplish predetermined objectives and express the results and effectiveness of what is expected.

The importance of Canada's natural gas industry to remain competitive on a global level was discussed. Third party processing is a tool that the Canadian gas industry can use to overcome the relative disadvantage of smaller, and therefore more expensive, gas processing plants in Canada, and to maintain, and even improve, its competitive position vis-a-vis its US counterparts. The principal role of a third party processor is to provide midstream services such as raw gas gathering, field compression, gas processing, sales gas transmission, and natural gas liquids fractionation. Some third party processors also provide marketing services. Third party processors add value to the gas producer by reducing risk, reducing cost, improving reliability, and improving netbacks. The many variables involved in determining the economic viability of third party processing, including the quantity and deliverability of the raw resource, facility capacity, capital investment, operating costs, technology, fee structures, operational reliability, and speed, among others, were examined and the significance of each variable was explained

To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

Fragmented medial coronoid process: (FCP) is often considered to be part of the osteochondrosis dissecans complex, but trauma and growth discrepancies between the radius and ulna are proposed as causes. There is little to clinically differentiate FCP, from osteochondrosis dissecans (OCD) of the elbow. Pain on, flexion-extension of the elbow and lateral rotation of the paw is a little more consistent in FCP. Radiographic examination of the elbow is important despite the, fact that radiographic signs of the FCP are often nonspecific. Excessive osteoarthrosis and superimposition of the radial head and coronoid process make identification of the FCP difficult. Craniocaudal, flexed mediolateral and 25 degree craniocaudal-lateromedial views are necessary for diagnosis. Osteophyte production is more dramatic with FCP than with OCD and suggests therefore the occurrence of OCP in many cases. Although the detached process may be seen on any view, the oblique projection offers the least obstructed view. Exposure of the joint is identical to that for OCD, that means a medial approach with osteotomy of the epicondyle. In most cases the process is loose enough to be readily apparent, but in some it is necessary to exert force on the process in order to find the cleavage plane. It is necessary to remove the osteophytes as well and to inspect and irrigate the joint carefully to remove cartilage fragments before closure. Confinement is advisable for 4 weeks before returning the dog to normal activity. The outlook for function is good if the FCP is removed before secondary degenerative joint disease is well established

To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

The evolution of vaccines (e.g., live attenuated, recombinant) and vaccine production methods (e.g., in ovo, cell culture) are intimately tied to each other. As vaccine technology has advanced, the methods to produce the vaccine have advanced and new vaccine opportunities have been created. These technologies will continue to evolve as we strive for safer and more immunogenic vaccines and as our understanding of biology improves. The evolution of vaccine process technology has occurred in parallel to the remarkable growth in the development of therapeutic proteins as products; therefore, recent vaccine innovations can leverage the progress made in the broader biotechnology industry. Numerous important legacy vaccines are still in use today despite their traditional manufacturing processes, with further development focusing on improving stability (e.g., novel excipients) and updating formulation (e.g., combination vaccines) and delivery methods (e.g., skin patches). Modern vaccine development is currently exploiting a wide array of novel technologies to create safer and more efficacious vaccines including: viral vectors produced in animal cells, virus-like particles produced in yeast or insect cells, polysaccharide conjugation to carrier proteins, DNA plasmids produced in E. coli, and therapeutic cancer vaccines created by in vitro activation of patient leukocytes. Purification advances (e.g., membrane adsorption, precipitation) are increasing efficiency, while innovative analytical methods (e.g., microsphere-based multiplex assays, RNA microarrays) are improving process understanding. Novel adjuvants such as monophosphoryl lipid A, which acts on antigen presenting cell toll-like receptors, are expanding the previously conservative list of widely accepted vaccine adjuvants. As in other areas of biotechnology, process characterization by sophisticated analysis is critical not only to improve yields, but also to determine the final product quality. From a regulatory

When designing process-aware information systems, often variants of the same process have to be specified. Each variant then constitutes an adjustment of a particular process to specific requirements building the process context. Current Business Process Management (BPM) tools do not adequately

In the quest for knowledge about how to make good process models, recent research focus is shifting from studying the quality of process models to studying the process of process modeling (often abbreviated as PPM) itself. This paper reports on our efforts to visualize this specific process in such

This chapter gives an overview of the fundamentals of process intensification from a process systems engineering point of view. The concept of process intensification, including process integration, is explained together with the drivers for applying process intensification, which can be achieved...

A combined electrolysis catalytic exchange (CECE) facility employing a liquid phase catalytic exchange (LPCE) column for water detritiation is currently being studied at the tritium treatment laboratory in KAERI, which is also adopted for water detritiation in JET, ITER, and TLK (tritium laboratory Karlsruhe). The CECE facility is needed for process performance studies, to support the design of the Water Detritiation System for the nuclear industry, such as nuclear reactor operations and a radioisotope production, and medical research. The design and commissioning of this demonstration scale facility to investigate the achievable tritium decontamination factor are ongoing. A simple schematic of the CECE process is shown in Figure 1. The key elements of the facility are an electrolyser for conversion of tritiated water to gaseous hydrogen, and a LPCE column where this hydrogen is detritiated via isotopic exchange with liquid water

Modern computing, as well as the historical development of computing, has been dominated by sequential monoprocessing. Yet there is the alternative of parallelism, where several processes may be in concurrent execution. This alternative is discussed in a series of lectures, in which the main developments involving parallelism are considered, both from the standpoint of computing systems and that of applications that can exploit such systems. The lectures seek to discuss parallelism in a historical context, and to identify all the main aspects of concurrency in computation right up to the present time. Included will be consideration of the important question as to what use parallelism might be in the field of data processing. (orig.)

Utilizing the comment section of patient satisfaction surveys, Clark Memorial Hospital in Jeffersonville, IN went through a thoughtful process to arrive at an experience that patients said they wanted. Two Lean Six Sigma tools were used--the Voice of the Customer (VoC) and the Affinity Diagram. Even when using these tools, a facility will not be able to accomplish everything the patient may want. Guidelines were set and rules were established for the Process Improvement Team in order to lessen frustration, increase focus, and ultimately be successful. The project's success is driven by the team members carrying its message back to their areas. It's about ensuring that everyone is striving to improve the patients' experience by listening to what they say is being done right and what they say can be done better. And then acting on it.

The Electric Power Research Institute (EPRI) manages research for its sponsoring electric utilities in the United States. Research in the area of low level radioactive waste (LLRW) from light water reactors focuses primarily on waste processing within the nuclear power plants, monitoring of the waste packages, and assessments of disposal technologies. Accompanying these areas and complimentary to them is the determination and evaluation of the sources of nuclear power plants radioactive waste. This paper focuses on source characterization of nuclear power plant waste, LLRW processing within nuclear power plants, and the monitoring of these wastes. EPRI's work in waste disposal technology is described in another paper in this proceeding by the same author. 1 reference, 5 figures

as a major neurotransmitter in the central and peripheral nervous systems. The tissue-specific expression of the hormones is regulated at the transcriptional level, but the posttranslational phase is also decisive and is highly complex in order to ensure accurate maturation of the prohormones in a cell...... specific manner. Despite the structural similarities of gastrin and CCK, there are decisive differences in the posttranslational processing and secretion schemes, suggesting that specific features in the processing may have evolved to serve specific purposes. For instance, CCK peptides circulate in low...... picomolar concentrations, whereas the cellular expression of gastrin is expressed at higher levels, and accordingly gastrin circulates in 10-20-fold higher concentrations. Both common cancers and the less frequent neuroendocrine tumors express the gastrin gene and prohormone. But the posttranslational...

Full Text Available Processing of Serbian inflected verbs was investigated in two lexical decision experiments. In the first experiment subjects were presented with five forms of future tense, while in the second experiment the same verbs were presented in three forms of present and future tense. The outcome of the first experiment indicates that processing of inflected verb is determined by the amount of information derived from the average probability per congruent personal pronoun of a particular verb form. This implies that the cognitive system is not sensitive to verb person per se, nor to the gender of congruent personal pronoun. Results of the second experiment show that for verb forms of different tenses, presented in the same experiment, the amount of information has to be additionally modulated by tense probability. Such an outcome speaks in favor of cognitive relevance of verb tense.

Design and performance details of the advanced information processing system (AIPS) for fault and damage tolerant data processing on aircraft and spacecraft are presented. AIPS comprises several computers distributed throughout the vehicle and linked by a damage tolerant data bus. Most I/O functions are available to all the computers, which run in a TDMA mode. Each computer performs separate specific tasks in normal operation and assumes other tasks in degraded modes. Redundant software assures that all fault monitoring, logging and reporting are automated, together with control functions. Redundant duplex links and damage-spread limitation provide the fault tolerance. Details of an advanced design of a laboratory-scale proof-of-concept system are described, including functional operations.

This paper describes the improvements that have been implemented to enhance communications of a large Department of Energy site with state and local agencies. Through an aggressive off-site communications program, and with constant feedback from stakeholders, the site has established a clear line of communication that provides off-site agencies with timely and accurate information regarding its activities. This paper will discuss the implementation of the courtesy notification process, which takes into consideration the potential for media or public interest, and quarterly facility tours and briefing. This paper will include a historical time-line of events and incidents that have resulted in establishing the Off-site Communications Process and the demonstrated success in opening lines of communication with these off-site agencies. (author)

The last few years have seen a significant increase in the use of ionising radiation in industrial processes and also international trade in irradiated products. With this, the demand for internationally accepted dosimetric techniques, accredited to international standards has also increased which is further stimulated by the emergence of ISO-9000 series of standards in industries. The present paper describes some of the important dosimetric techniques used in radiation processing, the role of IAEA in evolving internationally accepted standards and work carried out at the Defence Laboratories, Jodhpur in the development of a cheap, broad dose range and simple dosimeter for routine dosimetry. For this polyhydroxy alcohols viz., mannitol, sorbitol and inositol were studied using the spectrophotometric read out method. Out of the alcohols studied mannitol was found to be most promising covering a dose range of 0.01 kGy - 100 kGy. (author). 26 refs., 3 figs., 1 tab

This paper presents the Logical Process Calculus (LPC), a formalism that supports heterogeneous system specifications containing both operational and declarative subspecifications. Syntactically, LPC extends Milner's Calculus of Communicating Systems with operators from the alternation-free linear-time mu-calculus (LT(mu)). Semantically, LPC is equipped with a behavioral preorder that generalizes Hennessy's and DeNicola's must-testing preorder as well as LT(mu's) satisfaction relation, while being compositional for all LPC operators. From a technical point of view, the new calculus is distinguished by the inclusion of: (1) both minimal and maximal fixed-point operators and (2) an unimple-mentability predicate on process terms, which tags inconsistent specifications. The utility of LPC is demonstrated by means of an example highlighting the benefits of heterogeneous system specification.

A process is claimed for isotopic separation applied to isotopes of elements that can be placed in at least a physicochemical form in which the isotopic atoms or the molecules containing these atoms can be easily displaced and for which there are selective radiations preferentially absorbed by the isotopes of a certain type or by the molecules containing them, said absorption substantially increasing the probability of ionization of said atoms or molecules relative to the atoms or molecules that did not absorb the radiation. The process consists of placing the isotopic mixture in such a form, subjecting it in a separation zone to selective radiations and to an electrical field that produces migration of positive ions toward the negative electrodes and negative ions toward the positive electrodes, and withdrawing from certain such zones the fractions thus enriched in certain isotopes

An under-researched issue in work within the `knowledge movement' is therelation between organizational issues and knowledge processes (i.e., sharingand creating knowledge). We argue that managers can shape formalorganization structure and organization forms and can influence the moreinformal org...... to Anna Grandori for numerous excellent comments on anearlier draft. The standard disclaimer applies.Keywords: Knowledge creation, knowledge sharing, governance, organizationaleconomics, organizational behavior.......An under-researched issue in work within the `knowledge movement' is therelation between organizational issues and knowledge processes (i.e., sharingand creating knowledge). We argue that managers can shape formalorganization structure and organization forms and can influence the moreinformal...... organizational practices in order to foster knowledge sharing andcreation. Theoretically, we unfold this argument by relying on key ideas oforganizational economics and organizational behaviour studies. We put forwarda number of refutable propositions derived from this reasoning.AcknowledgmentsWe are grateful...

In 2007 EU Regulation (EC) 834/2007 introduced principles and criteria for organic food processing. These regulations have been analysed and discussed in several scientific publications and research project reports. Recently, organic food quality was described by principles, aspects and criteria....... These principles from organic agriculture were verified and adapted for organic food processing. Different levels for evaluation were suggested. In another document, underlying paradigms and consumer perception of organic food were reviewed against functional food, resulting in identifying integral product...... identity as the underlying paradigm and a holistic quality view connected to naturalness as consumers' perception of organic food quality. In a European study, the quality concept was applied to the organic food chain, resulting in a problem, namely that clear principles and related criteria were missing...

Darwin's principle of evolution by natural selection is readily casted into a mathematical formalism. Molecular biology revealed the mechanism of mutation and provides the basis for a kinetic theory of evolution that models correct reproduction and mutation as parallel chemical reaction channels. A result of the kinetic theory is the existence of a phase transition in evolution occurring at a critical mutation rate, which represents a localization threshold for the population in sequence space. Occurrence and nature of such phase transitions depend critically on fitness landscapes. The fitness landscape being tantamount to a mapping from sequence or genotype space into phenotype space is identified as the true source of complexity in evolution. Modeling evolution as a stochastic process is discussed and neutrality with respect to selection is shown to provide a major challenge for understanding evolutionary processes (author)

As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... amount of cross correlation, practitioners are often recommended to use latent structures methods such as Principal Component Analysis to summarize the data in only a few linear combinations of the original variables that capture most of the variation in the data. Applications of these control charts...

Purpose: To simplify the structure of a gas processing system which has hitherto been much complicated by the recyclic use of molecular sieve regeneration gas, by enabling to release the regeneration gas to outside in a once-through manner. Constitution: The system comprises a cooler for receiving and cooling gases to be processed containing radioactive rare gases, moisture-removing pipelines each connected in parallel to the exit of the cooler and having switching valves and a moisture removing column disposed between the valves and a charcoal absorber in communication with the moisture removing pipelines. Pipelines for flowing regeneration heating gases are separately connected to the moisture removing columns, and molecular sieve is charged in the moisture removing column by the amount depending on the types of the radioactive rare gases. (Aizawa, K.)

A naphta fraction is subjected to a catalytic reforming process in several series-connected reactors. The first reactor is equipped with a moving catalyst bed containing not more the 30% of volume of the total catalyst amount. The other reactors are designed as packed-bed systems. The content of coke deposited on the catalyst of the first reactor owing to the reforming process is maintained at below 1% of weight. This is effected by periodic removal of a proportion of the contaminated catalyst from the bottom part of the bed, by its regeneration and re-feeding to the top part of the bed. This results in prolonged service life of the catalyst and simultaneous improvement of the anti-knock value of the product.

Nuclear technology such as γ-rays, electron beams and ion beams irradiation is widely used in industrial, medical and agricultural fields. The purpose of radiation application is aiming at increasing welfare and quality of our life. Radiation technology applied to medical care is widely known as X-ray diagnosis but the contribution of radiation processing to our daily life is not well known even though it is effectively used in industry and agriculture. The main radiation processing in industry is the modification of polymers, i.e. heat shrinkable tube, radial tire, plastic foam, etc. in a car, heat resistant wire and cable, semiconductor, floppy disk, etc. in a computer, and sterilization of medical devices. In Agriculture, radiation has been used in various fields such as food irradiation, sterile insect technique, mutation breeding, etc. contributing for human being to supply foods and sustainable environment. (author)

A new rapidly processable radiographic silver halide material is described for use in mammography and non-destructive testing of industrial materials. The radiographic material is used for direct exposure to penetrating radiation without the use of fluorescent-intensifying screens. It consists of a transparent support with a layer of hydrophilic colloid silver halide emulsion on one or both sides. Examples of the preparation of three different silver halide emulsions are given including the use of different chemical sensitizers. These new radiographic materials have good resistance to the formation of pressure marks in rapid processing apparatus and they have improved sensitivity for direct exposure to penetrating radiation compared to conventional radiographic emulsions. (U.K.)

This patent describes a process for purifying molybdenum containing arsenic and phosphorus. The process comprising: adding to an acidic slurry of molybdenum trioxide, a source of magnesium ions in a solid form, with the amount of magnesium and the magnesium ion concentration in the subsequently formed ammonium molybdate solution being sufficient to subsequently form insoluble compounds containing greater than about 80% by weight of the arsenic and greater than about 80% by weight of the phosphorus, and ammonia in an amount sufficient to subsequently dissolve the molybdenum and subsequently form the insoluble compounds, with the source of magnesium ions being added prior to the addition of the ammonia; digesting the resulting ammoniated slurry at a temperature sufficient to dissolve the molybdenum and form an ammonium molybdate solution while the pH is maintained at from bout 9 to about 10 to form a solid containing the insoluble compounds; and separating the solid from the ammonium molybdate solution

The article deals with the combustion process for lump wood in low-power fireplaces (units to dozens of kW). Such a combustion process is cyclical in its nature, and what combustion facility users are most interested in is the frequency, at which fuel needs to be stoked to the fireplace. The paper defines the basic terms such as burnout curve and burning rate curve, which are closely related to the stocking frequency. The fuel burning rate is directly dependent on the immediate thermal power of the fireplace. This is also related to the temperature achieved in the fireplace, magnitude of flue gas losses and the ability to generate conditions favouring the full burnout of the fuel's combustible component, which, at once ensures the minimum production of combustible pollutants. Another part of the paper describes experiments conducted in traditional fireplaces with a grate, at which well-dried lump wood was combusted.

A pulse height analyzer terminal system PHATS has been developed for online data processing via JAERI-TOKAI computer network. The system is controled by using a micro-computer MICRO-8 which was developed for the JAERI-TOKAI network. The system program consists of two subprograms, online control system ONLCS and pulse height analyzer control system PHACS. ONLCS links the terminal with the conversational programming system of FACOM 230/75 through the JAERI-TOKAI network and controls data processing in TSS and remote batch modes. PHACS is used to control INPUT/OUTPUT of data between pulse height analyzer and cassette-MT or typewriter. This report describes the hardware configuration and the system program in detail. In the appendix, explained are real time monitor, type of message, PEX to PEX protocol and Host to Host protocol, required for the system programming. (author)

Over the past twenty years, astronomers have identified hundreds of extrasolar planets--planets orbiting stars other than the sun. Recent research in this burgeoning field has made it possible to observe and measure the atmospheres of these exoplanets. This is the first textbook to describe the basic physical processes--including radiative transfer, molecular absorption, and chemical processes--common to all planetary atmospheres, as well as the transit, eclipse, and thermal phase variation observations that are unique to exoplanets. In each chapter, Sara Seager offers a conceptual introduction, examples that combine the relevant physics equations with real data, and exercises. Topics range from foundational knowledge, such as the origin of atmospheric composition and planetary spectra, to more advanced concepts, such as solutions to the radiative transfer equation, polarization, and molecular and condensate opacities. Since planets vary widely in their atmospheric properties, Seager emphasizes the major p...

The results of the various aspects of this study indicate that the encapsulation process is not only capable of reducing the percent of Radon-222 emanation but also reduces the possibility of the leaching of toxic elements. Radon-222 emanation after solidification showed a 93.51% reduction from the slurry. The Gamma Spectral Analyses of short-lived Radon daughters supported the above findings. Leach studies on solidified refinery waste and transformer oils indicate there is a significant reduction in the possibility of toxic substances leaching out of the solidified samples. Further studies are needed to confirm the results of this investigation; however, the present findings indicate that the process could substantially reduce Radon-222 exhalation into the environment from uranium tailings ponds and reduce toxic leachates from hazardous waste materials

, and is essential to achieve successful speech communication, correct orientation in our full environment, and eventually survival. These adaptive processes may differ in individuals with hearing loss, whose auditory system may cope via ‘‘readapting’’ itself over a longer time scale to the changes in sensory input...... induced by hearing impairment and the compensation provided by hearing devices. These devices themselves are now able to adapt to the listener’s individual environment, attentional state, and behavior. These topics related to auditory adaptation, in the broad sense of the term, were central to the 6th...... International Symposium on Auditory and Audiological Research held in Nyborg, Denmark, in August 2017. The symposium addressed adaptive processes in hearing from different angles, together with a wide variety of other auditory and audiological topics. The papers in this special issue result from some...

Microelectronics and the classical information technologies transformed the physics of semiconductors. Photonics has given optical materials a new direction. Quantum information technologies, we believe, will have immense impact on condensed matter physics. The novel systems of quantum information processing need to be designed and made. Their behaviours must be manipulated in ways that are intrinsically quantal and generally nanoscale. Both in this special issue and in previous issues (see e.g., Spiller T P and Munro W J 2006 J. Phys.: Condens. Matter 18 V1-10) we see the emergence of new ideas that link the fundamentals of science to the pragmatism of market-led industry. We hope these papers will be followed by many others on quantum information processing in the Journal of Physics: Condensed Matter.

Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.

This thesis addresses the difficulties encountered in managing large amounts of data in supervisory control of complex systems. Some previous alarm and disturbance analysis concepts are reviewed and a method for improving the supervision of complex systems is presented. The method, called multivariate supervision, is based on adding low level intelligence to the process control system. By using several measured variables linked together by means of deductive logic, the system can take into account the overall state of the supervised system. Thus, it can present to the operators fewer messages with higher information content than the conventional control systems which are based on independent processing of each variable. In addition, the multivariate method contains a special information presentation concept for improving the man-machine interface. (author)

This introduction to the industrial primary aluminum production process presents a short description of the electrolytic reduction technology, the history of aluminum, and the importance of this metal and its production process to modern society. Aluminum's special qualities have enabled advances in technologies coupled with energy and cost savings. Aircraft capabilities have been greatly enhanced, and increases in size and capacity are made possible by advances in aluminum technology. The metal's flexibility for shaping and extruding has led to architectural advances in energy-saving building construction. The high strength-to-weight ratio has meant a substantial reduction in energy consumption for trucks and other vehicles. The aluminum industry is therefore a pivotal one for ecological sustainability and strategic for technological development.

A process for treating radioactive waste solutions prior to disposal is described. A water-soluble phosphate, borate, and/or silicate is added. The solution is sprayed with steam into a space heated from 325 to 400 deg C whereby a powder is formed. The powder is melted and calcined at from 800 to 1000 deg C. Water vapor and gaseous products are separated from the glass formed. (AEC)

This viewgraph presentation reviews the known and possible geologic processes of Europa. It shows slides of Europa, with different terrains (ridged plains and molten plains), and a possible view of the interior. Europa's eccentric orbit is reviewed. The presentation also reviews Europa's composition. The possible reasons for Europa's geology are reviewed. Also the possiblity that life exists on Europa is raised. The planned Europa Geophysical Explorer mission is also reviewed.

This book will be a basic, step-by-step tutorial, which will help readers take advantage of all that Spark has to offer.Fastdata Processing with Spark is for software developers who want to learn how to write distributed programs with Spark. It will help developers who have had problems that were too much to be dealt with on a single computer. No previous experience with distributed programming is necessary. This book assumes knowledge of either Java, Scala, or Python.

The connection between language processing and combinatorics on words is natural. Historically, linguists actually played a part in the beginning of the construction of theoretical combinatorics on words. Some of the terms in current use originate from linguistics: word, prefix, suffix, grammar, syntactic monoid... However, interpenetration between the formal world of computer theory and the intuitive world of linguistics is still a love story with ups and downs. We will encounter in this cha...

The objective of this dissertation is to explore the hypothesis that there is a strong linkage between gas atomization processing conditions, as-atomized aluminum powder characteristics, and the consolidation methodology required to make components from aluminum powder. The hypothesis was tested with pure aluminum powders produced by commercial air atomization, commercial inert gas atomization, and gas atomization reaction synthesis (GARS). A comparison of the GARS aluminum powders with the commercial aluminum powders showed the former to exhibit superior powder characteristics. The powders were compared in terms of size and shape, bulk chemistry, surface oxide chemistry and structure, and oxide film thickness. Minimum explosive concentration measurements assessed the dependence of explosibility hazard on surface area, oxide film thickness, and gas atomization processing conditions. The GARS aluminum powders were exposed to different relative humidity levels, demonstrating the effect of atmospheric conditions on post-atomization processing conditions. The GARS aluminum powders were exposed to different relative humidity levels, demonstrating the effect of atmospheric conditions on post-atomization oxidation of aluminum powder. An Al-Ti-Y GARS alloy exposed in ambient air at different temperatures revealed the effect of reactive alloy elements on post-atomization powder oxidation. The pure aluminum powders were consolidated by two different routes, a conventional consolidation process for fabricating aerospace components with aluminum powder and a proposed alternative. The consolidation procedures were compared by evaluating the consolidated microstructures and the corresponding mechanical properties. A low temperature solid state sintering experiment demonstrated that tap densified GARS aluminum powders can form sintering necks between contacting powder particles, unlike the total resistance to sintering of commercial air atomization aluminum powder.

The Thermox process is a process developed by AB Atomenergi for the decladding and dissolution of irradiated Zircaloy-2 clad uranium dioxide fuel elements and consists of the following stages: 1. Decladding by means of thermal oxidation of the Zircaloy-2 with oxygen and water vapour at 825 C using nitrogen as a catalyst. 2. Oxidation of the uranium dioxide pellets with air and oxygen to U{sub 3}O{sub 8} at a temperature of 450 - 650 C. 3. Dissolving and leaching the uranium oxides with dilute nitric acid leaving the insoluble zirconium oxide as a residue. 4. Filtering the solution and washing the residues of the cladding. The work has included the following parts; The laboratory scale investigation of the conditions for the oxidation of Zircaloy-2 in various gas mixtures and of the conditions for oxidizing and dissolving sintered UO{sub 2} pellets; The development on a pilot plant scale of suitable apparatus and process techniques for the safe and reproducible treatment of half length inactive fuel elements; Studies of some special operation and handling problems, which have to be solved before the method can be applied in full scale. Five half length fuel elements have been treated, and the results have been satisfactory. The pilot plant experiments have proved that inactive fuel elements can be decanned, oxidized and dissolved by means of the Thermox process. Solutions and canning residues are easy to filter, separate, and handle and are free from corroding agents. The uranium losses can be kept very low. The zirconium dioxide is obtained in a form suitable for permanent disposal.

Consumer decision making has been a focal interest in consumer research, and consideration of current marketplace trends ( e.g., technological change, an information explosion) indicates that this topic will continue to be critically important. We argue that consumer choice is inherently constructive. Due to limited processing capacity, consumers often do not have well-defined existing preferences, but construct them using a variety of strategies contingent on task demands. After describing c...

In the process for deparaffination of petroleum products (NP) by treatment of them with carbamide (KA) in the presence of an activator (AK) and solvent with subsequent separation of the forming paraffin-carbamide complex (PKC) the NP is mixed with the AK at a temperature of 5-40/sup 0/ in an NP:AK ratio of 1:0.01-0.03 with subsequent addition of portions of KA at a temperature of 5-20/sup 0/. The advantages of the process in comparison with the known one consists in the fact that the process of complex formation is realized practically without an induction period, and paraffin yield is increased by 27% on the potential with a simultaneous decrease in reagent consumption: KA by 20% and MeOH by 35%. Example -- 100 g of diesel fuel (a 200-360/sup 0/ fraction) is mixed with MeOH (1.3% on the feedstock), which in the obtained mixture is found in a liquid droplet state. To the mixture 40 g KA is added (2/3 of the total amount of KA to be added). The complex formation temperature is held at 35/sup 0/. To the PKC practically forming at once 260 g of solvent (an 85-120/sup 0/ gasoline fraction) and then 20 g KA are added. The temperature is held at 20/sup 0/. The forming suspension of the complex is distinguished by uniformity and the absence of coarse conglomerates. The paraffin yield amounts to 11.8% on the feedstock or 74% on the potential. The melting point of the paraffin is 22/sup 0/. In the described deparaffination scheme according to the proposed process complex formation is carried out in two steps; in a third step the suspension of KA and centrifuged paraffin is separated; the KA is returned to the complex formation reactor, and the centrifuged paraffin --- for solvent regeneration.

In accordance with the Nuclear Energy Act, the use of nuclear energy constitutes operations subject to license. The licensing process and conditions for granting a license is defined in the legislation. The licenses are applied from and granted by the Government. This paper discusses briefly the licensing process in Finland and also the roles and responsibilities of main stakeholders in licensing. Licensing of a nuclear power plant in Finland has three steps. The first step is the Decision in Principle (DiP). Goal of DiP is to decide whether using nuclear power is for the overall good for the Finnish society. The second step is Construction License (CL) and the goal of CL phase is to determine whether the design of the proposed plant is safe and that the participating organisations are capable of constructing the plant to meet safety goals. The third step is the Operating License (OL) and the goal of the OL phase is to determine whether the plant operates safely and licensee is capable to operate the plant safely. Main stakeholders in the licensing process in Finland are the utility (licensee) interested in using nuclear power in Finland, Ministry of Employment and the Economy (MEE), Government, Parliament, STUK, the municipality siting the plant and the general public. Government grants all licenses, and Parliament has to ratify Government's Decision in Principle. STUK has to assess the safety of the license applications in each step and give statement to the Ministry. Municipality has to agree to site the plant. Both STUK and the municipality have a veto right in the licensing process

The Thermox process is a process developed by AB Atomenergi for the decladding and dissolution of irradiated Zircaloy-2 clad uranium dioxide fuel elements and consists of the following stages: 1. Decladding by means of thermal oxidation of the Zircaloy-2 with oxygen and water vapour at 825 C using nitrogen as a catalyst. 2. Oxidation of the uranium dioxide pellets with air and oxygen to U 3 O 8 at a temperature of 450 - 650 C. 3. Dissolving and leaching the uranium oxides with dilute nitric acid leaving the insoluble zirconium oxide as a residue. 4. Filtering the solution and washing the residues of the cladding. The work has included the following parts; The laboratory scale investigation of the conditions for the oxidation of Zircaloy-2 in various gas mixtures and of the conditions for oxidizing and dissolving sintered UO 2 pellets; The development on a pilot plant scale of suitable apparatus and process techniques for the safe and reproducible treatment of half length inactive fuel elements; Studies of some special operation and handling problems, which have to be solved before the method can be applied in full scale. Five half length fuel elements have been treated, and the results have been satisfactory. The pilot plant experiments have proved that inactive fuel elements can be decanned, oxidized and dissolved by means of the Thermox process. Solutions and canning residues are easy to filter, separate, and handle and are free from corroding agents. The uranium losses can be kept very low. The zirconium dioxide is obtained in a form suitable for permanent disposal

A metallic fuel alloy, which is a key element of the IFR fuel cycle, permits the use of pyrochemical processing of the spent fuel. Electrorefining with molten salt electrolytes is a key step in the pyroprocess because the actinides are recovered, separated from the fission products present in the spent fuel in this operation and then recycled for use as fuel. Chemical and electrochemical aspects of the electrorefining method is to be described. (author)

We consider a stochastic process in which independent identically distributed random matrices are multiplied and where the Lyapunov exponent of the product is positive. We continue multiplying the random matrices as long as the norm, ɛ, of the product is less than unity. If the norm is greater than unity we reset the matrix to a multiple of the identity and then continue the multiplication. We address the problem of determining the probability density function of the norm, \

Thanks to the innovations in the technology for the processing of medical images, to the high development of better and cheaper computers, and, additionally, to the advances in the systems of communications of medical images, the acquisition, storage and handling of digital images has acquired great importance in all the branches of the medicine. It is sought in this article to introduce some fundamental ideas of prosecution of digital images that include such aspects as their representation, storage, improvement, visualization and understanding

Experimental data and models of a new type of material sputtering with ions of relatively high energies due to inelastic (electron) processes are reviewed. This area of investigations began to develop intensively during the latest years. New experimental data of the authors on differential characteristics of ultradisperse gold and americium dioxide layers with fission fragments are given as well. Practical applications of the new sputtering type are considered as well as setup of possibl experiments at heavy multiply charged ion accelerators

Business Process Outsourcing (BPO) is gaining widespread acceptance throughout the US, Europe, South America and Asia Pacific as the top executives of leading multinationals turn to outsourcing as a strategic management tool for improving corporate performance, profitability and shareholder value. BPO started to emerge a few years ago as follow-on to IT outsourcing. The concept is not new; BPO is the contracting of a specific business task. Outsourcing focuses on adding value typically to non...

A process for withdrawing gaseous UF 6 from a first system and directing same into a second system for converting the gas to liquid UF 6 at an elevated temperature, additionally including the step of withdrawing the resulting liquid UF 6 from the second system, subjecting it to a specified sequence of flash-evaporation, cooling and solidification operations, and storing it as a solid in a plurality of storage vessels. (author)

Five different inelastic light scattering processes will be denoted by, ordinary Raman scattering (ORS), resonance Raman scattering (RRS), off-resonance fluorescence (ORF), resonance fluorescence (RF), and broad fluorescence (BF). A distinction between fluorescence (including ORF and RF) and Raman scattering (including ORS and RRS) will be made in terms of the number of intermediate molecular states which contribute significantly to the scattered amplitude, and not in terms of excited state lifetimes or virtual versus real processes. The theory of these processes will be reviewed, including the effects of pressure, laser wavelength, and laser spectral distribution on the scattered intensity. The application of these processes to the remote sensing of atmospheric pollutants will be discussed briefly. It will be pointed out that the poor sensitivity of the ORS technique cannot be increased by going toward resonance without also compromising the advantages it has over the RF technique. Experimental results on inelastic light scattering from I(sub 2) vapor will be presented. As a single longitudinal mode 5145 A argon-ion laser line was tuned away from an I(sub 2) absorption line, the scattering was observed to change from RF to ORF. The basis, of the distinction is the different pressure dependence of the scattered intensity. Nearly three orders of magnitude enhancement of the scattered intensity was measured in going from ORF to RF. Forty-seven overtones were observed and their relative intensities measured. The ORF cross section of I(sub 2) compared to the ORS cross section of N2 was found to be 3 x 10(exp 6), with I(sub 2) at its room temperature vapor pressure.

This invention discloses a process and apparatus for pyrolyzing particulate coal by heating with a particulate solid heating media in a transport reactor. The invention tends to dampen fluctuations in the flow of heating media upstream of the pyrolysis zone, and by so doing forms a substantially continuous and substantially uniform annular column of heating media flowing downwardly along the inside diameter of the reactor. The invention is particularly useful for bituminous or agglomerative type coals.

Among hard electromagnetic processes, I will use the most recent data and focus on quantitative test of QCD. More specifically, I will retain two items: - hadroproduction of direct photons, - Drell-Yan. In addition, I will briefly discuss a recent analysis of ISR data obtained with AFS (Axial Field Spectrometer) which sheds a new light on the e/π puzzle at low P T

A process is disclosed for the refining of hydrocarbons or other mixtures through treatment in vapor form with metal catalysts, characterized by such metals being used as catalysts, which are obtained by reduction of the oxide of minerals containing the iron group, and by the vapors of the hydrocarbons, in the presence of the water vapor, being led over these catalysts at temperatures from 200 to 300/sup 0/C.

In this paper we describe a new approach for learning dialog act processing. In this approach we integrate a symbolic semantic segmentation parser with a learning dialog act network. In order to support the unforeseeable errors and variations of spoken language we have concentrated on robust data-driven learning. This approach already compares favorably with the statistical average plausibility method, produces a segmentation and dialog act assignment for all utterances in a robust manner,...

The purpose of this work is to explore the feasibility of several advanced microwave processing concepts to develop new energy-efficient materials and processes. The project includes two tasks: (1) commercialization of the variable-frequency microwave furnace; and (2) microwave curing of polymer composites. The variable frequency microwave furnace, whose initial conception and design was funded by the AIC Materials Program, will allow us, for the first time, to conduct microwave processing studies over a wide frequency range. This novel design uses a high-power traveling wave tube (TWT) originally developed for electronic warfare. By using this microwave source, one can not only select individual microwave frequencies for particular experiments, but also achieve uniform power densities over a large area by the superposition of many different frequencies. Microwave curing of thermoset resins will be studied because it hold the potential of in-situ curing of continuous-fiber composites for strong, lightweight components. Microwave heating can shorten curing times, provided issues of scaleup, uniformity, and thermal management can be adequately addressed.

Using the specific case study of the Caroline interrogatory process, an example is given of how an effective communications and public involvement program can re-establish trust and credibility levels within an community after an incident. The public is nervous about sour gas, especially about blowouts of gas from a pipeline. The post-approval period was marked by high expectations and a community consultation program which included a community advisory board, an emergency planning committee, socio-economic factors, and environmental monitoring and studies. Information and education involves newspaper articles, newsletters, tours, public consultation meetings, and weekly e-mail. Mercury was detected as a potential hazard at the site, and company actions are illustrated. Overall lessons learned included: starting early paid off, face to face resident contacts were the most effective, the willingness to make changes was the key to success, the community helped, knowing all the answers is not essential, and there is a need for empathy. The interrogatory process includes a hybrid technique that is comprised of four stages: 1) process review and public input, 2) identification and clarification of issues, 3) responses by industry and government, and 4) a public forum and follow-up action.

Using the specific case study of the Caroline interrogatory process, an example is given of how an effective communications and public involvement program can re-establish trust and credibility levels within an community after an incident. The public is nervous about sour gas, especially about blowouts of gas from a pipeline. The post-approval period was marked by high expectations and a community consultation program which included a community advisory board, an emergency planning committee, socio-economic factors, and environmental monitoring and studies. Information and education involves newspaper articles, newsletters, tours, public consultation meetings, and weekly e-mail. Mercury was detected as a potential hazard at the site, and company actions are illustrated. Overall lessons learned included: starting early paid off, face to face resident contacts were the most effective, the willingness to make changes was the key to success, the community helped, knowing all the answers is not essential, and there is a need for empathy. The interrogatory process includes a hybrid technique that is comprised of four stages: 1) process review and public input, 2) identification and clarification of issues, 3) responses by industry and government, and 4) a public forum and follow-up action.

Gaseous waste recombiners 'A' and 'B' are connected in series and three-way valves are disposed at the upstream and the downstream of the recombiners A and B, and bypass lines are disposed to the recombiners A and B, respectively. An opening/closing controller for the three-way valves is interlocked with a hydrogen densitometer disposed to a hydrogen injection line. Hydrogen gas and oxygen gas generated by radiolysis in the reactor are extracted from a main condenser and caused to flow into a gaseous waste processing system. Gaseous wastes are introduced together with overheated steams to the recombiner A upon injection of hydrogen. Both of the bypass lines of the recombiners A and B are closed, and recombining reaction for the increased hydrogen gas is processed by the recombiners A and B connected in series. In an operation mode not conducting hydrogen injection, it is passed through the bypass line of the recombiner A and processed by the recombiner B. With such procedures, the increase of gaseous wastes due to hydrogen injection can be coped with existent facilities. (I.N.)

This paper surveys the semantic ramifications of extending traditional process algebras with notions of priority that allow for some transitions to be given precedence over others. These enriched formalisms allow one to model system features such as interrupts, prioritized choice, or real-time behavior. Approaches to priority in process algebras can be classified according to whether the induced notion of preemption on transitions is global or local and whether priorities are static or dynamic. Early work in the area concentrated on global pre-emption and static priorities and led to formalisms for modeling interrupts and aspects of real-time, such as maximal progress, in centralized computing environments. More recent research has investigated localized notions of pre-emption in which the distribution of systems is taken into account, as well as dynamic priority approaches, i.e., those where priority values may change as systems evolve. The latter allows one to model behavioral phenomena such as scheduling algorithms and also enables the efficient encoding of real-time semantics. Technically, this paper studies the different models of priorities by presenting extensions of Milner's Calculus of Communicating Systems (CCS) with static and dynamic priority as well as with notions of global and local pre- emption. In each case the operational semantics of CCS is modified appropriately, behavioral theories based on strong and weak bisimulation are given, and related approaches for different process-algebraic settings are discussed.

In a liquid waste processing device for processing living water wastes discharged from nuclear power plant facilities through a filtration vessel and a sampling vessel, a filtration layer disposed in the filtration vessel is divided into a plurality of layers along planes vertical to the direction of flow and the size of the filter material for each of the divided layers is made finer toward the downstream. Further, the thickness of the filtration material in each of the divided layers is also reduced toward the downstream. The filter material is packed such that the porosity in each of the divided layers is substantially identical. Further, the filtration material is packed in a mesh-like bag partitioned into a desired size and laid with no gaps to the planes vertical to the direction of the flow. Thus, liquid wastes such as living water wastes can be processed easily and simply so as to satisfy circumstantial criteria without giving undesired effects on the separation performance and life time and with easy replacement of filter. (T.M.)

An electroplating process for preparing a monolith metal layer over a polycrystalline base metal and the plated monolith product. A monolith layer has a variable thickness of one crystal. The process is typically carried in molten salts electrolytes, such as the halide salts under an inert atmosphere at an elevated temperature, and over deposition time periods and film thickness sufficient to sinter and recrystallize completely the nucleating metal particles into one single crystal or crystals having very large grains. In the process, a close-packed film of submicron particle (20) is formed on a suitable substrate at an elevated temperature. The temperature has the significance of annealing particles as they are formed, and substrates on which the particles can populate are desirable. As the packed bed thickens, the submicron particles develop necks (21) and as they merge into each other shrinkage (22) occurs. Then as micropores also close (23) by surface tension, metal density is reached and the film consists of unstable metal grain (24) that at high enough temperature recrystallize (25) and recrystallized grains grow into an annealed single crystal over the electroplating time span. While cadmium was used in the experimental work, other soft metals may be used.

The efficiency of thermoelectric technology today is limited by the properties of available thermoelectric materials and a wide variety of new approaches to developing better materials have recently been suggested. The key goal is to find a material with a large ZT, the dimensionless thermoelectric figure of merit. However, if an analogy is drawn between thermoelectric technology and gas-cycle engines then selecting different materials for the thermoelements is analogous to selecting a different working gas for the mechanical engine. And an attempt to improve ZT is analogous to an attempt to improve certain thermodynamic properties of the working-gas. An alternative approach is to focus on the thermoelectric process itself (rather than on ZT), which is analogous to considering alternate cycles such as Stirling vs. Brayton vs. Rankine etc., rather than merely considering alternative gases. Focusing on the process is a radically different approach compared to previous studies focusing on ZT. Aspects of the thermoelectric process and alternative approaches to efficient thermoelectric conversion are discussed.

Advances in nanoparticle synthesis are opening new opportunities for a broad variety of technologies that exploit the special properties of matter at the nanoscale. To realize this potential will require the development of new technologies for processing nanoparticles, so as to utilize them in a manufacturing context. Two important classes of such processing technologies include the controlled deposition of nanoparticles onto surfaces, and the application of chemically specific coatings onto individual nanoparticles, so as to either passivate or functionalize their surfaces. This paper provides an overview of three technologies related to these objectives, with an emphasis on aerosol-based methods: first, the deposition of nanoparticles by hypersonic impaction, so as so spray-coat large areas with nanoparticles; second, the use of aerodynamic lenses to produce focused beams of nanoparticles, with beam widths of a few tens of microns, so as to integrate nanoparticle-based structures into microelectromechanical systems; and third, the coating of individual nanoparticles by means of photoinduced chemical vapor deposition (photo-CVD), driven by excimer lamps. We also discuss the combination of these technologies, so that nanoparticle synthesis, together with multiple processing steps, can be accomplished in a single flow stream.

The basic methods for processing ore to recover the contained uranium have not changed significantly since the 1954-62 period. Improvements in mill operations have been the result of better or less expensive reagents, changes in equipment, and in the successful resolvement of many environmental matters. There is also an apparent trend toward large mills that can profitably process lower grade ores. The major thrust in the near future will not be on process technology but on the remaining environmental constraints associated with milling. At this time the main ''spot light'' is on tailings dam and impoundment area construction and reclamation. Plans must provide for an adequate safety factor for stability, no surface or groundwater contamination, and minimal discharge of radionuclides to unrestricted areas, as may be required by law. Solution mining methods must also provide for plans to restore the groundwater back to its original condition as defined by local groundwater regulations. Basic flowsheets (each to finished product) plus modified versions of the basic types are shown

There are an estimated 100,000 genes in the human genome of which 97% is non-coding. On the other hand, bacteria have little or no non-coding DNA. Non-coding region includes introns, ALU sequences, satellite DNA, and other segments not expressed as proteins. Why it exists? Why nature has kept non-coding during the long evolutionary period if it has no role in the development of complex life forms? Does complexity of a species somehow correlated to the existence of apparently useless sequences? What kind of capability is encoded within such nucleotide sequences that is a necessary, but not a sufficient condition for the evolution of complex life forms, keeping in mind the C-value paradox and the omnipresence of non-coding segments in higher eurkaryotes and also in many archea and prokaryotes. The physico-chemical description of biological processes is hardware oriented and does not highlight algorithmic or information processing aspect. However, an algorithm without its hardware implementation is useless as much as hardware without its capability to run an algorithm. The nature and type of computation an information-processing hardware can perform depends only on its algorithm and the architecture that reflects the algorithm. Given that enormously difficult tasks such as high fidelity replication, transcription, editing and regulation are all achieved within a long linear sequence, it is natural to think that some parts of a genome are involved is these tasks. If some complex algorithms are encoded with these parts, then it is natural to think that non-coding regions contain processing-information algorithms. A comparison between well-known automatic sequences and sequences constructed out of motifs is found in all species proves the point: noncoding regions are a sort of ''hardwired'' programs, i.e., they are linear representations of information-processing machines. Thus in our model, a noncoding region, e.g., an intron contains a program (or equivalently, it is

The goal of the Mars Aqueous Processing System (MAPS) is to establish a flexible process that generates multiple products that are useful for human habitation. Selectively extracting useful components into an aqueous solution, and then sequentially recovering individual constituents, can obtain a suite of refined or semi-refined products. Similarities in the bulk composition (although not necessarily of the mineralogy) of Martian and Lunar soils potentially make MAPS widely applicable. Similar process steps can be conducted on both Mars and Lunar soils while tailoring the reaction extents and recoveries to the specifics of each location. The MAPS closed-loop process selectively extracts, and then recovers, constituents from soils using acids and bases. The emphasis on Mars involves the production of useful materials such as iron, silica, alumina, magnesia, and concrete with recovery of oxygen as a byproduct. On the Moon, similar chemistry is applied with emphasis on oxygen production. This innovation has been demonstrated to produce high-grade materials, such as metallic iron, aluminum oxide, magnesium oxide, and calcium oxide, from lunar and Martian soil simulants. Most of the target products exhibited purities of 80 to 90 percent or more, allowing direct use for many potential applications. Up to one-fourth of the feed soil mass was converted to metal, metal oxide, and oxygen products. The soil residue contained elevated silica content, allowing for potential additional refining and extraction for recovery of materials needed for photovoltaic, semiconductor, and glass applications. A high-grade iron oxide concentrate derived from lunar soil simulant was used to produce a metallic iron component using a novel, combined hydrogen reduction/metal sintering technique. The part was subsequently machined and found to be structurally sound. The behavior of the lunar-simulant-derived iron product was very similar to that produced using the same methods on a Michigan iron

The dynamic nature of organisations has increased demand for business process agility leading to the adoption of continuous Business Process Improvement (BPI). Success of BPI projects calls for continuous process analysis and exploration of several improvement alternatives. These activities are

Companies continuously strive to improve their processes to increase productivity and delivered quality against lower costs. With Business Process Redesign (BPR) projects such improvement goals can be achieved. BPR involves the restructuring of business processes, stimulated by the application of

In this paper we give a brief introduction to Ornstein-Uhlenbeck processes and their simulation methods. Ornstein-Uhlenbeck processes were introduced by Barndorff-Nielsen and Shephard (2001) as a model to describe volatility in finance. Ornstein-Uhlenbeck processes are based on Levy processes. Levy processes simulation may be found in [1, 2].

Full Text Available The purpose of the article is to define the essence of project scope management process, its components, as well as to develop an algorithm of project scope management in terms of pharmaceutical production. Methodology. To carry out the study, available information sources on standards of project management in whole and elements of project scope management in particular are analysed. Methods of system and structural analysis, logical generalization are used to study the totality of subprocesses of project scope management, input and output documents, and to provide each of them. Methods of network planning are used to construct a precedence diagram of project scope management process. Results of the research showed that components of the project scope management are managing the scope of the project product and managing the content of project work. It is the second component is investigated in the presented work as a subject of research. Accordingly, it is defined that project scope management process is to substantiate and bring to the realization the necessary amount of work that ensures the successful implementation of the project (achievement of its goal and objectives of individual project participants. It is also determined that the process of managing the project scope takes into account the planning, definition of the project scope, creation of the structure of project work, confirmation of the scope and management of the project scope. Participants of these subprocesses are: customer, investor, and other project participants – external organizations (contractors of the project; project review committee; project manager and project team. It is revealed that the key element of planning the project scope is the formation of the structure of design work, the justification of the number of works, and the sequence of their implementation. It is recommended to use the following sequence of stages for creating the structure of project work

The three great myths, which form a sort of triumvirate of misunderstanding, are the Eureka! myth, the hypothesis myth, and the measurement myth. These myths are prevalent among scientists as well as among observers of science. The Eureka! myth asserts that discovery occurs as a flash of insight, and as such is not subject to investigation. This leads to the perception that discovery or deriving a hypothesis is a moment or event rather than a process. Events are singular and not subject to description. The hypothesis myth asserts that proper science is motivated by testing hypotheses, and that if something is not experimentally testable then it is not scientific. This myth leads to absurd posturing by some workers conducting empirical descriptive studies, who dress up their study with a ``hypothesis`` to obtain funding or get it published. Methods papers are often rejected because they do not address a specific scientific problem. The fact is that many of the great breakthroughs in silence involve methods and not hypotheses or arise from largely descriptive studies. Those captured by this myth also try to block funding for those developing methods. The third myth is the measurement myth, which holds that determining what to measure is straightforward, so one doesn`t need a lot of introspection to do science. As one ecologist put it to me ``Don`t give me any of that philosophy junk, just let me out in the field. I know what to measure.`` These myths lead to difficulties for scientists who must face peer review to obtain funding and to get published. These myths also inhibit the study of science as a process. Finally, these myths inhibit creativity and suppress innovation. In this paper I first explore these myths in more detail and then propose a new model of discovery that opens the supposedly miraculous process of discovery to doser scrutiny.

Full Text Available This paper outlines the current understanding of solar flares, mainly focused on magnetohydrodynamic (MHD processes responsible for producing a flare. Observations show that flares are one of the most explosive phenomena in the atmosphere of the Sun, releasing a huge amount of energy up to about 10^32 erg on the timescale of hours. Flares involve the heating of plasma, mass ejection, and particle acceleration that generates high-energy particles. The key physical processes for producing a flare are: the emergence of magnetic field from the solar interior to the solar atmosphere (flux emergence, local enhancement of electric current in the corona (formation of a current sheet, and rapid dissipation of electric current (magnetic reconnection that causes shock heating, mass ejection, and particle acceleration. The evolution toward the onset of a flare is rather quasi-static when free energy is accumulated in the form of coronal electric current (field-aligned current, more precisely, while the dissipation of coronal current proceeds rapidly, producing various dynamic events that affect lower atmospheres such as the chromosphere and photosphere. Flares manifest such rapid dissipation of coronal current, and their theoretical modeling has been developed in accordance with observations, in which numerical simulations proved to be a strong tool reproducing the time-dependent, nonlinear evolution of a flare. We review the models proposed to explain the physical mechanism of flares, giving an comprehensive explanation of the key processes mentioned above. We start with basic properties of flares, then go into the details of energy build-up, release and transport in flares where magnetic reconnection works as the central engine to produce a flare.

Information on the microbiology of the deep subsurface is necessary in order to understand the factors controlling the rate and extent of the microbially catalyzed redox reactions that influence the geophysical properties of these environments. Furthermore, there is an increasing threat that deep aquifers, an important drinking water resource, may be contaminated by man's activities, and there is a need to predict the extent to which microbial activity may remediate such contamination. Metabolically active microorganisms can be recovered from a diversity of deep subsurface environments. The available evidence suggests that these microorganisms are responsible for catalyzing the oxidation of organic matter coupled to a variety of electron acceptors just as microorganisms do in surface sediments, but at much slower rates. The technical difficulties in aseptically sampling deep subsurface sediments and the fact that microbial processes in laboratory incubations of deep subsurface material often do not mimic in situ processes frequently necessitate that microbial activity in the deep subsurface be inferred through nonmicrobiological analyses of ground water. These approaches include measurements of dissolved H2, which can predict the predominant microbially catalyzed redox reactions in aquifers, as well as geochemical and groundwater flow modeling, which can be used to estimate the rates of microbial processes. Microorganisms recovered from the deep subsurface have the potential to affect the fate of toxic organics and inorganic contaminants in groundwater. Microbial activity also greatly influences 1 the chemistry of many pristine groundwaters and contributes to such phenomena as porosity development in carbonate aquifers, accumulation of undesirably high concentrations of dissolved iron, and production of methane and hydrogen sulfide. Although the last decade has seen a dramatic increase in interest in deep subsurface microbiology, in comparison with the study of

Full Text Available To test whether atypical number development may affect other types of quantity processing, we investigated temporal discrimination in adults with developmental dyscalculia (DD. This also allowed us to test whether (1 number and time may be sub-served by a common quantity system or decision mechanisms –in which case they may both be impaired, or (2 whether number and time are distinct –and therefore they may dissociate. Participants judged which of two successively presented horizontal lines was longer in duration, the first line being preceded by either a small or a large number prime (‘1’ or ‘9’ or by a neutral symbol (‘#’, or in third task decide which of two Arabic numbers (either ‘1’, ‘5’, ’9’ lasted longer. Results showed that (i DD’s temporal discriminability was normal as long as numbers were not part of the experimental design even as task-irrelevant stimuli; however (ii task-irrelevant numbers dramatically disrupted DD’s temporal discriminability, the more their salience increased, though the actual magnitude of the numbers had no effect; and in contrast (iii controls’ time perception was robust to the presence of numbers but modulated by numerical quantity such that small number primes or numerical stimuli made durations appear shorter than veridical and the opposite for larger numerical prime or numerical stimuli. This study is the first to investigate continuous quantity as time in a population with a congenital number impairment and to show that atypical development of numerical competence leaves continuous quantity processing spared. Our data support the idea of a partially shared quantity system across numerical and temporal dimensions, which allows dissociations and interactions among dimensions; furthermore, they suggest that impaired number in DD is unlikely to originate from systems initially dedicated to continuous quantity processing like time.

The contents of this book are purpose of investigation, system of investigation, summary of investigation characteristic of this investigation, way to read the result table on prediction of investigation, object of investigation, important research and investigation fields, period of realizable prediction, cause of the obstacle for realization, propel way for studying and development, means of policy, comparison identical and similar task with the last time, illustration of future world in 2025 the result of investigation on material and the result of investigation on chemistry and process.

In a waste processing device for solidifying, pellets formed by condensing radioactive liquid wastes generated from a nuclear power plant, by using a solidification agent, sodium chloride, sodium hydroxide or sodium nitrate is mixed upon solidification. In particular, since sodium sulfate in a resin regenerating liquid wastes absorbs water in the cement upon cement solidification, and increases the volume by expansion, there is a worry of breaking the cement solidification products. This reaction can be prevented by the addition of sodium chloride and the like. Accordingly, integrity of the solidification products can be maintained for a long period of time. (T.M.)

Based on the authors’ research, this book introduces the main processing techniques in hyperspectral imaging. In this context, SVM-based classification, distance comparison-based endmember extraction, SVM-based spectral unmixing, spatial attraction model-based sub-pixel mapping, and MAP/POCS-based super-resolution reconstruction are discussed in depth. Readers will gain a comprehensive understanding of these cutting-edge hyperspectral imaging techniques. Researchers and graduate students in fields such as remote sensing, surveying and mapping, geosciences and information systems will benefit from this valuable resource.

The auscultation method is an important diagnostic indicator for hemodynamic anomalies. Heart sound classification and analysis play an important role in the auscultative diagnosis. The term phonocardiography refers to the tracing technique of heart sounds and the recording of cardiac acoustics vibration by means of a microphone-transducer. Therefore, understanding the nature and source of this signal is important to give us a tendency for developing a competent tool for further analysis and processing, in order to enhance and optimize cardiac clinical diagnostic approach. This book gives the

and another developed by an individual from the Gujarat, India. Using effectuation theory we find that there are some distinct differences between two categories regarding finance, access to science and technology, the motivation of innovators, options they have, actions they take, etc....... processes. Second, we show how individuals have very different understandings of frugal innovations as well as capacities and resources needed for the development of frugal innovations. Two prominent frugal innovation cases are used in this study. One innovation was developed by individuals from the USA...

A method for fabricating masks and reticles useful for projection lithography systems. An absorber layer is conventionally patterned using a pattern and etch process. Following the step of patterning, the entire surface of the remaining top patterning photoresist layer as well as that portion of an underlying protective photoresist layer where absorber material has been etched away is exposed to UV radiation. The UV-exposed regions of the protective photoresist layer and the top patterning photoresist layer are then removed by solution development, thereby eliminating the need for an oxygen plasma etch and strip and chances for damaging the surface of the substrate or coatings.

The invention relates to a process for separating a given material into two or more parts, in each of which the abundances of the isotopes of a given element differ from the abundances of the isotopes of the same material in the said material. More particularly, the invention relates to a method for the isotopically selective excitation of gas phase UF 6 by infrared photon absorption followed by selective reaction of said excited UF 6 with atomic chlorine, bromine, or iodine to form a product which may be separated by means known in the art

A silicon etch process wherein an area of silicon crystal surface is passivated by radiation damage and non-planar structure produced by subsequent anisotropic etching. The surface may be passivated by exposure to an energetic particle flux - for example an ion beam from an arsenic, boron, phosphorus, silicon or hydrogen source, or an electron beam. Radiation damage may be used for pattern definition and/or as an etch stop. Ethylenediamine pyrocatechol or aqueous potassium hydroxide anisotropic etchants may be used. The radiation damage may be removed after etching by thermal annealing. (author)

An Introduction to Information Processing provides an informal introduction to the computer field. This book introduces computer hardware, which is the actual computing equipment.Organized into three parts encompassing 12 chapters, this book begins with an overview of the evolution of personal computing and includes detailed case studies on two of the most essential personal computers for the 1980s, namely, the IBM Personal Computer and Apple's Macintosh. This text then traces the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapte

Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of