Sample records for sophisticated scientific instruments

Philosophers speak of science in terms of theory and experiment, yet when they speak of the progress of scientific knowledge they speak in terms of theory alone. In this article it is claimed that scientific knowledge consists of, among other things, scientificinstruments and instrumental techniques and not simply of some kind of justified beliefs. It is argued that one aspect of scientific progress can be characterized relatively straightforwardly - the accumulation of new scientificinstruments. The development of the cyclotron is taken to illustrate this point. Eight different activities which promoted the successful completion of the cyclotron are recognised. The importance is in the machine rather than the experiments which could be run on it and the focus is on how the cyclotron came into being, not how it was subsequently used. The completed instrument is seen as a useful unit of scientific progress in its own right. (UK)

The software environment RSYST was originally used to solve problems of reactor physics. The consideration of advanced scientific simulation requirements and the strict application of modern software design principles led to a system which is perfectly suitable to solve problems in various complex scientific problem domains. Starting with a review of the early days of RSYST, we describe the straight evolution driven by the need of software environment which combines the advantages of a high-performance database system with the capability to integrate sophisticatedscientific technical applications. The RSYST architecture is presented and the data modelling capabilities are described. To demonstrate the powerful possibilities and flexibility of the RSYST environment, we describe a wide range of RSYST applications, e.g., mechanical simulations of multibody systems, which are used in biomechanical research, civil engineering and robotics. In addition, a hypermedia system which is used for scientific technical training and documentation is presented. (orig.) [de

During the last years Colombia has increased the use of nuclear techniques, instruments and equipment in ambitious health programs, as well as in research centers, industry and education; this has resulted in numerous maintenance problems. As an alternative solution IAN has established a Central Maintenance Laboratory for nuclear instruments within an International Atomic Energy Agency program for eight Latin American and nine Asian Countries. Established strategies and some results are detailed in this writing

One of the most obvious ways in which the natural sciences depend on technology is through the use of instruments. This chapter presents a philosophical analysis of the role of technological instruments in science. Two roles of technological instruments in scientific practices are distinguished:

A sophisticated player is an individual who takes the action of the opponents, in a strategic situation, as determined by decision of rational opponents, and acts accordingly. A sophisticated agent is rational in the choice of his action, but ignores the fact that he is part of a strategic

The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data is being collected.

Nanotechnology has reached a level where almost every new development and even every new product uses features of nanoscopic properties of materials. As a consequence, an enormous amount of scientificinstruments is used in order to synthesize and analyze new structures and materials. Due to the surface sensitivity of such materials, many of these instruments require ultrahigh vacuum that has to be provided under extreme conditions like very high voltages. In this book, Yoshimura provides a review of the UHV related development during the last decades. His very broad experience in the design enables him to present us this detailed reference. After a general description how to design UHV systems, he covers all important issue in detail, like pumps, outgasing, Gauges, and Electrodes for high voltages. Thus, this book serves as reference for everybody using UVH in his scientific equipment

NASA has chosen to explore (16) Psyche with their 14th Discovery-class mission. Psyche is a 226-km diameter metallic asteroid hypothesized to be the exposed core of a planetesimal that was stripped of its rocky mantle by multiple hit and run collisions in the early solar system. The spacecraft launch is planned for 2022 with arrival at the asteroid in 2026 for 21 months of operations. The Psyche investigation has five primary scientific objectives: A. Determine whether Psyche is a core, or if it is unmelted material. B. Determine the relative ages of regions of Psyche's surface. C. Determine whether small metal bodies incorporate the same light elements as are expected in the Earth's high-pressure core. D. Determine whether Psyche was formed under conditions more oxidizing or more reducing than Earth's core. E. Characterize Psyche's topography. The mission's task was to select the appropriate instruments to meet these objectives. However, exploring a metal world, rather than one made of ice, rock, or gas, requires development of new scientific models for Psyche to support the selection of the appropriate instruments for the payload. If Psyche is indeed a planetary core, we expect that it should have a detectable magnetic field. However, the strength of the magnetic field can vary by orders of magnitude depending on the formational history of Psyche. The implications of both the extreme low-end and the high-end predictions impact the magnetometer and mission design. For the imaging experiment, what can the team expect for the morphology of a heavily impacted metal body? Efforts are underway to further investigate the differences in crater morphology between high velocity impacts into metal and rock to be prepared to interpret the images of Psyche when they are returned. Finally, elemental composition measurements at Psyche using nuclear spectroscopy encompass a new and unexplored phase space of gamma-ray and neutron measurements. We will present some end

The Proceeding of the Scientific Meeting in Nuclear Instrumentation Engineering held on Nov, 30, 2010 by the Centre for Nuclear Instrumentation Engineering - National Nuclear Energy Agency. The Proceedings of the Scientific Contains 40 papers Consist of Nuclear Instrumentation Engineering for Industry, Environment, and Nuclear Facilities. (PPIKSN)

The content-addressable-memory feature of a new system designed in these laboratories for non-destructive testing of nuclear reactor pressure vessels based on acoustic emission analysis is presented. The content addressable memory is divided into two parts: the first selects the most frequent events among incoming ones (FES: Frequent Event Selection memory), the second stores the frequent events singled out (FEM: Frequent Event Memory). The statistical behaviour of FES is analyzed, and experimental results are compared with theoretical ones; the model presented proved to be a useful tool in dimensioning the instrument store capacity. (Auth.)

This dissertation describes the research concerning the construction of a new educational and scientificinstrument. This instrument, Math Garden, is a web application in which children can practice arithmetic by playing math games in which items are tailored to their ability level. At the same

The purpose of this study was to develop and validate two survey instruments to evaluate high school students' scientific epistemic beliefs and goal orientations in learning science. The initial relationships between the sampled students' scientific epistemic beliefs and goal orientations in learning science were also investigated. A final valid sample of 600 volunteer Taiwanese high school students participated in this survey by responding to the Scientific Epistemic Beliefs Instrument (SEBI) and the Goal Orientations in Learning Science Instrument (GOLSI). Through both exploratory and confirmatory factor analyses, the SEBI and GOLSI were proven to be valid and reliable for assessing the participants' scientific epistemic beliefs and goal orientations in learning science. The path analysis results indicated that, by and large, the students with more sophisticated epistemic beliefs in various dimensions such as Development of Knowledge, Justification for Knowing, and Purpose of Knowing tended to adopt both Mastery-approach and Mastery-avoidance goals. Some interesting results were also found. For example, the students tended to set a learning goal to outperform others or merely demonstrate competence (Performance-approach) if they had more informed epistemic beliefs in the dimensions of Multiplicity of Knowledge, Uncertainty of Knowledge, and Purpose of Knowing.

Full Text Available A technical description is presented of the low-energy ion and electron (LION instrument on the SOHO spacecraft and its scientific goals are discussed. LION forms part of the comprehensive suprathermal and energetic particle analyzer (COSTEP, which is, in turn, a subset of the COSTEP/ERNE particle analyser collaboration (CEPAC.

Full Text Available A technical description is presented of the low-energy ion and electron (LION instrument on the SOHO spacecraft and its scientific goals are discussed. LION forms part of the comprehensive suprathermal and energetic particle analyzer (COSTEP, which is, in turn, a subset of the COSTEP/ERNE particle analyser collaboration (CEPAC.

Microgravity is a unique environment for materials and biotechnology processing. Microgravity minimizes or eliminates some of the effects that occur in one g. This can lead to the production of new materials or crystal structures. It is important to understand the processes that create these new materials. Thus, experiments are designed so that optical data collection can take place during the formation of the material. This presentation will discuss scientific application of optical instruments at MSFC. These instruments include a near-field scanning optical microscope, a miniaturized holographic system, and a phase-shifting interferometer.

Applications in the development of scientificinstrumentation of some of the new forms of carbon are discussed in this work. A short revision is presented of the remaining technical problems of the applications of diamond thin films as active semiconductor elements; heat sinks; X-ray, UV and particle detectors; surface acoustic wave devices, etc. Some advances in the improvement of the surface quality and textural design of diamond films are also presented. On the other hand, the possible implications of carbon nano tubes in scientificinstrumentation are also discussed mainly for the development of electronic nano devices. Finally, other promising applications of carbon nano tubes as nano tips for atomic force and scanning tunneling microscopes, as well as nano-host structures for the synthesis of metallic carbide and nitride nano wires are also presented. (Author)

... ScientificInstruments Pursuant to Section 6(c) of the Educational, Scientific and Cultural Materials... invite comments on the question of whether instruments of equivalent scientific value, for the purposes... conformational change of assemblies involved in biological processes such as ATP production, signal transduction...

... ScientificInstruments Pursuant to Section 6(c) of the Educational, Scientific and Cultural Materials... invite comments on the question of whether instruments of equivalent scientific value, for the purposes... materials for energy production. The experiments will involve structural and chemical analyses of materials...

Discusses the thinking of the Greek Sophist philosophers, particularly Gorgias and Protagoras, and their importance and relevance for contemporary English instructors. Considers the problem of language as signs of reality in the context of Sophist philosophy. (HB)

The purpose of this study is to develop scientific literacy evaluation instrument that tested its validity, reliability, and characteristics to measure the skill of student's scientific literacy used four scientific literacy, categories as follow:science as a body of knowledge (category A), science as a way of thinking (category B), science as a…

Full Text Available The brand image is fundamental for the scientific institution. In the high- competitive market the brand may have a significant impact on students decisions but also the decisions made by other stakeholders. The usage of online tools in creating brands and the brand awareness is standard nowdays and is present in many business strategies. The aim of the article is to present the problem on the example of scientific institutions. Based on literature review and empirical study the authors described the usage of most popular online tools in creating the scientific institutions’ brands.

Scientific realism holds that scientific theories are approximations of universal truths about reality, whereas scientificinstrumentalism posits that scientific theories are intellectual structures that provide adequate predictions of what is observed and useful frameworks for answering questions

Fjaestad, M. [Royal Inst. of Tech., Stockholm (Sweden). Dept. of History of Science and Technology

2001-01-01

The first Swedish reactor R1, constructed at the Royal Inst. of Technology in Stockholm, went critical in July 1954. This report presents historical aspects of the reactor, in particular about the reactor as a research instrument and a centre for physical science. The tensions between its role as a prototype and a step in the development of power reactors and that as a scientificinstrument are especially focused.

Full Text Available The instrumental concerto is one of the oldest genres of European instrumental music whose history began about three centuries ago and is continuing at present demonstrating an extraordinary variety of forms and diversity of variants of its treatment. Approaching the concerto genre the researchers try to establish the main features characteristic of this genre, to reveal its nature and to formulate a „pattern” or genetic canon of the concerto. In the article there is an analysis of the works of a lot of musicologists from different countries that are dedicated to the historical stylistic evolution of the concerto genre and to the study of the most relevant samples, from their origin (the 16th-17th centuries up to the present time. In this context, of special interest are the studies signed by A. Veinus, M. Roeder, M. Steinberg, S. Keefe, I. Grebneva, E. Dolinscaia and other musicologists. The article was written on the basis of the materials of the doctoral thesis „Concerto for Viola and Orchestra: Methods of Study” worked out under the supervision of the late scientist and university professor Vladimir Axionov.

... invite comments on the question of whether instruments of equivalent scientific value, for the purposes...: Projekt Messtechnik, Germany. Intended Use: The SPSx will be used to monitor the water-solid interaction... instrument monitors water-solid interactions by taking gravimetric measurement of samples continuously using...

Recent improvements in industrial vision technology and products together with the increasing need for high performance, cost efficient technical detectors for astronomical instrumentation have led ESO with the contribution of INAF to evaluate this trend and elaborate ad-hoc solutions which are interoperable and compatible with the evolution of VLT standards. The ESPRESSO spectrograph shall be the first instrument deploying this technology. ESO's Technical CCD (hereafter TCCD) requirements are extensive and demanding. A lightweight, low maintenance, rugged and high performance TCCD camera product or family of products is required which can operate in the extreme environmental conditions present at ESO's observatories with minimum maintenance and minimal downtime. In addition the camera solution needs to be interchangeable between different technical roles e.g. slit viewing, pupil and field stabilization, with excellent performance characteristics under a wide range of observing conditions together with ease of use for the end user. Interoperability is enhanced by conformance to recognized electrical, mechanical and software standards. Technical requirements and evaluation criteria for the TCCD solution are discussed in more detail. A software architecture has been adopted which facilitates easy integration with TCCD's from different vendors. The communication with the devices is implemented by means of dedicated adapters allowing usage of the same core framework (business logic). The preference has been given to cameras with an Ethernet interface, using standard TCP/IP based communication. While the preferred protocol is the industrial standard GigE Vision, not all vendors supply cameras with this interface, hence proprietary socket-based protocols are also acceptable with the provision of a validated Linux compliant API. A fundamental requirement of the TCCD software is that it shall allow for a seamless integration with the existing VLT software framework

Recent instruments developed to fulfill radiation monitoring needs at Los Alamos Scientific Laboratory are described. Laboratory instruments that measure tritium gas effluents alone, or in the presence of activated air from D-T fusion reactors are discussed. Fully portable systems for gamma, x-ray, and alpha analyses in the field are described. Also included are descriptions of survey instruments that measure low levels of transuranic contaminants and that measure pulsed-neutron dose rates

Machina & Schmeidler (Econometrica, 60, 1992) gave preference conditions for probabilistic sophistication, i.e. decision making where uncertainty can be expressed in terms of (subjective) probabilities without commitment to expected utility maximization. This note shows that simpler and more general

Reviews the history of intellectual views on the Greek sophists in three phases: (1) their disparagement by Plato and Aristotle as the morally disgraceful "other"; (2) nineteenth century British positivists' reappraisal of these relativists as ethically and scientifically superior; and (3) twentieth century versions of the sophists as…

Architectural stress is the inability of a system design to respond to new market demands. It is an important yet often concealed issue in high tech systems. In From scientificinstrument to industrial machine, we look at the phenomenon of architectural stress in embedded systems in the context of a transmission electron microscope system built by FEI Company. Traditionally, transmission electron microscopes are manually operated scientificinstruments, but they also have enormous potential for use in industrial applications. However, this new market has quite different characteristics. There are strong demands for cost-effective analysis, accurate and precise measurements, and ease-of-use. These demands can be translated into new system qualities, e.g. reliability, predictability and high throughput, as well as new functions, e.g. automation of electron microscopic analyses, automated focusing and positioning functions. From scientificinstrument to industrial machine takes a pragmatic approach to the proble...

The Los Alamos Scientific Laboratory is developing assay instrumentation for the quantitative analysis of transuranic materials found in bulk solid wastes generated by Department of Energy facilities and by the commercial nuclear power industry. This also includes wastes generated in the decontamination and decommissioning of facilities and wastes generated during burial ground exhumation. The assay instrumentation will have a detection capability for the transuranics of less than 10 nCi of activity per gram of waste whenever practicable.

The first Swedish reactor R1, constructed at the Royal Inst. of Technology in Stockholm, went critical in July 1954. This report presents historical aspects of the reactor, in particular about the reactor as a research instrument and a centre for physical science. The tensions between its role as a prototype and a step in the development of power reactors and that as a scientificinstrument are especially focused

The Los Alamos Scientific Laboratory is developing assay instrumentation for the quantitative analysis of transuranic materials found in bulk solid wastes generated by Department of Energy facilities and by the commercial nuclear power industry. This also includes wastes generated in the decontamination and decommissioning of facilities and wastes generated during burial ground exhumation. The assay instrumentation will have a detection capability for the transuranics of less than 10 nCi of activity per gram of waste whenever practicable

While genetics has remained as one key topic in school science, it continues to be conceptually and linguistically difficult for students with the concomitant debates as to what should be taught in the age of biotechnology. This article documents the development and implementation of a two-tier multiple-choice instrument for diagnosing grades 10 and 12 students' understanding of genetics in terms of reasoning. The pretest and posttest forms of the diagnostic instrument were used alongside other methods in evaluating students' understanding of genetics in a case-based qualitative study on teaching and learning with multiple representations in three Western Australian secondary schools. Previous studies have shown that a two-tier diagnostic instrument is useful in probing students' understanding or misunderstanding of scientific concepts and ideas. The diagnostic instrument in this study was designed and then progressively refined, improved, and implemented to evaluate student understanding of genetics in three case schools. The final version of the instrument had Cronbach's alpha reliability of 0.75 and 0.64, respectively, for its pretest and the posttest forms when it was administered to a group of grade 12 students (n = 17). This two-tier diagnostic instrument complemented other qualitative data collection methods in this research in generating a more holistic picture of student conceptual learning of genetics in terms of scientific reasoning. Implications of the findings of this study using the diagnostic instrument are discussed.

This paper discusses the scientificinstruments made and used by the microscopist Antony van Leeuwenhoek (1632–1723). The immediate cause of our study was the discovery of an overlooked document from the Delft archive: an inventory of the possessions that were left in 1745 after the death of

Architectural stress is the inability of a system design to respond to new market demands. It is an important yet often concealed issue in high tech systems. In From scientificinstrument to industrial machine, we look at the phenomenon of architectural stress in embedded systems in the context of a

Scientific realism holds that scientific theories are approximations of universal truths about reality, whereas scientificinstrumentalism posits that scientific theories are intellectual structures that provide adequate predictions of what is observed and useful frameworks for answering questions and solving problems in a given domain. These philosophical perspectives have different strengths and weaknesses and have been regarded as incommensurate: Scientific realism fosters theoretical rigor, verifiability, parsimony, and debate, whereas scientificinstrumentalism fosters theoretical innovation, synthesis, generativeness, and scope. The authors review the evolution of scientific realism and instrumentalism in psychology and propose that the categorical distinction between the 2 is overstated as a prescription for scientific practice. The authors propose that the iterative deployment of these 2 perspectives, just as the iterative application of inductive and deductive reasoning in science, may promote more rigorous, integrative, cumulative, and useful scientific theories.

We study the relationship between the cognitive load manipulation and strategic sophistication. The cognitive load manipulation is designed to reduce the subject's cognitive resources that are available for deliberation on a choice. In our experiment, subjects are placed under a large cognitive load (given a difficult number to remember) or a low cognitive load (given a number which is not difficult to remember). Subsequently, the subjects play a one-shot game then they are asked to recall...

Remote sensing instruments on today's space missions deliver a high amount of data which is typically evaluated on ground. Especially for deep space missions the telemetry downlink is very limited which creates the need for the scientific evaluation and thereby a reduction of data volume already on-board the spacecraft. A demanding example is the Polarimetric and Helioseismic Imager (PHI) instrument on Solar Orbiter. To enable on-board offline processing for data reduction, the instrument has to be equipped with a high capacity memory module. The module is based on non-volatile NAND-Flash technology, which requires more advanced operation than volatile DRAM. Unlike classical mass memories, the module is integrated into the instrument and allows readback of data for processing. The architecture and safe operation of such kind of memory module is described in the following paper.

This article develops a conceptualization and measure of cognitive health sophistication--the complexity of an individual's conceptual knowledge about health. Study 1 provides initial validity evidence for the measure--the Healthy-Unhealthy Other Instrument--by showing its association with other cognitive health constructs indicative of higher health sophistication. Study 2 presents data from a sample of low-income adults to provide evidence that the measure does not depend heavily on health-related vocabulary or ethnicity. Results from both studies suggest that the Healthy-Unhealthy Other Instrument can be used to capture variability in the sophistication or complexity of an individual's health-related schematic structures on the basis of responses to two simple open-ended questions. Methodological advantages of the Healthy-Unhealthy Other Instrument and suggestions for future research are highlighted in the discussion.

Full Text Available This research was describing the designing of instrument for affective assessment in English language teaching. The focus of the designing was only for observation sheet that will be used by English teachers during the teaching and learning process. The instrument was designed based on scientific approach that has five stages namely observing, questioning, experimenting, associating, and communicating. In the designing process, ADDIE Model was used as the method of research. The designing of instrument was considering the gap between the reality and the teachers’ need. The result showed that the designing was also notice to the affective taxonomy such as receiving, responding, valuing, organization, and characterization. Then, three key words were used as the indicator to show the five levels of affective taxonomy such as seriously, volunteer, and without asked by teacher. Furthermore, eighteen types of affective such as religious, honesty, responsible, discipline, hard work, self confidence, logical thinking, critical thinking, creative, innovative, independent, curiosity, love knowledge, respect, polite, democracy, emotional intelligence, and pluralist were put on each stage of scientific approach. So, it is hoped that can be implemented in all of context of English language teaching at schools and can assess the students’ affective comprehensively.

The art of the direct method of Liapunov to determine system stability is to construct a suitable Liapunov or V function where V is to be positive definite (PD), to shrink to a center, which may be conveniently chosen as the origin, and where V is the negative definite (ND). One aid to the art is to solve an approximation to the system equations in order to provide a candidate V function. It can happen, however, that the V function is not strictly ND but vanishes at a finite number of isolated points. Naively, one anticipates that stability has been demonstrated since the trajectory of the system at such points is only momentarily tangential and immediately enters a region of inward directed trajectories. To demonstrate stability rigorously requires the construction of a sophisticated Liapunov function from what can be called the naive original choice. In this paper, the authors demonstrate the method of perturbing the naive function in the context of the well-known second-order oscillator and then apply the method to a more complicated problem based on a prompt jump model for a nuclear fission reactor

A thermal control system is being developed for scientificinstruments placed on the lunar surface. This thermal control system, Lunar Mission Survival Module (MSM), was designed for scientificinstruments that are planned to be operated for over a year in the future Japanese lunar landing mission SELENE-2. For the long-term operations, the lunar surface is a severe environment because the soil (regolith) temperature varies widely from nighttime -200 degC to daytime 100 degC approximately in which space electronics can hardly survive. The MSM has a tent of multi-layered insulators and performs a "regolith mound". Temperature of internal devices is less variable just like in the lunar underground layers. The insulators retain heat in the regolith soil in the daylight, and it can keep the device warm in the night. We conducted the concept design of the lunar survival module, and estimated its potential by a thermal mathematical model on the assumption of using a lunar seismometer designed for SELENE-2. Thermal vacuum tests were also conducted by using a thermal evaluation model in order to estimate the validity of some thermal parameters assumed in the computed thermal model. The numerical and experimental results indicated a sufficient survivability potential of the concept of our thermal control system.

A thermal control system is being developed for scientificinstruments placed on the lunar surface. This thermal control system, Lunar Mission Survival Module (MSM), was designed for scientificinstruments that are planned to be operated for over a year in the future Japanese lunar landing mission SELENE-2. For the long-term operations, the lunar surface is a severe environment because the soil (regolith) temperature varies widely from nighttime −200 degC to daytime 100 degC approximately in which space electronics can hardly survive. The MSM has a tent of multi-layered insulators and performs a “regolith mound”. Temperature of internal devices is less variable just like in the lunar underground layers. The insulators retain heat in the regolith soil in the daylight, and it can keep the device warm in the night. We conducted the concept design of the lunar survival module, and estimated its potential by a thermal mathematical model on the assumption of using a lunar seismometer designed for SELENE-2. Thermal vacuum tests were also conducted by using a thermal evaluation model in order to estimate the validity of some thermal parameters assumed in the computed thermal model. The numerical and experimental results indicated a sufficient survivability potential of the concept of our thermal control system

A thermal control system is being developed for scientificinstruments placed on the lunar surface. This thermal control system, Lunar Mission Survival Module (MSM), was designed for scientificinstruments that are planned to be operated for over a year in the future Japanese lunar landing mission SELENE-2. For the long-term operations, the lunar surface is a severe environment because the soil (regolith) temperature varies widely from nighttime −200 degC to daytime 100 degC approximately in which space electronics can hardly survive. The MSM has a tent of multi-layered insulators and performs a “regolith mound”. Temperature of internal devices is less variable just like in the lunar underground layers. The insulators retain heat in the regolith soil in the daylight, and it can keep the device warm in the night. We conducted the concept design of the lunar survival module, and estimated its potential by a thermal mathematical model on the assumption of using a lunar seismometer designed for SELENE-2. Thermal vacuum tests were also conducted by using a thermal evaluation model in order to estimate the validity of some thermal parameters assumed in the computed thermal model. The numerical and experimental results indicated a sufficient survivability potential of the concept of our thermal control system.

Investigation and development of new techniques for intrumented surgery of the spine is not free of conflicts of interest. The influence of financial forces in the development of new technologies an its immediately application to spine surgery, shows the relationship between the published results and the industry support. Even authors who have defend eagerly fusion techniques, it have been demonstrated that them are very much involved in the revision of new articles to be published and in the approval process of new spinal technologies. When we analyze the published results of spine surgery, we must bear in mind what have been call in the "American Stock and Exchange" as "the bubble of spine surgery". The scientific literature doesn't show clear evidence in the cost-benefit studies of most instrumented surgical interventions of the spine compare with the conservative treatments. It has not been yet demonstrated that fusion surgery and disc replacement are better options than the conservative treatment. It's necessary to point out that at present "there are relationships between the industry and back pain, and there is also an industry of the back pain". Nonetheless, the "market of the spine surgery" is growing up because patients are demanding solutions for their back problems. The tide of scientific evidence seams to go against the spinal fusions in the degenerative disc disease, discogenic pain and inespecific back pain. After decades of advances in this field, the results of spinal fusions are mediocre. New epidemiological studies show that "spinal fusion must be accepted as a non proved or experimental method for the treatment of back pain". The surgical literature on spinal fusion published in the last 20 years following the Cochrane's method establish that: 1- this is at least incomplete, not reliable and careless; 2- the instrumentation seems to slightly increase the fusion rate; 3- the instrumentation doesn't improve the clinical results in general, lacking

The use of the scientific inquiry method of teaching science was investigated in one district's elementary schools. The study generated data directly from Albuquerque Public Schools fourth- and fifth-grade teachers through a mail-out survey and through observation. Two forms of an inquiry evaluation research instrument (Elementary Science Inquiry Survey - ESIS) were created. The ESIS-A is a classroom observation tool. The ESIS-B is a survey questionnaire designed to collect information from teachers. The study was designed first to establish reliability and validity for both forms of the instrument. The study made use of multiple regression and exploratory factor analysis. Sources used to establish the instruments' reliability and validity included: (1) Input from an international panel (qualitative analysis of comments sent by raters and quantitative analysis of numerical ratings sent by raters); (2) Cronbach's alpha; (3) Results of factor analysis; (4) Survey respondents' comments (qualitative analysis); (5) Teacher observation data. Cronbach's alpha for the data set was .8955. Inquiry practices were reported to occur between twice per week and three times per week. Teachers' comments regarding inquiry were reported. The ESIS was used to collect inquiry self-report data and teacher background data. The teacher background data included teacher science knowledge and information about their standards awareness and implementation. The following teacher knowledge factors were positively correlated with inquiry use: semesters of college science, science workshops taken, conducted scientific research, and SIMSE (NSF institute) participation. The following standards awareness and implementation factors were positively correlated with inquiry use: familiarity with the National Science Education Standards, familiarity with New Mexico science standards, state or national standards as a curriculum selection factor, student interest as a curriculum selection factor, and "no

The MICROSCOPE mission is fully dedicated to the in-orbit test of the universality of free fall, the so-called Weak Equivalence Principle (WEP). Based on a CNES Myriade microsatellite launched on the 25th of April 2016, MICROSCOPE is a CNES-ESA-ONERA-CNRS-OCA mission, the scientific objective of which is to test of the Equivalence Principle with an extraordinary accuracy at the level of 10-15. The measurement will be obtained from the T-SAGE (Twin Space Accelerometer for Gravitational Experimentation) instrument constituted by two ultrasensitive differential accelerometers. One differential electrostatic accelerometer, labeled SU-EP, contains, at its center, two proof masses made of Titanium and Platinum and is used for the test. The twin accelerometer, labeled SU-REF, contains two Platinum proof masses and is used as a reference instrument. Separated by a 17 cm-length arm, they are embarked in a very stable and soft environment on board a satellite equipped with a drag-free control system and orbiting on a sun synchronous circular orbit at 710 km above the Earth. In addition to the WEP test, this configuration can be interesting for various applications, and one of the proposed ideas is to use MICROSCOPE data for the measurement of Earth's gravitational gradient. Considering the gradiometer formed by the inner Platinum proof-masses of the two differential accelerometers and the arm along the Y-axis of the instrument which is perpendicular to the orbital plane, possibly 3 components of the gradient can be measured: Txy, Tyy and Tzy. Preliminary studies suggest that the errors can be lower than 10mE. Taking advantage of its higher altitude with respect to GOCE, the low frequency signature of Earth's potential seen by MICROSCOPE could provide an additional observable in gradiometry to discriminate between different models describing the large scales of the mass distribution in the Earth's deep mantle. The poster will shortly present the MICROSCOPE mission

Rover-based 2012 Moon and Mars Analog Mission Activities (MMAMA) were recently completed on Mauna Kea Volcano, Hawaii. Scientific investigations, scientific input, and operational constraints were tested in the context of existing project and protocols for the field activities designed to help NASA achieve the Vision for Space Exploration [1]. Several investigations were conducted by the rover mounted instruments to determine key geophysical and geochemical properties of the site, as well as capture the geological context of the area and the samples investigated. The rover traverse and associated science investigations were conducted over a three day period on the southeast flank of the Mauna Kea Volcano, Hawaii. The test area was at an elevation of 11,500 feet and is known as "Apollo Valley" (Fig. 1). Here we report the integration and operation of the rover-mounted instruments, as well as the scientific investigations that were conducted.

From earliest pre-history, with the dawning understanding of fire and its many uses, up to the astonishing advances of the twenty-first century, Thomas Crump traces the ever more sophisticated means employed in our attempts to understand the universe. The result is a vigorous and readable account of how our curious nature has continually pushed forward the frontiers of science and, as a consequence, human civilization.

... Technology, 771 Ferst Drive, NW., School of Materials Science and Engineering, Atlanta, GA 30332-0245... components of the instrument are necessary to elicit information from core-shell nanoparticles. Justification... enhanced by extending the resolution using phase-plate technology with this instrument. The instrument is...

We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ in their spec....... In addition to investigating the value of model sophistication in terms of dollar losses directly, we also use the model confidence set approach to statistically infer the set of models that delivers the best pricing performances.......We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ...

Applications in the development of scientificinstrumentation of some of the new forms of carbon are discussed in this work. A short revision is presented of the remaining technical problems of the applications of diamond thin films as active semiconductor elements; heat sinks; X-ray, UV and particle detectors; surface acoustic wave devices, etc. Some advances in the improvement of the surface quality and textural design of diamond films are also presented. On the other hand, the possible implications of carbon nano tubes in scientificinstrumentation are also discussed mainly for the development of electronic nano devices. Finally, other promising applications of carbon nano tubes as nano tips for atomic force and scanning tunneling microscopes, as well as nano-host structures for the synthesis of metallic carbide and nitride nano wires are also presented. (Author)

.... Manufacturer: FEI Company, The Netherlands. Intended Use: The instrument will be used for NIH-funded basic... Applied Life Sciences, Austria. Intended Use: The instrument is a highly specialized system for studying a wide range of materials used in very high cycle, high temperature applications, such as light metals...

... the cells, proteins of the tissues and metallic nanostructures will be analyzed with the instrument... humans, and provide new avenues toward therapies for myelin repair and prevention of axonal damage after...

.... In particular, the instrument will be applied during the last step of the synthesis--the calcination... study high temperature combustion occurring in a rotary type engine called a ``wave disk engine.'' The...

...: The instrument will be used to study biomaterials such as starches, lignin, and proteins, and compare... as transition metals and examining their chemical states and chemical reactivity before and after...

... resolution and scanning mode will enable the investigation of the chemical structure, morphology and... study the crystal structure, defect characteristics, and elemental distribution/ segregation of single crystals, interfacial voids, polymers, and composites. The instrument will also be used for the...

... before March 21, 2011. Address written comments to Statutory Import Programs Staff, Room 3720, U.S... instrument will be used to examine tissue specimens to identify and characterize pathologic tissue changes...

This second chapter on instrumentation gives little general consideration on history and classification of instrumentation, and two specific states of the art. The first one concerns NMR (block diagram of instrumentation chain with details on the magnets, gradients, probes, reception unit). The first one concerns precision instrumentation (optical fiber gyro-meter and scanning electron microscope), and its data processing tools (programmability, VXI standard and its history). The chapter ends with future trends on smart sensors and Field Emission Displays. (D.L.). Refs., figs

In the Laser Fusion Program at Lawrence Livermore National Laboratory, a single laser fusion experiment lasts only a billionth of a second but in this time high speed instrumentation collects data that when digitized will create a data bank of several megabytes. This first level of data must be processed in several stages to put it in a form useful for interpretation of the experiments. One stage involves unfolding the source characteristics from the data and response of the instrument. This involves calculating the response of the instrument from the characteristics of each of its components. It is in this calculation that the ORACLE DBMS has become an invaluable tool for manipulation and archiving of the component data

SCK-CEN's research and development programme on instrumentation aims at evaluating the potentials of new instrumentation technologies under the severe constraints of a nuclear application. It focuses on the tolerance of sensors to high radiation doses, including optical fibre sensors, and on the related intelligent data processing needed to cope with the nuclear constraints. Main achievements in these domains in 1999 are summarised

SCK-CEN's research and development programme on instrumentation involves the assessment and the development of sensitive measurement systems used within a radiation environment. Particular emphasis is on the assessment of optical fibre components and their adaptability to radiation environments. The evaluation of ageing processes of instrumentation in fission plants, the development of specific data evaluation strategies to compensate for ageing induced degradation of sensors and cable performance form part of these activities. In 2000, particular emphasis was on in-core reactor instrumentation applied to fusion, accelerator driven and water-cooled fission reactors. This involved the development of high performance instrumentation for irradiation experiments in the BR2 reactor in support of new instrumentation needs for MYRRHA, and for diagnostic systems for the ITER reactor.

SCK-CEN's research and development programme on instrumentation involves the assessment and the development of sensitive measurement systems used within a radiation environment. Particular emphasis is on the assessment of optical fibre components and their adaptability to radiation environments. The evaluation of ageing processes of instrumentation in fission plants, the development of specific data evaluation strategies to compensate for ageing induced degradation of sensors and cable performance form part of these activities. In 2000, particular emphasis was on in-core reactor instrumentation applied to fusion, accelerator driven and water-cooled fission reactors. This involved the development of high performance instrumentation for irradiation experiments in the BR2 reactor in support of new instrumentation needs for MYRRHA, and for diagnostic systems for the ITER reactor

... tissues to diagnose diseases, especially those caused by viral infections. For some diseases, electron... instrument will be used to evaluate tissue looking for ultrastructural indicators of disease, as well as other experiments including cell culture morphology, transplant and host tissue interactions, and...

... for which the instruments shown below are intended to be used, are being manufactured in the United... pump a dye laser to generate ultra-violet light which can be used to rack chemical species during... light (283 nm) will then fluoresce and can be detected using an intensified CCD camera. The key...

While genetics has remained as one key topic in school science, it continues to be conceptually and linguistically difficult for students with the concomitant debates as to what should be taught in the age of biotechnology. This article documents the development and implementation of a two-tier multiple-choice instrument for diagnosing grades 10…

... DEPARTMENT OF COMMERCE International Trade Administration Application(s) for Duty-Free Entry of..., School of Earth Sciences, 275 Mendenhall Laboratory, 125 South Oval Mall, Columbus, OH 43210. Instrument... and high-contrast images, a stage that is easy to move, a focus that does not change with changing...

This meeting provided an excellent overview of the state-of-the-art in perfusion imaging from the viewpoints of mathematical data analysis, radiochemical synthesis and evaluation, and instrumentation physics. The participants and audience had an opportunity to see how each of these aspects is essential for continued progress in this field

... States. Application accepted by Commissioner of Customs: March 29, 2012. Docket Number: 12-018. Applicant... general category manufactured in the United States. Application accepted by Commissioner of Customs: March...: The instrument will be used to investigate the genes and proteins that underlie normal and pathologic...

We report an overview of soft X-ray scientificinstruments and X-ray optics at the free electron laser (FEL) of the Pohang Accelerator Laboratory, with selected first-commissioning results. The FEL exhibited a pulse energy of 200 μJ/pulse, a pulse width of power of 10 500 was achieved. The estimated total time resolution between optical laser and X-ray pulses was <270 fs. A resonant inelastic X-ray scattering spectrometer was set up; its commissioning results are also reported.

Many people have heard of CERN, the European Organisation for Nuclear Research, and its enormous scientific masterpiece LEP, the Large Electron Positron collider. This is a 27-kilometer long particle accelerator designed to peek deeply inside the structure of matter in the framework of fundamental research. Despite the millions of Internet users, few of them know that the World Wide Web was invented at CERN in 1989, the same year that LEP was commissioned. Even fewer people know that CERN was among the first European organisations to have purchased the Oracle RDBMS back in 1983 and effectively put it in use for mission critical data management applications. Since that date, Oracle databases have been used extensively at CERN and in particular for technical and scientific data. This paper gives an overview of the use of Oracle throughout the lifecycle of CERN's flagship: the construction, exploitation and dismantling of LEP.

SCK-CEN's R and D programme on instrumentation involves the development of advanced instrumentation systems for nuclear applications as well as the assessment of the performance of these instruments in a radiation environment. Particular emphasis is on the use of optical fibres as umbilincal links of a remote handling unit for use during maintanance of a fusion reacor, studies on the radiation hardening of plasma diagnostic systems; investigations on new instrumentation for the future MYRRHA accelerator driven system; space applications related to radiation-hardened lenses; the development of new approaches for dose, temperature and strain measurements; the assessment of radiation-hardened sensors and motors for remote handling tasks and studies of dose measurement systems including the use of optical fibres. Progress and achievements in these areas for 2001 are described

SCK-CEN's R and D programme on instrumentation involves the development of advanced instrumentation systems for nuclear applications as well as the assessment of the performance of these instruments in a radiation environment. Particular emphasis is on the use of optical fibres as umbilincal links of a remote handling unit for use during maintanance of a fusion reacor, studies on the radiation hardening of plasma diagnostic systems; investigations on new instrumentation for the future MYRRHA accelerator driven system; space applications related to radiation-hardened lenses; the development of new approaches for dose, temperature and strain measurements; the assessment of radiation-hardened sensors and motors for remote handling tasks and studies of dose measurement systems including the use of optical fibres. Progress and achievements in these areas for 2001 are described.

Underveloped technology countries bare the hard consequences of the dependence vicious circle: they are dependents because they have delayed technology and they have delayed technology because they are dependents. Only with massive investiments in school education and in scientific and technological research they should break this vicious circle, greatly responsable for the social wounds and build a new social structure in which most people, if not all their population can, in fact, to perfor...

SCK-CEN's research and development programme on instrumentation aims at evaluating the potentials of new instrumentation technologies under the severe constraints of a nuclear application. It focuses on the tolerance of sensors to high radiation doses, including optical fibre sensors, and on the related intelligent data processing needed to cope with the nuclear constraints. Main achievements in these domains in 1999 are summarised.

Most countries agree to the necessity of burying high or medium-level wastes in geological layers situated at a few hundred meters below the ground level. The advantages and disadvantages of different types of rock such as salt, clay, granite and volcanic material are examined. Sophisticated studies are lead to determine the best geological confinement but questions arise about the time for which safety must be ensured. France has chosen 3 possible sites. These sites are geologically described in the article. The final place will be proposed after a testing phase of about 5 years in an underground facility. (A.C.)

A proper measurement of the relevant single and two-phase flow parameters is the basis for the understanding of many complex thermal-hydraulic processes. Reliable instrumentation is therefore necessary for the interaction between analysis and experiment especially in the field of nuclear safety research where postulated accident scenarios have to be simulated in experimental facilities and predicted by complex computer code systems. The so-called conventional instrumentation for the measurement of e. g. pressures, temperatures, pressure differences and single phase flow velocities is still a solid basis for the investigation and interpretation of many phenomena and especially for the understanding of the overall system behavior. Measurement data from such instrumentation still serves in many cases as a database for thermal-hydraulic system codes. However some special instrumentation such as online concentration measurement for boric acid in the water phase or for non-condensibles in steam atmosphere as well as flow visualization techniques were further developed and successfully applied during the recent years. Concerning the modeling needs for advanced thermal-hydraulic codes, significant advances have been accomplished in the last few years in the local instrumentation technology for two-phase flow by the application of new sensor techniques, optical or beam methods and electronic technology. This paper will give insight into the current state of instrumentation technology for safety-related thermohydraulic experiments. Advantages and limitations of some measurement processes and systems will be indicated as well as trends and possibilities for further development. Aspects of instrumentation in operating reactors will also be mentioned.

Full Text Available Underveloped technology countries bare the hard consequences of the dependence vicious circle: they are dependents because they have delayed technology and they have delayed technology because they are dependents. Only with massive investiments in school education and in scientific and technological research they should break this vicious circle, greatly responsable for the social wounds and build a new social structure in which most people, if not all their population can, in fact, to perform their citizen’s rights, putting an end to the social exclusion.

The DAPNIA is a department of CEA, its main characteristic is to manage scientific teams working on astrophysics, nuclear physics, elementary particles and instrumentation. Every 2 years DAPNIA's activities are submitted to an evaluation made by a scientific committee whose members are experts independent from CEA. This committee reviews the work done, gives an opinion about the options chosen for the projects to come and writes out a report. In 1997 the committee had a very positive opinion of the work done by DAPNIA teams. The contributions to various and important national or international programs have been successful, we can quote: Ulysse mission, soho, iso, integral for spatial programs, aleph, delphi, H1 at Hera, atlas, cms, na48, nomad, babar, antares for particle physics and spiral, smc, compass for nuclear physics. The committee advises DAPNIA to favour more contacts between the theoreticians and the experimentalists who work on quantum chromodynamics and hadron physics. The committee shows its concern about improving the balance between the means dedicated to instrumentation designing and those dedicated to the analysis and interpretation of the experimental data collected. (A.C.)

Full Text Available This study aims to develop an authentic assessment instrument to measure critical thinking skills in global warming learning and to describe the suitability, easiness, and usefulness of the use instruments which are developed base on the teacher’s opinion. The development design is carried out by Borg & Gall (2003 development model, which is conducted with seven stages: information gathering stage, planning stage, product development stage, product test stage, product revision stage, field trial stage, and final product. The test subjects are students and teachers in SMA Lampung Tengah by using purposive sampling technique. Global warming learning using authentic assessment consists of a series of learning activities, including observing, discussing, exploring, associating and communicating. The results show the authentic assessment techniques global warming to measure and cultivate critical thinking skills consisting of written tests, performance, portfolios, projects, and attitudes. The developed assessment model meets content and constructs validity, and effectively improves students' critical thinking skills and has a high level of suitability, easiness, and usefulness well-being. The assessment techniques are used in global warming learning are performance assessment techniques, portfolios, projects, products, and attitude that together contribute to the improvement of critical thinking skills on 97,4% of global warming learning.

The present paper mediates a basic knowledge of the most commonly used experimental techniques. We discuss the principles and concepts necessary to understand what one is doing if one performs an experiment on a certain instrument. (author) 29 figs., 1 tab., refs

This article provides an overview of several software control applications developed for NASA using LabVIEW. The applications covered here include (1) an Ultrasonic Measurement System for nondestructive evaluation of advanced structural materials, an Xray Spectral Mapping System for characterizing the quality and uniformity of developing photon detector materials, (2) a Life Testing System for these same materials, (3) and the instrument panel for an aircraft mounted Cloud Absorption Radiometer that measures the light scattered by clouds in multiple spectral bands. Many of the software interface concepts employed are explained. Panel layout and block diagram (code) strategies for each application are described. In particular, some of the more unique features of the applications' interfaces and source code are highlighted. This article assumes that the reader has a beginner-to-intermediate understanding of LabVIEW methods.

This chapter reviews the parameters which are important to positron-imaging instruments. It summarizes the options which various groups have explored in designing tomographs and the methods which have been developed to overcome some of the limitations inherent in the technique as well as in present instruments. The chapter is not presented as a defense of positron imaging versus single-photon or other imaging modality, neither does it contain a description of various existing instruments, but rather stresses their common properties and problems. Design parameters which are considered are resolution, sampling requirements, sensitivity, methods of eliminating scattered radiation, random coincidences and attenuation. The implementation of these parameters is considered, with special reference to sampling, choice of detector material, detector ring diameter and shielding and variations in point spread function. Quantitation problems discussed are normalization, and attenuation and random corrections. Present developments mentioned are noise reduction through time-of-flight-assisted tomography and signal to noise improvements through high intrinsic resolution. Extensive bibliography. (U.K.)

Eating, nourishment or nutrition circulate in our culture as synonyms and thus do not account for the changes that occur in nourishment, which intended or unintended, have a hybridization pattern that represents a change of rules and food preferences. This paper aims to take these common sense conceptions as analytic categories for analyzing and interpreting research for the Humanities and Health Sciences in a theoretical perspective, through conceptualization. The food is associated with a natural function (biological), a concept in which nature is opposed to culture, and nourishment takes cultural meanings (symbolic), expressing the division of labor, wealth, and a historical and cultural creation through which one can study a society. One attributes to Nutrition a sense of rational action, derived from the constitution of this science in modernity, inserted in a historical process of scientific rationalization of eating and nourishing. We believe that through the practice of conceptualization in interdisciplinary research, which involves a shared space of knowledge, we can be less constrained by a unified theoretical model of learning and be freer to think about life issues.

The Mars Science Laboratory Curiosity landed in Gale crater in August 2012 with the goal to identify and characterize habitable environments on Mars. Curiosity has been studying a series of sedimentary rocks primarily deposited in fluviolacustrine environments approximately 3.5 Ga. Minerals in the rocks and soils on Mars can help place further constraints on these ancient aqueous environments, including pH, salinity, and relative duration of liquid water. The Chemistry and Mineralogy (CheMin) X-ray diffraction and X-ray fluorescence instrument on Curiosity uses a Co X-ray source and charge-coupled device detector in transmission geometry to collect 2D Debye-Scherrer ring patterns of the less than 150 micron size fraction of drilled rock powders or scooped sediments. With an angular range of approximately 2.52deg 20 and a 20 resolution of approximately 0.3deg, mineral abundances can be quantified with a detection limit of approximately 1-2 wt. %. CheMin has returned quantitative mineral abundances from 16 mudstone, sandstone, and aeolian sand samples so far. The mineralogy of these samples is incredibly diverse, suggesting a variety of depositional and diagenetic environments and different source regions for the sediments. Results from CheMin have been essential for reconstructing the geologic history of Gale crater and addressing the question of habitability on ancient Mars.

Existing scientific ballooning solutions for multi hour flights in the upper troposphere/lower stratosphere are expensive and/or technically challenging. In contrast, solar hot air balloons are inexpensive and simple to construct. These balloons, which rely solely on sunlight striking a darkened envelope, can deliver payloads to 22 km altitude and maintain level flight until sunset. We describe an experimental campaign in which five solar hot air balloons launched in 45 minutes created a free flying infrasound (low frequency sound) microphone network that remained in the air for over 12 hours. We discuss the balloons' trajectory, maximum altitude, and stability as well as present results from the infrasound observations. We assess the performance and limitations of this design for lightweight atmospheric instrumentation deployments that require multi-hour flight times. Finally, we address the possibilities of multi day flights during the polar summer and on other planets.

Understanding the way in which animals diversified and radiated during their early evolutionary history remains one of the most captivating of scientific challenges. Integral to this is the 'Cambrian explosion', which records the rapid emergence of most animal phyla, and for which the triggering and accelerating factors, whether environmental or biological, are still unclear. Here we describe exceptionally well-preserved complex digestive organs in early arthropods from the early Cambrian of China and Greenland with functional similarities to certain modern crustaceans and trace these structures through the early evolutionary lineage of fossil arthropods. These digestive structures are assumed to have allowed for more efficient digestion and metabolism, promoting carnivory and macrophagy in early arthropods via predation or scavenging. This key innovation may have been of critical importance in the radiation and ecological success of Arthropoda, which has been the most diverse and abundant invertebrate phylum since the Cambrian.

Full Text Available Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636. Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.

Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI) to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636). Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.

This paper examines whether the sophistication of market investors influences management's strategy on discretionary accounting choice, and thus changes the persistence of discretionary accruals. The results show that the persistence of discretionary accruals for firms face with naive investors is lower than that for firms face with sophisticated investors. The results also demonstrate that sophisticated investors indeed incorporate the implications of current earnings components into future ...

This study aims to investigate the scientific literacy among 12th grade science students in SMA Negeri 2 Karanganyar. The instrument used is a four-tier wave diagnostic instrument. This instrument was originally used to diagnose students’ conceptions about nature and propagation of waves. This study using quantitative descriptive method. The diagnostic results based on dominant students’ answers show the lack of knowledge percentage of 14.3%-77.1%, alternative conceptions percentage 0%-60%, scientific conceptions percentage 0%-65.7%. Lack of knowledge indicated when there is doubt about at least one tier of the student’s answer. The results of the research shows that the students’ dominant scientific literacy is in the nominal literacy category with the percentage of 22.9% - 91.4%, the functional literacy with the percentage 2.86% - 28.6%, and the conceptual/procedural literacy category with the percentage 0% - 65.7%. Description level of nominal literacy in context of the current study is student have alternative conceptions and lack of knowledge. Student recognize the scientific terms, but is not capable to justify this term.

This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…

The influence of a financial sophistication scale on adjustable-rate mortgage (ARM) borrowing is explored. Descriptive statistics and regression analysis using recent data from the Survey of Consumer Finances reveal that ARM borrowing is driven by both the least and most financially sophisticated households but for different reasons. Less…

Organizations are designing more sophisticated accounting information systems to meet the strategic goals and enhance their performance. This study examines the effect of accounting information system design on the performance of organizations pursuing different strategic priorities. The alignment between sophisticated accounting information systems and organizational strategy is analyzed. The enabling effect of the accounting information system on performance is also examined. Relationships ...

We study the interplay of probabilistic sophistication, second order stochastic dominance, and uncertainty aversion, three fundamental notions in choice under uncertainty. In particular, our main result, Theorem 2, characterizes uncertainty averse preferences that satisfy second order stochastic dominance, as well as uncertainty averse preferences that are probabilistically sophisticated.

This paper deals with the organizational structure of ground-based receiving, processing, and dissemination of scientific information created by the Astrophysics Institute of the Scientific Research Nuclear University, Moscow Engineering Physics Institute. Hardware structure and software features are described. The principles are given for forming sets of control commands for scientific equipment (SE) devices, and statistics data are presented on the operation of facility during flight tests of the spacecraft (SC) in the course of one year.

The proceeding contains papers presented on Scientific Meeting and Presentation on on Basic Research of Nuclear Science and Technology, held in Yogyakarta, 25-27 April 1995. This proceeding is part one from two books published for the meeting contains papers on Physics, Reactor Physics and Nuclear Instrumentation as results of research activities in National Atomic Energy Agency. There are 39 papers indexed individually. (ID)

Scientific Meeting and Presentation on Basic Research in Nuclear Science and Technology is a routine activity was held by PPNY BATAN for monitoring the research Activity which achieved in BATAN. The Proceeding contains a proposal about basic which has physics; reactor physics and nuclear instrumentation. This proceedings is the first part from two part which published in series. There are 33 articles which have separated index

DARE (Dedicated Aerosol Retrieval Experiment) is a study to design an instrument for accurate remote sensing of aerosol properties from space. DARE combines useful properties of several existing instruments like TOMS, GOME, ATSR and POLDER. It has a large wavelength range, 330 to 1000 nm, to

use of modern rhetorical theories but analyses the letter in terms of the clas- ..... If a critical reader would have had the traditional anti-sophistic arsenal ..... pressions and that 'rhetoric' is mainly a matter of communicating these thoughts.

Full Text Available Nonprice competition is increasingly important in world food markets. Recently, the expression ‘export sophistication’ has been introduced in the economic literature to refer to a wide set of attributes that increase product value. An index has been proposed to measure sophistication in an indirect way through the per capita GDP of exporting countries (Lall et al., 2006; Haussmann et al., 2007.The paper applies the sophistication measure to the Italian food export sector, moving from an analysis of trends and performance of Italian food exports. An original way to disentangle different components in the temporal variation of the sophistication index is also proposed.Results show that the sophistication index offers original insights on recent trends in world food exports and with respect to Italian core food exports.

Investor sophistication has lagged behind the growing complexity of retail financial markets. To explore this, we develop a dynamic model to study the interaction between obfuscation and investor sophistication in mutual fund markets. Taking into account different learning mechanisms within the investor population, we characterize the optimal timing of obfuscation for financial institutions who offer retail products. We show that educational initiatives that are directed to facilitate learnin...

The European Space Agency’s Rosetta mission will be getting under way in February 2004. The Rosetta spacecraft will be pairing up with Comet 67P/Churyumov-Gerasimenko and accompanying it on its journey, investigating the comet’s composition and the dynamic processes at work as it flies sunwards. The spacecraft will even deposit a lander on the comet. “This will be our first direct contact with the surface of a comet,” said Dr Manfred Warhaut, Operations Manager for the Rosetta mission at ESA's European Space Operations Centre (ESOC) in Darmstadt, Germany. The trip is certainly not short: Rosetta will need ten years just to reach the comet. This places extreme demands on its hardware; when the probe meets up with the comet, all instruments must be fully operational, especially since it will have been in “hibernation” for 2 and a half years of its journey. During this ‘big sleep’, all systems, scientificinstruments included, are turned off. Only the on-board computer remains active. Twelve cubic metres of technical wizardry Rosetta’s hardware fits into a sort of aluminium box measuring just 12 cubic metres. The scientific payload is mounted in the upper part, while the subsystems - on-board computer, transmitter and propulsion system - are housed below. The lander is fixed to the opposite side of the probe from the steerable antenna. As the spacecraft orbits the comet, the scientificinstruments will at all times be pointed towards its surface; the antenna and solar panels will point towards the Earth and Sun respectively. For trajectory and attitude control and for the major braking manœuvres, Rosetta is equipped with 24 thrusters each delivering 10 N. That corresponds to the force needed here on Earth to hold a bag containing 10 apples. Rosetta sets off with 1650 kg of propellant on board, accounting for more than half its mass at lift-off. Just 20% of total mass is available for scientific purposes. So when developing the research instruments

James Gregory, inventor of the reflecting telescope and Fellow of the Royal Society, was the first Regius Professor of Mathematics of the University of St Andrews, 1668–74. He attempted to establish in St Andrews what would, if completed, have been the first purpose-built observatory in the British Isles. He travelled to London in 1673 to purchase instruments for equipping the observatory and improving the teaching and study of natural philosophy and mathematics in the university, seeking the advice of John Flamsteed, later the first Astronomer Royal. This paper considers the observatory initiative and the early acquisition of instruments at the University of St Andrews, with reference to Gregory's correspondence, inventories made ca. 1699–ca. 1718 and extant instruments themselves, some of which predate Gregory's time. It examines the structure and fate of the university observatory, the legacy of Gregory's teaching and endeavours, and the meridian line laid down in 1748 in the University Library.

Microspectrometers, miniature spectrometers, portable spectrometers, or Fiber Optic Spectrometers are some of the names typically given to the class small spectrometers that are derived from simple, fixed optics, and low cost detector arrays. The author will use these terms interchangeably. This class of instrument has been available for over 18 years, gaining industry acceptance with each year. From a very basic optical platform to sophisticatedinstrumentation for scientific investigation and process control, this class of instrument has evolved substantially since its introduction to the market. For instance it is now possible to cover the range from 200 - 2,500 nm utilizing only two channels of spectrometers with either synchronous or asynchronous channel control. On board processing and memory have enabled the instruments to become fully automated, stand alone sensors communicating with their environment via analog, digital, USB2 and even wireless protocols. New detectors have entered the market enabling solutions "tuned" to the demands of specific applications.

Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications. PMID:28553191

Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications.

Scientific measurements of sound generation and radiation by musical instruments are surprisingly hard to correlate with the subtle and complex judgments of instrumental quality made by expert musicians

Fund raising, it is argued, needs sophisticated prospect research. Professional prospect researchers play an important role in helping to identify prospective donors and also in helping to stimulate interest in gift giving. A sample of an individual work-up on a donor and a bibliography are provided. (MLW)

Full Text Available Procles, cited by Pausanias (in the imperfect tense about a display in Rome and for an opinion about Pyrrhus of Epirus, probably was not a historian of Hellenistic date, but a contemporary sophist whom Pausanias encountered in person in Rome.

Recent electronics technology advances make it possible to design sophisticatedinstruments in small packages for convenient field implementation. An inspector-instrument interface design that allows communication of procedures, responses, and results between the instrument and user is presented. This capability has been incorporated into new spent-fuel instrumentation and a battery-powered multichannel analyzer

Recent electronics technology advances make it possible to design sophisticatedinstruments in small packages for convenient field implementation. This report describes an inspector-instrument interface design which allows communication of procedures, responses, and results between the instrument and user. The interface has been incorporated into new spent-fuel instrumentation and a battery-powered multichannel analyzer

Many performing musicians, as well as instrument builders, are coming to realize the importance of understanding the science of musical instruments. This book explains how string instruments produce sound. It presents basic ideas in simple language, and it also translates some more sophisticated ideas in non-technical language. It should be of interest to performers, researchers, and instrument makers alike.

This paper investigates the effectiveness of development strategies for tourism destinations. We argue that resource investments unambiguously increase tourism revenues and that increasing the degree of tourism sophistication, that is increasing the variety of tourism related goods and services, increases tourism activity and decreases the perceived quality of the destination's resource endowment, leading to an ambiguous effect on tourism revenues. We disentangle these two effects and charact...

It has long been recognized that the physical form of materials can mediate their toxicity—the health impacts of asbestiform materials, industrial aerosols, and ambient particulate matter are prime examples. Yet over the past 20 years, toxicology research has suggested complex and previously unrecognized associations between material physicochemistry at the nanoscale and biological interactions. With the rapid rise of the field of nanotechnology and the design and production of increasingly complex nanoscale materials, it has become ever more important to understand how the physical form and chemical composition of these materials interact synergistically to determine toxicity. As a result, a new field of research has emerged—nanotoxicology. Research within this field is highlighting the importance of material physicochemical properties in how dose is understood, how materials are characterized in a manner that enables quantitative data interpretation and comparison, and how materials move within, interact with, and are transformed by biological systems. Yet many of the substances that are the focus of current nanotoxicology studies are relatively simple materials that are at the vanguard of a new era of complex materials. Over the next 50 years, there will be a need to understand the toxicology of increasingly sophisticated materials that exhibit novel, dynamic and multifaceted functionality. If the toxicology community is to meet the challenge of ensuring the safe use of this new generation of substances, it will need to move beyond “nano” toxicology and toward a new toxicology of sophisticated materials. Here, we present a brief overview of the current state of the science on the toxicology of nanoscale materials and focus on three emerging toxicology-based challenges presented by sophisticated materials that will become increasingly important over the next 50 years: identifying relevant materials for study, physicochemical characterization, and

Many important decisions require strategic sophistication. We examine experimentally whether teams act more strategically than individuals. We let individuals and teams make choices in simple games, and also elicit first- and second-order beliefs. We find that teams play the Nash equilibrium strategy significantly more often, and their choices are more often a best response to stated first order beliefs. Distributional preferences make equilibrium play less likely. Using a mixture model, the estimated probability to play strategically is 62% for teams, but only 40% for individuals. A model of noisy introspection reveals that teams differ from individuals in higher order beliefs. PMID:24926100

Two classes of the two-dimensional Euclidean chiral field theoreties are singled out: 1) the field phi(x) takes the values in the compact Hermitiam symmetric space 2) the field phi(x) takes the values in an orbit of the adjoint representation of the comcompact Lie group. The theories have sophisticated topological and rich analytical structures. They are considered with the help of topological invariants (topological charges). Explicit formulae for the topological charges are indicated, and the lower bound extimate for the action is given

Full Text Available This article compares capital budgeting techniques employed in listed and unlisted companies in Brazil. We surveyed the Chief Financial Officers (CFOs of 398 listed companies and 300 large unlisted companies, and based on 91 respondents, the results suggest that the CFOs of listed companies tend to use less simplistic methods more often, for example: NPV and CAPM, and that CFOs of unlisted companies are less likely to estimate the cost of equity, despite being large companies. These findings indicate that stock exchange listing may require greater sophistication of the capital budgeting process.

The future trends of the experimental plasma physics development in outer space demand more and more exact and sophisticatedscientificinstrumentation. Moreover, the situation is complicated by constant reduction of financial support of scientific research, even in leading countries. This resulted in the development of mini; micro and nanosatellites with low price and short preparation time. Consequently, it provoked the creation of new generation of scientificinstruments with reduced weight and power consumption but increased level of metrological parameters. The recent state of the development of electromagnetic (EM) sensors for microsatellites is reported. For flux-gate magnetometers (FGM) the reduction of weight as well as power consumption was achieved not only due to the use of new electronic components but also because of the new operation mode development. The scientific and technological study allowed to decrease FGM noise and now the typical noise figure is about 10 picotesla rms at 1 Hz and the record one is below 1 picotesla. The super-light version of search-coil magnetometers (SCM) was created as the result of intensive research. These new SCMs can have about six decades of operational frequency band with upper limit ˜ 1 MHz and noise level of few femtotesla with total weight about 75 grams, including electronics. A new instrument.- wave probe (WP) - which combines three independent sensors in one body - SCM, split Langmuir probe and electric potential sensor - was created. The developed theory confirms that WP can directly measure the wave vector components in space plasmas.

Full Text Available The paper deals with the problems of routing control algorithms of new conception of tram vehicle bogie. The main goal of these research activities is wear reduction of rail wheels and tracks, wear reduction of traction energy losses and increasing of running comfort. The testing experimental tram vehicle with special bogie construction powered by traction battery is utilized for these purposes. This vehicle has a rotary bogie with independent rotating wheels driven by permanent magnets synchronous motors and a solid axle. The wheel forces in bogie are measured by large amounts of the various sensors placed on the testing experimental tram vehicle. Nowadays the designed control algorithms are implemented to the vehicle superset control system. The traction requirements and track characteristics have an effect to these control algorithms. This control including sophisticated routing brings other improvements which is verified and corrected according to individual traction and driving characteristics, and opens new possibilities.

The GSECARS (APS sector 13) scientific program will provide fundamental new information on the deep structure and composition of the Earth and other planets, the formation of economic mineral deposits, the cycles and fate of toxic metals in the environment, and the mechanisms of nutrient uptake and disease in plants. In the four experimental stations (2 per beamline), scientists will have access to three main x-ray techniques: diffraction (microcrystal, powder, diamond anvil cell, and large volume press), fluorescence microprobe, and spectroscopy (conventional, microbeam, liquid and solid surfaces). The high pressure facilities will be capable of x-ray crystallography at P{approx_gt}360 GPa and T{approximately}6000 K with the diamond anvil cell and P{approximately}25 GPa and T{approximately}2500{degree}C with the large volume press. Diffractometers will allow study of 1 micrometer crystals and micro-powders. The microprobe (1 micrometer focused beam) will be capable of chemical analyses in the sub-ppm range using wavelength and energy dispersive detectors. Spectroscopy instrumentation will be available for XANES and EXAFS with microbeams as well as high sensitivity conventional XAS and studies of liquid and solid interfaces. Visiting scientists will be able to setup, calibrate, and test experiments in off-line laboratories with equipment such as micromanipulators, optical microscopes, clean bench, glove boxes, high powered optical and Raman spectrometers. {copyright} {ital 1996 American Institute of Physics.}

By means of the combined use of X-ray photoelectron spectroscopy (XPS), optical microscopy (OM) and scanning electron microscopy (SEM) coupled with energy dispersive X-ray spectroscopy (EDS) the surface and subsurface chemical and metallurgical features of silver counterfeited Roman Republican coins are investigated to decipher some aspects of the manufacturing methods and to evaluate the technological ability of the Roman metallurgists to produce thin silver coatings. The results demonstrate that over 2000 ago important advances in the technology of thin layer deposition on metal substrates were attained by Romans. The ancient metallurgists produced counterfeited coins by combining sophisticated micro-plating methods and tailored surface chemical modification based on the mercury-silvering process. The results reveal that Romans were able systematically to chemically and metallurgically manipulate alloys at a micro scale to produce adherent precious metal layers with a uniform thickness up to few micrometers. The results converge to reveal that the production of forgeries was aimed firstly to save expensive metals as much as possible allowing profitable large-scale production at a lower cost. The driving forces could have been a lack of precious metals, an unexpected need to circulate coins for trade and/or a combinations of social, political and economic factors that requested a change in money supply. Finally, some information on corrosion products have been achieved useful to select materials and methods for the conservation of these important witnesses of technology and economy.

Intraspecific communication in frogs plays an important role in the recognition of conspecifics in general and of potential rivals or mates in particular and therefore with relevant consequences for pre-zygotic reproductive isolation. We investigate intraspecific communication in Hylodes japi, an endemic Brazilian torrent frog with territorial males and an elaborate courtship behavior. We describe its repertoire of acoustic signals as well as one of the most complex repertoires of visual displays known in anurans, including five new visual displays. Previously unknown in frogs, we also describe a bimodal inter-sexual communication system where the female stimulates the male to emit a courtship call. As another novelty for frogs, we show that in addition to choosing which limb to signal with, males choose which of their two vocal sacs will be used for visual signaling. We explain how and why this is accomplished. Control of inflation also provides additional evidence that vocal sac movement and color must be important for visual communication, even while producing sound. Through the current knowledge on visual signaling in Neotropical torrent frogs (i.e. hylodids), we discuss and highlight the behavioral diversity in the family Hylodidae. Our findings indicate that communication in species of Hylodes is undoubtedly more sophisticated than we expected and that visual communication in anurans is more widespread than previously thought. This is especially true in tropical regions, most likely due to the higher number of species and phylogenetic groups and/or to ecological factors, such as higher microhabitat diversity.

Full Text Available Intraspecific communication in frogs plays an important role in the recognition of conspecifics in general and of potential rivals or mates in particular and therefore with relevant consequences for pre-zygotic reproductive isolation. We investigate intraspecific communication in Hylodes japi, an endemic Brazilian torrent frog with territorial males and an elaborate courtship behavior. We describe its repertoire of acoustic signals as well as one of the most complex repertoires of visual displays known in anurans, including five new visual displays. Previously unknown in frogs, we also describe a bimodal inter-sexual communication system where the female stimulates the male to emit a courtship call. As another novelty for frogs, we show that in addition to choosing which limb to signal with, males choose which of their two vocal sacs will be used for visual signaling. We explain how and why this is accomplished. Control of inflation also provides additional evidence that vocal sac movement and color must be important for visual communication, even while producing sound. Through the current knowledge on visual signaling in Neotropical torrent frogs (i.e. hylodids, we discuss and highlight the behavioral diversity in the family Hylodidae. Our findings indicate that communication in species of Hylodes is undoubtedly more sophisticated than we expected and that visual communication in anurans is more widespread than previously thought. This is especially true in tropical regions, most likely due to the higher number of species and phylogenetic groups and/or to ecological factors, such as higher microhabitat diversity.

In the paper we present compact library for analysis of nuclear spectra. The library consists of sophisticated functions for background elimination, smoothing, peak searching, deconvolution, and peak fitting. The functions can process one- and two-dimensional spectra. The software described in the paper comprises a number of conventional as well as newly developed methods needed to analyze experimental data. Program summaryProgram title: SpecAnalysLib 1.1 Catalogue identifier: AEDZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 42 154 No. of bytes in distributed program, including test data, etc.: 2 379 437 Distribution format: tar.gz Programming language: C++ Computer: Pentium 3 PC 2.4 GHz or higher, Borland C++ Builder v. 6. A precompiled Windows version is included in the distribution package Operating system: Windows 32 bit versions RAM: 10 MB Word size: 32 bits Classification: 17.6 Nature of problem: The demand for advanced highly effective experimental data analysis functions is enormous. The library package represents one approach to give the physicists the possibility to use the advanced routines simply by calling them from their own programs. SpecAnalysLib is a collection of functions for analysis of one- and two-parameter γ-ray spectra, but they can be used for other types of data as well. The library consists of sophisticated functions for background elimination, smoothing, peak searching, deconvolution, and peak fitting. Solution method: The algorithms of background estimation are based on Sensitive Non-linear Iterative Peak (SNIP) clipping algorithm. The smoothing algorithms are based on the convolution of the original data with several types of filters and algorithms based on discrete

The N-Reactor confinement system release dose to the public in a postulated accident is reduced by washing the confinement atmosphere with fog sprays. This allows a low pressure release of confinement atmosphere containing fission products through filters and out an elevated stack. The current accident analysis required revision of the CORRAL code and other codes such as CONTEMPT to properly model the N Reactor confinement into a system of multiple fog-sprayed compartments. In revising these codes, more sophisticated models for the fog sprays and iodine plateout were incorporated to remove some of the conservatism of steam condensing rate, fission product washout and iodine plateout than used in previous studies. The CORRAL code, which was used to describe the transport and deposition of airborne fission products in LWR containment systems for the Rasmussen Study, was revised to describe fog spray removal of molecular iodine (I 2 ) and particulates in multiple compartments for sprays having individual characteristics of on-off times, flow rates, fall heights, and drop sizes in changing containment atmospheres. During postulated accidents, the code determined the fission product removal rates internally rather than from input decontamination factors. A discussion is given of how the calculated plateout and washout rates vary with time throughout the analysis. The results of the accident analyses indicated that more credit could be given to fission product washout and plateout. An important finding was that the release of fission products to the atmosphere and adsorption of fission products on the filters were significantly lower than previous studies had indicated

This book discusses the principles of physics through applications of state-of-the-art technologies and advanced instruments. The authors use diagrams, sketches, and graphs coupled with equations and mathematical analysis to enhance the reader's understanding of modern devices. Readers will learn to identify common underlying physical principles that govern several types of devices, while gaining an understanding of the performance trade-off imposed by the physical limitations of various processing methods. The topics discussed in the book assume readers have taken an introductory physics course, college algebra, and have a basic understanding of calculus. * Describes the basic physics behind a large number of devices encountered in everyday life, from the air conditioner to Blu-ray discs * Covers state-of-the-art devices such as spectrographs, photoelectric image sensors, spacecraft systems, astronomical and planetary observatories, biomedical imaging instruments, particle accelerators, and jet engines * Inc...

The ADS provides access to over 940,000 references from astronomy and planetary sciences publications and 1.5 million records from physics publications. It is funded by NASA and provides free access to these references, as well as to 2.4 million scanned pages from the astronomical literature. These include most of the major astronomy and several planetary sciences journals, as well as many historical observatory publications. The references now include the abstracts from all volumes of the Journal of Geophysical Research (JGR) since the beginning of 2002. We get these abstracts on a regular basis. The Kluwer journal Solar Physics has been scanned back to volume 1 and is available through the ADS. We have extracted the reference lists from this and many other journals and included them in the reference and citation database of the ADS. We have recently scanning Earth, Moon and Planets, another Kluwer journal, and will scan other Kluwer journals in the future as well. We plan on extracting references from these journals as well in the near future. The ADS has many sophisticated query features. These allow the user to formulate complex queries. Using results lists to get further information about the selected articles provide the means to quickly find important and relevant articles from the database. Three advanced feedback queries are available from the bottom of the ADS results list (in addition to regular feedback queries already available from the abstract page and from the bottom of the results list): 1. Get reference list for selected articles: This query returns all known references for the selected articles (or for all articles in the first list). The resulting list will be ranked according to how often each article is referred to and will show the most referenced articles in the field of study that created the first list. It presumably shows the most important articles in that field. 2. Get citation list for selected articles: This returns all known articles

In space instrumentation, there is currently no instrument dedicated to susceptibly or complete magnetization measurements of rocks. Magnetic field instrument suites are generally vector (or scalar) magnetometers, which locally measure the magnetic field. When mounted on board rovers, the electromagnetic perturbations associated with motors and other elements make it difficult to reap the benefits from the inclusion of such instruments. However, magnetic characterization is essential to understand key aspects of the present and past history of planetary objects. The work presented here overcomes the limitations currently existing in space instrumentation by developing a new portable and compact multi-sensor instrument for ground breaking high-resolution magnetic characterization of planetary surfaces and sub-surfaces. This new technology introduces for the first time magnetic susceptometry (real and imaginary parts) as a complement to existing compact vector magnetometers for planetary exploration. This work aims to solve the limitations currently existing in space instrumentation by means of providing a new portable and compact multi-sensor instrument for use in space, science and planetary exploration to solve some of the open questions on the crustal and more generally planetary evolution within the Solar System.

In respective departments in charge of basic planning, design, manufacture, inspection and construction of nuclear power plants, by the positive utilization of CAD/CAE system, efficient workings have been advanced. This time, the plant integrated CAE system wich heightens the function of these individual systems, and can make workings efficient and advanced by unifying and integrating them was developed. This system is composed of the newly developed application system and the data base system which enables the unified management of engineering data and high speed data conversion in addition to the CAD system for three-dimensional plant layout planning. On the basis of the rich experience and the proposal of improvement of designers by the application of the CAD system for three-dimensional plant layout planning to actual machines, the automation, speed increase and the visualization of input and output by graphical user interface (GUI) in the processing of respective applications were made feasible. As the advancement of plant CAE system, scenic engineering system, integrated layout CAE system, electric instrumentation design CAE system and construction planning CAE system are described. As for the integration of plant CAE systems, the integrated engineering data base, the combination of plant CAE systems, and the operation management in the dispersed environment of networks are reported. At present, Hitachi Ltd. exerts efforts for the construction of atomic energy product in formation integrated management system as the second stage of integration. (K.I.)

The Raman Laser Spectrometer (RLS) is one of the Pasteur Payload instruments, within the ESA's Aurora Exploration Programme, ExoMars mission. Raman spectroscopy is based on the analysis of spectral fingerprints due to the inelastic scattering of light when interacting with matter. RLS is composed by Units: SPU (Spectrometer Unit), iOH (Internal Optical Head), and ICEU (Instrument Control and Excitation Unit) and the harnesses (EH and OH). The iOH focuses the excitation laser on the samples and collects the Raman emission from the sample via SPU (CCD) and the video data (analog) is received, digitalizing it and transmiting it to the processor module (ICEU). The main sources of noise arise from the sample, the background, and the instrument (Laser, CCD, focuss, acquisition parameters, operation control). In this last case the sources are mainly perturbations from the optics, dark signal and readout noise. Also flicker noise arising from laser emission fluctuations can be considered as instrument noise. In order to evaluate the SNR of a Raman instrument in a practical manner it is useful to perform end-to-end measurements on given standards samples. These measurements have to be compared with radiometric simulations using Raman efficiency values from literature and taking into account the different instrumental contributions to the SNR. The RLS EQM instrument performances results and its functionalities have been demonstrated in accordance with the science expectations. The Instrument obtained SNR performances in the RLS EQM will be compared experimentally and via analysis, with the Instrument Radiometric Model tool. The characterization process for SNR optimization is still on going. The operational parameters and RLS algorithms (fluorescence removal and acquisition parameters estimation) will be improved in future models (EQM-2) until FM Model delivery.

In considering the appropriate use of the terms "science" and "scientificinstrument," tracing the history of "mathematical instruments" in the early modern period is offered as an illuminating alternative to the historian's natural instinct to follow the guiding lights of originality and innovation, even if the trail transgresses contemporary boundaries. The mathematical instrument was a well-defined category, shared across the academic, artisanal, and commercial aspects of instrumentation, and its narrative from the sixteenth to the eighteenth century was largely independent from other classes of device, in a period when a "scientific" instrument was unheard of.

In analogy to political sophistication, it is imperative that citizens have a certain level of economic sophistication, especially in times of heated debates about the economy. This study examines the impact of different influences (media, interpersonal communication and personal experiences) on

With the recent interest in the fifth century B.C. theories of Protagoras and Gorgias come assumptions about the philosophical affinity of the Greek educator Isocrates to this pair of older sophists. Isocratean education in discourse, with its emphasis on collaborative political discourse, falls within recent definitions of a sophist curriculum.…

While Aristotle's philosophical views are more foundational than those of many of the Older Sophists, Aristotle's rhetorical theories inherit and incorporate many of the central tenets ascribed to Sophistic rhetoric, albeit in a more systematic fashion, as represented in the "Rhetoric." However, Aristotle was more than just a rhetorical…

The aim of the paper is to make a critical reading of ecocentrism and its meta-scientific use of ecology. First, basic assumptions of ecocentrism will be examined, which involve nature's intrinsic value, postmodern and modern positions in ecocentrism, and the subject-object dichotomy under the lenses of ecocentrism. Then, we will discuss…

Scientific equipment today is one of the things that is very important in modern research and development. Research agencies usually have a lot of sophisticatedscientific equipment and expensive, Nuclear Malaysia is one the example. When an organization has a lot of expensive equipment, so sometimes there are things that are overlooked that contribute to the damage of the equipment or the use of minimal equipment compared to the price. The audit committee was established to examine the equipment. This auditing purposely is to know the current status of condition, use and frequency of use of scientific equipment in Nuclear Malaysia. Committee is composed of a number of research officers and senior technical assistant. Committee is divided into 8 teams audit consists of 2 officers and a technician for each. Audit just only done on equipment purchased through quotations or tenders (for example more than RM50,000 ). Audit process performed from July 2010 to May 2011. A total of 62 scientificinstruments were successfully audited. Results showed that 1.6 % damaged tools full 6.5 % while the device is being repaired by 90.3 % well-functioning tool. Most of the scientific equipment are damaged due to the turmoil that is the responsibility of lack of regular maintenance. Awareness of the importance of regular maintenance and use of log books among officials responsible still at low. Willingness to change from different attitude to be more responsible for the scientific equipment poses a challenge to the organization. (author)

In May of 2006, a committee was formed to assess the fundamental needs and opportunities in scientific software for x-ray data reduction, analysis, modeling, and simulation. This committee held a series of discussions throughout the summer, conducted a poll of the members of the x-ray community, and held a workshop. This report details the findings and recommendations of the committee. Each experiment performed at the APS requires three crucial ingredients: the powerful x-ray source, an optimized instrument to perform measurements, and computer software to acquire, visualize, and analyze the experimental observations. While the APS has invested significant resources in the accelerator, investment in other areas such as scientific software for data analysis and visualization has lagged behind. This has led to the adoption of a wide variety of software with variable levels of usability. In order to maximize the scientific output of the APS, it is essential to support the broad development of real-time analysis and data visualization software. As scientists attack problems of increasing sophistication and deal with larger and more complex data sets, software is playing an ever more important role. Furthermore, our need for excellent and flexible scientific software can only be expected to increase, as the upgrade of the APS facility and the implementation of advanced detectors create a host of new measurement capabilities. New software analysis tools must be developed to take full advantage of these capabilities. It is critical that the APS take the lead in software development and the implementation of theory to software to ensure the continued success of this facility. The topics described in this report are relevant to the APS today and critical for the APS upgrade plan. Implementing these recommendations will have a positive impact on the scientific productivity of the APS today and will be even more critical in the future.

We use citation data of scientific articles produced by individual nations in different scientific domains to determine the structure and efficiency of national research systems. We characterize the scientific fitness of each nation-that is, the competitiveness of its research system-and the complexity of each scientific domain by means of a non-linear iterative algorithm able to assess quantitatively the advantage of scientific diversification. We find that technological leading nations, beyond having the largest production of scientific papers and the largest number of citations, do not specialize in a few scientific domains. Rather, they diversify as much as possible their research system. On the other side, less developed nations are competitive only in scientific domains where also many other nations are present. Diversification thus represents the key element that correlates with scientific and technological competitiveness. A remarkable implication of this structure of the scientific competition is that the scientific domains playing the role of "markers" of national scientific competitiveness are those not necessarily of high technological requirements, but rather addressing the most "sophisticated" needs of the society.

Full Text Available Theory of Mind (ToM, i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded. However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity or social group size (a proxy for social network complexity are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees engage in simple dyadic games against artificial ToM players (via a familiar human caregiver. Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size. Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities.

Full Text Available This paper provides an overview on the history of scientificinstrument collections in Spanish secondary schools, focusing especially on the period around their establishment in the mid-nineteenth century. It describes their most important features as well as their promoters and main users. Attention is also paid to the teaching practices which encouraged different uses of scientificinstruments in nineteenth-century classrooms and the reasons that lead to the progressive abandonment of nineteenth-century collections along with the advent of new pedagogical ideas. First, we briefly describe the collections created at the end of the 18th century. Then we evaluate the mid nineteenth-century situation, when the Spanish Government supported several projects to provide the new secondary schools with comprehensive physics and chemistry cabinets. Finally, we offer a general overview of the current state of the collections and of several projects and proposals aimed at their use as historical sources, pedagogical tools and objects with great patrimonial and museum value.El objetivo de este trabajo consiste en ofrecer una revisión de la historia de las colecciones de instrumentos científicos en los institutos de enseñanza secundaria en España, la mayor parte creadas a mediados del siglo XIX. Prestaremos especial atención a sus principales promotores y usuarios, así como a los usos para los que fueron inicialmente empleadas y las razones que condujeron a su progresivo abandono con la llegada de nuevos métodos pedagógicos. En primer lugar comentaremos las colecciones creadas a finales del siglo XVIII para conocer la situación existente a mediados del siglo XIX, cuando el Gobierno llevó a cabo iniciativas para dotar mientas didácticas y objetos de valor museístico y patrimonial.

The present report documents deliberations of a large group of experts in neutron scattering and fundamental physics on the need for new neutron sources of greater intensity and more sophisticatedinstrumentation than those currently available. An additional aspect of the Workshop was a comparison between steady-state (reactor) and pulsed (spallation) sources. The main conclusions were: (1) the case for a new higher flux neutron source is extremely strong and such a facility will lead to qualitatively new advances in condensed matter science and fundamental physics; (2) to a large extent the future needs of the scientific community could be met with either a 5 x 10 15 n cm -2 s -1 steady state source or a 10 17 n cm -2 s -1 peak flux spallation source; and (3) the findings of this Workshop are consistent with the recommendations of the Major Materials Facilities Committee

Children born very low birth weight (development of self-regulation and effective functional skills, and play serves as an important avenue of early intervention. The current study investigated associations between maternal flexibility and toddler play sophistication in Caucasian, Spanish speaking Hispanic, English speaking Hispanic, and Native American toddlers (18-22 months adjusted age) in a cross-sectional cohort of 73 toddlers born VLBW and their mothers. We found that the association between maternal flexibility and toddler play sophistication differed by ethnicity (F(3,65) = 3.34, p = .02). In particular, Spanish speaking Hispanic dyads evidenced a significant positive association between maternal flexibility and play sophistication of medium effect size. Results for Native Americans were parallel to those of Spanish speaking Hispanic dyads: the relationship between flexibility and play sophistication was positive and of small-medium effect size. Findings indicate that for Caucasians and English speaking Hispanics, flexibility evidenced a non-significant (negative and small effect size) association with toddler play sophistication. Significant follow-up contrasts revealed that the associations for Caucasian and English speaking Hispanic dyads were significantly different from those of the other two ethnic groups. Results remained unchanged after adjusting for the amount of maternal language, an index of maternal engagement and stimulation; and after adjusting for birth weight, gestational age, gender, test age, cognitive ability, as well maternal age, education, and income. Our results provide preliminary evidence that ethnicity and acculturation may mediate the association between maternal interactive behavior such as flexibility and toddler developmental outcomes, as indexed by play sophistication. Addressing these association differences is particularly important in children born VLBW because interventions targeting parent interaction strategies such as

innovation for a Laplace innovation assumption improves the pricing in a smaller way. Apart from investigating directly the value of model sophistication in terms of dollar losses, we also use the model condence set approach to statistically infer the set of models that delivers the best pricing performance.......We assess the predictive accuracy of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set 248 multivariate models that differer...

This article examines the impact of cognitive ability on ideological voting. We find, using a US sample and a Danish sample, that the effect of cognitive ability rivals the effect of the traditionally strongest predicter of ideological voting political sophistication. Furthermore, the results...... are consistent with the effect of cognitive ability being partly mediated by political sophistication. Much of the effect of cognitive ability remains however and is not explained by differences in education or Openness to experience either. The implications of these results for democratic theory are discussed....

Background: Particularly in higher education, not only a view of science as a means of finding absolute truths (absolutism), but also a view of science as generally tentative (multiplicism) can be unsophisticated and obstructive for learning. Most quantitative epistemic belief inventories neglect this and understand epistemic sophistication as…

Full Text Available A strong link has been established between operational excellence and the degree of sophistication of logistics organization, a function of factors such as performance monitoring, investment in Information Technology [IT] and the formalization of logistics organization, as proposed in the Bowersox, Daugherty, Dröge, Germain and Rogers (1992 Leading Edge model. At the same time, shippers have been increasingly outsourcing their logistics activities to third party providers. This paper, based on a survey with large Brazilian shippers, addresses a gap in the literature by investigating the relationship between dimensions of logistics organization sophistication and drivers of logistics outsourcing. To this end, the dimensions behind the logistics sophistication construct were first investigated. Results from factor analysis led to the identification of six dimensions of logistics sophistication. By means of multivariate logistical regression analyses it was possible to relate some of these dimensions, such as the formalization of the logistics organization, to certain drivers of the outsourcing of logistics activities of Brazilian shippers, such as cost savings. These results indicate the possibility of segmenting shippers according to characteristics of their logistics organization, which may be particularly useful to logistics service providers.

is founded on politically sophisticated individuals having a greater comprehension of news and other mass-mediated sources, which makes them less likely to rely on neighborhood cues as sources of information relevant for political attitudes. Based on a unique panel data set with fine-grained information...

Claims that teaching ethics is particularly important to technical writing. Outlines a classical, sophistic approach to ethics based on the theories and pedagogies of Protagoras, Gorgias, and Isocrates, which emphasizes the Greek concept of "nomos," internal and external deliberation, and responsible action. Discusses problems and…

This article looks at the work of contemporary Dutch fashion designer Alexander van Slobbe (1959) and examines how, since the 1990s, his fashion practices have consistently and consciously put forward a unique reflection on questions related to materiality, sophisticated archaism, luxury,

This article looks at the work of contemporary Dutch fashion designer Alexander van Slobbe (1959) and examines how, since the 1990s, his fashion practices have consistently and consciously put forward a unique reflection on questions related to materiality, sophisticated archaism, luxury,

Full Text Available This longitudinal case study explored Iranian EFL learners’ lexical complexity (LC through the lenses of Dynamic Systems Theory (DST. Fifty independent essays written by five intermediate to advanced female EFL learners in a TOEFL iBT preparation course over six months constituted the corpus of this study. Three Coh-Metrix indices (Graesser, McNamara, Louwerse, & Cai, 2004; McNamara & Graesser, 2012, three Lexical Complexity Analyzer indices (Lu, 2010, 2012; Lu & Ai, 2011, and four Vocabprofile indices (Cobb, 2000 were selected to measure different dimensions of LC. Results of repeated measures analysis of variance (RM ANOVA indicated an improvement with regard to only lexical sophistication. Positive and significant relationships were found between time and mean values in Academic Word List and Beyond-2000 as indicators of lexical sophistication. The remaining seven indices of LC, falling short of significance, tended to flatten over the course of this writing program. Correlation analyses among LC indices indicated that lexical density enjoyed positive correlations with lexical sophistication. However, lexical diversity revealed no significant correlations with both lexical density and lexical sophistication. This study suggests that DST perspective specifies a viable foundation for analyzing lexical complexity

The dependency between the accuracy/uncertainty of storm erosion exceedance estimates obtained via a probabilistic model and the level of sophistication of the structural function (storm erosion model) embedded in the probabilistic model is assessed via the application of Callaghan et al.'s (2008)

Meanwhile, the attainment of such a sophisticated status in Western scientific research has been facilitated by its experimental methodology which has made possible the transfer of knowledge from one generation to another. However, other non- Western forms of knowledge that lack these characteristics are regarded as ...

Full Text Available The world’s most numerous bird, the domestic chicken, and their wild ancestor, the red junglefowl, have long been used as model species for animal behaviour research. Recently, this research has advanced our understanding of the social behaviour, personality, and cognition of fowl, and demonstrated their sophisticated behaviour and cognitive skills. Here, we overview some of this research, starting with describing research investigating the well-developed senses of fowl, before presenting how socially and cognitively complex they can be. The realisation that domestic chickens, our most abundant production animal, are behaviourally and cognitively sophisticated should encourage an increase in general appraise and fascination towards them. In turn, this should inspire increased use of them as both research and hobby animals, as well as improvements in their unfortunately often poor welfare.

The paper analyses the advantages and disadvantages of the second generation improved technologies and third generation technologies mainly developed in China in terms of safety and economy. The paper also discusses the maturity of the second generation improved technologies and the sophistication of the third generation technologies respectively. Meanwhile, the paper proposes that the advantage and disadvantage of second generation improved technologies and third generation technologies should be carefully taken into consideration and the relationship between the maturity and sophistication should be properly dealt with in the current stage. A two-step strategy shall be taken as a solution to solve the problem of insufficient capacity of nuclear power, trace and develop the third generation technologies, so as to ensure the sound and fast development of nuclear power. (authors)

The advances in the field of computer and communications are leading to the development of smart embedded nuclear instruments. These instruments have highly sophisticated signal-processing algorithms based on FPGA and ASICS, provisions of present day connectivity and user interfaces. The developments in the connectivity, standards and bus technologies have made possible to access these instruments on LAN and WAN with suitable reliability and security. To get rid of wires i.e. in order to access these instruments, without wires at any place, wireless technology has evolved and become integral part of day-to-day activities. The environment monitoring can be done remotely, if smart antennas are incorporated on these instruments

The welfare cost of anticipated inflation is quantified in a calibrated model of the U.S. economy that exhibits tractable equilibrium dispersion in wealth and earnings. Inflation does not generate large losses in societal welfare, yet its impact varies noticeably across segments of society depending also on the financial sophistication of the economy. If money is the only asset, then inflation hurts mostly the wealthier and more productive agents, while those poorer and less productive may ev...

According to my experiences, the Western world hopelessly fails to understand Russian mentality, or misinterprets it. During my analysis of the Russian way of thinking I devoted special attention to the examination of military mentality. I have connected the issue of the Russian way of thinking to the contemporary imperial policies of Putin’s Russia. I have also attempted to prove the level of sophistication of both. I hope that a better understanding of both the Russian mentality and imperi...

Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences. PMID:28450829

Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences.

The National Symposium on Advanced Instrumentation for Nuclear Research was held in Bombay during January 27-29, 1993 at BARC. Progress of modern nuclear research is closely related to the availability of state of the art instruments and systems. With the advancements in experimental techniques and sophisticated detector developments, the performance specifications have become more stringent. State of the art techniques and diverse applications of sophisticated nuclear instrumentation systems are discussed along with indigenous efforts to meet the specific instrumentation needs of research programs in nuclear sciences. Papers of relevance to nuclear science and technology are indexed separately. (original)

The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding-rocket instrument developed at the National Astronomical Observatory of Japan (NAOJ) as a part of an international collaboration. The in- strument main scientific goal is to achieve polarization measurement of the Lyman-alpha line at 121.56 nm emitted from the solar upper-chromosphere and transition region with an unprecedented 0.1% accuracy. For this purpose, the optics are composed of a Cassegrain telescope coated with a "cold mirror" coating optimized for UV reflection and a dual-channel spectrograph allowing for simultaneous observation of the two orthogonal states of polarization. Although the polarization sensitivity is the most important aspect of the instrument, the spatial and spectral resolutions of the instrument are also crucial to observe the chromospheric features and resolve the Ly- pro les. A precise alignment of the optics is required to ensure the resolutions, but experiments under vacuum conditions are needed since Ly-alpha is absorbed by air, making the alignment experiments difficult. To bypass this issue, we developed methods to align the telescope and the spectrograph separately in visible light. We will explain these methods and present the results for the optical alignment of the CLASP telescope and spectrograph. We will then discuss the combined performances of both parts to derive the expected resolutions of the instrument, and compare them with the flight observations performed on September 3rd 2015.

The field of health status and quality of life (QoL) measurement - as a formal discipline with a cohesive theoretical framework, accepted methods, and diverse applications--has been evolving for the better part of 30 years. To identify health status and QoL instruments and review them against rigorous criteria as a precursor to creating an instrument library for later dissemination, the Medical Outcomes Trust in 1994 created an independently functioning Scientific Advisory Committee (SAC). In the mid-1990s, the SAC defined a set of attributes and criteria to carry out instrument assessments; 5 years later, it updated and revised these materials to take account of the expanding theories and technologies upon which such instruments were being developed. This paper offers the SAC's current conceptualization of eight key attributes of health status and QoL instruments (i.e., conceptual and measurement model; reliability; validity; responsiveness; interpretability; respondent and administrative burden; alternate forms; and cultural and language adaptations) and the criteria by which instruments would be reviewed on each of those attributes. These are suggested guidelines for the field to consider and debate; as measurement techniques become both more familiar and more sophisticated, we expect that experts will wish to update and refine these criteria accordingly.

The Los Alamos Scientific Laboratory (LASL) has an extensive program for the development of nondestructive assay instrumentation for the quantitative analysis of transuranic (TRU) materials found in bulk solid wastes generated by Department of Energy facilities and by the commercial nuclear power industry. Included are wastes generated in decontamination and decommissioning of outdated nuclear facilities as well as wastes from old waste burial ground exhumation programs. The assay instrumentation is designed to have detection limits below 10 nCi/g wherever practicable. Because of the topic of this workshop, only the assay instrumentation applied specifically to soil monitoring will be discussed here. Four types of soil monitors are described

The Los Alamos Scientific Laboratory (LASL) has an extensive program for the development of nondestructive assay instrumentation for the quantitative analysis of transuranic (TRU) materials found in bulk solid wastes generated by Department of Energy facilities and by the commercial nuclear power industry. Included are wastes generated in decontamination and decommissioning of outdated nuclear facilities, as well as from old waste-burial-ground exhumation programs. The assay instrumentation is designed to have detection limits below 10 nCi/g wherever practicable. The assay instrumentation that is applied specifically to soil monitoring is discussed

As part of WFIRST Science Center preparations, the IPAC Science Operations Center (ISOC) maintains a repository of 1) WFIRST data and instrument simulations, 2) tools to facilitate scientific performance and feasibility studies using the WFIRST, and 3) parameters summarizing the current design and predicted performance of the WFIRST telescope and instruments. The simulation repository provides access for the science community to simulation code, tools, and resulting analyses. Examples of simulation code with ISOC-built web-based interfaces include EXOSIMS (for estimating exoplanet yields in CGI surveys) and the Galaxy Survey Exposure Time Calculator. In the future the repository will provide an interface for users to run custom simulations of a wide range of coronagraph instrument (CGI) observations and sophisticated tools for designing microlensing experiments. We encourage those who are generating simulations or writing tools for exoplanet observations with WFIRST to contact the ISOC team so we can work with you to bring these to the attention of the broader astronomical community as we prepare for the exciting science that will be enabled by WFIRST.

This study examines the impact of uncertainty on the sophistication of capital budgeting practices. While the theoretical applications of sophisticated capital budgeting practices (defined as the use of real option reasoning and/or game theory decision rules) have been well documented, empirical

Full Text Available This article analyses the three epideictic orations of Isocrates which are in themselves a precious testimony of the quality of intellectual life at the close of the fourth century before Christ. To this period belong also the Socratics who are generally seen as an important link between Socrates and Plato. The author of this article proposes a more productive approach to the study of Antisthenes, Euclid of Megara and other so-called Socratics, revealing them not as independent thinkers but rather as adherents of the sophistic school and also as teachers, thereby, including them among those who took part in the educative activity of their time

CTF3 (CLIC Test Facility 3), currently under construction at CERN, is a test facility designed to demonstrate the key feasibility issues of the CLIC (Compact LInear Collider) two-beam scheme. When completed, this facility will consist of a 150 MeV linac followed by two rings for bunch-interleaving, and a test stand where 30 GHz power will be generated. In this paper, the work that has been carried out on the linac's low power RF system is described. This includes, in particular, a sophisticated phase control system for the RF pulse compressor to produce a flat-top rectangular pulse over 1.4 µs.

International Series of Monographs in Nuclear Energy, Volume 107: Radioisotope Instruments, Part 1 focuses on the design and applications of instruments based on the radiation released by radioactive substances. The book first offers information on the physical basis of radioisotope instruments; technical and economic advantages of radioisotope instruments; and radiation hazard. The manuscript then discusses commercial radioisotope instruments, including radiation sources and detectors, computing and control units, and measuring heads. The text describes the applications of radioisotop

The situation regarding photogrammetric instrumentation has changed quite dramatically over the last 2 or 3 years with the withdrawal of most analogue stereo-plotting machines from the market place and their replacement by analytically based instrumentation. While there have been few new developments in the field of comparators, there has been an explosive development in the area of small, relatively inexpensive analytical stereo-plotters based on the use of microcomputers. In particular, a number of new instruments have been introduced by manufacturers who mostly have not been associated previously with photogrammetry. Several innovative concepts have been introduced in these small but capable instruments, many of which are aimed at specialised applications, e.g. in close-range photogrammetry (using small-format cameras); for thematic mapping (by organisations engaged in environmental monitoring or resources exploitation); for map revision, etc. Another innovative and possibly significant development has been the production of conversion kits to convert suitable analogue stereo-plotting machines such as the Topocart, PG-2 and B-8 into fully fledged analytical plotters. The larger and more sophisticated analytical stereo-plotters are mostly being produced by the traditional mainstream photogrammetric systems suppliers with several new instruments and developments being introduced at the top end of the market. These include the use of enlarged photo stages to handle images up to 25 × 50 cm format; the complete integration of graphics workstations into the analytical plotter design; the introduction of graphics superimposition and stereo-superimposition; the addition of correlators for the automatic measurement of height, etc. The software associated with this new analytical instrumentation is now undergoing extensive re-development with the need to supply photogrammetric data as input to the more sophisticated G.I.S. systems now being installed by clients, instead

This study developed minute estimation by adopting comprehensive sensitivity analytical program for reliability of TRU waste repository concepts in a crystalline rock condition. We examined each components and groundwater scenario of geological repository and prepared systematic bases to examine the reliability from the point of comprehensiveness. Models and data are sophisticated to examine the reliability. Based on an existing TRU waste repository concepts, effects of parameters to nuclide migration were quantitatively classified. Those parameters, that will be decided quantitatively, are such as site character of natural barrier and design specification of engineered barriers. Considering the feasibility of those figures of specifications, reliability is re-examined on combinations of those parameters within a practical range. Future issues are; Comprehensive representation of hybrid geosphere model including the fractured medium and permeable matrix medium. Sophistication of tools to develop the reliable combinations of parameters. It is significant to continue this study because the disposal concepts and specification of TRU nuclides containing waste on various sites shall be determined rationally and safely through these studies. (author)

Progress in biomedical research is largely driven by improvements, innovations, and breakthroughs in technology, accelerating the research process, and an increasingly complex collaboration of both clinical and basic science. This increasing sophistication has driven the need for centralized shared resource cores ("cores") to serve the scientific community. From a biomedical research enterprise perspective, centralized resource cores are essential to increased scientific, operational, and cost effectiveness; however, the concentration of instrumentation and resources in the cores may render them highly vulnerable to damage from severe weather and other disasters. As such, protection of these assets and the ability to recover from a disaster is increasingly critical to the mission and success of the institution. Therefore, cores should develop and implement both disaster and business continuity plans and be an integral part of the institution's overall plans. Here we provide an overview of key elements required for core disaster and business continuity plans, guidance, and tools for developing these plans, and real-life lessons learned at a large research institution in the aftermath of Superstorm Sandy.

Progress in biomedical research is largely driven by improvements, innovations, and breakthroughs in technology, accelerating the research process, and an increasingly complex collaboration of both clinical and basic science. This increasing sophistication has driven the need for centralized shared resource cores (“cores”) to serve the scientific community. From a biomedical research enterprise perspective, centralized resource cores are essential to increased scientific, operational, and cost effectiveness; however, the concentration of instrumentation and resources in the cores may render them highly vulnerable to damage from severe weather and other disasters. As such, protection of these assets and the ability to recover from a disaster is increasingly critical to the mission and success of the institution. Therefore, cores should develop and implement both disaster and business continuity plans and be an integral part of the institution’s overall plans. Here we provide an overview of key elements required for core disaster and business continuity plans, guidance, and tools for developing these plans, and real-life lessons learned at a large research institution in the aftermath of Superstorm Sandy. PMID:26848285

International audience; The expression instrumental interaction as been introduced by Claude Cadoz to identify a human-object interaction during which a human manipulates a physical object - an instrument - in order to perform a manual task. Classical examples of instrumental interaction are all the professional manual tasks: playing violin, cutting fabrics by hand, moulding a paste, etc.... Instrumental interaction differs from other types of interaction (called symbolic or iconic interactio...

Full Text Available Background: Constructing gene co-expression networks from cancer expression data is important for investigating the genetic mechanisms underlying cancer. However, correlation coefficients or linear regression models are not able to model sophisticated relationships among gene expression profiles. Here, we address the 3-way interaction that 2 genes’ expression levels are clustered in different space locations under the control of a third gene’s expression levels. Results: We present xSyn, a software tool for identifying such 3-way interactions from cancer gene expression data based on an optimization procedure involving the usage of UPGMA (Unweighted Pair Group Method with Arithmetic Mean and synergy. The effectiveness is demonstrated by application to 2 real gene expression data sets. Conclusions: xSyn is a useful tool for decoding the complex relationships among gene expression profiles. xSyn is available at http://www.bdxconsult.com/xSyn.html .

Semen limitation (lack of semen to fertilize all of a female's eggs) imposes high fitness costs to female partners. Females should therefore avoid mating with semen-limited males. This can be achieved by using public information extracted from watching individual males' previous copulating activities. This adaptive preference should be flexible given that semen limitation is temporary. We first demonstrate that the number of offspring produced by males Drosophila melanogaster gradually decreases over successive copulations. We then show that females avoid mating with males they just watched copulating and that visual public cues are sufficient to elicit this response. Finally, after males were given the time to replenish their sperm reserves, females did not avoid the males they previously saw copulating anymore. These results suggest that female fruit flies may have evolved sophisticated behavioural processes of resistance to semen-limited males, and demonstrate unsuspected adaptive context-dependent mate choice in an invertebrate.

Explores scientific fraud, asserting that while few scientists actually falsify results, the field has become so competitive that many are misbehaving in other ways; an example would be unreasonable criticism by anonymous peer reviewers. (EV)

Instruments are essential for accounting, for surveillance and for protection of nuclear materials. The development and application of such instrumentation is reviewed, with special attention to international safeguards applications. Active and passive nondestructive assay techniques are some 25 years of age. The important advances have been in learning how to use them effectively for specific applications, accompanied by major advances in radiation detectors, electronics, and, more recently, in mini-computers. The progress in seals has been disappointingly slow. Surveillance cameras have been widely used for many applications other than safeguards. The revolution in TV technology will have important implications. More sophisticated containment/surveillance equipment is being developed but has yet to be exploited. On the basis of this history, some expectations for instrumentation in the near future are presented

Full Text Available The article tackles the problem of models of communication in science. The formal division of communication processes into oral and written does not resolve the problem of attitude. The author defines successful communication as a win-win game, based on the respect and equality of the partners, regardless of their position in the world of science. The core characteristics of the process of scientific communication are indicated , such as openness, fairness, support, and creation. The task of creating the right atmosphere for science communication belongs to moderators, who should not allow privilege and differentiation of position to affect scientific communication processes.

Today, for the first time, scientific concerns are seriously being addressed that span future times--hundreds, even thousands, or more years in the future. One is witnessing what the author calls scientific millenarianism. Are such concerns for the distant future exercises in futility, or are they real issues that, to the everlasting gratitude of future generations, this generation has identified, warned about and even suggested how to cope with in the distant future? Can the four potential catastrophes--bolide impact, CO 2 warming, radioactive wastes and thermonuclear war--be avoided by technical fixes, institutional responses, religion, or by doing nothing? These are the questions addressed in this paper

One of the main aims of the IAEA is to foster the exchange of scientific and technical information and one of the main ways of doing this is to convene international scientific meetings. They range from large international conferences bringing together several hundred scientists, smaller symposia attended by an average of 150 to 250 participants and seminars designed to instruct rather than inform, to smaller panels and study groups of 10 to 30 experts brought together to advise on a particular programme or to develop a set of regulations. The topics of these meetings cover every part of the Agency's activities and form a backbone of many of its programmes. (author)

This annual scientific and technological report provides an overview of research and development activities at Peruvian Institute of Nuclear Energy (IPEN) during the period from 1 january to 31 december, 2010. This report includes 41 papers divided in 8 subject matters, such as: physics and chemistry, materials science, nuclear engineering, mining industrial and environmental applications, medical and biological applications, radiation protection and nuclear safety, scientificinstrumentation and management aspects. It also includes annexes. (APC)

This annual scientific and technological report provides an overview of research and development activities at Peruvian Institute of Nuclear Energy (IPEN) during the period from 1 january to 31 december, 2011. This report includes 30 papers divided in 8 subject matters, such as: physics and chemistry, materials science, nuclear engineering, mining industrial and environmental applications, medical and biological applications, radiation protection and nuclear safety, scientificinstrumentation and management aspects. It also includes annexes. (APC)

Economic complexity reflects the amount of knowledge that is embedded in the productive structure of an economy. By combining tools from network science and econometrics, a robust and stable relationship between a country's productive structure and its economic growth has been established. Here we report that not only goods but also services are important for predicting the rate at which countries will grow. By adopting a terminology which classifies manufactured goods and delivered services as products, we investigate the influence of services on the country's productive structure. In particular, we provide evidence that complexity indices for services are in general higher than those for goods, which is reflected in a general tendency to rank countries with developed service sector higher than countries with economy centred on manufacturing of goods. By focusing on country dynamics based on experimental data, we investigate the impact of services on the economic complexity of countries measured in the product space (consisting of both goods and services). Importantly, we show that diversification of service exports and its sophistication can provide an additional route for economic growth in both developing and developed countries.

This study introduces the second release of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES 2.0), a freely available and easy-to-use text analysis tool. TAALES 2.0 is housed on a user's hard drive (allowing for secure data processing) and is available on most operating systems (Windows, Mac, and Linux). TAALES 2.0 adds 316 indices to the original tool. These indices are related to word frequency, word range, n-gram frequency, n-gram range, n-gram strength of association, contextual distinctiveness, word recognition norms, semantic network, and word neighbors. In this study, we validated TAALES 2.0 by investigating whether its indices could be used to model both holistic scores of lexical proficiency in free writes and word choice scores in narrative essays. The results indicated that the TAALES 2.0 indices could be used to explain 58% of the variance in lexical proficiency scores and 32% of the variance in word-choice scores. Newly added TAALES 2.0 indices, including those related to n-gram association strength, word neighborhood, and word recognition norms, featured heavily in these predictor models, suggesting that TAALES 2.0 represents a substantial upgrade.

The recently developed 'two-step' behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects' investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues.

The recently developed ‘two-step’ behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects’ investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues. PMID:26657806

Full Text Available The recently developed 'two-step' behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects' investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues.

The development of a powerful numerical model to simulate the fracture behavior of concrete material has long been one of the dominant research areas in earthquake engineering. A reliable model should be able to adequately represent the discontinuous characteristics of cracks and simulate various failure behaviors under complicated loading conditions. In this paper, a numerical formulation, which incorporates a sophisticated rigid-plastic interface constitutive model coupling cohesion softening, contact, friction and shear dilatation into the XFEM, is proposed to describe various crack behaviors of concrete material. An effective numerical integration scheme for accurately assembling the contribution to the weak form on both sides of the discontinuity is introduced. The effectiveness of the proposed method has been assessed by simulating several well-known experimental tests. It is concluded that the numerical method can successfully capture the crack paths and accurately predict the fracture behavior of concrete structures. The influence of mode-II parameters on the mixed-mode fracture behavior is further investigated to better determine these parameters.

Full Text Available Opportunity identification is the key element of the entrepreneurial process; therefore the issue of developing this skill in students is a crucial task in contemporary European education which has recognized entrepreneurship as one of the lifelong learning key competences. The earlier opportunity identification becomes a habitual way of thinking and behavior across a broad range of contexts, the more likely that entrepreneurial disposition will steadily reside in students. In order to nurture opportunity identification in students for making them able to organize sophisticated businesses in the future, certain demands ought to be put forward as well to the teacher – the person who is to promote these qualities in their students. The paper reflects some findings of a research conducted within the frameworks of a workplace learning project for the teachers of one of Riga secondary schools (Latvia. The main goal of the project was to teach the teachers to identify hidden inner links between apparently unrelated things, phenomena and events within 10th grade study curriculum and connect them together and create new opportunities. The creation and solution of cross-disciplinary tasks were the means for achieving this goal.

We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

Full Text Available We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

The science objectives for proposed NASA missions for the next decades push the state of the art in sensitivity and spatial resolution over a wide range of wavelengths, including the x-ray to the submillimeter. While some of the proposed missions are larger and more sensitive versions of familiar concepts, such as the next generation space telescope, others use concepts, common on the Earth, but new to space, such as optical interferometry, in order to provide spatial resolutions impossible with other concepts. However, despite their architecture, the performance of all of the proposed missions depends critically on the back-end instruments that process the collected energy to produce scientifically interesting outputs. The Advanced Optical Instruments Technology panel was chartered with defining technology development plans that would best improve optical instrument performance for future astrophysics missions. At this workshop the optical instrument was defined as the set of optical components that reimage the light from the telescope onto the detectors to provide information about the spatial, spectral, and polarization properties of the light. This definition was used to distinguish the optical instrument technology issues from those associated with the telescope, which were covered by a separate panel. The panel identified several areas for optical component technology development: diffraction gratings; tunable filters; interferometric beam combiners; optical materials; and fiber optics. The panel also determined that stray light suppression instruments, such as coronagraphs and nulling interferometers, were in need of general development to support future astrophysics needs.

5G is the upcoming evolution for the current cellular networks that aims at satisfying the future demand for data services. Heterogeneous cloud radio access networks (H-CRANs) are envisioned as a new trend of 5G that exploits the advantages of heterogeneous and cloud radio access networks to enhance spectral and energy efficiency. Remote radio heads (RRHs) are small cells utilized to provide high data rates for users with high quality of service (QoS) requirements, while high power macro base station (BS) is deployed for coverage maintenance and low QoS users service. Inter-tier interference between macro BSs and RRHs and energy efficiency are critical challenges that accompany resource allocation in H-CRANs. Therefore, we propose an efficient resource allocation scheme using online learning, which mitigates interference and maximizes energy efficiency while maintaining QoS requirements for all users. The resource allocation includes resource blocks (RBs) and power. The proposed scheme is implemented using two approaches: centralized, where the resource allocation is processed at a controller integrated with the baseband processing unit and decentralized, where macro BSs cooperate to achieve optimal resource allocation strategy. To foster the performance of such sophisticated scheme with a model free learning, we consider users\\' priority in RB allocation and compact state representation learning methodology to improve the speed of convergence and account for the curse of dimensionality during the learning process. The proposed scheme including both approaches is implemented using software defined radios testbed. The obtained results and simulation results confirm that the proposed resource allocation solution in H-CRANs increases the energy efficiency significantly and maintains users\\' QoS.

The role of nursing home (NH) information technology (IT) in quality improvement has not been clearly established, and its impacts on communication between care givers and patient outcomes in these settings deserve further attention. In this research, we describe a mixed method approach to explore communication strategies used by healthcare providers for resident skin risk in NH with high IT sophistication (ITS). Sample included NH participating in the statewide survey of ITS. We incorporated rigorous observation of 8- and 12-h shifts, and focus groups to identify how NH IT and a range of synchronous and asynchronous tools are used. Social network analysis tools and qualitative analysis were used to analyze data and identify relationships between ITS dimensions and communication interactions between care providers. Two of the nine ITS dimensions (resident care-technological and administrative activities-technological) and total ITS were significantly negatively correlated with number of unique interactions. As more processes in resident care and administrative activities are supported by technology, the lower the number of observed unique interactions. Additionally, four thematic areas emerged from staff focus groups that demonstrate how important IT is to resident care in these facilities including providing resident-centered care, teamwork and collaboration, maintaining safety and quality, and using standardized information resources. Our findings in this study confirm prior research that as technology support (resident care and administrative activities) and overall ITS increases, observed interactions between staff members decrease. Conversations during staff interviews focused on how technology facilitated resident centered care through enhanced information sharing, greater virtual collaboration between team members, and improved care delivery. These results provide evidence for improving the design and implementation of IT in long term care systems to support

The largest supercomputers have millions of independent processors, and concurrency levels are rapidly increasing. For ideal efficiency, developers of the simulations that run on these machines must ensure that computational work is evenly balanced among processors. Assigning work evenly is challenging because many large modern parallel codes simulate behavior of physical systems that evolve over time, and their workloads change over time. Furthermore, the cost of imbalanced load increases with scale because most large-scale scientific simulations today use a Single Program Multiple Data (SPMD) parallel programming model, and an increasing number of processors will wait for the slowest one at the synchronization points. To address load imbalance, many large-scale parallel applications use dynamic load balance algorithms to redistribute work evenly. The research objective of this dissertation is to develop methods to decide when and how to load balance the application, and to balance it effectively and affordably. We measure and evaluate the computational load of the application, and develop strategies to decide when and how to correct the imbalance. Depending on the simulation, a fast, local load balance algorithm may be suitable, or a more sophisticated and expensive algorithm may be required. We developed a model for comparison of load balance algorithms for a specific state of the simulation that enables the selection of a balancing algorithm that will minimize overall runtime.

Instrumentation is developed for the Civilian Radioactive Waste Management Program to meet several different (and sometimes conflicting) objectives. This paper addresses instrumentation development for data needs that are related either directly or indirectly to a repository site, but does not touch on instrumentation for work with waste forms or other materials. Consequently, this implies a relatively large scale for the measurements, and an in situ setting for instrument performance. In this context, instruments are needed for site characterization to define phenomena, develop models, and obtain parameter values, and for later design and performance confirmation testing in the constructed repository. The former set of applications is more immediate, and is driven by the needs of program design and performance assessment activities. A host of general technical and nontechnical issues have arisen to challenge instrumentation development. Instruments can be classed into geomechanical, geohydrologic, or other specialty categories, but these issues cut across artificial classifications. These issues are outlined. Despite this imposing list of issues, several case histories are cited to evaluate progress in the area

The measurement of two phase flow phenomena in transient conditions representative of a Loss-of-Coolant Accident requires the use of sophisticatedinstruments and the further development of other instruments. Measurements made in large size pipes are often flow regime dependent. The flow regimes encountered depend upon the system geometry, transient effects, heat transfer, etc. The geometries in which these measurements must be made, the instruments which are currently used, new instruments being developed, the facilities used to calibrate these instruments, and the improvements which must be made to measurement capabilities are described

Functionalized poly(p-xylylenes) or so-called reactive polymers can be synthesized via chemical vapor deposition (CVD) polymerization. The resulting ultra-thin coatings are pinhole-free and can be conformally deposited to a wide range of substrates and materials. More importantly, the equipped functional groups can served as anchoring sites for tailoring the surface properties, making these reactive coatings a robust platform that can deal with sophisticated challenges faced in biointerfaces. In this work presented herein, surface coatings presenting various functional groups were prepared by CVD process. Such surfaces include aldehyde-functionalized coating to precisely immobilize saccharide molecules onto well-defined areas and alkyne-functionalized coating to click azide-modified molecules via Huisgen 1,3-dipolar cycloaddition reaction. Moreover, CVD copolymerization has been conducted to prepare multifunctional coatings and their specific functions were demonstrated by the immobilization of biotin and NHS-ester molecules. By using a photodefinable coating, polyethylene oxides were immobilized onto a wide range of substrates through photo-immobilization. Spatially controlled protein resistant properties were characterized by selective adsorption of fibrinogen and bovine serum albumin as model systems. Alternatively, surface initiator coatings were used for polymer graftings of polyethylene glycol) methyl ether methacrylate, and the resultant protein- and cell- resistant properties were characterized by adsorption of kinesin motor proteins, fibrinogen, and murine fibroblasts (NIH3T3). Accessibility of reactive coatings within confined microgeometries was systematically studied, and the preparation of homogeneous polymer thin films within the inner surface of microchannels was demonstrated. Moreover, these advanced coatings were applied to develop a dry adhesion process for microfluidic devices. This process provides (i) excellent bonding strength, (ii) extended

This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.

This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.

A description of instrumentation used in the Loss-of-Fluid Test (LOFT) large break Loss-of-Coolant Experiments is presented. Emphasis is placed on hydraulic and thermal measurements in the primary system piping and components, reactor vessel, and pressure suppression system. In addition, instrumentation which is being considered for measurement of phenomena during future small break testing is discussed. (orig.) 891 HP/orig. 892 BRE [de

...[supreg] portable document format at the following address: http://sites.wff.nasa.gov/code250/BPO_PEA.php... over 25 years. Balloons are used to collect scientific data and conduct research on the atmosphere and... has seen a dramatic increase in sophistication of experiments and demands for service. Due to the...

The instrument of scientific publishing, originally a necessary tool to enable development of a global science, has evolved relatively little in response to technological advances. Current scientific publishing practices incentivize a number of harmful approaches to research. Health Psychology

This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two…

This article proposes an analysis of Plato's "Sophist" (230b4--e5) that underlines the bond between the logical and the emotional components of the Socratic "elenchus", with the aim of depicting the social valence of this philosophical practice. The use of emotions characterizing the 'elenctic' method described by Plato is…

Full Text Available During the history of human kind, since our first ancestors, tools have represented a mean to reach objectives which might otherwise seemed impossibles. In the called New Economy, where tangibles assets appear to be losing the role as the core element to produce value versus knowledge, tools have kept aside man in his dairy work. In this article, the author's objective is to describe, in a simple manner, the importance of managing the organization's group of tools or instruments (Instrumental Capital. The characteristic conditions of this New Economy, the way Knowledge Management deals with these new conditions and the sub-processes that provide support to the management of Instrumental Capital are described.

Proper nuclear instrument maintenance is the essential precondition for any experimental work in nuclear sciences and technology. With the rapidly increasing sophistication of nuclear instrumentation, and considering the rather specific conditions that prevail in many IAEA Member States, this topic is gaining in importance, and has a strong economic implication. There is a general opinion that a regional, and possibly interregional cooperation in the field might be advantageous, and economically beneficial to all participating parties. The experience in such cooperation is limited, but sufficient that some reliable observations can be made, some conclusion can be drawn, and some recommendation for the possible future development can be presented

This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

At this year's particle physics conference at Brighton, a parallel session was given over to instrumentation and detector development. While this work is vital to the health of research and its continued progress, its share of prime international conference time is limited. Instrumentation can be innovative three times — first when a new idea is outlined, secondly when it is shown to be feasible, and finally when it becomes productive in a real experiment, amassing useful data rather than operational experience. Hyams' examples showed that it can take a long time for a new idea to filter through these successive stages, if it ever makes it at all

At this year's particle physics conference at Brighton, a parallel session was given over to instrumentation and detector development. While this work is vital to the health of research and its continued progress, its share of prime international conference time is limited. Instrumentation can be innovative three times — first when a new idea is outlined, secondly when it is shown to be feasible, and finally when it becomes productive in a real experiment, amassing useful data rather than operational experience. Hyams' examples showed that it can take a long time for a new idea to filter through these successive stages, if it ever makes it at all.

Full Text Available Every neutron scattering experiment requires the choice of a suited neutron diffractometer (or spectrometer in the case of inelastic scattering with its optimal configuration in order to accomplish the experimental tasks in the most successful way. Most generally, the compromise between the incident neutron flux and the instrumental resolution has to be considered, which is depending on a number of optical devices which are positioned in the neutron beam path. In this chapter the basic instrumental principles of neutron diffraction will be explained. Examples of different types of experiments and their respective expectable results will be shown. Furthermore, the production and use of polarized neutrons will be stressed.

The present invention relates to a surgical instrument for minimall-invasive surgery, comprising a handle, a shaft and an actuating part, characterised by a gastight cover surrounding the shaft, wherein the cover is provided with a coupler that has a feed- through opening with a loskable seal,

Long-baseline interferometry has been traditionally regarded as a very technical method with a limited number of applications. At the turn of the millennium, with the introduction of large community-oriented facilities such as the VLTI, we are seeing a transformation into a major astronomical technique, with a corresponding dramatic increase in the number of publications. This book marks this transition, by providing a wide and deep insight into a large number of diverse and compelling results recently obtained by long-baseline interferometers. It also provides an insight into concepts which form the basis of future instruments and further developments in this technique.

While eBanking security continues to increase in sophistication to protect against threats, the usability of the eBanking decreases resulting in poor security behaviors by the users. The current research evaluates se curity risks and measures taken for eBanking solutions. A case study is presented describing how increased complexity decreases vulnerabilities online but increases vulnerabilities from internal threats and eBanking users

This article sums up the Research and Development effort at present being carried out in the five following fields of applications: Health physics and Radioprospection, Control of nuclear reactors, Plant control (preparation and reprocessing of the fuel, testing of nuclear substances, etc.), Research laboratory instrumentation, Detectors. It also sets the place of French industrial activities by means of an estimate of the French market, production and flow of trading with other countries [fr

Although the division of the zodiac into 360° probably derives from Egypt or Assyria around 2000 BC, there is no surviving evidence of Mesopotamian cultures embodying this division into a mathematical instrument. Almost certainly, however, it was from Babylonia that the Greek geometers learned of the 360° circle, and by c. 80 BC they had incorporated it into that remarkably elaborate device gener...

It is essential to any research activity that accurate and efficient measurements be made for the experimental parameters under consideration for each individual experiment or test. Satisfactory measurements in turn depend upon having the necessary instruments and the capability of ensuring that they are performing within their intended specifications. This latter requirement can only be achieved by providing an adequate maintenance facility, staffed with personnel competent to understand the problems associated with instrument adjustment and repair. The Instrument Repair Shop at the Lawrence Berkeley Laboratory is designed to achieve this end. The organization, staffing and operation of this system is discussed. Maintenance policy should be based on studies of (1) preventive vs. catastrophic maintenance, (2) records indicating when equipment should be replaced rather than repaired and (3) priorities established to indicate the order in which equipment should be repaired. Upon establishing a workable maintenance policy, the staff should be instructed so that they may provide appropriate scheduled preventive maintenance, calibration and corrective procedures, and emergency repairs. The education, training and experience of the maintenance staff is discussed along with the organization for an efficient operation. The layout of the various repair shops is described in the light of laboratory space and financial constraints

This report of the Centre d'Etude de l'Energie Nucleaire - Studiecentrum voor Kernenergie gives a survey of the scientific and technical work done in 1978. The research areas are: 1. The sodium cooled fast reactor and namely the mixed oxide fuels, the carbide fuel, the materials development, the reprocessing, the fast reactor physics, the safety and instrumentation and the sodium technology. 2. The gas cooled reactors as gas cooled fast and high temperature reactors. 3. The light water reactors namely the BR3 reactor, the light water reactor fuels and the plutonium recycling. 4. The applied nuclear research, waste conditioning and disposal as the safeguards, the fusion research and the lithium technology. 5. The basic and exploratory research namely the materials science and the nuclear physics and finally 6. Non-nuclear research and development such as the air pollution, the pollution abatement and waste handling, the fuel cells and applied electrochemistry. (AF)

This report of the Centre d'Etude de l'Energie Nucleaire - Studiecentrum voor Kernenergie gives a survey of the scientific and technical work done in 1977. The research areas are: 1. The sodium cooled fast reactors and namely the mixed oxide fuels, the carbide fuel, the materials development, the reprocessing, the fast reactor physics, the safety and instrumentation and the sodium technology. 2. The gas cooled reactors as gas cooled fast and high temperature reactors. 3. The light water reactors namely the BR3 reactor, the light water reactor fuels and the plutonium recycling. 4. The applied nuclear research, waste conditioning and disposal as the safeguards, the fusion research and the lithium technology. 5. The basic and exploraty research namely the materials science and the nuclear physics and finally 6. Non-nuclear reseach and development such as the air pollution, the pollution abatement and waste handling, the fuel cells and applied electrochemistry. (AF)

This report of the Centre d'Etude de l'Energie Nucleaire - Studiecentrum voor Kernenergie gives a survey of the scientific and technical work done in 1976. The research areas are: 1. The sodium cooled fast reactors and namely the mixed oxide fuels, the carbide fuel, the materials development, the reprocessing, the fast reactor physics, the safety and instrumentation and the sodium technology. 2. The gas cooled reactors as gas cooled fast and high temperature reactors. 3. The light water reactors namely the BR3 reactor, the light water reactor fuels and the plutonium recycling. 4. The applied nuclear research, waste conditioning and disposal as the safeguards, the fusion research and the lithium technology. 5. The basic and exploratory research namely the materials science and the nuclear physics and finally 6. Non-nuclear research and development such as the air pollution, the pollution abatement and waste handling, the fuel cells and applied electrochemistry

This report of the Centre d'Etude de l'Energie Nucleaire - Studiecentrum voor Kernenergie - gives a survey of the scientific and technical work done in 1980. The research areas are: 1. The sodium cooled fast reactor and namely the mixed oxide fuels, the carbide fuel, the materials development, the reprocessing, the fast reactor physics; the safety and instrumentation and the sodium technology. 2. The gas cooled reactors as gas cooled fast and high temperature reactors. 3. The light water reactors, namely the BR3 reactor, the light water reactor fuels and the plutonium recycling. 4. The applied nuclear research, waste conditioning and disposal as the safeguards, the fusion research and the lithium technology. 5. The basis and exploratory research namely the materials science and the nuclear physics and finally 6. Non-nuclear research and development such as the air pollution, the pollution abatement and waste handling, the fuel cells and applied electrochemistry. (AF)

The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientificinstruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. Brookhaven National Laboratory (BNL) has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

The capability of neutron instrumentation in coming years will depend upon many factors, the main ones being the neutron source the instrument is sited on, the quality of the instrument itself, the quality of the support provided and the protocol for instrument operation. All of these factors must be optimised and improved upon to ensure consistently high quality scientific exploitation of an instrument. Examples of progress in each of these fields are given and a subjective view of possible future trends are hazarded. (author)

The Early Literacy Knowledge and Skills (ELKS) instrument was informed by the work of Ferreiro and Teberosky based on the notion that young children could be differentiated according to levels of sophistication in their understanding of the rules of written language. As an initial step to evaluate the instrument for teaching purposes, the present…

Monte Carlo simulations have become an essential tool for improving the performance of neutron-scattering instruments, since the level of sophistication in the design of instruments is defeating purely analytical methods. The program McStas, being developed at Rise National Laboratory, includes...

The magnetosphere is explored in situ by satellites, but measurements near the low altitude magnetospheric boundary by rockets, balloons and groundbased instruments play a very significant role. The geomagnetic field provides a frame with anisotropic wave and particle propagation effects, enabling remote sensing of the distant magnetosphere by means of balloon-borne and groundbased instruments. Examples will be given of successful studies, with coordinated satellite and balloon observations, of substorm, pulsation and other phenomena propagating both along and across the geomagnetic field. Continued efforts with sophisticated balloon-borne instrumentations should contribute substantially to our understanding of magnetospheric physics. (Author)

With the advent of microprocessors and digital-processing technologies as catalyst, classical sensors capable of simple signal conditioning operations have evolved rapidly to take on higher and more specialized functions including validation, compensation, and classification. This new category of sensor expands the scope of incorporating intelligence into instrumentation systems, yet with such rapid changes, there has developed no universal standard for design, definition, or requirement with which to unify intelligent instrumentation. Explaining the underlying design methodologies of intelligent instrumentation, Intelligent Instrumentation: Principles and Applications provides a comprehensive and authoritative resource on the scientific foundations from which to coordinate and advance the field. Employing a textbook-like language, this book translates methodologies to more than 80 numerical examples, and provides applications in 14 case studies for a complete and working understanding of the material. Beginn...

Robert John Ackermann deals decisively with the problem of relativism that has plagued post-empiricist philosophy of science. Recognizing that theory and data are mediated by data domains (bordered data sets produced by scientificinstruments), he argues that the use of instruments breaks the dependency of observation on theory and thus creates a reasoned basis for scientific objectivity.

nourishment takes cultural meanings (symbolic, expressing the division of labor, wealth, and a historical and cultural creation through which one can study a society. One attributes to Nutrition a sense of rational action, derived from the constitution of this science in modernity, inserted in a historical process of scientific rationalization of eating and nourishing. We believe that through the practice of conceptualization in interdisciplinary research, which involves a shared space of knowledge, we can be less constrained by a unified theoretical model of learning and be freer to think about life issues

RFS or Regles Fondamentales de Surete (Basic Safety Rules) applicable to certain types of nuclear facilities lay down requirements with which compliance, for the type of facilities and within the scope of application covered by the RFS, is considered to be equivalent to compliance with technical French regulatory practice. The object of the RFS is to take advantage of standardization in the field of safety, while allowing for technical progress in that field. They are designed to enable the operating utility and contractors to know the rules pertaining to various subjects which are considered to be acceptable by the Service Central de Surete des Installations Nucleaires, or the SCSIN (Central Department for the Safety of Nuclear Facilities). These RFS should make safety analysis easier and lead to better understanding between experts and individuals concerned with the problems of nuclear safety. The SCSIN reserves the right to modify, when considered necessary, any RFS and specify, if need be, the terms under which a modification is deemed retroactive. The aim of this RFS is to define the type, location and operating conditions for seismic instrumentation needed to determine promptly the seismic response of nuclear power plants features important to safety to permit comparison of such response with that used as the design basis

RFS or ''Regles Fondamentales de Surete'' (Basic Safety Rules) applicable to certain types of nuclear facilities lay down requirements with which compliance, for the type of facilities and within the scope of application covered by the RFS, is considered to be equivalent to compliance with technical French regulatory practice. The object of the RFS is to take advantage of standardization in the field of safety , while allowing for technical progress in that field. They are designed to enable the operating utility and contractors to know the rules pertaining to various subjects which are considered to be acceptable by the ''Service Central de Surete des Installations Nucleaires'' or the SCSIN (Central Department for the Safety of Nuclear Facilities). These RFS should make safety analysis easier and lead to better understanding between experts and individuals concerned with the problems of nuclear safety. The SCSIN reserves the right to modify, when considered necessary any RFS and specify, if need be, the terms under which a modification is deemed retroactive. The purpose of this RFS is to specify the meteorological instrumentation required at the site of each nuclear power plant equipped with at least one pressurized water reactor

We experimentally investigate coordination games in which cognition plays an important role, i.e. where outcomes are affected by the agents level of understanding of the game and the beliefs they form about each others understanding.We ask whether and when repeated exposure permits agents to learn

This issue is grouped into sections on materials, design, performance and analysis of balloons, reviews of major national and international balloon programmes, novel instrumentation and systems for scientific ballooning, and selected recent scientific observations.

An instrument is described for measuring radiation, particularly nuclear radiation, comprising: a radiation sensitive structure pivoted toward one end and including a pair of elongated solid members contiguously joined together along their length dimensions and having a common planar interface therebetween. One of the pairs of members is comprised of radiochromic material whose index of refraction changes due to anomolous dispersion as a result of being exposed to nuclear radiation. The pair of members further has mutually different indices of refraction with the member having the larger index of refraction further being transparent for the passage of light and of energy therethrough; means located toward the other end of the structure for varying the angle of longitudinal elevation of the pair of members; means for generating and projecting a beam of light into one end of the member having the larger index of refraction. The beam of light is projected toward the planar interface where it is reflected out of the other end of the same member as a first output beam; means projecting a portion of the beam of light into one end of the member having the larger index of refraction where it traverses therethrough without reflection and out of the other end of the same member as a second output beam; and means adjacent the structure for receiving the first and second output beams, whereby a calibrated change in the angle of elevation of the structure between positions of equal intensity of the first and second output beams prior to and following exposure provides a measure of the radiation sensed due to a change of refraction of the radiochromic material

With the publication and enforcement of the standard IEC 61508 of safety related systems, recent system architectures have been presented and evaluated. Among a number of techniques and measures to the evaluation of safety integrity level (SIL) for safety-related systems, several measures such as reliability block diagrams and Markov models are used to analyze the probability of failure on demand (PFD) and mean time to failure (MTTF) which conform to IEC 61508. The current paper deals with the quantitative analysis of the novel 1oo4-architecture (one out of four) presented in recent work. Therefore sophisticated calculations for the required parameters are introduced. The provided 1oo4-architecture represents an advanced safety architecture based on on-chip redundancy, which is 3-failure safe. This means that at least one of the four channels have to work correctly in order to trigger the safety function.

The book deals mainly with direct mass determination by means of a conventional balances. It covers the history of the balance from the beginnings in Egypt earlier than 3000 BC to recent developments. All balance types are described with emphasis on scientific balances. Methods of indirect mass determination, which are applied to very light objects like molecules and the basic particles of matter and celestial bodies, are included. As additional guidance, today’s manufacturers are listed and the profile of important companies is reviewed. Several hundred photographs, reproductions and drawings show instruments and their uses. This book includes commercial weighing instruments for merchandise and raw materials in workshops as well as symbolic weighing in the ancient Egyptian’s ceremony of ‘Weighing of the Heart’, the Greek fate balance, the Roman Justitia, Juno Moneta and Middle Ages scenes of the Last Judgement with Jesus or St. Michael and of modern balances. The photographs are selected from the...

This summarises the status of the inelastic spectrometers at the ISIS facility and gives some highlights from their scientific programme. The inelastic spectrometers HET, TFXA and IRIS are now being used routinely by UK and International research groups and have produced notable scientific results. Work has progressed steadily on eVS. The PRISMA spectrometer, the product of a collaboration between CNR, Italy and RAL, was installed this summer. Earlier this year an agreement was reached in principle between the Rutherford Appleton Laboratory and the University of Wurzburg (West Germany) to build a second spectrometer (ROTAX) for the study of coherent inelastic excitations. A more sophisticated technical concept than that of PRISMA, based on a nonuniformly rotating analyser, it will allow a greater flexibility in the choice of dynamic scans. Substantial progress has also been made on the design of MARI, the sister spectrometer to HET, which is being built as part of the UK-Japan collaboration on pulsed neutron scattering. (author)

The Salton Sea Scientific Drilling Project, was spudded on 24 October 1985, and reached a total depth of 10,564 ft. (3. 2 km) on 17 March 1986. There followed a period of logging, a flow test, and downhole scientific measurements. The scientific goals were integrated smoothly with the engineering and economic objectives of the program and the ideal of 'science driving the drill' in continental scientific drilling projects was achieved in large measure. The principal scientific goals of the project were to study the physical and chemical processes involved in an active, magmatically driven hydrothermal system. To facilitate these studies, high priority was attached to four areas of sample and data collection, namely: (1) core and cuttings, (2) formation fluids, (3) geophysical logging, and (4) downhole physical measurements, particularly temperatures and pressures.

This paper gives an overview of the CARMENES instrument and of the survey that will be carried out with it during the first years of operation. CARMENES (Calar Alto high-Resolution search for M dwarfs with Exoearths with Near-infrared and optical Echelle Spectrographs) is a next-generation radial-velocity instrument under construction for the 3.5m telescope at the Calar Alto Observatory by a consortium of eleven Spanish and German institutions. The scientific goal of the project is conducting a 600-night exoplanet survey targeting ~ 300 M dwarfs with the completed instrument. The CARMENES instrument consists of two separate echelle spectrographs covering the wavelength range from 0.55 to 1.7 μm at a spectral resolution of R = 82,000, fed by fibers from the Cassegrain focus of the telescope. The spectrographs are housed in vacuum tanks providing the temperature-stabilized environments necessary to enable a 1 m/s radial velocity precision employing a simultaneous calibration with an emission-line lamp or with a Fabry-Perot etalon. For mid-M to late-M spectral types, the wavelength range around 1.0 μm (Y band) is the most important wavelength region for radial velocity work. Therefore, the efficiency of CARMENES has been optimized in this range. The CARMENES instrument consists of two spectrographs, one equipped with a 4k x 4k pixel CCD for the range 0.55 - 1.05 μm, and one with two 2k x 2k pixel HgCdTe detectors for the range from 0.95 - 1.7μm. Each spectrograph will be coupled to the 3.5m telescope with two optical fibers, one for the target, and one for calibration light. The front end contains a dichroic beam splitter and an atmospheric dispersion corrector, to feed the light into the fibers leading to the spectrographs. Guiding is performed with a separate camera; on-axis as well as off-axis guiding modes are implemented. Fibers with octagonal cross-section are employed to ensure good stability of the output in the presence of residual guiding errors. The

Scientists often need to build dedicated computer-controlled experimental systems. For this purpose, it is becoming common to employ open-source microcontroller platforms, such as the Arduino. These boards and associated integrated software development environments provide affordable yet powerful solutions for the implementation of hardware control of transducers and acquisition of signals from detectors and sensors. It is, however, a challenge to write programs that allow interactive use of ...

In order to gain more insight into the specific behavior of materials, it is often necessary to perform measurements as a function of different external parameters. Despite its high sensitivity to internal fields, this simple observation also applies for the {mu}SR technique. The most common parameter which can be tuned during an experiment is the sample temperature. By using a range of cryostats, temperatures between 0.02 and 900 K can be covered at the PSI {mu}SR Facility. On the other hand, and by using high-energy muons, pressures as high as 10'000 bars can nowadays be reached during {mu}SR experiments. As will be demonstrated in the following Sections, the magnetic field is an additional external parameter playing a fundamental role when studying the ground state properties of materials in condensed matter physics and chemistry. However, the availability of high magnetic fields for {mu}SR experiments is still rather limited. Hence, if on one hand the high value of the gyromagnetic ratio of the muon provides the high magnetic sensitivity of the method, on the other hand it can lead to very high muon-spin precession frequencies when performing measurements in applied fields (the muon-spin precession frequency in a field of 1 Tesla s 135.5 MHz). Consequently, the use of ultra-fast detectors and electronics is mandatory when measuring in magnetic fields exceeding 1 Tesla. If such fields are very intense when compared to the Earth magnetic field < 10{sup -4} Tesla), the energy associated with them is still modest in view of the thermal energy. Hence, the Zeeman energy splitting of a free electron in a magnetic field of 1 Tesla corresponds to a thermal energy as low as 0.67 Kelvin. It is worth mentioning that nowadays magnetic fields of the order of 10 to 15 Tesla are quite common in condensed matter laboratories and have opened up vast new exciting experimental possibilities. (author)

In order to gain more insight into the specific behavior of materials, it is often necessary to perform measurements as a function of different external parameters. Despite its high sensitivity to internal fields, this simple observation also applies for the μSR technique. The most common parameter which can be tuned during an experiment is the sample temperature. By using a range of cryostats, temperatures between 0.02 and 900 K can be covered at the PSI μSR Facility. On the other hand, and by using high-energy muons, pressures as high as 10'000 bars can nowadays be reached during μSR experiments. As will be demonstrated in the following Sections, the magnetic field is an additional external parameter playing a fundamental role when studying the ground state properties of materials in condensed matter physics and chemistry. However, the availability of high magnetic fields for μSR experiments is still rather limited. Hence, if on one hand the high value of the gyromagnetic ratio of the muon provides the high magnetic sensitivity of the method, on the other hand it can lead to very high muon-spin precession frequencies when performing measurements in applied fields (the muon-spin precession frequency in a field of 1 Tesla s 135.5 MHz). Consequently, the use of ultra-fast detectors and electronics is mandatory when measuring in magnetic fields exceeding 1 Tesla. If such fields are very intense when compared to the Earth magnetic field -4 Tesla), the energy associated with them is still modest in view of the thermal energy. Hence, the Zeeman energy splitting of a free electron in a magnetic field of 1 Tesla corresponds to a thermal energy as low as 0.67 Kelvin. It is worth mentioning that nowadays magnetic fields of the order of 10 to 15 Tesla are quite common in condensed matter laboratories and have opened up vast new exciting experimental possibilities. (author)

In 1783, the Montgolfier brothers ushered in a new era of transportation and exploration when they used hot air to drive an un-tethered balloon to an altitude of 2 km. Made of sackcloth and held together with cords, this balloon challenged the way we thought about human travel, and it has since evolved into a robust platform for performing novel science and testing new technologies. Today, high-altitude balloons regularly reach altitudes of 40 km, and they can support payloads that weigh more than 3,000 kg. Long-duration balloons can currently support mission durations lasting 55 days, and developing balloon technologies (i.e. Super-Pressure Balloons) are expected to extend that duration to 100 days or longer; competing with satellite payloads. This relatively inexpensive platform supports a broad range of science payloads, spanning multiple disciplines (astrophysics, heliophysics, planetary and earth science.) Applications extending beyond traditional science include testing new technologies for eventual space-based application and stratospheric airships for planetary applications.

This article focuses on scientific integrity and the identification of predisposing factors to scientific misconduct in Brazil. Brazilian scientific production has increased in the last ten years, but the quality of the articles has decreased. Pressure on researchers and students for increasing scientific production may contribute to scientific misconduct. Cases of misconduct in science have been recently denounced in the country. Brazil has important institutions for controlling ethical and safety aspects of human research, but there is a lack of specific offices to investigate suspected cases of misconduct and policies to deal with scientific dishonesty.

An observatory system like the VLT/I requires careful scientific planning for operations and future instruments. Currently the ESO optical/near-infrared facilities include four 8m telescopes, four (movable) 1.8m telescopes used exclusively for interferometry, two 4m telescopes and two survey telescopes. This system offers a large range of scientific capabilities and setting the corresponding priorities depends good community interactions. Coordinating the existing and planned instrumentation is an important aspect for strong scientific return. The current scientific priorities for the VLT and VLTI are pushing for the development of the highest angular resolution imaging and astrometry, integral field spectroscopy and multi-object spectroscopy. The ESO 4m telescopes on La Silla will be dedicated to time domain spectroscopy and exo-planet searches with highly specialized instruments. The next decade will also see a significant rise in the scientific importance of massive ground and space-based surveys. We discuss how future developments in astronomical research could shape the VLT/I evolution.

This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two types of plurality heuristics to model approval voting behavior. When using a sincere plurality heuristic, voters simplify their decision process by voting for their single favorite candidate. When using a strategic plurality heuristic, voters strategically focus their attention on the 2 front-runners and vote for their preferred candidate among these 2. Using a hierarchy of Thurstonian random utility models, the authors implemented these different decision rules and tested them statistically on 7 real world approval voting elections. They cross-validated their key findings via a psychological Internet experiment. Although a substantial number of voters used the plurality heuristic in the real elections, they did so sincerely, not strategically. Moreover, even though Thurstonian models do not force such agreement, the results show, in contrast to common wisdom about social choice rules, that the sincere social orders by Condorcet, Borda, plurality, and approval voting are identical in all 7 elections and in the Internet experiment. PsycINFO Database Record (c) 2007 APA, all rights reserved.

Full Text Available The human pathogens Yersinia pseudotuberculosis and Yersinia enterocolitica cause enterocolitis, while Yersinia pestis is responsible for pneumonic, bubonic, and septicaemic plague. All three share an infection strategy that relies on a virulence factor arsenal to enable them to enter, adhere to, and colonise the host while evading host defences to avoid untimely clearance. Their arsenal includes a number of adhesins that allow the invading pathogens to establish a foothold in the host and to adhere to specific tissues later during infection. When the host innate immune system has been activated, all three pathogens produce a structure analogous to a hypodermic needle. In conjunction with the translocon, which forms a pore in the host membrane, the channel that is formed enables the transfer of six ‘effector’ proteins into the host cell cytoplasm. These proteins mimic host cell proteins but are more efficient than their native counterparts at modifying the host cell cytoskeleton, triggering the host cell suicide response. Such a sophisticated arsenal ensures that yersiniae maintain the upper hand despite the best efforts of the host to counteract the infecting pathogen.

We present a pedagogical review of the weak gravitational lensing measurement process and its connection to major scientific questions such as dark matter and dark energy. Then we describe common ways of parametrizing systematic errors and understanding how they affect weak lensing measurements. Finally, we discuss several instrumental systematics and how they fit into this context, and conclude with some future perspective on how progress can be made in understanding the impact of instrumental systematics on weak lensing measurements

Christian N. L. Olivers, winner of the Award for Distinguished Scientific Early Career Contributions to Psychology, is cited for outstanding research on visual attention and working memory. Olivers uses classic experimental designs in an innovative and sophisticated way to determine underlying mechanisms. He has formulated important theoretical…

Mathematical and meteorological statistic processing of meteorological-climatologic data, which includes assessment of the exactness, level of confidence of the average and extreme values, frequencies (probabilities) of the occurrence of each meteorological phenomenon and element e.t.c. helps to describe the impacts climate may have on different social and economic activities (transportation, heat& power generation), as well as on human health. Having in mind the new technology and the commercial world, during the work with meteorological-climatologic data we have meet many different challenges. Priority in all of this is the quality of the meteorological-climatologic set of data. First, we need compatible modern, sophisticated measurement and informatics solution for data. Results of this measurement through applied processing and analyze is the second branch which is very important also. Should we all (country) need that? Today we have many unpleasant events connected with meteorology, many questions which are not answered and all of this has too long lasting. We must give the answers and solve the real and basic issue. In this paper the data issue will be presented. We have too much of data but so little of real and quality applied of them, Why? There is a data for: -public applied -for jurisdiction needs -for getting fast decision-solutions (meteorological-dangerous phenomenon's) -for getting decisions for long-lasting plans -for explore in different sphere of human living So, it is very important for what kind of data we are talking. Does the data we are talking are with public or scientific-applied character? So,we have two groups. The first group which work with the data direct from the measurement place and instrument. They are store a quality data base and are on extra help to the journalists, medical workers, human civil engineers, electromechanical engineers, agro meteorological and forestry engineer e.g. The second group do work with all scientific

Full text: Promising students had a foretaste of the latest laboratory techniques at the ICFA 1993 India School on Instrumentation in High Energy Physics held from February 15-26 and hosted by the Tata Institute of Fundamental Research (TIFR), Bombay. The scientific programme was put together by the ICFA Panel for Future Instrumentation, Innovation and Development, chaired by Tord Ekelof (Uppsala). The programme included lectures and topical seminars covering a wide range of detector subjects. In small groups, students got acquainted with modern detector technologies in the laboratory sessions, using experimental setups assembled in various institutes world-wide and shipped to Bombay for the School. The techniques covered included multiwire proportional chambers for detection of particles and photons, gaseous detectors for UV photons and X-ray imaging, the study of charge drift in silicon detectors, measurement of the muon lifetime using liquid scintillators, tracking using scintillating fibres, and electronics for sensitive detectors. The India School was attended by around 80 students from 20 countries; 34 came from Indian universities. It was the fifth in this series, previous Schools having been at Trieste (1987, 1989 and 1991) organized by the ICFA Panel and hosted and sponsored by the International Centre for Theoretical Physics, and in 1990, organized at Rio de Janeiro in collaboration with the Centro Brasileiro de Pesquisas Fisicas. The School was jointly directed by Suresh Tonwar (TIFR), Fabio Sauli (CERN) and Marleigh Sheaff (University of Wisconsin), and sponsored by TIFR and DAE (India), CERN (Switzerland), ICTP and INFN (Italy), British Council and RAL (UK), NSF and DOE (USA), KEK (Japan), IPP (Canada) and DESY (Germany)

The scientific method is the principal methodology by which biological knowledge is gained and disseminated. As fundamental as the scientific method may be, its historical development is poorly understood, its definition is variable, and its deployment is uneven. Scientific progress may occur without the strictures imposed by the formal…

Sophisticated numerical tools exist for modeling geomorphic processes and linking them to tectonic and climatic systems, but they are often seen as inaccessible for users with an exploratory level of interest. We have improved the accessibility of landscape evolution models by producing a simple graphics user interface (GUI) that takes advantage of the Channel-Hillslope Integrated Landscape Development (CHILD) model. Model access is flexible: the user can edit values for basic geomorphic, tectonic, and climate parameters, or obtain greater control by defining the spatiotemporal distributions of those parameters. Users can make educated predictions by choosing their own parametric values for the governing equations and interpreting the results immediately through model graphics. This method of modeling allows users to iteratively build their understanding through experimentation. Use of this GUI is intended for inquiry and discovery-based learning activities. We discuss a number of examples of how the GUI can be used at the upper high school, introductory university, and advanced university level. Effective teaching modules initially focus on an inquiry-based example guided by the instructor. As students become familiar with the GUI and the CHILD model, the class can shift to more student-centered exploration and experimentation. To make model interpretations more robust, digital elevation models can be imported and direct comparisons can be made between CHILD model results and natural topography. The GUI is available online through the University of Maine's Earth and Climate Sciences website, through the Community Surface Dynamics Modeling System (CSDMS) model repository, or by contacting the corresponding author.

IASI is an infrared atmospheric sounder. It will provide meteorologist and scientific community with atmospheric spectra. The IASI system includes 3 instruments that will be mounted on the Metop satellite series, a data processing software integrated in the EPS (EUMETSAT Polar System) ground segment and a technical expertise centre implemented in CNES Toulouse. The instrument is composed of a Fourier transform spectrometer and an associated infrared imager. The optical configuration is based on a Michelson interferometer and the interferograms are processed by an on-board digital processing subsystem, which performs the inverse Fourier transforms and the radiometric calibration. The infrared imager co-registers the IASI soundings with AVHRR imager (AVHRR is another instrument on the Metop satellite). The presentation will focus on the architectures of the instrument, the description of the implemented technologies and the measured performance of the first flight model. CNES is leading the IASI program in association with EUMETSAT. The instrument Prime is ALCATEL SPACE.

In this instrument review chapter the calibration plans of ESO IR instruments are presented and briefly reviewed focusing, in particular, on the case of ISAAC, which has been the first IR instrument at VLT and whose calibration plan served as prototype for the coming instruments.

The purpose of this manual is to provide apprentice health physics surveyors and other operating groups not directly concerned with radiation detection instruments a working knowledge of the radiation detection and measuring instruments in use at the Laboratory. The characteristics and applications of the instruments are given. Portable instruments, stationary instruments, personnel monitoring instruments, sample counters, and miscellaneous instruments are described. Also, information sheets on calibration sources, procedures, and devices are included. Gamma sources, beta sources, alpha sources, neutron sources, special sources, a gamma calibration device for badge dosimeters, and a calibration device for ionization chambers are described

The earliest astronomical instruments used in India were the gnomon and the water clock. In the early seventh century, Brahmagupta described ten types of instruments, which were adopted by all subsequent writers with minor modifications. Contact with Islamic astronomy in the second millennium AD led to a radical change. Sanskrit texts began to lay emphasis on the importance of observational instruments. Exclusive texts on instruments were composed. Islamic instruments like the astrolabe were adopted and some new types of instruments were developed. Production and use of these traditional instruments continued, along with the cultivation of traditional astronomy, up to the end of the nineteenth century.

This paper describes the development of the Probability Evaluation Game (PEG): an innovative teaching instrument that emphasises the sophistication of listening and highlights listening as a key skill for accounting practitioners. Whilst in a roundtable format, playing PEG involves participants individually evaluating a series of probability terms…

This report on troubleshooting of nuclear instruments is the product of several scientists and engineers, who are closely associated with nuclear instrumentation and with the IAEA activities in the field. The text covers the following topics: Preamplifiers, amplifiers, scalers, timers, ratemeters, multichannel analyzers, dedicated instruments, tools, instruments, accessories, components, skills, interfaces, power supplies, preventive maintenance, troubleshooting in systems, radiation detectors. The troubleshooting and repair of instruments is illustrated by some real examples

To reduce the complexities of conventional programming, graphical software was used in the development of instrumentation drivers. The graphical software provides a standard set of tools (graphical subroutines) which are sufficient to program the most sophisticated CAMAC/GPIB drivers. These tools were used and instrumentation drivers were successfully developed for operating CAMAC/GPIB hardware from two different manufacturers: LeCroy and DSP. The use of these tools is presented for programming a LeCroy A/D Waveform Analyzer.

This study investigates how elite Turkish high school physics students claim to approach learning physics when they are simultaneously (i) engaged in a curriculum that led to significant gains in their epistemological sophistication and (ii) subject to a high-stakes college entrance exam. Students reported taking surface (rote) approaches to…

can empower performers by producing super instrument works that allow the concert instrument to become an ensemble controlled by a single player. The existing instrumental skills of the performer can be multiplied and the qualities of regular acoustic instruments extended or modified. Such a situation......The genre of contemporary classical music has seen significant innovation and research related to new super, hyper, and hybrid instruments, which opens up a vast palette of expressive potential. An increasing number of composers, performers, instrument designers, engineers, and computer programmers...... have become interested in different ways of “supersizing” acoustic instruments in order to open up previously-unheard instrumental sounds. Super instruments vary a great deal but each has a transformative effect on the identity and performance practice of the performing musician. Furthermore, composers...

The long-standing belief that age is negatively associated with scientific productivity and creativity is shown to be based upon incorrect analysis of data. Studies reported in this article suggest that the relationship between age and scientific performance is influenced by the operation of the reward system. (Author)

(Purpose) The purpose of this paper is to describe visual literacy, an adapted version of Visual Thinking Strategy (VTS), and an art-integrated middle school mathematics lesson about scientific notation. The intent of this lesson was to provide students with a real life use of scientific notation and exponents, and to motivate them to apply their…

The doctoral dissertation discusses some of the moral standards of good scientific practice that areunderexposed in the literature. In particular, attempts are made to correct the conceptual confusionsurrounding the norm of 'disinterestedness' in science (‘uhildethed’), and the norm of scientific...

Introduction of International Financial Reporting Standards in Ukraine requires scientific and methodological study of their specific use in national practice. The essence and types of financial instruments have been researched. The regulatory support for their accounting in Ukraine has been established. The authors have analyzed the provisions of the International Financial Reporting Standards governing the financial instruments accounting, worked out characteristics of existing methodology ...

Anaerobic digestion is a multistep process, and is most applied to solids destruction and wastewater treatment for energy production. Despite wide application, and long-term industrial proof of application, some industries are still reluctant to apply this technology. One of the classical reasons...... benchmark. There has therefore been, overall, a quantum advance in application and sophistication of instrumentation and control in anaerobic digestion, and it is an effective option for improved process loading rate and conversion efficiency....... are still a limitation, but this is being partly addressed by the increased complexity of digestion processes. Methods for control benchmarking have also been improved, as there is now an industry standard model (the IWA ADM1), and this is being applied in an improved whole wastewater treatment plant...

The LHC is equipped with a full suite of sophisticated beam instrumentation which has been essential for rapid commissioning, the safe increase in total stored beam power and the understanding of machine optics and accelerator physics phenomena. This paper will comment on all of these systems and on their contributions to the various stages of beam commissioning. It will include details on: the beam position system and its use for realtime global orbit feedback; the beam loss system and its role in machine protection; total and bunch by bunch intensity measurements; tune measurement and feedback; synchrotron light diagnostics for transverse beam size measurements, abort gap monitoring and longitudinal density measurements. Issues and problems encountered along the way will also be discussed together with the prospect for future upgrades.

During the 1970s and 1980s, before synthesizers based on direct sampling of musical sounds became popular, replicating musical instruments using frequency modulation (FM) or wavetable synthesis was one of the “holy grails” of music synthesis. Synthesizers such as the Yamaha DX7 allowed users great flexibility in mixing and matching sounds, but were notoriously difficult to coerce into producing sounds like those of a given instrument. Instrument design wizards practiced the mysteries of FM instrument design.

A liquid metal cooled nuclear reactor is described which has an equal number of fuel sub-assemblies and sensing instruments. Each instrument senses temperature and rate of coolant flow of a coolant derived from a group of three sub-assemblies so that an abnormal value for one sub-assembly will be indicated on three instruments thereby providing for redundancy of up to two of the three instruments. The abnormal value may be a precurser to unstable boiling of coolant

Scientific achievements and technological advances in today undeniably has become more sophisticated with the latest capabilities that have helped human life in either routine or scientific problem solving. But each time the power that created them, every tool, particularly equipment used for experimental and scientific tasks also face the risk of aging and damage either the damage is expected or unexpected. Circumstance or situation depends on the use and care of such a scientific tool. To ensure a scientific tool that can be used for a long shelf life, the appropriate care and maintenance should be concerned about. This paper discusses matters related to scientific equipment and Nuclear Malaysia Nuclear specializes in equipment that is connected to and controlled by a computer in terms of care, use and future planning to ensure that it can be used for an appropriate period of time. (author)

Scientific investigations involving the use of neutron beams have been centered at reactor sources for the last 35 years. Recently, there has been considerable interest in using the neutrons produced by accelerator driven (pulsed) sources. Such installations are in operation in England, Japan, and the United States. In this article a brief survey is given of how the neutron beams are produced and how they can be optimized for neutron scattering experiments. A detailed description is then given of the various types of instruments that have been, or are planned, at pulsed sources. Numerous examples of the scientific results that are emerging are given. An attempt is made throughout the article to compare the scientific opportunities at pulsed sources with the proven performance of reactor installations, and some familiarity with the latter and the general field of neutron scattering is assumed. New areas are being opened up by pulsed sources, particularly with the intense epithermal neutron beams, which promise to be several orders of magnitude more intense than can be obtained from a thermal reactor

We are interested in the quality of sound produced by musical instruments and their playability. In wind instruments, a hydrodynamic source of sound is coupled to an acoustic resonator. Linear acoustics can predict the pitch of an instrument. This can significantly reduce the trial-and-error process

The Earth Venture Instrument (EVI) element of the Earth Venture Program calls for developing instruments for participation on a NASA-arranged spaceflight mission of opportunity to conduct innovative, integrated, hypothesis or scientific question-driven approaches to pressing Earth system science issues. This paper discusses the EVI element and the management approach being used to manage both an instrument development activity as well as the host accommodations activity. In particular the focus will be on the approach being used for the first EVI (EVI-1) selected instrument, Tropospheric Emissions: Monitoring of Pollution (TEMPO), which will be hosted on a commercial GEO satellite and some of the challenges encountered to date and corresponding mitigations that are associated with the management structure for the TEMPO Mission and the architecture of EVI.

Full Text Available The aim of this work is to study the relationship between scientific-creative thinking construct and academic performance in a sample of adolescents. In addition, the scientific-creative thinking instrument’s reliability will be tested. The sample was composed of 98 students (aged between 12-16 years old attending to a Secondary School in Murcia Region (Spain. The used instruments were: a the Scientific-Creative Thinking Test designed by Hu and Adey (2002, which was adapted to the Spanish culture by the High Abilities research team at Murcia University. The test is composed of 7 task based in the Scientific Creative Structure Model. It assesses the dimensions fluency, flexibility and originality; b The General and Factorial Intelligence Test (IGF/5r; Yuste, 2002, which assess the abilities of general intelligence and logic reasoning, verbal reasoning, numerical reasoning and spatial reasoning; c Students’ academic achievement by domains (scientific-technological, social-linguistic and artistic was collected. The results showed positive and statistical significant correlations between the scientific-creative tasks and academic achievement of different domains.

Throughout the history of science, the scientific image has played a significant role in communication. With recent developments in computing technology, there has been an increase in the kinds of opportunities now available for scientists to communicate in more sophisticated ways. Within behavior analysis, though, we are only just beginning to appreciate the importance of going beyond the printing press to elucidate basic principles of behavior. The aim of this manuscript is to stimulate appreciation of both the role of the scientific image and the opportunities provided by a quick response code (QR code) for enhancing the functionality of the printed page. I discuss the limitations of imagery in behavior analysis ("Introduction"), and I show examples of what can be done with animations and multimedia for teaching philosophical issues that arise when teaching about private events ("Private Events 1 and 2"). Animations are also useful for bypassing ethical issues when showing examples of challenging behavior ("Challenging Behavior"). Each of these topics can be accessed only by scanning the QR code provided. This contingency has been arranged to help the reader embrace this new technology. In so doing, I hope to show its potential for going beyond the limitations of the printing press.

Full Text Available According to International Code of Oenological Practices it is allowed to use acide L(+tartrique for wine acidification, while use of synthetic dihydroxysuccinic acid is forbidden. Today it is impossible to differentiate natural dihydroxysuccinic acid from synthetic one by standard techniques. Even by using very sensitive method of isotope mass spectrometry certain difficulties emerge because total isotope characteristics of carbon of dihydroxysuccinic acid of different nature have the same values. However, isotope characteristics of carbon of intramolecular structural groups of dihydroxysuccinic acid made of different raw materials differ significantly. This allows specifying the nature of dihydroxysuccinic acid that is used for making of wines and juice drinks. In Russia, scientific and research institute of beer brewing and wine-making industry carried out a work for studying isotope characteristics of intramolecular isotope heterogeneity of dihydroxysuccinic acid from different origins in order to identify wines and juice drinks. Isotope characteristics of organic oxy acids from different origins were studied including them obtained by synthetic way and numeric range of value δ13 C,‰ were specified. The obtained results allow performing identification tests of wines and juice drinks to find out the products that contain not specified additives as that allowed for its use in production process.

The International Atomic Energy Agency is performing safeguards at some nuclear power reactors, 50 bulk processing facilities, and 170 research facilities. Its verification activities require the use of instruments to measure nuclear materials and of surveillance instruments to maintain continuity of knowledge of the locations of nuclear materials. Instruments that are in use and under development to measure weight, volume, concentration, and isotopic composition of nuclear materials, and the major surveillance instruments, are described in connection with their uses at representative nuclear facilities. The current status of safeguards instrumentation and the needs for future development are discussed

Full Text Available The Revista Scientific aims to publish quality papers that include the perspective of analysis in educational settings. Together with www.indtec.com.ve, this electronic publication aims to promote and disseminate, with seriousness and rigor, the academic production in this field. Editorial of the new stage Revista Scientific was created with the aim of constituting a reference space for scientific research in the field of research analysis that is carried out within the universities in Latin America, once the distribution list hosted on the INDTEC platform (http://www.indtec.com.ve is consolidated as a space for dissemination and development of new ideas and initiatives. The first presentation of INDTEC Magazine was held in August 2016 in Venezuela. Thanks to the support of the INDTEC platform, SCIENTIFIC Magazine has been able to develop from the cooperative work of the people who make up its Editorial Committee, Academic Committee and Scientific Committee in Electronic Edition, and of the referees of each one of the numbers. Part of the success is due to the motivation of its co-editors and excellent professionals from different parts of the world: Argentina, Belgium, Colombia, Cuba, Ecuador, Spain, Mexico, Venezuela, which form the various committees, with enthusiasm and joy participating in this project (whose organizational structure is presented in this edition and continues in increcendo. Also, the strategy adopted to edit a monographic number from the various events organized in the framework of the universities, has contributed to provide SCIENTIFIC with a point value speaker of intellectual progress in the field of education. SCIENTIFIC Magazine is currently indexed in ISI, International Scientific Indexing, Dubai - UAE; ROAD, the Directory of Open Access Scholarly Resources (ISSN International Center, France; REVENCYT-ULA, Venezuela; Google Scholar (Google Scholar, International Index; Published in Calaméo; ISSUU; Academia

The document is a collection of the scientific meeting abstracts in the fields of nuclear physics, medical sciences, chemistry, agriculture, environment, engineering, different aspects of energy and presents research done in 1999 in these fields

As NCI's central scientific strategy office, CRS collaborates with the institute's divisions, offices, and centers to identify research opportunities to advance NCI's vision for the future of cancer research.

The purpose of this text is to provide a reference source to scientists, engineers, and students who are new to scientific visualization or who are interested in expanding their knowledge in this subject...

The phrase pre-modern scientific may be used to describe certain attitudes and ..... But unfortunately, in the general atmosphere of poor education and collective fears .... present day science and technology that old time beliefs and traditional ...

No library or information service and especially in a developing .... Good public relations, consultancy services including bilateral and ... project proposal for the creation of a scientific and technological information ... For example, in 1995 the ...

The procedures used to select and implement scientific objectives for the Voyager 1 and 2 planetary encounters are described. Attention is given to the scientific tradeoffs and engineering considerations must be addressed at various stages in the mission planning process, including: the limitations of ground and spacecraft communications systems, ageing of instruments in flight, and instrument calibration over long distances. The contribution of planetary science workshops to the definition of scientific objectives for deep space missions is emphasized.

us to understand how a truth is reproduced, circulating among diverse fields of human knowledge. Also it will show why we accept and reproduce a particular discourse. Finally, we state Euclidean geometry as a truth that circulates in scientific discourse and performs a scientific self. We unfold...... the importance of having students following the path of what schools perceive a real scientist is, no to become a scientist, but to become a logical thinker, a problem solver, a productive citizen who uses reason....

Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

The following topics are dealt with: Nuclear heavy ion physics, atomic physics with heavy ions, heavy ion research in other fields, accelerator experiments and developments, instruments and methods, the international FAIR project. (HSI)

In a report released last week the National Academy of Sciences' Panel on Scientific Communication and National Security concluded that the ‘limited and uncertain benefits’ of controls on the dissemination of scientific and technological research are ‘outweighed by the importance of scientific progress, which open communication accelerates, to the overall welfare of the nation.’ The 18-member panel, chaired by Dale R. Corson, president emeritus of Cornell University, was created last spring (Eos, April 20, 1982, p. 241) to examine the delicate balance between open dissemination of scientific and technical information and the U.S. government's desire to protect scientific and technological achievements from being translated into military advantages for our political adversaries.The panel dealt almost exclusively with the relationship between the United States and the Soviet Union but noted that there are ‘clear problems in scientific communication and national security involving Third World countries.’ Further study of this matter is necessary.

This exploratory study assessed the influence of an implicit, inquiry-oriented nature of science (NOS) instructional approach undertaken in an interdisciplinary college science course on undergraduate honor students' (UHS) understanding of the aspects of NOS for scientific work and scientific knowledge. In this study, the nature of scientific work concentrated upon the delineation of science from pseudoscience and the value scientists place on reproducibility. The nature of scientific knowledge concentrated upon how UHS view scientific theories and how they believe scientists utilize scientific theories in their research. The 39 UHS who participated in the study were non-science majors enrolled in a Honors College sponsored interdisciplinary science course where the instructors took an implicit NOS instructional approach. An open-ended assessment instrument, the UFO Scenario, was designed for the course and used to assess UHS' images of science at the beginning and end of the semester. The mixed-design study employed both qualitative and quantitative techniques to analyze the open-ended responses. The qualitative techniques of open and axial coding were utilized to find recurring themes within UHS' responses. McNemar's chi-square test for two dependent samples was used to identify whether any statistically significant changes occurred within responses from the beginning to the end of the semester. At the start of the study, the majority of UHS held mixed NOS views, but were able to accurately define what a scientific theory is and explicate how scientists utilize theories within scientific research. Postinstruction assessment indicated that UHS did not make significant gains in their understanding of the nature of scientific work or scientific knowledge and their overall images of science remained static. The results of the present study found implicit NOS instruction even with an extensive inquiry-oriented component was an ineffective approach for modifying UHS

This book contains a selection of papers and articles in instrumentation previously pub­ lished in technical periodicals and journals of learned societies. Our selection has been made to illustrate aspects of current practice and applications of instrumentation. The book does not attempt to be encyclopaedic in its coverage of the subject, but to provide some examples of general transduction techniques, of the sensing of particular measurands, of components of instrumentation systems and of instrumentation practice in two very different environments, the food industry and the nuclear power industry. We have made the selection particularly to provide papers appropriate to the study of the Open University course T292 Instrumentation. The papers have been chosen so that the book covers a wide spectrum of instrumentation techniques. Because of this, the book should be of value not only to students of instrumen­ tation, but also to practising engineers and scientists wishing to glean ideas from areas of instrumen...

The objective of this project was to develop and coordinate nuclear instrumentation standards with resulting economies for the nuclear and radiation fields. There was particular emphasis on coordination and management of the Nuclear Instrument Module (NIM) System, U.S. activity involving the CAMAC international standard dataway system, the FASTBUS modular high-speed data acquisition and control system and processing and management of national nuclear instrumentation and detector standards, as well as a modest amount of assistance and consultation services to the Pollutant Characterization and Safety Research Division of the Office of Health and Environmental Research. The principal accomplishments were the development and maintenance of the NIM instrumentation system that is the predominant instrumentation system in the nuclear and radiation fields worldwide, the CAMAC digital interface system in coordination with the ESONE Committee of European Laboratories, the FASTBUS high-speed system and numerous national and international nuclear instrumentation standards

The Visible Integral-Field Replicable Unit Spectrograph (VIRUS) instrument will be installed at the Hobby-Eberly Telescope† in the near future. The instrument will be housed in two enclosures that are mounted adjacent to the telescope, via the VIRUS Support Structure (VSS). We have designed the enclosures to support and protect the instrument, to enable servicing of the instrument, and to cool the instrument appropriately while not adversely affecting the dome environment. The system uses simple HVAC air handling techniques in conjunction with thermoelectric and standard glycol heat exchangers to provide efficient heat removal. The enclosures also provide power and data transfer to and from each VIRUS unit, liquid nitrogen cooling to the detectors, and environmental monitoring of the instrument and dome environments. In this paper, we describe the design and fabrication of the VIRUS enclosures and their subsystems.

The Radiation Protection Instrument, 1993 (Legislative Instrument 1559) prescribes the powers and functions of the Radiation Protection Board established under the Ghana Atomic Energy Commission by the Atomic Energy Commission (Amendment) Law, 1993 (P.N.D.C. Law 308). Also included in the Legislative Instrument are schedules on control and use of ionising radiation and radiation sources as well as procedures for notification, licensing and inspection of ionising radiation facilities. (EAA)

In the years of space exploration since the mid-sixties, a wide experience in optical space instrumentation has developed in Germany. This experience ranges from large telescopes in the 1 m and larger category with the accompanying focal plane detectors and spectrometers for all regimes of the electromagnetic spectrum (infrared, visible, ultraviolet, x-rays), to miniature cameras for cometary and planetary explorations. The technologies originally developed for space science. are now also utilized in the fields of earth observation and even optical telecommunication. The presentation will cover all these areas, with examples for specific technological or scientific highlights. Special emphasis will be given to the current state-of-the-art instrumentation technologies in scientific institutions and industry, and to the future perspective in approved and planned projects.

Nuclear power plant (NPP) and other nuclear installations have been recognized as applications needing very sophisticated technologies. One of technologies used in this all nuclear facilities is nuclear instrumentation. In order that NPP and other nuclear installations be operated safely, nuclear instrumentation requires standardization from design to its operation. Internationally, standardizations of nuclear instrumentation have been issued by IEC (International Electrotechnical Commission). Formulation of standard in nuclear instrumentation in IEC is carried out by Technical Committee (TC) 45. This paper describes briefly the standardization of nuclear instrumentation applied in Indonesia as Indonesian National Standard (SNI, Standard National Indonesia), standardization of nuclear instrumentation developed by TC 45, SC 45A, and SC 45B, as well as the possibility to adopt and apply those IEC standard in Indonesia

National Aeronautics and Space Administration — Armstrong researchers have developed a networked instrumentation system that connects modern experimental payloads to existing analog and digital communications...

Westinghouse Hanford Company Project W-211 is responsible for providing the system capabilities to remove radioactive waste from ten double-shell tanks used to store radioactive wastes on the Hanford Site in Richland, Washington. The project is also responsible for measuring tank waste slurry properties prior to injection into pipeline systems, including the Replacement of Cross-Site Transfer System. This report summarizes studies of the appropriateness of the instrumentation specified for use in Project W-211. The instruments were evaluated in a test loop with simulated slurries that covered the range of properties specified in the functional design criteria. The results of the study indicate that the compact nature of the baseline Project W-211 loop does not result in reduced instrumental accuracy resulting from poor flow profile development. Of the baseline instrumentation, the Micromotion densimeter, the Moore Industries thermocouple, the Fischer and Porter magnetic flow meter, and the Red Valve Pressure transducer meet the desired instrumental accuracy. An alternate magnetic flow meter (Yokagawa) gave nearly identical results as the baseline fischer and Porter. The Micromotion flow meter did not meet the desired instrument accuracy but could potentially be calibrated so that it would meet the criteria. The Nametre on-line viscometer did not meet the desired instrumental accuracy and is not recommended as a quantitative instrument although it does provide qualitative information. The recommended minimum set of instrumentation necessary to ensure the slurry meets the Project W-058 acceptance criteria is the Micromotion mass flow meter and delta pressure cells

Deficiencies exist in both the performance and the quality of health physics instruments. Recognizing the implications of such deficiencies for the protection of workers and the public, in the early 1980s the DOE and the NRC encouraged the development of a performance standard and established a program to test a series of instruments against criteria in the standard. The purpose of the testing was to establish the practicality of the criteria in the standard, to determine the performance of a cross section of available instruments, and to establish a testing capability. Over 100 instruments were tested, resulting in a practical standard and an understanding of the deficiencies in available instruments. In parallel with the instrument testing, a value-impact study clearly established the benefits of implementing a formal testing program. An ad hoc committee also met several times to establish recommendations for the voluntary implementation of a testing program based on the studies and the performance standard. For several reasons, a formal program did not materialize. Ongoing tests and studies have supported the development of specific instruments and have helped specific clients understand the performance of their instruments. The purpose of this presentation is to trace the history of instrument testing to date and suggest the benefits of a centralized formal program

Due to the rising costs and competitive pressures radiological clinics and practices are now facing, controlling instruments are gaining importance in the optimization of structures and processes of the various diagnostic examinations and interventional procedures. It will be shown how the use of selected controlling instruments can secure and improve the performance of radiological facilities. A definition of the concept of controlling will be provided. It will be shown which controlling instruments can be applied in radiological departments and practices. As an example, two of the controlling instruments, material cost analysis and benchmarking, will be illustrated.

Mastering the art of communicating scientific information is more critical than ever for a successful career in science and technology. Scientists today must be able to effectively convey sophisticated information to a broad audience that may include students, colleagues around the world, regulatory bodies, granting agencies, legislators, and the lay public. In this engaging and lively book, the author provides a step-by-step guide to the complete process of making a scientific presentation from preparation to delivery. It offers numerous examples highlighting what to follow and what to avoid. This revised edition covers the effective use of PowerPoint™ and other computer-based presentation programs. It also includes a handy checklist, new illustrations, and tips on handling an audience in a foreign country.

Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

The Undergraduate Student Instrumentation Project (USIP) is a NASA program to engage undergraduate students in rigorous scientific research, for the purposes of innovation and developing the next generation of professionals for an array of fields. The program is student led and executed from initial ideation to research to the design and deployment of scientific payloads. The University of Houston has been selected twice to participate in the USIP programs. The first program (USIP_UH I) ran from 2013 to 2016. USIP_UH II started in January of 2016, with funding starting at the end of May. USIP_UH I (USIP_UH II) at the University of Houston was (is) composed of eight (seven) research teams developing six (seven), distinct, balloon-based scientificinstruments. These instruments will contribute to a broad range of geophysical sciences from Very Low Frequency recording and Total Electron Content to exobiology and ozone profiling. USIP_UH I had 12 successful launches with 9 recoveries from Fairbanks, AK in March 2015, and 4 piggyback flights with BARREL 3 from Esrange, Kiruna, Sweden in August, 2015. USIP_UH II had 8 successful launches with 5 recoveries from Fairbanks, AK in March 2017, 3 piggyback flights with BARREL 4 from Esrange, Kiruna, Sweden in August, 2016, and 1 flight each from CSBF and UH. The great opportunity of this program is capitalizing on the proliferation of electronics miniaturization to create new generations of scientificinstruments that are smaller and lighter than ever before. This situation allows experiments to be done more cheaply which ultimately allows many more experiments to be done.

Background: The validation of educational instruments, in particular the employment of factor analysis, can be improved in many instances. Aims: To demonstrate the superiority of a sophisticated method of factor analysis, implying an integration of recommendations described in the factor analysis

In the words of the UK government chief scientific adviser "Science is not finished until it's communicated" (Walport 2013). The tools to produce good visual communication have never been so easily accessible to scientists as at the present. Correspondingly, it has never been easier to produce and disseminate poor graphics. In this presentation, we highlight some good practice and offer some practical advice in preparing scientific figures for presentation to peers or to the public. We identify common mistakes in visualisation, including some made by the authors, and offer some good reasons not to trust defaults in graphics software. In particular, we discuss the use of colour scales and share our experiences in running a social media campaign (http://tiny.cc/endrainbow) to replace the "rainbow" (also "jet", or "spectral") colour scale as the default in (climate) scientific visualisation.

Scientific publishing is the ultimate product of scientist work. Number of publications and their quoting are measures of scientist success while unpublished researches are invisible to the scientific community, and as such nonexistent. Researchers in their work rely on their predecessors, while the extent of use of one scientist work, as a source for the work of other authors is the verification of its contributions to the growth of human knowledge. If the author has published an article in a scientific journal it cannot publish the article in any other journal h with a few minor adjustments or without quoting parts of the first article, which are used in another article. Copyright infringement occurs when the author of a new article with or without the mentioning the author used substantial portions of previously published articles, including tables and figures. Scientific institutions and universities should,in accordance with the principles of Good Scientific Practice (GSP) and Good Laboratory Practices (GLP) have a center for monitoring,security, promotion and development of quality research. Establish rules and compliance to rules of good scientific practice are the obligations of each research institutions,universities and every individual-researchers,regardless of which area of science is investigated. In this way, internal quality control ensures that a research institution such as a university, assume responsibility for creating an environment that promotes standards of excellence, intellectual honesty and legality. Although the truth should be the aim of scientific research, it is not guiding fact for all scientists. The best way to reach the truth in its study and to avoid the methodological and ethical mistakes is to consistently apply scientific methods and ethical standards in research. Although variously defined plagiarism is basically intended to deceive the reader's own scientific contribution. There is no general regulation of control of

Scientific publishing is the ultimate product of scientist work. Number of publications and their quoting are measures of scientist success while unpublished researches are invisible to the scientific community, and as such nonexistent. Researchers in their work rely on their predecessors, while the extent of use of one scientist work, as a source for the work of other authors is the verification of its contributions to the growth of human knowledge. If the author has published an article in a scientific journal it cannot publish the article in any other journal h with a few minor adjustments or without quoting parts of the first article, which are used in another article. Copyright infringement occurs when the author of a new article with or without the mentioning the author used substantial portions of previously published articles, including tables and figures. Scientific institutions and universities should,in accordance with the principles of Good Scientific Practice (GSP) and Good Laboratory Practices (GLP) have a center for monitoring,security, promotion and development of quality research. Establish rules and compliance to rules of good scientific practice are the obligations of each research institutions,universities and every individual-researchers,regardless of which area of science is investigated. In this way, internal quality control ensures that a research institution such as a university, assume responsibility for creating an environment that promotes standards of excellence, intellectual honesty and legality. Although the truth should be the aim of scientific research, it is not guiding fact for all scientists. The best way to reach the truth in its study and to avoid the methodological and ethical mistakes is to consistently apply scientific methods and ethical standards in research. Although variously defined plagiarism is basically intended to deceive the reader’s own scientific contribution. There is no general regulation of control of

Since 1988, the Scientific Visualization Studio(SVS) at NASA Goddard Space Flight Center has produced scientific visualizations of NASA s scientific research and remote sensing data for public outreach. These visualizations take the form of images, animations, and end-to-end systems and have been used in many venues: from the network news to science programs such as NOVA, from museum exhibits at the Smithsonian to White House briefings. This presentation will give an overview of the major activities and accomplishments of the SVS, and some of the most interesting projects and systems developed at the SVS will be described. Particular emphasis will be given to the practices and procedures by which the SVS creates visualizations, from the hardware and software used to the structures and collaborations by which products are designed, developed, and delivered to customers. The web-based archival and delivery system for SVS visualizations at svs.gsfc.nasa.gov will also be described.

The way we record knowledge, and the web of technical, formal, and social practices that surrounds it, inevitably affects the knowledge that we record. The ways we hold knowledge about the past - in handwritten manuscripts, in printed books, in file folders, in databases - shape the kind of stories we tell about that past. In this talk, I look at how over the past two hundred years, information technology has affected the nature and production of scientific knowledge. Further, I explore ways in which the emergent new cyberinfrastructure is changing our relationship to scientific practice.

Full Text Available Usability, most often defined as the ease of use and acceptability of a system, affects the users' performance and their job satisfaction when working with a machine. Therefore, usability is a very important aspect which must be considered in the process of a system development. The paper presents several numerical data related to the history of the scientific research of the usability of information systems, as it is viewed in the information provided by three important scientific databases, Science Direct, ACM Digital Library and IEEE Xplore Digital Library, at different queries related to this field.

Presents the citation for Theodore P. Beauchaine, who received the Award for Distinguished Scientific Early Career Contributions to Psychology (psychopathology) "for core contributions in developmental psychopathology, especially related to the biological underpinnings of various mental disorders among children, sophisticated and elegant quantitative approaches to these issues, and exemplary work on the prevention of such conditions." A brief profile and a selected bibliography accompany the citation. ((c) 2006 APA, all rights reserved).

A description of instrumentation used in the Loss-of-Fluid Test (LOFT) large break Loss-of-Coolant Experiments is presented. Emphasis is placed on hydraulic and thermal measurements in the primary system piping and components, reactor vessel, and pressure suppression system. In addition, instrumentation which is being considered for measurement of phenomena during future small break testing is discussed

Nuclear medicine instruments are rather sophisticated. They are difficult to maintain in effective working condition, especially in developing countries. The present document describes a survey conducted in Bangladesh, India, Malaysia, Pakistan, Philippines, Singapore, Sri Lanka and Thailand from October 1977 to March 1978, on the use and maintenance of nuclear medicine equipment. The survey evaluated the existing problems of instrument maintenance in the 8 countries visited. The major instruments in use were (1) scintillation probe counters, (2) well scintillation counters and (3) rectilinear cameras. Gamma camera was not widely available in the region at the time of the survey. Most of the surveyed instruments were kept in a detrimental environment resulting in a high failure rate, that caused the relatively high instrument unavailability of 11%. Insufficient bureaucratic handling of repair cases, difficulties with the supply of spare- and replacement parts and lack of training proved to be the main reasons for long periods of instrument inoperation. Remedial actions, based on the survey data, have been initiated

As the strategic knowledge gaps mature for the exploration of Mars, Mars sample return (MSR), and Phobos/Deimos missions, one approach that becomes more probable involves smaller science instrumentation and integrated science suites. Recent technological advances provide the foundation for a significant evolution of instrumentation; however, the funding support is currently too small to fully utilize these advances. We propose that an increase in funding for instrumentation development occur in the near-term so that these foundational technologies can be applied. These instruments would directly address the significant knowledge gaps for humans to Mars orbit, humans to the Martian surface, and humans to Phobos/ Deimos. They would also address the topics covered by the Decadal Survey and the Mars scientific goals, objectives, investigations and priorities as stated by the MEPAG. We argue that an increase of science instrumentation funding would be of great benefit to the Mars program as well as the potential for human exploration of the Mars system. If the total non-Earth-related planetary science instrumentation budget were increased 100% it would not add an appreciable amount to the overall NASA budget and would provide the real potential for future breakthroughs. If such an approach were implemented in the near-term, NASA would benefit greatly in terms of science knowledge of the Mars, Phobos/Deimos system, exploration risk mitigation, technology development, and public interest.

Full Text Available The development of the Internet and sophisticated search engines such as e.g. Google together with the spread of social networks have introduced new marketing possibilities of addressing potential clients with offer of goods and services. Unlike most traditional marketing procedures, these instruments allow for targeting the business information directly at concrete individuals, taking into consideration their age, sex, education, hobbies. All this is based on their choice of words keyed into the search engines. This is the targeted advertising where consumer response can be accurately measured, e.i. the so called context advertising.The purpose of this paper is to analyse the legal aspects of some of the above mentioned internet marketing instruments, as even in this sphere legal regulation clearly lags behind the dynamically developing possibilities of the Internet as a means of communication. These marketing methods being viewed from the perspective of valid laws, several problem areas may be detected, which concern the right of privacy protection of natural person, intellectual property, or legal regulation of implied or unsolicited advertising.This paper concentrates on the summary of rules of law which regulate internet users privacy protection with respect to the Czech and Community laws, assessment of their efficiency and de lege ferenda discretion.

Full Text Available This paper investigates selected short- and mid-term effects in trade in goods between the Visegrad countries (V4: the Czech Republic, Hungary, Poland and the Slovak Republic and the Republic of Korea under the framework of the Free Trade Agreement between the European Union and the Republic of Korea. This Agreement is described in the “Trade for All” (2015: 9 strategy as the most ambitious trade deal ever implemented by the EU. The primary purpose of our analysis is to identify, compare, and evaluate the evolution of the technological sophistication of bilateral exports and imports. Another dimension of the paper concentrates on the developments within intra-industry trade. Moreover, these objectives are approached taking into account the context of the South Korean direct investment inflow to the V4. The evaluation of technological sophistication is based on UNCTAD’s methodology, while the intensity of intra-industry trade is measured by the GL-index and identification of its subcategories (horizontal and vertical trade. The analysis covers the timespan 2001–2015. The novelty of the paper lies in the fact that the study of South Korean-V4 trade relations has not so far been carried out from this perspective. Thus this paper investigates interesting phenomena identified in the trade between the Republic of Korea (ROK and V4 economies. The main findings imply an impact of South Korean direct investments on trade. This is represented by the trade deficit of the V4 with ROK and the structure of bilateral trade in terms of its technological sophistication. South Korean investments might also have had positive consequences for the evolution of IIT, particularly in the machinery sector. The political interpretation indicates that they may strengthen common threats associated with the middle-income trap, particularly the technological gap and the emphasis placed on lower costs of production.

The Department of Energy (DOE) designated the Advanced Test Reactor (ATR) as a National Scientific User Facility (NSUF) in April 2007 to support U.S. leadership in nuclear science and technology. By attracting new research users - universities, laboratories, and industry - the ATR will support basic and applied nuclear research and development, further advancing the nation's energy security needs. A key component of the ATR NSUF effort is to prove new in-pile instrumentation techniques that are capable of providing real-time measurements of key parameters during irradiation. To address this need, an assessment of instrumentation available and under-development at other test reactors has been completed. Based on this review, recommendations are made with respect to what instrumentation is needed at the ATR and a strategy has been developed for obtaining these sensors. Progress toward implementing this strategy is reported in this document. It is anticipated that this report will be updated on an annual basis.

The Department of Energy (DOE) designated the Advanced Test Reactor (ATR) as a National Scientific User Facility (NSUF) in April 2007 to support U.S. leadership in nuclear science and technology. By attracting new research users - universities, laboratories, and industry - the ATR will support basic and applied nuclear research and development, further advancing the nation's energy security needs. A key component of the ATR NSUF effort is to prove new in-pile instrumentation techniques that are capable of providing real-time measurements of key parameters during irradiation. To address this need, an assessment of instrumentation available and under-development at other test reactors has been completed. Based on this review, recommendations are made with respect to what instrumentation is needed at the ATR and a strategy has been developed for obtaining these sensors. Progress toward implementing this strategy is reported in this document. It is anticipated that this report will be updated on an annual basis.

Concepts of scientificinstruments designed to perform infrared astronomical tasks such as imaging, photometry, and spectroscopy are discussed as part of the Space Infrared Telescope Facility (SIRTF) project under definition study at NASA/Ames Research Center. The instruments are: the multiband imaging photometer, the infrared array camera, and the infrared spectograph. SIRTF, a cryogenically cooled infrared telescope in the 1-meter range and wavelengths as short as 2.5 microns carrying multiple instruments with high sensitivity and low background performance, provides the capability to carry out basic astronomical investigations such as deep search for very distant protogalaxies, quasi-stellar objects, and missing mass; infrared emission from galaxies; star formation and the interstellar medium; and the composition and structure of the atmospheres of the outer planets in the solar sytem. 8 refs

This is a report on scientific research at DESY in 1972. The activities in the field of electron-nucleon scattering, photoproduction and synchrotron radiation get a special mention. It is also reported on the work on the double storage ring as well as on the extension to the synchrotron. (WL/LN) [de

In order to reduce the knowledge divide, more Open Access Journals (OAJ) are needed in all languages and scholarly subject areas that exercise peer-review or editorial quality control. To finance needed costs, it is discussed why and how to sell target specific advertisement by associating ads to given scientific keywords. (author)

This annual scientific report gives an concise overview of research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2007. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research

The findings of R+D activities of the HMI radiation chemistry department in the fields of pulsed radiolysis, reaction kinematics, insulators and plastics are presented as well as the scientific publications and lectures of HMI staff and visitors including theoretical contributions, theses and dissertations, and conference papers. (HK) [de

The annual scientific report gives an overview of the R and D activities at the Belgian Nuclear Research Centre SCK-CEN in 2001. The report discusses progress and main achievements in four principal areas: Radiation Protection, Radioactive Waste and Clean-up, Reactor Safety and the BR2 Reactor

The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2005. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research.

The questions posed in yesterday's posts about hopes for 2008 were half of what we were asked by the Powers That Be. The other half: What scientific development do you fear you'll be blogging or reading about in 2008?

The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2004. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research

The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2004. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research.

Scientific Medical Journal: an official journal of Egyptian Medical Education provides a forum for dissemination of knowledge, exchange of ideas, inform of exchange of ideas, information and experience among workers, investigators and clinicians in all disciplines of medicine with emphasis on its treatment and prevention.

The annual scientific report gives an overview of the R and D activities at the Belgian Nuclear Research Centre SCK-CEN in 2001. The report discusses progress and main achievements in four principal areas: Radiation Protection, Radioactive Waste and Clean-up, Reactor Safety and the BR2 Reactor.

A method for assessing scientific performance based on relationships displayed numerically in published documents is proposed and illustrated using published documents in pediatric oncology for the period 1979-1982. Contributions of a major clinical investigations group, the Childrens Cancer Study Group, are analyzed. Twenty-nine references are…

The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2006. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research.

The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2006. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research

The annual scientific report gives an overview of the R and D activities at the Belgian Nuclear Research Centre SCK-CEN in 2003. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge, and fusion research.

... information in policymaking. The selection of scientists and technology professionals for positions in the... Administration on a wide range of issues, including improvement of public health, protection of the environment... technological findings and conclusions. If scientific and technological information is developed and used by the...

A report is given on the scientific research at DESY in 1973, which included the first storage of electrons in the double storage ring DORIS. Also mentioned are the two large spectrometers PLUTO and DASP, and experiments relating to elementary particles, synchrotron radiation, and the improvement of the equipment are described. (WL/AK) [de

The annual scientific report gives a summary overview of the research and development activities at the Belgian Nuclear Research Centre SCK-CEN in 2005. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge and fusion research

The annual scientific report gives an overview of the R and D activities at the Belgian Nuclear Research Centre SCK-CEN in 2003. The report discusses progress and main achievements in the following areas: reactor safety, radioactive waste and clean-up, radiation protection, the BR2 reactor, nuclear research and society, managing nuclear knowledge, and fusion research

This paper presents and comments on Mario Bunge's scientific realism. After a brief introduction in Sects. 1 and 2 outlines Bunge's conception of realism. Focusing on the case of quantum mechanics, Sect. 3 explores how his approach plays out for problematic theories. Section 4 comments on Bunge's project against the background of the current…

This annual scientific report of SCK-CEN presents a comprehensive coverage and research activities in the filed of (a) waste and site restoration (b) reactor safety and radiation protection (c) operation of BR2 Materials Testing Reactor and (d) services provided by the center (analysis for characterization of waste packages, nuclear measurements, low-level radioactivity measurements).

Reproducibility of experiments is considered as one of the main principles of the scientific method. Recent developments in data and computation intensive science, i.e. e-Science, and state of the art in Cloud computing provide the necessary components to preserve data sets and re-run code and

This annual scientific-technological report provides an overview of research and development activities at Peruvian Institute of Nuclear Energy (IPEN) during the period from 1 january to 31 december, 2003. This report includes 54 papers divided in 9 subject matters: physics and nuclear chemistry, nuclear engineering, materials science, radiochemistry, industrial applications, medical applications, environmental applications, protection and radiological safety, and management aspects

The Scientific Tourism is relatively new direction in the world, however it already has managed to gain great popularity. As it is, it has arisen in 1980s, but its ideological basis comes from the earliest periods of the human history. In Armenia, it is a completely new phenomenon and still not-understandable for many people. At global level, the Scientific Tourism has several definitions: for example, as explains the member of the scientific tourist centre of Zlovlen Mrs. Pichelerova "The essence of the scientific tourism is based on the provision of the educational, cultural and entertainment needs of a group of people of people who are interested in the same thing", which in our opinion is a very comprehensive and discreet definition. We also have our own views on this type of tourism. Our philosophy is that by keeping the total principles, we put the emphasis on the strengthening of science-individual ties. Our main emphasis is on the scientific-experimental tourism. But this does not mean that we do not take steps to other forms of tourism. Studying the global experience and combining it with our resources, we are trying to get a new interdisciplinary science, which will bring together a number of different professionals as well as individuals, and as a result will have a new lore. It is in this way that an astronomer will become an archaeologist, an archaeologist will become an astrophysicist, etc. Speaking on interdisciplinary sciences, it's worth mentioning that in recent years, the role of interdisciplinary sciences at global level every day is being considered more and more important. In these terms, tourism is an excellent platform for the creation of interdisciplinary sciences and, therefore, the preparation of corresponding scholars. Nevertheless, scientific tourism is very important for the revelation, appreciation and promotion of the country's historical-cultural heritage and scientific potential. Let us not forget either that tourism in all its

To increase students' confidence in giving scientific presentations, students were shown how to present scientific findings as a narrative story. Students who were preparing to give a scientific talk attended a workshop in which they were encouraged to experience the similarities between telling a personal anecdote and presenting scientific data.…

Instrumentation is not a clearly defined subject, having a 'fuzzy' boundary with a number of other disciplines. Often categorized as either 'techniques' or 'applications' this book addresses the various applications that may be needed with reference to the practical techniques that are available for the instrumentation or measurement of a specific physical quantity or quality. This makes it of direct interest to anyone working in the process, control and instrumentation fields where these measurements are essential.* Comprehensive and authoritative collection of technical information* Writte

This book deals with the latest radiation instrument, which is comprised of eight chapters. It explains X rays instrument for medial treatment, X-ray tube instrument and permissible burden with its history, structure and characteristic high voltage apparatus with high voltage rectifier circuit, X-ray control apparatus for medical treatment, X-ray image equipment X-ray television apparatus and CCD 205, X-ray apparatus of install and types, Digital X-ray apparatus with CR 261 and DR 269, performance management on X-ray for medical treatment with its history, necessity and management in the radiation field.

Jones' Instrument Technology, Volume 5: Automatic Instruments and Measuring Systems deals with general trends in automatic instruments and measuring systems. Specific examples are provided to illustrate the principles of such devices. A brief review of a considerable number of standards is undertaken, with emphasis on the IEC625 Interface System. Other relevant standards are reviewed, including the interface and backplane bus standards. This volume is comprised of seven chapters and begins with a short introduction to the principles of automatic measurements, classification of measuring system

This essay proposes that our understanding of medical instruments might benefit from adding a more forthright concern with their immediate presence to the current historical focus on simply decoding their meanings and context. This approach is applied to the intriguingly tricky question of what...... actually is meant by a "medical instrument." It is suggested that a pragmatic part of the answer might lie simply in reconsidering the holdings of medical museums, where the significance of the physical actuality of instruments comes readily to hand....

Full Text Available The Fortran programming language was designed by John Backus and his colleagues at IBM to reduce the cost of programming scientific applications. IBM delivered the first compiler for its model 704 in 1957. IBM's competitors soon offered incompatible versions. ANSI (ASA at the time developed a standard, largely based on IBM's Fortran IV in 1966. Revisions of the standard were produced in 1977, 1990, 1995 and 2003. Development of a revision, scheduled for 2008, is under way. Unlike most other programming languages, Fortran is periodically revised to keep pace with developments in language and processor design, while revisions largely preserve compatibility with previous versions. Throughout, the focus on scientific programming, and especially on efficient generated programs, has been maintained.

The 1997 Scientific Report of the Belgian Nuclear Research Centre SCK-CEN describes progress achieved in nuclear safety, radioactive waste management, radiation protection and safeguards. In the field of nuclear research, the main projects concern the behaviour of high-burnup and MOX fuel, the embrittlement of reactor pressure vessels, the irradiation-assisted stress corrosion cracking of reactor internals, and irradiation effects on materials of fusion reactors. In the field of radioactive waste management, progress in the following domains is reported: the disposal of high-level radioactive waste and spent fuel in a clay formation, the decommissioning of nuclear installations, the study of alternative waste-processing techniques. For radiation protection and safeguards, the main activities reported on are in the field of site and environmental restoration, emergency planning and response and scientific support to national and international programmes

The aim of this report is to outline the main developments of the 'Departement des Reacteurs Nucleaires' (DRN) during the year 1999. DRN is one of the CEA Institutions. This report is divided in three main parts: the DRN scientific programs, the scientific and technical publications (with abstracts in English) and economic data on staff, budget and communication. Main results of the Department for the year 1999 are presented giving information on the simulation of low mach number compressible flow, experimental irradiation of multi-materials, progress in the dry route conversion process of UF 6 to UO 2 , the neutronics, the CASCADE installation, the corium, the BWR type reactor cores technology, the reactor safety, the transmutation of americium and fuel cell flow studies, the crack propagation, the hybrid systems and the CEA sites improvement. (A.L.B.)

Scientific publications have become a mainstay of communication among readers, academicians, researchers and scientists worldwide. Although, its existence dates back to 17 th century in the West, Nepal is still struggling to take few steps towards improving its local science for last 50 years. Since the start of the first medical journal in 1963, the challenges remains as it were decades back regarding role of authors, peer reviewers, editors and even publishers in Nepal. Although, there has been some development in terms of the number of articles being published and appearances of the journals, yet there is a long way to go. This article analyzes the past and present scenario, and future perspective for scientific publications in Nepal.

Sherlock Holmes was intended by his creator, Arthur Conan Doyle, to be a 'scientific detective'. Conan Doyle criticized his predecessor Edgar Allan Poe for giving his creation - Inspector Dupin - only the 'illusion' of scientific method. Conan Doyle believed that he had succeeded where Poe had failed; thus, he has Watson remark that Holmes has 'brought detection as near an exact science as it will ever be brought into the world.' By examining Holmes' methods, it becomes clear that Conan Doyle modelled them on certain images of science that were popular in mid- to late-19th century Britain. Contrary to a common view, it is also evident that rather than being responsible for the invention of forensic science, the creation of Holmes was influenced by the early development of it.

This monograph investigates the collaborative creation of scientific knowledge in research groups. To do so, I combine philosophical analysis with a first-hand comparative case study of two research groups in experimental science. Qualitative data are gained through observation and interviews......, and I combine empirical insights with existing approaches to knowledge creation in philosophy of science and social epistemology. On the basis of my empirically-grounded analysis I make several conceptual contributions. I study scientific collaboration as the interaction of scientists within research...... to their publication. Specifically, I suggest epistemic difference and the porosity of social structure as two conceptual leitmotifs in the study of group collaboration. With epistemic difference, I emphasize the value of socio-cognitive heterogeneity in group collaboration. With porosity, I underline the fact...

The aim of this report is to outline the main developments of the ''Departement des Reacteurs Nucleaires'', (DRN) during the year 1998. DRN is one of the CEA Institution. This report is divided in three main parts: the DRN scientific programs, the scientific and technical publications (with abstracts in english) and economic data on staff, budget and communication. Main results of the Department, for the year 1998, are presented giving information on the reactors technology and safety, the neutronics, the transmutation and the hybrid systems, the dismantling and the sites improvement, the nuclear accidents, the nuclear matter transport, the thermonuclear fusion safety, the fuel cladding materials and radioactive waste control. (A.L.B.)

Inquisitive minds in our society are never satisfied with curatedimages released by a typical public affairs office. They always want tolook deeper and play directly on original data. However, most scientificdata products are notoriously hard to use. They are immensely large,highly distributed and diverse in format. In this presentation,we will demonstrate Resource EXplorer (REX), a novel webtop applicationthat allows anyone to conveniently explore and visualize rich scientificdata repositories, using only a standard web browser. This tool leverageson the power of Webification Science (w10n-sci), a powerful enabling technologythat simplifies the use of scientific data on the web platform.W10n-sci is now being deployed at an increasing number of NASA data centers,some of which are the largest digital treasure troves in our nation.With REX, these wonderful scientific resources are open for teachers andstudents to learn and play.

In this article I propose the classification of the evolutionary stages that a scientific discipline evolves through and the type of scientists that are the most productive at each stage. I believe that each scientific discipline evolves sequentially through four stages. Scientists at stage one introduce new objects and phenomena as subject matter for a new scientific discipline. To do this they have to introduce a new language adequately describing the subject matter. At stage two, scientists develop a toolbox of methods and techniques for the new discipline. Owing to this advancement in methodology, the spectrum of objects and phenomena that fall into the realm of the new science are further understood at this stage. Most of the specific knowledge is generated at the third stage, at which the highest number of original research publications is generated. The majority of third-stage investigation is based on the initial application of new research methods to objects and/or phenomena. The purpose of the fourth stage is to maintain and pass on scientific knowledge generated during the first three stages. Groundbreaking new discoveries are not made at this stage. However, new ways to present scientific information are generated, and crucial revisions are often made of the role of the discipline within the constantly evolving scientific environment. The very nature of each stage determines the optimal psychological type and modus operandi of the scientist operating within it. Thus, it is not only the talent and devotion of scientists that determines whether they are capable of contributing substantially but, rather, whether they have the 'right type' of talent for the chosen scientific discipline at that time. Understanding the four different evolutionary stages of a scientific discipline might be instrumental for many scientists in optimizing their career path, in addition to being useful in assembling scientific teams, precluding conflicts and maximizing

A properly conditioned AC power supply is necessary for reliable functioning of instruments. Electric mains power is produced primarily for industry, workshops, lighting and household uses. Its quality is adjusted to these uses. In areas sand countries with a fast growing demand for electric power, these requirements are far from being met. Electronic instruments and computers, especially in these countries, need protection against disturbances of the mains supply. A clean and dry environment is needed for reliable functioning and long life of instruments. High humidity, specially at higher temperatures, changes the characteristics of electronic components. Moreover, under these conditions fungal growth causes leakage of currents and corrosion causes poor contacts. The presence of dust enhances these effects. They give rise to malfunction of instruments, particularly of high voltage equipment

National Aeronautics and Space Administration — This work will extend and proof-out the design concept for a high pixel count (128 pixels in 2 bands) submillimeter-wave heterodyne receiver array instrument for the...

A properly conditioned AC power supply is necessary for reliable functioning of instruments. Electric mains power is produced primarily for industry, workshops, lighting and household uses. Its quality is adjusted to these uses. In areas sand countries with a fast growing demand for electric power, these requirements are far from being met. Electronic instruments and computers, especially in these countries, need protection against disturbances of the mains supply. A clean and dry environment is needed for reliable functioning and long life of instruments. High humidity, specially at higher temperatures, changes the characteristics of electronic components. Moreover, under these conditions fungal growth causes leakage of currents and corrosion causes poor contacts. The presence of dust enhances these effects. They give rise to malfunction of instruments, particularly of high voltage equipment

This is a general presentation of fiber optics instrumentation development work being conducted at NASA Dryden for the past 10 years and recent achievements in the field of fiber optics strain sensors.

This paper reports on Nuclear Instrument Technician (NIT) training that has developed at an accelerated rate over the past three decades. During the 1960's commercial nuclear power plants were in their infancy. For that reason, there is little wonder that NIT training had little structure and little creditability. NIT training, in many early plants, was little more than On-The Job Training (OJT). The seventies brought changes in Instrumentation and Controls as well as emphasis on the requirements for more in depth training and documentation. As in the seventies, the eighties saw not only changes in technologies but tighter requirements, standardized training and the development of accredited Nuclear Instrument Training; thus the conclusion: Nuclear Instrument Training Isn't What It Used To Be

This page outlines the major differences between Renewable Energy Certificates (REC) and Project Offsets and what types of claims each instrument allows the organization to make in regards to environmental emissions claims.

Full Text Available The professional blog is a weblog that on the whole meets the requirements of scientific publication. In my opinion it bear a resemblance to digital notice board, where the competent specialists of the given branch of science can place their ideas, questions, possible solutions and can raise problems. Its most important function can be collectivization of the knowledge. In this article I am going to examine the characteristics of the scientific blog as a genre. Conventional learning counts as a rather solitary activity. If the students have access to the materials of each other and of the teacher, their sense of solitude diminishes and this model is also closer to the constructivist approach that features the way most people think and learn. Learning does not mean passively collecting tiny pieces of knowledge; it much more esembles ‘spinning a conceptual net’ which is made up by the experiences and observations of the individual. With the spreading of the Internet more universities and colleges worldwide gave a try to on-line educational methods, but the most efficient one has not been found yet. The publication of the curriculum (the material of the lectures and the handling of the electronic mails are not sufficient; much more is needed for collaborative learning. Our scholastic scientific blog can be a sufficient field for the start of a knowledge-building process based on cooperation. In the Rocard-report can be read that for the future of Europe it is crucial to develop the education of the natural sciences, and for this it isnecessary to act on local, regional, national and EU-level. To the educational processes should be involved beyond the traditional actors (child, parent, teacher also others (scientists, professionals, universities, local institutions, the actors of the economic sphere, etc.. The scholastic scientific blog answer the purposes, as a collaborative knowledge-sharing forum.

The R + D-projects in this field and the infrastructural tasks mentioned are handled in seven working- and two project groups: Computer systems, Numerical and applied mathematics, Software development, Process calculation systems- hardware, Nuclear electronics, measuring- and automatic control technique, Research of component parts and irradiation tests, Central data processing, Processing of process data in the science of medicine, Co-operation in the BERNET-project in the 'Wissenschaftliches Rechenzentrum Berlin (WRB)' (scientific computer center in Berlin). (orig./WB)

of the system, and post-interviews to understand the participants' views of doing science under both conditions. We hypothesized that study participants would be less effective, report more difficulty, and be less favorably inclined to adopt the system when collaborating remotely. Contrary to expectations...... of collaborating remotely. While the data analysis produced null results, considered as a whole, the analysis leads us to conclude there is positive potential for the development and adoption of scientific collaboratory systems....

National scientific program of the Vinca Institute Nuclear Reactors And Radioactive Waste comprises research and development in the following fields: application of energy of nuclear fission, application of neutron beams, analyses of nuclear safety and radiation protection. In the first phase preparatory activities, conceptual design and design of certain processes and facilities should be accomplished. In the second phase realization of the projects is expected. (author)

Instrumentation (1, 2 & 3) by Rhodri Jones (CERN) Wednesday 12, Thursday 13 and Friday 14 November from 11:00 to 12:00 at CERN (40-S2-A01 - Salle Anderson) Description: The LHC is equipped with a full suite of sophisticated beam instrumentation which has been essential for rapid commissioning, the safe increase in total stored beam power and the understanding of machine optics and accelerator physics phenomena. These lectures will introduce these systems and comment on their contributions to the various stages of beam operation. They will include details on: the beam position system and its use for real-time global orbit feedback; the beam loss system and its role in machine protection; total and bunch by bunch intensity measurements; tune measurement and feedback; diagnostics for transverse beam size measurements, abort gap monitoring and longitudinal density measurements. Issues and problems encountered along the way will also be discussed together with the prospect for future upgrades. ...

Full Text Available Everyone working in an ophthalmic operating theatre must be competent in the care, handling, storage, and maintenance of instruments. This will help to improve surgical outcomes, maintain an economic and affordable service for patients, and provide a safe environment for the wellbeing of patients and staff.Including instrument care in theatre courses and in-service training is one way of ensuring staff competence.

The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

Simple experiments involving musical instruments of the woodwind family can be used to demonstrate the basic physics of vibrating air columns in resonance tubes using nothing more than straightforward measurements and data collection hardware and software. More involved experimentation with the same equipment can provide insight into the effects of holes in the tubing and other factors that make simple tubes useful as musical instruments.

Full Text Available During the last three decades of the nineteenth century, organizations developed rapidly, their managers began to realize that they had too frequent managerial problems; this awareness lead to a new phase of development of scientific management. Examining the titles published in that period, it can be concluded that management issues that pose interest related to payroll and payroll systems, problems exacerbated by the industrial revolution and related work efficiency. Noting that large organizations losing power, direct supervision, the managers were looking for incentives to replace this power . One of the first practitioners of this new management system was Henry R. Towne, the president of the well-known enterprise "Yale and Towne Manufacturing Company", which applied the management methods in his company workshops. Publishers of magazines "Industrial Management" and "The Engineering Magazine" stated that HR Towne is, undisputedly, the pioneer of scientific management. He initiated the systematic application of effective management methods and his famous article "The Engineer as Economist" provided to the company. "American Society of Mechanical Engineers" in 1886 was the one that probably inspired Frederick W. Taylor to devote his entire life and work in scientific management.

For decades, computer scientists have tried to teach computers to think like human experts. Until recently, most of those efforts have failed to come close to generating the creative insights and solutions that seem to come naturally to the best researchers, doctors, and engineers. But now, Tony Hey, a VP of Microsoft Research, says we're witnessing the dawn of a new generation of powerful computer tools that can "mash up" vast quantities of data from many sources, analyze them, and help produce revolutionary scientific discoveries. Hey and his colleagues call this new method of scientific exploration "machine learning." At Microsoft, a team has already used it to innovate a method of predicting with impressive accuracy whether a patient with congestive heart failure who is released from the hospital will be readmitted within 30 days. It was developed by directing a computer program to pore through hundreds of thousands of data points on 300,000 patients and "learn" the profiles of patients most likely to be rehospitalized. The economic impact of this prediction tool could be huge: If a hospital understands the likelihood that a patient will "bounce back," it can design programs to keep him stable and save thousands of dollars in health care costs. Similar efforts to uncover important correlations that could lead to scientific breakthroughs are under way in oceanography, conservation, and AIDS research. And in business, deep data exploration has the potential to unearth critical insights about customers, supply chains, advertising effectiveness, and more.

Many radiological surveillance instruments are in use at DOE facilities throughout the country. These instruments are an essential part of all health physics programs, and poor instrument performance can increase program costs or compromise program effectiveness. Generic data from simple tests on newly purchased instruments shows that many instruments will not meet requirements due to manufacturing defects. In other cases, lack of consideration of instrument use has resulted in poor acceptance of instruments and poor reliability. The performance of instruments is highly variable for electronic and mechanical performance, radiation response, susceptibility to interferences and response to environmental factors. Poor instrument performance in these areas can lead to errors or poor accuracy in measurements

Many radiological surveillance instruments are in use at DOE facilities throughout the country. These instruments are an essential part of all health physics programs, and poor instrument performance can increase program costs or compromise program effectiveness. Generic data from simple tests on newly purchased instruments shows that many instruments will not meet requirements due to manufacturing defects. In other cases, lack of consideration of instrument use has resulted in poor acceptance of instruments and poor reliability. The performance of instruments is highly variable for electronic and mechanical performance, radiation response, susceptibility to interferences and response to environmental factors. Poor instrument performance in these areas can lead to errors or poor accuracy in measurements

UAVSAR is an imaging radar instrument suite that serves as NASA's airborne facility instrument to acquire scientific data for Principal Investigators as well as a radar test-bed for new radar observation techniques and radar technology demonstration. Since commencing operational science observations in January 2009, the compact, reconfigurable, pod-based radar has been acquiring L-band fully polarimetric SAR (POLSAR) data with repeat-pass interferometric (RPI) observations underneath NASA Dryden's Gulfstream-III jet to provide measurements for science investigations in solid earth and cryospheric studies, vegetation mapping and land use classification, archaeological research, soil moisture mapping, geology and cold land processes. In the past year, we have made significant upgrades to add new instrument capabilities and new platform options to accommodate the increasing demand for UAVSAR to support scientific campaigns to measure subsurface soil moisture, acquire data in the polar regions, and for algorithm development, verification, and cross-calibration with other airborne/spaceborne instruments.

Modern societies depend on a growing production of scientific knowledge, which is based on the functional differentiation of science into still more specialised scientific disciplines and subdisciplines. This is the basis for the paradox of scientific expertise: The growth of science leads to a f...... cross-disciplinary research and in the collective use of different kinds of scientific expertise, and thereby make society better able to solve complex, real-world problems.......Modern societies depend on a growing production of scientific knowledge, which is based on the functional differentiation of science into still more specialised scientific disciplines and subdisciplines. This is the basis for the paradox of scientific expertise: The growth of science leads...... to a fragmentation of scientific expertise. To resolve this paradox, the present paper investigates three hypotheses: 1) All scientific knowledge is perspectival. 2) The perspectival structure of science leads to specific forms of knowledge asymmetries. 3) Such perspectival knowledge asymmetries must be handled...

Full Text Available In this paper, the author attempts to elucidate validity of Plato's criticism of Parmenides' simplified monistic ontology, as well as his concept of non-being. In contrast to Parmenides, Plato introduces a more complex ontology of the megista gene and redefines Parmenides' concept of non-being as something absolutely different from being. According to Plato, not all things are in the same sense, i. e. they have the different ontological status. Additionally, he redefines Parmenides' concept of absolute non-being as 'difference' or 'otherness.' .

The control systems at Sellafield fuel handling plant are described. The requirements called for built-in diagnostic features as well as the ability to handle a large sequencing application. Speed was also important; responses better than 50ms were required. The control systems are used to automate operations within each of the three main process caves - two Magnox fuel decanners and an advanced gas-cooled reactor fuel dismantler. The fuel route within the fuel handling plant is illustrated and described. ASPIC (Automated Sequence Package for Industrial Control) which was developed as a controller for the plant processes is described. (U.K.)

The Consider, Read, Elucidate hypotheses, Analyze and interpret data, Think of the next Experiment (CREATE) strategy for teaching and learning uses intensive analysis of primary literature to improve students’ critical-thinking and content integration abilities, as well as their self-rated science attitudes, understanding, and confidence. CREATE also supports maturation of undergraduates’ epistemological beliefs about science. This approach, originally tested with upper-level students, has been adapted in Introduction to Scientific Thinking, a new course for freshmen. Results from this course's initial semesters indicate that freshmen in a one-semester introductory course that uses a narrowly focused set of readings to promote development of analytical skills made significant gains in critical-thinking and experimental design abilities. Students also reported significant gains in their ability to think scientifically and understand primary literature. Their perceptions and understanding of science improved, and multiple aspects of their epistemological beliefs about science gained sophistication. The course has no laboratory component, is relatively inexpensive to run, and could be adapted to any area of scientific study. PMID:23463229

The widespread release of activity and the resultant spread of contamination after the Chernobyl accident resulted in requests to NRPB to provide instruments for, and expertise in, the measurement of radiation. The most common request was for advice on the usefulness of existing instruments, but Board staff were also involved in their adaptation or in the development of new instruments specially to meet the circumstances of the accident. The accident occurred on 26 April. On 1 May, NRPB was involved at Heathrow Airport in the monitoring of the British students who had returned from Kiev and Minsk. The main purpose was to reassure the students by checking that their persons and belongings did not have significant surface contamination. Additional measurements were also made of iodine activity in thyroid using hand-held detectors or a mobile body monitor. This operation was arranged with the Foreign and Commonwealth Office, which had also received numerous requests for instruments from embassies and consulates in countries close to the scene of the accident. There was concern for the well-being of staff and other United Kingdom nationals who resided in or intended to visit the most affected countries. The board supplied suitable instruments, and the FCO distributed them to embassies. The frequency of environmental monitoring was increased from 29 April in anticipation of contamination and appropriate Board instrumentation was deployed. After the Chernobyl cloud arrived in the UK on 2 May, there were numerous requests from local government, public authorities, private companies and members of the public for information and advice on monitoring equipment and procedures. Some of these requirements could be met with existing equipment but members of the public were usually advised not to proceed. At a later stage, the contamination of foodstuffs and livestock required the development of an instrument capable of detecting low levels of {sup 137}Cs and {sup 134}Cs in food

Discusses the philosophical strengths and weaknesses of Laudan's normative naturalism, which understands the principles of scientific method to be akin to scientific hypotheses, and therefore open to test like any principle of science. Contains 19 references. (Author/WRM)

The purpose of this work is to describe feasible measuring and monitoring alternatives which can be used, if needed, in medium to full scale nuclear waste repository deposition hole mock-up tests. The focus of the work was to determine what variables can actually be measured, how to achieve the measurements and what kind of demands comes from the modelling, scientific, and technical points of view. This project includes a review of the previous waste repository mock-up tests carried out in several European countries such as Belgium, Czech Republic, Spain and Sweden. Also information was gathered by interviewing domestic and foreign scientists specialized in the fields of measurement instrumentation and related in-situ and laboratory work. On the basis of this review, recommendations were developed for the necessary actions needed to be done from the instrumentation point of view for future tests. It is possible to measure and monitor the processes going on in a deposition hole in-situ conditions. The data received during a test in real repository conditions enables to follow the processes and to verify the hypothesis made on the behaviour of various components of the repository: buffer, canister, rock and backfill. Because full scale testing is expensive, the objectives and hypothesis must be carefully set and the test itself with its instrumentation must serve very specific objectives. The main purpose of mock-up tests is to verify that the conditions surrounding the canister are according to the design requirements. A whole mock-up test and demonstration process requires a lot of time and effort. The instrumentation part of the work must also start at early stages to ensure that the instrumentation itself will not become bottlenecked nor suffer from low quality solutions. The planning of the instrumentation work could be done in collaboration with foreign scientists which have participated to previous instrumentation projects. (orig.)

Marie Curie is best known for her discovery of radium one hundred years ago this month, but she also worked closely with industry in developing methods to make and monitor radioactive material, as Soraya Boudia explains. One hundred years ago this month, on 28 December 1898, Pierre Curie, Marie Sklodowska-Curie and Gustave Bemont published a paper in Comptes-rendus - the journal of the French Academy of Sciences. In the paper they announced that they had discovered a new element with astonishing properties: radium. But for one of the authors, Marie Curie, the paper was more than just the result of outstanding work: it showed that a woman could succeed in what was then very much a male-dominated scientific world. Having arrived in Paris from Poland in 1891, Marie Curie became the first woman in France to obtain a PhD in physics, the first woman to win a Nobel prize and the first woman to teach at the Sorbonne. She also helped to found a new scientific discipline: the study of radioactivity. She became an icon and a role-model for other women to follow, someone who succeeded - despite many difficulties - in imposing herself on the world of science. Although Curie's life story is a familiar and well documented one, there is one side to her that is less well known: her interaction with industry. As well as training many nuclear physicists and radiochemists in her laboratory, she also became a scientific pioneer in industrial collaboration. In this article the author describes this side of Marie Curie. (UK)

The methods for measuring radiation are shortly reviewed. The instrumentation for neutron flux measurement is classified into out-of-core and in-core instrumentation. The out-of-core instrumentation monitors the operational range from the subcritical reactor to full power. This large range is covered by several measurement channels which derive their signals from counter tubes and ionization chambers. The in-core instrumentation provides more detailed information on the power distribution in the core. The self-powered neutron detectors and the aeroball system in PWR reactors are discussed. Temperature and pressure measurement devices are briefly discussed. The different methods for leak detection are described. In concluding the plant instrumentation part some new monitoring systems and analysis methods are presented: early failure detection methods by noise analysis, acoustic monitoring and vibration monitoring. The presentation of the control starts from an qualitative assessment of the reactor dynamics. The chosen control strategy leads to the definition of the part-load diagram, which provides the set-points for the different control systems. The tasks and the functions of these control systems are described. In additiion to the control, a number of limiting systems is employed to keep the reactor in a safe operating region. Finally, an outlook is given on future developments in control, concerning mainly the increased application of process computers. (orig./RW)

The second edition of Instrumental Analysis is a survey of the major instrument-based methods of chemical analysis. It appears to be aimed at undergraduates but would be equally useful in a graduate course. The volume explores all of the classical quantitative methods and contains sections on techniques that usually are not included in a semester course in instrumentation (such as electron spectroscopy and the kinetic methods). Adequate coverage of all of the methods contained in this book would require several semesters of focused study. The 25 chapters were written by different authors, yet the style throughout the book is more uniform than in the earlier edition. With the exception of a two-chapter course in analog and digital circuits, the book purports to de-emphasize instrumentation, focusing more on the theory behind the methods and the application of the methods to analytical problems. However, a detailed analysis of the instruments used in each method is by no means absent. The book has the favor of a user's guide to analysis

Purpose – The purpose of this paper is to investigate to what extent male and female PhDs choose academic vs non‐academic employment. Further, it analyses gender earnings differences in the academic and non‐academic labour markets. Design/methodology/approach – Rich Swedish cross‐sectional regist...... scientific human capital. Originality/value – The study is the first to investigate career‐choice and earnings of Swedish PhDs. Further, the study is the first to investigate both the academic and the non‐academic labour markets....

This scientific report of the Fuel Cycle Direction of the Cea, presents the Direction activities and research programs in the fuel cycle domain during the year 1999. The first chapter is devoted to the front end of the fuel cycle with the SILVA process as main topic. The second chapter is largely based on the separation chemistry of the back end cycle. The third and fourth chapters present studies of more applied and sometimes more technical developments in the nuclear industry or not. (A.L.B.)

The main activities of SCK/CEN during 1974 are reported in individual summaries. Fields of research are the following: sodium cooled fast reactors, gas cooled reactors, light water reactors, applied nuclear research (including waste disposal, safeguards and fusion research), basic and exploratory research (including materials science, nuclear physics and radiobiology). The BR2 Materials testing reactor and associated facilities are described. The technical and administrative support activities are also presented. A list of publications issued by the SCK/CEN Scientific staff is given

Full Text Available Technological and scientific innovations have increased exponentially over the past years in the dentistry profession. In this article, these developments are evaluated both in terms of clinical practice and their place in the educational program. The effect of the biologic and digital revolutions on dental education and daily clinical practice are also reviewed. Biomimetics, personalized dental medicine regenerative dentistry, nanotechnology, high-end simulations providing virtual reality, genomic information, and stem cell studies will gain more importance in the coming years, moving dentistry to a different dimension.

The main activities of SCK/CEN during 1975 are reported in individual summaries. Field of research are the following: sodium cooled fast reactors, gas cooled reactors, light water reactors, applied nuclear research (including waste disposal, safeguards and fusion research), basic and exploratory research (including materials science, nuclear physics and radiobiology). The BR2 Materials testing reactor and associated facilities are described. The technical and administrative support activities are also presented. A list of publications issued by the SCK/CEN Scientific staff is given

Scientific computing is about developing mathematical models, numerical methods and computer implementations to study and solve real problems in science, engineering, business and even social sciences. Mathematical modelling requires deep understanding of classical numerical methods. This essential guide provides the reader with sufficient foundations in these areas to venture into more advanced texts. The first section of the book presents numEclipse, an open source tool for numerical computing based on the notion of MATLAB®. numEclipse is implemented as a plug-in for Eclipse, a leading integ

The scientific activities and achievements of the Nuclear Research Center Democritus for the year 1979 are presented in the form of a list of 78 projects giving title, objectives, commencement year, responsible of each project, developed activities and the pertaining lists of publications. The 15 chapters of this work cover the activities of the main Divisions of the Democritus NRC: Electronics, Biology, Physics, Chemistry, Health Physics, Reactor, Radioisotopes, Environmental Radioactivity, Soil Science, Computer Center, Uranium Exploration, Medical Service, Technological Applications and Training. (T.A.)

Energy communication is a paradigmatic case of scientific communication. It is particularly important today, when the world is confronted with a number of immediate, urgent problems. Science communication has become a real duty and a big challenge for scientists. It serves to create and foster a climate of reciprocal knowledge and trust between science and society, and to establish a good level of interest and enthusiasm for research. For an effective communication it is important to establish an open dialogue with the audience, and a close collaboration among scientists and science communicators. An international collaboration in energy communication is appropriate to better support international and interdisciplinary research and projects.

Scientific visualization is the visual presentation of numerical data. The National Center for Supercomputing Applications (NCSA) has developed methods for visualizing computerbased simulations of digital imaging data. The applicability of these various tools for unique and potentially medical beneficial display of MR images is investigated. Raw data are obtained from MR images of the brain, neck, spine, and brachial plexus obtained on a 1.5-T imager with multiple pulse sequences. A supercomputer and other mainframe resources run a variety of graphic and imaging programs using this data. An interdisciplinary team of imaging scientists, computer graphic programmers, an physicians works together to achieve useful information

knowledge about imaging, climate analysis, ecology, demographics, industrial economics, and biology. The need for ontology negotiation also arises at the boundaries between scientific programs. For example, a Principal Investigator may want to use information from a previous mission to complement downloads from the instruments currently deployed.

"Massive quantities of data will soon begin flowing from the largest scientificinstrument ever built into an international netword of computer centers, including one operated jointly by the University of Chicago and Indiana University." (1,5 page)

Protective Instrumentation (PI) for Nuclear Power Plants (NPP) is a general term for an highly reliable instrumentation, which provides information for keeping the system within safe limits, for initation of countermeasures in the case of an incident or for mitigation of consequences of an accident. In German NPPs one can find a hierarchical structure of protective instrumentation, wherein the Reactor Protection System (RPS) has the highest priority. To meet the reliability requirements different design principles are used, like - redundancy - diversity - fail safe - decoupling. The presentation gives an overview about the different design principles and characterizes their reliability aspects. As an example for the technical realization the RPS of a German NPP is discussed in some detail. Furthermore some information about other type of PI is given and reliability aspects of the interaction of operating personell with these systems are mentioned. (orig.)

The Aethalometer is an instrument that provides a real-time readout of the concentration of “Black” or “Elemental” carbon aerosol particles (BC or E) in an air stream (see Figure 1 and Figure 2). It is a self-contained instrument that measures the rate of change of optical transmission through a spot on a filter where aerosol is being continuously collected and uses the information to calculate the concentration of optically absorbing material in the sampled air stream. The instrument measures the transmitted light intensities through the “sensing” portion of the filter, on which the aerosol spot is being collected, and a “reference” portion of the filter as a check on the stability of the optical source. A mass flowmeter monitors the sample air flow rate. The data from these three measurements is used to determine the mean BC content of the air stream.

When the Federal Government decided on a 25% reduction of CO 2 emissions till 2005 in 1990 the necessity resulted that an instrument has to be developed for the analysis and assessment of the ecological, economic and energetic impact of different reduction strategies. The development task was awarded by the BMFT to the Research Centre Juelich in cooperation with well-known institutions of energy system research. The total instrument is scheduled to be finished by the end of 1994. For the decentral use of the instrument by a wide specialist public the developed models and data banks which are equipped with a user-friendly surface are suited for larger PCs (486, 16 MB RAM/500-1000 MB ROM). (orig.) [de

Full Text Available The rules laid down by Romanian Capital Market Law and the regulations put in force for its implementation apply to issuers of financial instruments admitted to trading on the regulated market established in Romania. But the issuers remain companies incorporated under Company Law of 1990. Such dual regulations need increased attention in order to observe the legal status of the issuers/companies and financial instruments/shares. Romanian legislator has chosen to implement in Capital Market Law special rules regarding the administration of the issuers of financial instruments, not only rules regarding admitting and maintaining to a regulated market. Thus issuers are, in Romanian Law perspective, special company that should comply special rule regarding board of administration and general shareholders meeting.

Discussed is the lack of a scientific foundation and scientific evidence favoring astrology. Included are several research studies conducted to examine astrological tenets which yield generally negative results. (Author/DS)

In the past decade, a number of scientific collaboratories have emerged, yet adoption of scientific collaboratories remains limited. Meeting expectations is one factor that influences adoption of innovations, including scientific collaboratories. This paper investigates expectations scientists have...... with respect to scientific collaboratories. Interviews were conducted with 17 scientists who work in a variety of settings and have a range of experience conducting and managing scientific research. Results indicate that scientists expect a collaboratory to: support their strategic plans; facilitate management...... of the scientific process; have a positive or neutral impact on scientific outcomes; provide advantages and disadvantages for scientific task execution; and provide personal conveniences when collaborating across distances. These results both confirm existing knowledge and raise new issues for the design...

Nuclear spectroscopy instruments are important tools for nuclear research and applications. Several types of nuclear spectrometers are being sent to numerous laboratories in developing countries through technical co-operation projects. These are mostly sophisticated systems based on different radiation detectors, analogue and digital circuitry. In most cases, they use microprocessor or computer techniques involving software and hardware. Maintenance service and repair of these systems is a major problem in many developing countries because suppliers do not set up service stations. The Agency's Laboratories at Seibersdorf started conducting group fellowship training on nuclear spectroscopy instrumentation maintenance in 1987. This article describes the training programme

A description of the control and data transfer management system for scientificinstrumentation involved in the GAMMA-400 space project is given. The technical capabilities of all specialized equipment to provide the functioning of the scientificinstrumentation and satellite support systems are unified in a single structure. Control of the scientificinstruments is maintained using one-time pulse radio commands, as well as program commands in the form of 16-bit code words, which are transmitted via onboard control system and scientific data acquisition system. Up to 100 GByte of data per day can be transferred to the ground segment of the project. The correctness of the proposed and implemented structure, engineering solutions and electronic elemental base selection has been verified by the experimental working-off of the prototype of the GAMMA-400 scientific complex in laboratory conditions. (paper)

Discussions of standards in the scientific community have been compared to religious wars for many years. The only things scientists agree on in these battles are either "standards are not useful" or "everyone can benefit from using my standard". Instead of achieving the goal of facilitating interoperable communities, in many cases the standards have served to build yet another barrier between communities. Some important progress towards diminishing these obstacles has been made in the data layer with the merger of the NetCDF and HDF scientific data formats. The universal adoption of XML as the standard for representing metadata and the recent adoption of ISO metadata standards by many groups around the world suggests that similar convergence is underway in the metadata layer. At the same time, scientists and tools will likely need support for native tongues for some time. I will describe an approach that combines re-usable metadata "components" and restful web services that provide those components in many dialects. This approach uses advanced XML concepts of referencing and linking to construct complete records that include reusable components and builds on the ISO Standards as the "unabridged dictionary" that encompasses the content of many other dialects.

This guidebook introduces the reader—the scientific tourist and others—to the visible memorabilia of science and scientists in Budapest—statues, busts, plaques, buildings, and other artefacts. According to the Hungarian–American Nobel laureate Albert Szent-Györgyi, this metropolis at the crossroads of Europe has a special atmosphere of respect for science. It has been the venue of numerous scientific achievements and the cradle, literally, of many individuals who in Hungary, and even more beyond its borders became world-renowned contributors to science and culture. Six of the eight chapters of the book cover the Hungarian Nobel laureates, the Hungarian Academy of Sciences, the university, the medical school, agricultural sciences, and technology and engineering. One chapter is about selected gimnáziums from which seven Nobel laureates (Szent-Györgyi, de Hevesy, Wigner, Gabor, Harsanyi, Olah, and Kertész) and the five “Martians of Science” (von Kármán, Szilard, Wigner, von Neumann, and Teller...

The International Committee supported the proposal of the Chairman of the XVIII International Linac Conference to issue a new Compendium of linear accelerators. The last one was published in 1976. The Local Organizing Committee of Linac96 decided to set up a sub-committee for this purpose. Contrary to the catalogues of the High Energy Accelerators which compile accelerators with energies above 1 GeV, we have not defined a specific limit in energy. Microtrons and cyclotrons are not in this compendium. Also data from thousands of medical and industrial linacs has not been collected. Therefore, only scientific linacs are listed in the present compendium. Each linac found in this research and involved in a physics context was considered. It could be used, for example, either as an injector for high energy accelerators, or in nuclear physics, materials physics, free electron lasers or synchrotron light machines. Linear accelerators are developed in three continents only: America, Asia, and Europe. This geographical distribution is kept as a basis. The compendium contains the parameters and status of scientific linacs. Most of these linacs are operational. However, many facilities under construction or design studies are also included. A special mention has been made at the end for the studies of future linear colliders.

In this essay, the author attempts to enlighten the reader as to the meaning of the term ''verified scientific findings'' in section 13, sub-section 1, sentence 2 of the new Chemicals Control Law. The examples given here are the generally accepted regulations in regards to technology (that is sections 7a and 18b of the WHG (law on water economy), section 3, sub-section 1 of the machine- and engine protection laws) and to the status of technology (section 3, sub-section 6 of the BImSchG (Fed. law on prevention of air-borne pollution)), and to the status of science (section 5, sub-section 2 of the AMG (drug legislation). The ''status of science and technology'' as defined in sections 4 ff of the Atomic Energy Law (AtomG) and in sections 3, 4, 12, 2) of the First Radiation Protection Ordinance (1.StrlSch. VO), is also being discussed. The author defines the in his opinion ''dynamic term'' as the generally recognized result of scientific research, and the respective possibilities of practical utilization of technology. (orig.) [de

Drilling for scientific purpose is a process of conducting geophysical exploration at deep underground and drilling for collecting crust samples directly. This is because earth science has advanced to get a good understanding about the top of the crust and has shifted its main interest to the lower layer of the crust in land regions. The on-land drilling plan in Japan has just started, and the planned drilling spots are areas around the Minami River, Hidaka Mts., kinds of the Mesozoic and Cenozoic granite in outside zone, the extension of Japan Sea, Ogasawara Is., Minami-Tori Is., and active volcanos. The paper also outlines the present situation of on-land drilling in the world, focusing on the SG-3rd super-deep well SG-3 on the Kola Peninsula, USSR, Satori SG-1st well SG-1 in Azerbaidzhan S.S.R, V.S.S.R, Sweden's wells, Cyprus' wells, Bayearn well Plan in West Germany, and Salton Sea Scientific Drilling Program in the U.S. At its end, the paper explains the present situation and the future theme of the Japanese drilling technique and points out the necessity of developing equipment, and techniques. (14 figs, 5 tabs, 26 refs)

Virtual Sensor Test Instrumentation is based on the concept of smart sensor technology for testing with intelligence needed to perform sell-diagnosis of health, and to participate in a hierarchy of health determination at sensor, process, and system levels. A virtual sensor test instrumentation consists of five elements: (1) a common sensor interface, (2) microprocessor, (3) wireless interface, (4) signal conditioning and ADC/DAC (analog-to-digital conversion/ digital-to-analog conversion), and (5) onboard EEPROM (electrically erasable programmable read-only memory) for metadata storage and executable software to create powerful, scalable, reconfigurable, and reliable embedded and distributed test instruments. In order to maximize the efficient data conversion through the smart sensor node, plug-and-play functionality is required to interface with traditional sensors to enhance their identity and capabilities for data processing and communications. Virtual sensor test instrumentation can be accessible wirelessly via a Network Capable Application Processor (NCAP) or a Smart Transducer Interlace Module (STIM) that may be managed under real-time rule engines for mission-critical applications. The transducer senses the physical quantity being measured and converts it into an electrical signal. The signal is fed to an A/D converter, and is ready for use by the processor to execute functional transformation based on the sensor characteristics stored in a Transducer Electronic Data Sheet (TEDS). Virtual sensor test instrumentation is built upon an open-system architecture with standardized protocol modules/stacks to interface with industry standards and commonly used software. One major benefit for deploying the virtual sensor test instrumentation is the ability, through a plug-and-play common interface, to convert raw sensor data in either analog or digital form, to an IEEE 1451 standard-based smart sensor, which has instructions to program sensors for a wide variety of

74 students, including 45 from developing countries, ten lecturers and nine laboratory instructors participated in the novel instrumentation school held in June at the International Centre for Theoretical Physics (ICTP), Trieste, Italy, sponsored by ICTP and arranged through the Instrumentation Panel of the International Committee for Future Accelerators (ICF). During the two weeks of the course, students had the chance to construct and test a proportional chamber, measure the lifetime of cosmic ray muons, operate and analyse the performance of an 8-wire imaging drift chamber, or study noise and signal processing using a silicon photodiode.

As the sensory system for an accelerator, the beam instrumentation provides a tremendous amount of diagnostic information. Access to this information can vary from periodic spot checks by operators to high bandwidth data acquisition during studies. In this paper, example applications will illustrate the requirements on interfaces between the control system and the instrumentation hardware. A survey of the major accelerator facilities will identify the most popular interface standards. The impact of developments such as isochronous protocols and embedded digital signal processing will also be discussed

A spectroelectrochemical instrument has been developed for measuring the total organic carbon (TOC) content of an aqueous solution. Measurements of TOC are frequently performed in environmental, clinical, and industrial settings. Until now, techniques for performing such measurements have included, various ly, the use of hazardous reagents, ultraviolet light, or ovens, to promote reactions in which the carbon contents are oxidized. The instrument now being developed is intended to be a safer, more economical means of oxidizing organic carbon and determining the TOC levels of aqueous solutions and for providing a low power/mass unit for use in planetary missions.

The Advanced Nuclear Technology Group of the Los Alamos National Laboratory is now using intelligent data-acquisition and analysis instrumentation for determining the multiplication of nuclear material. Earlier instrumentation, such as the large NIM-crate systems, depended on house power and required additional computation to determine multiplication or to estimate error. The portable, battery-powered multiplication measurement unit, with advanced computational power, acquires data, calculates multiplication, and completes error analysis automatically. Thus, the multiplication is determined easily and an available error estimate enables the user to judge the significance of results

NIM is a standard modular instrumentation system that is in wide use throughout the world. As the NIM system developed and accommodations were made to a dynamic instrumentation field and a rapidly advancing technology, additions, revisions and clarifications were made. These were incorporated into the standard in the form of addenda and errata. This standard is a revision of the NIM document, AEC Report TID-20893 (Rev. 4) dated July 1974. It includes all the addenda and errata items that were previously issued as well as numerous additional items to make the standard current with modern technology and manufacturing practice

The rapid development and availability of low-cost technologies have created a wide interest in virtual reality. In the field of computer music, the term “virtual musical instruments” has been used for a long time to describe software simulations, extensions of existing musical instruments......, and ways to control them with new interfaces for musical expression. Virtual reality musical instruments (VRMIs) that include a simulated visual component delivered via a head-mounted display or other forms of immersive visualization have not yet received much attention. In this article, we present a field...

74 students, including 45 from developing countries, ten lecturers and nine laboratory instructors participated in the novel instrumentation school held in June at the International Centre for Theoretical Physics (ICTP), Trieste, Italy, sponsored by ICTP and arranged through the Instrumentation Panel of the International Committee for Future Accelerators (ICF). During the two weeks of the course, students had the chance to construct and test a proportional chamber, measure the lifetime of cosmic ray muons, operate and analyse the performance of an 8-wire imaging drift chamber, or study noise and signal processing using a silicon photodiode

A fire at a nuclear power plant (NPP) has the potential to damage structures, systems, and components important to safety, if not promptly detected and suppressed. At Browns Ferry Nuclear Power Plant on March 22, 1975, a fire in the reactor building damaged electrical power and control systems. Damage to instrumentation cables impeded the function of both normal and standby reactor coolant systems, and degraded the operators’ plant monitoring capability. This event resulted in additional NRC involvement with utilities to ensure that NPPs are properly protected from fire as intended by the NRC principle design criteria (i.e., general design criteria 3, Fire Protection). Current guidance and methods for both deterministic and performance based approaches typically make conservative (bounding) assumptions regarding the fire-induced failure modes of instrumentation cables and those failure modes effects on component and system response. Numerous fire testing programs have been conducted in the past to evaluate the failure modes and effects of electrical cables exposed to severe thermal conditions. However, that testing has primarily focused on control circuits with only a limited number of tests performed on instrumentation circuits. In 2001, the Nuclear Energy Institute (NEI) and the Electric Power Research Institute (EPRI) conducted a series of cable fire tests designed to address specific aspects of the cable failure and circuit fault issues of concern1. The NRC was invited to observe and participate in that program. The NRC sponsored Sandia National Laboratories to support this participation, whom among other things, added a 4-20 mA instrumentation circuit and instrumentation cabling to six of the tests. Although limited, one insight drawn from those instrumentation circuits tests was that the failure characteristics appeared to depend on the cable insulation material. The results showed that for thermoset insulated cables, the instrument reading tended to drift

This group of figurines, each 0.15m tall, were unearthed from a Tang Dynasty tomb in Changsha in 1977. Music was very developed in the Tang Dynasty. Colorful musical instruments and dances were popular both among the people and in the palace. These vivid-looking figurines wear pleated skirts with small sleeves and open chest, a style influenced by the non-Han nationalities living in the north and west of China. Some of the musical instruments were brought from the Western Regions. The figurines are playing the xiao (a vertical bamboo flute), the konghou (an

The following sections are included: * INTRODUCTION * THE IDEOLOGY OF SCIENTIFIC INTERNATIONALISM * THE UNIVERSALITY OF SCIENTIFIC KNOWLEDGE * SCIENCE AS A MACHT-ERSATZ * SCIENCE AS A POLITICAL INSTRUMENT - KULTURPOLITIK * THE ANTI-POLITICAL AND "MANDARIN" IDEOLOGIES * THE SUBORDINATION OF THE INTERESTS OF SCIENCE TO THE INTERESTS OF THE NATION * UNOFFICIAL INTERNATIONAL SCIENTIFIC RELATIONS-PRECONDITIONS FOR INTRANSIGENCE

an appropriate use of the resources of Scientific Realism (in particular, IBE) to achieve platonism? (§2) We argue that just because a variety of different inferential strategies can be employed by Scientific Realists does not mean that ontological conclusions concerning which things we should be Scientific...

of improving the expression and communication of risk and/or uncertainties in the selected opinions. The Scientific Committee concluded that risk assessment terminology is not fully harmonised within EFSA. In part this is caused by sectoral legislation defining specific terminology and international standards......The Scientific Committee of the European Food Safety Authority (EFSA) reviewed the use of risk assessment terminology within its Scientific Panels. An external report, commissioned by EFSA, analysed 219 opinions published by the Scientific Committee and Panels to recommend possible ways......, the Scientific Committee concludes that particular care must be taken that the principles of CAC, OIE or IPPC are followed strictly. EFSA Scientific Panels should identify which specific approach is most useful in dealing with their individual mandates. The Scientific Committee considered detailed aspects...

Power stations are characterized by a wide variety of mechanical and electrical plant operating with structures, liquids and gases working at high pressures and temperatures and with large mass flows. The voltages and currents are also the highest that occur in most industries. In order to achieve maximum economy, the plant is operated with relatively small margins from conditions that can cause rapid plant damage, safety implications, and very high financial penalties. In common with other process industries, power stations depend heavily on control and instrumentation. These systems have become particularly significant, in the cost-conscious privatized environment, for providing the means to implement the automation implicit in maintaining safety standards, improving generation efficiency and reducing operating manpower costs. This book is for professional instrumentation engineers who need to known about their use in power stations and power station engineers requiring information about the principles and choice of instrumentation available. There are 8 chapters; chapter 4 on instrumentation for nuclear steam supply systems is indexed separately. (Author)

Recognizing that the University Reactor Instrumentation Program was developed in response to widespread needs in the academic community for modernization and improvement of research and training reactors at institutions such as the University of Florida, the items proposed to be supported by this grant over its two year period have been selected as those most likely to reduce foreed outages, to meet regulatory concerns that had been expressed in recent years by Nuclear Regulatory Commission inspectors or to correct other facility problems and limitations. Department of Energy Grant Number DE-FG07-90ER129969 was provided to the University of Florida Training Reactor(UFTR) facility through the US Department of Energy's University Reactor Instrumentation Program. The original proposal submitted in February, 1990 requested support for UFTR facility instrumentation and equipment upgrades for seven items in the amount of $107,530 with $13,800 of this amount to be the subject of cost sharing by the University of Florida and $93,730 requested as support from the Department of Energy. A breakdown of the items requested and total cost for the proposed UFTR facility instrumentation and equipment improvements is presented

In the October 1994 round of proposals at the ILL, the external biology review sub- committee was asked to allocate neutron beam time to a wide range of experiments, on almost half the total number of scheduled neutron instruments: on 3 diffractometers, on 3 small angle scattering instruments, and on some 6 inelastic scattering spectrometers. In the 3.5 years since the temporary reactor shutdown, the ILL`s management structure has been optimized, budgets and staff have been trimmed, the ILL reactor has been re-built, and many of the instruments up-graded, many powerful (mainly Unix) workstations have been introduced, and the neighboring European Synchrotron Radiation Facility has established itself as the leading synchrotron radiation source and has started its official user program. The ILL reactor remains the world`s most intense dedicated neutron source. In this challenging context, it is of interest to review briefly the park of ILL instruments used to study the structure and energetics of small and large biological systems. A brief summary will be made of each class of experiments actually proposed in the latest ILL proposal round.

The Ozone Monitoring Instrument (OMI) flies on the National Aeronautics and Space Adminsitration's Earth Observing System Aura satellite launched in July 2004. OMI is a ultraviolet/visible (UV/VIS) nadir solar backscatter spectrometer, which provides nearly global coverage in one day with a spatial

The technical properties of well instruments for radioactive logging used in the radiometric logging complexes PKS-1000-1 (''Sond-1'') and PRKS-2 (''Vitok-2'') are described. The main features of the electric circuit of the measuring channels are given

SCK-CEN's advanced instrumentation and teleoperation project aims at evaluating the potential of a telerobotic approach in a nuclear environment and, in particular, the use of remote-perception systems. Main achievements in 1997 in the areas of R and D on radiation tolerance for remote sensing, optical fibres and optical-fibre sensors, and computer-aided teleoperation are reported

Deficiencies and desirable improvements can be identified in every technical area in which health physics instruments are employed. The needed improvements cover the full spectrum including long-term reliability, human factors, accuracy, ruggedness, ease of calibration, improved radiation response, and improved mixed field response. Some specific areas of deficiency noted along with needed improvements. 17 references

The rapid development and availability of low cost technologies has created a wide interest in virtual reality (VR), but how to design and evaluate multisensory interactions in VR remains as a challenge. In this paper, we focus on virtual reality musical instruments, present an overview of our...

Thanks to the development of new technology, musical instruments are no more tied to their existing acoustic or technical limitations as almost all parameters can be augmented or modified in real time. An increasing number of composers, performers, and computer programmers have thus become intere...

In the performance of a thermoluminescence dosimetry (TLD) system the equipment plays an important role. Crucial parameters of instrumentation in TLD are discussed in some detail. A review is given of equipment available on the market today - with some emphasis on automation - which is partly based on information from industry and others involved in research and development. (author)

An electrical ionization chamber is described having a self-supporting wall of cellular material which is of uniform areal density and formed of material, such as foamed polystyrene, having an average effective atomic number between about 4 and about 9, and easily replaceable when on the instrument. (auth)

The Integrating Nephelometer (Figure 1) is an instrument that measures aerosol light scattering. It measures aerosol optical scattering properties by detecting (with a wide angular integration – from 7 to 170°) the light scattered by the aerosol and subtracting the light scattered by the carrier gas, the instrument walls and the background noise in the detector (zeroing). Zeroing is typically performed for 5 minutes every day at midnight UTC. The scattered light is split into red (700 nm), green (550 nm), and blue (450 nm) wavelengths and captured by three photomultiplier tubes. The instrument can measure total scatter as well as backscatter only (from 90 to 170°) (Heintzenberg and Charlson 1996; Anderson et al. 1996; Anderson and Ogren 1998; TSI 3563 2015) At ARM (Atmospheric Radiation Measurement), two identical Nephelometers are usually run in series with a sample relative humidity (RH) conditioner between them. This is possible because Nephelometer sampling is non-destructive and the sample can be passed on to another instrument. The sample RH conditioner scans through multiple RH values in cycles, treating the sample. This kind of setup allows to study how aerosol particles’ light scattering properties are affected by humidification (Anderson et al. 1996). For historical reasons, the two Nephelometers in this setup are labeled “wet” and “dry”, with the “dry” Nephelometer usually being the one before the conditioner and sampling ambient air (the names are switched for the MAOS measurement site due to the high RH of the ambient air).

This is a chapter for a book called the Standard Handbook for Electrical Engineering. Though it is not obvious from the title, the book deals mainly with power engineering. The first chapter (not mine) is about the fundamental quantities used in measurement. This chapter is about the process and the instrumentation.

This document is a user manual for CRISP, one of the two neutron reflectomers at ISIS. CRISP is highly automated allowing precision reproducible measurements. The manual provides detailed instructions for the setting-up and running of the instrument and advice on data analysis. (UK)

Most researchers leverage bottom-up suppression to unlock the underlying mechanisms of unconscious processing. However, a top-down approach – for example via hypnotic suggestion – paves the road to experimental innovation and complementary data that afford new scientific insights concerning attention and the unconscious. Drawing from a reliable taxonomy that differentiates subliminal and preconscious processing, we outline how an experimental trajectory that champions top-down suppression techniques, such as those practiced in hypnosis, is uniquely poised to further contextualize and refine our scientific understanding of unconscious processing. Examining subliminal and preconscious methods, we demonstrate how instrumental hypnosis provides a reliable adjunct that supplements contemporary approaches. Specifically, we provide an integrative synthesis of the advantages and shortcomings that accompany a top-down approach to probe the unconscious mind. Our account provides a larger framework for complementing the results from core studies involving prevailing subliminal and preconscious techniques. PMID:25120504

Full Text Available Most researchers leverage bottom-up suppression to unlock the underlying mechanisms of unconscious processing. However, a top-down approach – for example via hypnotic suggestion – paves the road to experimental innovation and complementary data that afford new scientific insights concerning attention and the unconscious. Drawing from a reliable taxonomy that differentiates subliminal and preconscious processing, we outline how an experimental trajectory that champions top-down suppression techniques, such as those practiced in hypnosis, is uniquely poised to further contextualize and refine our scientific understanding of unconscious processing. Examining subliminal and preconscious methods, we demonstrate how instrumental hypnosis provides a reliable adjunct that supplements contemporary approaches. Specifically, we provide an integrative synthesis of the advantages and shortcomings that accompany a top-down approach to probe the unconscious mind. Our account provides a larger framework for complementing the results from core studies involving prevailing subliminal and preconscious techniques.

The sample collection technology and instrument concept for the Sample of Comet Coma Earth Return Mission (SOCCER) are described. The scientific goals of this Flyby Sample Return are to return to coma dust and volatile samples from a known comet source, which will permit accurate elemental and isotopic measurements for thousands of individual solid particles and volatiles, detailed analysis of the dust structure, morphology, and mineralogy of the intact samples, and identification of the biogenic elements or compounds in the solid and volatile samples. Having these intact samples, morphologic, petrographic, and phase structural features can be determined. Information on dust particle size, shape, and density can be ascertained by analyzing penetration holes and tracks in the capture medium. Time and spatial data of dust capture will provide understanding of the flux dynamics of the coma and the jets. Additional information will include the identification of cosmic ray tracks in the cometary grains, which can provide a particle's process history and perhaps even the age of the comet. The measurements will be made with the same equipment used for studying micrometeorites for decades past; hence, the results can be directly compared without extrapolation or modification. The data will provide a powerful and direct technique for comparing the cometary samples with all known types of meteorites and interplanetary dust. This sample collection system will provide the first sample return from a specifically identified primitive body and will allow, for the first time, a direct method of matching meteoritic materials captured on Earth with known parent bodies.