Sample records for hadron accelerator enclosure

The application of hadronaccelerators (protons and light ions) in cancer therapy is discussed. After a brief introduction on the rationale for the use of heavy charged particles in radiation therapy, a discussion is given on accelerator technology and beam delivery systems. Next, existing and planned facilities are briefly reviewed. The Italian Hadron-therapy Project is then described in some detail, with reference ro both the National Centre for Oncological Hadron-therapy and the design of different types of compact proton accelerators aimed at introducing proton therapy in a large umber of hospitals. (author)

The application of hadronaccelerators (protons and light ions) in cancer therapy is discussed. After a brief introduction on the rationale for the use of heavy charged particles in radiation therapy, a discussion is given on accelerator technology and beam delivery systems. Next, existing and planned facilities are briefly reviewed. The Italian Hadrontherapy Project (the largest project of this type in Europe) is then described, with reference to both the National Centre for Oncological Hadrontherapy and the design of two types of compact proton accelerators aimed at introducing proton therapy in a large number of hospitals. Finally, the radiation protection requirements are discussed. (author)

Hadron therapy has entered a new age [1]. The number of facilities grows steadily, and 'consumer' interest is high. Some groups are working on new accelerator technology, while others optimize existing designs by reducing capital and operating costs, and improving performance. This paper surveys the current requirements and directions in accelerator technology for hadron therapy.

Polarization hadron experiments at high energies continue to generate surprises. Many questions remain unanswered or unanswerable within the frame work of QCD. These include such basic questions as to why at high energies the polarization analyzing power in pp elastic scattering remains high, why hyperons are produced with high polarizations etc. It is, therefore, interesting to investigate the possibilities of accelerating and storing polarized beams in high energy colliders. On the technical side the recent understanding and confirmation of the actions of partial and multiple Siberian snakes made it possible to contemplate accelerating and storing polarized hadron beams to multi-TeV energies. In this paper, we will examine the equipment, the operation and the procedure required to obtain colliding beams of polarized protons at TeV energies

The formation of electron clouds in accelerators operating with positrons and positively charge ions is a well-known problem. Depending on the parameters of the beam the electron cloud manifests itself differently. In this thesis the electron cloud phenomenon is studied for the CERN Super Proton Synchrotron (SPS) and Large Hadron Collider (LHC) conditions, and for the heavy-ion synchrotron SIS-100 as a part of the FAIR complex in Darmstadt, Germany. Under the FAIR conditions the extensive use of slow extraction will be made. After the acceleration the beam will be debunched and continuously extracted to the experimental area. During this process, residual gas electrons can accumulate in the electric field of the beam. If this accumulation is not prevented, then at some point the beam can become unstable. Under the SPS and LHC conditions the beam is always bunched. The accumulation of electron cloud happens due to secondary electron emission. At the time when this thesis was being written the electron cloud was known to limit the maximum intensity of the two machines. During the operation with 25 ns bunch spacing, the electron cloud was causing significant beam quality deterioration. At moderate intensities below the instability threshold the electron cloud was responsible for the bunch energy loss. In the framework of this thesis it was found that the instability thresholds of the coasting beams with similar space charge tune shifts, emittances and energies are identical. First of their kind simulations of the effect of Coulomb collisions on electron cloud density in coasting beams were performed. It was found that for any hadron coasting beam one can choose vacuum conditions that will limit the accumulation of the electron cloud below the instability threshold. We call such conditions the ''good'' vacuum regime. In application to SIS-100 the design pressure 10{sup -12} mbar corresponds to the good vacuum regime. The transition to the bad vacuum

Possible hadron and photon experiments at 20 TeV stationary-target proton accelerator have been considered in order to see typical limitations and possibilities of the experiments in this new energy domain

"The Large Hadron Collider, the largest atom smasher in the world, broke the record for proton acceleration Monday, sending beams of the particles at 1.18 trillion electron volts, scientists said" (1 paragraph)

Possible hadron and photon experiments at 20 TeV stationary-target proton accelerator have been considered in order to see typical limitations and possibilities of the experiments in this new energy domain.

The CERN Proton Synchrotron has been fitted with a new trajectory measurement system (TMS). Analogue signals from forty beam position monitors are digitized at 125MS/s, and then further treated entirely in the digital domain to derive the positions of all individual particle bunches on the fly. Large FPGAs handle all digital processing. The system fits in fourteen plug-in modules distributed over three half-width cPCI crates. Data are stored in circular buffers of large enough size to keep a fewseconds-worth of position data. Multiple clients can then request selected portions of the data, possibly representing many thousands of consecutive turns, for display on operator consoles. The system uses digital phase-locked loops to derive its beamlocked timing reference. Programmable state machines, driven by accelerator timing pulses and information from the accelerator control system, direct the order of operations. The cPCI crates are connected to a standard Linux computer by means of a private Gigabit Ethernet ...

Over the past years the understanding and use of coherent interactions of charged particles with ordered crystal lattices has achieved excellent results. Improving collimation of hadron beams in circular accelerators, like the Large Hadron Collider (LHC) of the European Council for Nuclear Research (CERN), it is one of the possible applications. The aim of the UA9 experiment is to demonstrate the feasibility of a two-stage collimation system in the CERN-SPS : the first stage is a bent crystal oriented for an optimal channeling of the incoming halo particles; the second stage is a massive absorber. Two crystals were installed in the LHC last year and a test of crystal assisted collimation at the highest energy will be possible as early as 2015. Finally, the UA9 Collaboration is investigating extraction of particles from a circular accelerator, based on bent crystals.

The CERN Accelerator School (CAS) and ESS-Bilbao jointly organised a specialised course on High-Power Hadron Machines, held at the Hotel Barceló Nervión in Bilbao, Spain, from 24 May to 2 June, 2011. CERN Accelerator School students. After recapitulation lectures on the essentials of accelerator physics and review lectures on the different types of accelerators, the programme focussed on the challenges of designing and operating high-power facilities. The particular problems for RF systems, beam instrumentation, vacuum, cryogenics, collimators and beam dumps were examined. Activation of equipment, radioprotection and remote handling issues were also addressed. The school was very successful, with 69 participants of 22 nationalities. Feedback from the participants was extremely positive, praising the expertise and enthusiasm of the lecturers, as well as the high standard and excellent quality of their lectures. In addition to the academic programme, the participants w...

The International Workshop on Hadron Facility Technology was held February 22-27, 1988, at the Study Center at Los Alamos National Laboratory. The program included papers on facility plans, beam dynamics, and accelerator hardware. The parallel sessions were particularly lively with discussions of all facets of kaon factory design. The workshop provided an opportunity for communication among the staff involved in hadron facility planning from all the study groups presently active. The recommendations of the workshop include: the need to use h=1 RF in the compressor ring; the need to minimize foil hits in painting schemes for all rings; the need to consider single Coulomb scattering in injection beam los calculations; the need to study the effect of field inhomogeneity in the magnets on slow extraction for the 2.2 Tesla main ring of AHF; and agreement in principle with the design proposed for a joint Los Alamos/TRIUMF prototype main ring RF cavity

The ions acceleration by intensive ultra-short laser pulses has interest in views of them possible applications for proton radiography, production of medical isotopes and hadron therapy. The 3D relativistic PIC-code LegoLPI is developed at RFNC-VNIITF for modeling of intensive laser interaction with plasma. The LegoLPI-code simulations were carried out to find the optimal conditions for generation of proton beams with parameters necessary for hadrons therapy. The performed simulations show that optimal for it may be two-layer foil of aluminum and polyethylene with thickness 100 nm and 50 nm accordingly. The maximum efficiency of laser energy transformation into 200 MeV protons is achieved on irradiating these foils by 30 fs laser pulse with intensity about 2.10^22 W/cm^2. The conclusion is made that lasers with peak power about 0.5-1PW and average power 0.5-1 kW are needed for generation of proton beams with parameters necessary for proton therapy.

Major trends of the physics program related to the study of hadron structure and hadron spectroscopy at the new high current, high duty cycle electron machines are discussed. It is concluded that planned experiments at these machines may have important impact on our understanding of the strong interaction by studying the internal structure and spectroscopy of the nucleon and lower mass hyperon states

The third mini-workshop on high intensity, high brightness hadronaccelerators was held at Brookhaven National Laboratory on May 7-9, 1997 and had about 30 participants. The workshop focussed on rf and longitudinal dynamics issues relevant to intense and/or bright hadron synchrotrons. A plenary session was followed by four sessions on particular topics. This document contains copies of the viewgraphs used as well as summaries written by the session chairs.

In the second part the technologies of dose delivery are described emphasising the main challenges of modern radiotherapy, in particular the treatment of moving organs. In this framework the properties of the beams produced by conventional accelerators (cyclotrons and synchrotrons) are compared with the ones due to two novel approaches based on fast cycling machines, as FFAGs and cyclinacs.

Full Text Available The laser driven acceleration of ions is considered a promising candidate for an ion source for hadron therapy of oncological diseases. Though proton and carbon ion sources are conventionally used for therapy, other light ions can also be utilized. Whereas carbon ions require 400 MeV per nucleon to reach the same penetration depth as 250 MeV protons, helium ions require only 250 MeV per nucleon, which is the lowest energy per nucleon among the light ions (heavier than protons. This fact along with the larger biological damage to cancer cells achieved by helium ions, than that by protons, makes this species an interesting candidate for the laser driven ion source. Two mechanisms (magnetic vortex acceleration and hole-boring radiation pressure acceleration of PW-class laser driven ion acceleration from liquid and gaseous helium targets are studied with the goal of producing 250 MeV per nucleon helium ion beams that meet the hadron therapy requirements. We show that He^{3} ions, having almost the same penetration depth as He^{4} with the same energy per nucleon, require less laser power to be accelerated to the required energy for the hadron therapy.

In order to anticipate the construction of accelerator based laboratory of which one of its applications is for radiotherapy of cancer patients at Research and Development Center for Advanced Technology belonging to National Nuclear Energy Agency, Yogyakarta, in the next 7th. Five Year Development Plan (Repelita VII), it is considered important to perform a study on its commercial prospect. It is found, through calculations based on the available data and realistic assumptions, that patients from neighboring countries are needed to make the operation of radiotherapy facility effective and efficient. (author)

A component which suffers radiation damage usually also becomes radioactive, since the source of activation and radiation damage is the interaction of the material with particles from an accelerator or with reaction products. However, the underlying mechanisms of the two phenomena are different. These mechanisms are described here. Activation and radiation damage can have far-reaching consequences. Components such as targets, collimators, and beam dumps are the first candidates for failure as a result of radiation damage. This means that they have to be replaced or repaired. This takes time, during which personnel accumulate dose. If the dose to personnel at work would exceed permitted limits, remote handling becomes necessary. The remaining material has to be disposed of as radioactive waste, for which an elaborate procedure acceptable to the authorities is required. One of the requirements of the authorities is a complete nuclide inventory. The methods used for calculation of such inventories are presented,...

Non-scaling FFAG rings for cancer hadron therapy offer reduced physical aperture and large dynamic aperture as compared with scaling FFAGs. The variation of tune with energy implies the crossing of resonances during acceleration. Our design avoids intrinsic resonances, although imperfection resonances must be, and can be, crossed. We consider a system of three non-scaling FFAG rings for cancer therapy with 250 MeV protons and 400 MeV/u carbon ions. Hadrons are accelerated in a common RFQ and linear accelerator, and injected into the FFAG rings at .. .. . H+/C6+ ions are accelerated in the two smaller/larger rings to 31 and 250 MeV/68.8 and 400 MeV/u kinetic energy, respectively. The lattices consist of doublet cells with a straight section for RF cavities. The gantry with triplet cells accepts the whole required momentum range at fixed field. This unique design uses either high temperature super-conductors or super-conducting magnets reducing gantry size and weight. Elements with variable field at beginning a...

Full Text Available Nonscaling fixed field alternating gradient (FFAG rings for cancer hadron therapy offer reduced physical aperture and large dynamic aperture as compared to scaling FFAGs. The variation of tune with energy implies the crossing of resonances during acceleration. Our design avoids intrinsic resonances, although imperfection resonances must be crossed. We consider a system of three nonscaling FFAG rings for cancer therapy with 250 MeV protons and 400 MeV/u carbon ions. Hadrons are accelerated in a common radio frequency quadrupole and linear accelerator, and injected into the FFAG rings at v/c=0.1294. H^{+}/C^{6+} ions are accelerated in the two smaller/larger rings to 31 and 250 MeV/68.8 and 400 MeV/u kinetic energy, respectively. The lattices consist of doublet cells with a straight section for rf cavities. The gantry with triplet cells accepts the whole required momentum range at fixed field. This unique design uses either high-temperature superconductors or superconducting magnets reducing gantry magnet size and weight. Elements with a variable field at the beginning and at the end set the extracted beam at the correct position for a range of energies.

To calculate observables for heavy ion collisions, many events with fluctuating initial conditions are simulated and statistically analyzed. A larger set of simulated collision events yields better statistical results, so decreasing the computation time per event is important. A modern GPU possesses thousands of cores and can efficiently perform identical tasks in parallel. We take advantage of this for performing the Cooper-Frye integrals for the hadron spectra obtained from the numerical output from dissipative hydrodynamic simulations. For a given event, this computation consists of two parts: (1) generating thermal spectra of all hadron resonances as Cooper-Frye integrals over the freeze-out surface, and (2) computing the spectra of stable hadrons by letting unstable resonances decay. We here show results using input from (2 +1)-dimensional boost-invariant hydrodynamic simulations where both of these steps were accelerated by parallelizing them on a GPU. The GPU implementation yields a speed-up by about two and one orders of magnitude, respectively, for the first and second of these steps. For semi-central Pb +Pb collision at the LHC, the time needed for the first step is reduced from 31 minutes on a single CPU to 16 seconds on the GPU, and for the second step from 4 minutes to 20 seconds. This research was funded by the National Science Foundation through the JETSCAPE collaboration.

Hadron therapy was first proposed in 1946 and is by now widespread throughout the world, as witnessed with the design and construction of the CNAO, HIT, PROSCAN and MedAustron treatment centres, among others. The clinical interest in hadron therapy lies in the fact that it delivers precision treatment of tumours, exploiting the characteristic shape (the Bragg peak) of the energy deposition in the tissues for charged hadrons. In particular, carbon ion therapy is found to be biologically more effective, with respect to protons, on certain types of tumours. Following an approach tested at NIRS in Japan [1], carbon ion therapy treatments based on 12C could be combined or fully replaced with 11C PET radioactive ions post-accelerated to the same energy. This approach allows providing a beam for treatment and, at the same time, to collect information on the 3D distributions of the implanted ions by PET imaging. The production of 11C ion beams can be performed using two methods. A first one is based on the production...

Hadron therapy was first proposed in 1946 and is by now widespread throughout the world, as witnessed with the design and construction of the CNAO, HIT, PROSCAN and MedAustron treatment centres, among others. The clinical interest in hadron therapy lies in the fact that it delivers precision treatment of tumours, exploiting the characteristic shape (the Bragg peak) of the energy deposition in the tissues for charged hadrons. In particular, carbon ion therapy is found to be biologically more effective, with respect to protons, on certain types of tumours. Following an approach tested at NIRS in Japan [1], carbon ion therapy treatments based on 12C could be combined or fully replaced with 11C PET radioactive ions post-accelerated to the same energy. This approach allows providing a beam for treatment and, at the same time, to collect information on the 3D distributions of the implanted ions by PET imaging. The production of 11C ion beams can be performed using two methods. A first one is based on the production using compact PET cyclotrons with 10-20 MeV protons via 14N(p,α)11C reactions following an approach developed at the Lawrence Berkeley National Laboratory [2]. A second route exploits spallation reactions 19F(p,X)11C and 23Na(p,X)11C on a molten fluoride salt target using the ISOL (isotope separation on-line) technique [3]. This approach can be seriously envisaged at CERN-ISOLDE following recent progresses made on 11C+ production [4] and proven post-acceleration of pure 10C3/6+ beams in the REX-ISOLDE linac [5]. Part of the required components is operational in radioactive ion beam facilities or commercial medical PET cyclotrons. The driver could be a 70 MeV, 1.2 mA proton commercial cyclotron, which would lead to 8.1 × 10711C6+ per spill. This intensity is appropriate using 11C ions alone for both imaging and treatment. Here we report on the ongoing feasibility studies of such approach, using the Monte Carlo particle transport code FLUKA [6,7] to simulate

Hadron therapy was first proposed in 1946 and is by now widespread throughout the world, as witnessed with the design and construction of the CNAO, HIT, PROSCAN and MedAustron treatment centres, among others. The clinical interest in hadron therapy lies in the fact that it delivers precision treatment of tumours, exploiting the characteristic shape (the Bragg peak) of the energy deposition in the tissues for charged hadrons. In particular, carbon ion therapy is found to be biologically more effective, with respect to protons, on certain types of tumours. Following an approach tested at NIRS in Japan [1], carbon ion therapy treatments based on {sup 12}C could be combined or fully replaced with {sup 11}C PET radioactive ions post-accelerated to the same energy. This approach allows providing a beam for treatment and, at the same time, to collect information on the 3D distributions of the implanted ions by PET imaging. The production of {sup 11}C ion beams can be performed using two methods. A first one is based on the production using compact PET cyclotrons with 10–20 MeV protons via {sup 14}N(p,α){sup 11}C reactions following an approach developed at the Lawrence Berkeley National Laboratory [2]. A second route exploits spallation reactions {sup 19}F(p,X){sup 11}C and {sup 23}Na(p,X){sup 11}C on a molten fluoride salt target using the ISOL (isotope separation on-line) technique [3]. This approach can be seriously envisaged at CERN-ISOLDE following recent progresses made on {sup 11}C{sup +} production [4] and proven post-acceleration of pure {sup 10}C{sup 3/6+} beams in the REX-ISOLDE linac [5]. Part of the required components is operational in radioactive ion beam facilities or commercial medical PET cyclotrons. The driver could be a 70 MeV, 1.2 mA proton commercial cyclotron, which would lead to 8.1 × 10{sup 711}C{sup 6+} per spill. This intensity is appropriate using {sup 11}C ions alone for both imaging and treatment. Here we report on the ongoing feasibility

The Large Hadron Collider (LHC) is the largest accelerator in the world. It is designed to collide two proton beams with unprecedented particle energy of 7TeV. The energy stored in each beam is 362MJ, sufficient to melt 500kg of copper. An accidental release of even a small fraction of the beam energy can result in severe damage to the equipment. Machine protection systems are essential to safely operate the accelerator and handle all possible accidents. This thesis deals with the study of different failure scenarios and its possible consequences. It addresses failure scenarios ranging from low intensity losses on high-Z materials and superconductors to high intensity losses on carbon and copper collimators. Low beam losses are sufficient to quench the superconducting magnets and the stabilized superconducting cables (bus-bars) that connects the main magnets. If this occurs and the energy from the bus-bar is not extracted fast enough it can lead to a situation similar to the accident in 2008 at LHC during pow...

The objective is to investigate whether existing technology might be extrapolated to provide the conceptual framework for a major hadron-hadron collider facility for high energy physics experimentation for the remainder of this century. One contribution to this large effort is to formalize the methods and mathematical tools necessary. In this report, the main purpose is to introduce the student to basic design procedures. From these follow the fundamental characteristics of the facility: its performance capability, its size, and the nature and operating requirements on the accelerator components, and with this knowledge, we can determine the technology and resources needed to build the new facility

Hadron therapy was first proposed in 1946 and is by now widespread throughout the world, as witnessed with the design and construction of the CNAO, HIT, PROSCAN and MedAustron treatment centres, among others. The clinical interest in hadron therapy lies in the fact that it delivers precision treatment of tumours, exploiting the characteristic shape (the Bragg peak) of the energy deposition in the tissues for charged hadrons. In particular, carbon ion therapy is found to be biologically more e...

The International Workshop on Hadron Facility Technology was held February 20--25, 1989, at the Study Center at Los Alamos National Laboratory. This volume (first of two) included papers on architecture, beam diagnostics, compressors, and linacs. Participants included groups from AHF, Brookhaven National Laboratory, European Hadron Facility, Fermilab, and the Moscow Meson Factory. The workshop was well attended by members of the Los Alamos staff. The interchange of information and the opportunity by criticism by peers was important to all who attended

The International Workshop on Hadron Facility Technology was held February 20--25, 1989, at the Study Center at Los Alamos National Laboratory. This volume (second of two) included papers on computer controls, polarized beam, rf, magnet and power supplies, experimental areas, and instabilities. Participants included groups from AHF, Brookhaven National Laboratory, European Hadron Facility, Fermilab, and the Moscow Meson Factory. The workshop was well attended by members of the Los Alamos staff. The interchange of information and the opportunity by criticism by peers was important to all who attended.

The International Workshop on Hadron Facility Technology was held February 20--25, 1989, at the Study Center at Los Alamos National Laboratory. This volume (second of two) included papers on computer controls, polarized beam, rf, magnet and power supplies, experimental areas, and instabilities. Participants included groups from AHF, Brookhaven National Laboratory, European Hadron Facility, Fermilab, and the Moscow Meson Factory. The workshop was well attended by members of the Los Alamos staff. The interchange of information and the opportunity by criticism by peers was important to all who attended

An isolation enclosure and a group of isolation enclosures are described which are useful when a relatively large containment area is required. The enclosure is in the form of a ring having a section removed so that a technician may enter the center area of the ring. In a preferred embodiment, an access zone is located in the transparent wall of the enclosure and extends around the inner perimeter of the ring so that a technician can insert his hands into the enclosure to reach any point within. The inventive enclosures provide more containment area per unit area of floor space than conventional material isolation enclosures. 3 figures

In this thesis beam and spin dynamics of ring accelerators are described. After a general theoretical treatment methods for the beam optimization and polarization conservation are discussed. Then experiments on spin manipulation at the COSY facility are considered. Finally the beam simulation and accelerator lay-out for the HESR with regards to the FAIR experiment are described. (HSI)

This diploma thesis describes calibration of hadron calorimeter Tilecal by muon and electron beams. In the first chapter, some calorimetry concepts and basic variables are mentioned or defined. In the second chapter, a detailed Tilecal description is given, special attention was given to provide an up-to-date information (written in April 2003). In this chapter, Tilecal calibration systems and data-taking during testbeams at CERN laboratory in summer 2002 are described. In the third chapter, results of data analyses of muon theta=90 deg and eta-projective runs taken during June, July and August 2002 testbeam periods are given. Results of analyses of calibration by electron beams measured in August 2002 are shown as well. It is also shown, that results of analyses mentioned above are important for the calorimeter calibration for ATLAS detector and also for checking the status of calibrated calorimeter modules.

Cancer accelerator therapy continues to be ever more prevalent with new facilities being constructed at a rapid rate. Some of these facilities are synchrotrons, but many are cyclotrons and, of these, a number are FFAG cyclotrons. The therapy method of "spot scanning” requires many pulses per second (typically 200 Hz), which can be accomplished with a cyclotron (in contrast with a synchrotron). We briefly review commercial scaling FFAG machines and then discuss recent work on non-scaling FFAGs, which may offer the possibility of reduced physical aperture and a large dynamic aperture. However, a variation of tune with energy implies the crossing of resonances during the acceleration process. A design can be developed such as to avoid intrinsic resonances, although imperfection resonances must still be crossed. Parameters of two machines are presented; a 250 MeV proton therapy accelerator and a 400 MeV carbon therapy machine.

Beam losses are responsible for material activation in most of the components of particle accelerators. The activation is caused by several nuclear processes and varies with the irradiation history and the characteristics of the material (namely chemical composition and size). Once at the end of their operational lifetime, these materials require radiological characterization. The radionuclide inventory depends on the particle spectrum, the irradiation history and the chemical composition of the material. As long as these factors are known and the material cross-sections are available, the induced radioactivity can be calculated analytically. However, these factors vary widely among different items of waste and sometimes they are only partially known. The European Laboratory for Particle Physics (CERN, Geneva) has been operating accelerators for high-energy physics for 50 years. Different methods for the evaluation of the radionuclide inventory are currently under investigation at CERN, including the so-calle...

Full Text Available The particle-core model for a continuous cylindrical beam is used to describe the motion of single particles oscillating in a uniform linear focusing channel. Using a random variation of the focusing forces, the model is deployed as proof of principle for the occurrence of large single particle radii without the presence of initial mismatch of the beam core. Multiparticle simulations of a periodic 3D transport channel are then used to qualify and quantify the effects in a realistic accelerator lattice.

Beam losses are responsible for material activation in most of the components of particle accelerators. The activation is caused by several nuclear processes and varies with the irradiation history and the characteristics of the material (namely chemical composition and size). Once at the end of their operational lifetime, these materials require radiological characterization. The radionuclide inventory depends on the particle spectrum, the irradiation history and the chemical composition of the material. As long as these factors are known and the material cross-sections are available, the induced radioactivity can be calculated analytically. However, these factors vary widely among different items of waste and sometimes they are only partially known. The European Laboratory for Particle Physics (CERN, Geneva) has been operating accelerators for high-energy physics for 50 years. Different methods for the evaluation of the radionuclide inventory are currently under investigation at CERN, including the so-called "fingerprint method". This paper provides a mathematical formulation of the fingerprint method highlighting its advantages and limits of validity. The study includes the application to a real case and the validation of the predictions.

Full Text Available With a few exceptions, all on-axis injection and extraction schemes implemented in circular particle accelerators, synchrotrons, and storage rings, make use of magnetic and electrostatic septa with systems of slow-pulsing dipoles acting on tens of thousands of turns and fast-pulsing dipoles on just a few. The dipoles create a closed orbit deformation around the septa, usually referred to as an orbit bump. A new approach is presented which obviates the need for the septum deflectors. Fast-pulsing elements are still required, but their strength can be minimized by choosing appropriate local accelerator optics. This technique should increase the beam clearance and reduce the usually high radiation levels found around the septa and also reduce the machine impedance introduced by the fast-pulsing dipoles. The basis of the technique is the creation of stable islands around stable fixed points in horizontal phase space. The trajectories of these islands may then be adjusted to match the position and angle of the incoming or outgoing beam.

With a few exceptions, all on-axis injection and extraction schemes implemented in circular particle accelerators, synchrotrons, and storage rings, make use of magnetic and electrostatic septa with systems of slow-pulsing dipoles acting on tens of thousands of turns and fast-pulsing dipoles on just a few. The dipoles create a closed orbit deformation around the septa, usually referred to as an orbit bump. A new approach is presented which obviates the need for the septum deflectors. Fast-pulsing elements are still required, but their strength can be minimized by choosing appropriate local accelerator optics. This technique should increase the beam clearance and reduce the usually high radiation levels found around the septa and also reduce the machine impedance introduced by the fast-pulsing dipoles. The basis of the technique is the creation of stable islands around stable fixed points in horizontal phase space. The trajectories of these islands may then be adjusted to match the position and angle of the incoming or outgoing beam.

In this thesis a beam-based method has been developed to measure the strength and the polarity of corrector magnets (skew quadrupoles and sextupoles) in circular accelerators. The algorithm is based on the harmonic analysis (via FFT) of beam position monitor (BPM) data taken turn by turn from an accelerator in operation. It has been shown that, from the differences of the spectral line amplitudes between two consecutive BPMs, both the strength and the polarity of non-linear elements placed in between can be measured. The method has been successfully tested using existing BPM data from the SPS of CERN. A second beam-based method has been studied for a fast measurement and correction of betatron coupling driven by skew quadrupole field errors and tilted focusing quadrupoles. In this thesis it has been shown how the correction for minimizing the coupling stop band C can be performed in a single machine cycle from the harmonic analysis of multi-BPM data. The method has been successfully applied to RHIC. A third theoretical achievement is a new description of the betatron motion close to the difference resonance in presence of linear coupling. New formulae describing the exchange of RMS resonances have been derived here making use of Lie algebra providing a better description of the emittance behavior. A new way to decouple the equations of motion and explicit expressions for the individual single particle invariants have been found. For the first time emittance exchange studies have been carried out in the SIS-18 of GSI. Applications of this manipulation are: emittance equilibration under consideration for future operations of the SIS-18 as booster for the SIS-100; emittance transfer during multi-turn injection to improve the efficiency and to protect the injection septum in high intensity operations, by shifting part of the horizontal emittance into the vertical plane. Multi-particle simulations with 2D PIC space-charge solver have been run to infer heuristic scaling

Final Report Abstract for DE-FG02-99ER4110, May 15, 2011- October 15, 2014 There is a synergy between the fields of Beam Dynamics (BD) in modern particle accelerators and Applied Mathematics (AMa). We have formulated significant problems in BD and have developed and applied tools within the contexts of dynamical systems, topological methods, numerical analysis and scientific computing, probability and stochastic processes, and mathematical statistics. We summarize the three main areas of our AMa work since 2011. First, we continued our study of Vlasov-Maxwell systems. Previously, we developed a state of the art algorithm and code (VM3@A) to calculate coherent synchrotron radiation in single pass systems. In this cycle we carefully analyzed the major expense, namely the integral-over-history (IOH), and developed two approaches to speed up integration. The first strategy uses a representation of the Bessel function J0 in terms of exponentials. The second relies on “local sequences” developed recently for radiation boundary conditions, which are used to reduce computational domains. Although motivated by practicality, both strategies involve interesting and rather deep analysis and approximation theory. As an alternative to VM3@A, we are integrating Maxwell’s equations by a time-stepping method, bypass- ing the IOH, using a Discontinuous Galerkin (DG) method. DG is a generalization of Finite Element and Finite Volume methods. It is spectrally convergent, unlike the commonly used Finite Difference methods, and can handle complicated vacuum chamber geometries. We have applied this in several contexts and have obtained very nice results including an explanation of an experiment at the Canadian Light Source, where the geometry is quite complex. Second, we continued our study of spin dynamics in storage rings. There is much current and proposed activity where spin polarized beams are being used in testing the Standard Model and its modifications. Our work has focused

The European Hadron Facility (EHF) is a project for particle and nuclear physics in the 1990s which would consist of a fast cycling high intensity proton synchrotron of about 30 GeV primary energy and providing a varied spectrum of intense high quality secondary beams (polarized protons, pions, muons, kaons, antiprotons, neutrinos). The physics case of this project has been studied over the last two years by a European group of particle and nuclear physicists (EHF Study Group), whilst the conceptual design for the accelerator complex was worked out (and is still being worked on) by an international group of machine experts (EHF Design Study Group). Both aspects have been discussed in recent years in a series of working parties, topical seminars, and workshops held in Freiburg, Trieste, Heidelberg, Karlsruhe, Les Rasses and Villigen. This long series of meetings culminated in the International Conference on a European Hadron Facility held in Mainz from 10-14 March

The NICA collider project at the Joint Institute for Nuclear Research in Dubna will have the capability of colliding protons, polarized deuterons, and nuclei at an effective nucleon-nucleon center-of mass energy in the range {radical}s{sub NN} = 4 to 11 GeV. I briefly survey a number of novel hadron physics processes which can be investigated at the NICA collider. The topics include the formation of exotic heavy quark resonances near the charm and bottom thresholds, intrinsic strangeness, charm, and bottom phenomena, hidden-color degrees of freedom in nuclei, color transparency, single-spin asymmetries, the RHIC baryon anomaly, and non-universal antishadowing.

The proceedings contain invited lectures and papers presente at the symposium. Attention was devoted to hadron interactions a high energy in QCD, to the structure and decay of hadrons, the production of hadrons and supersymmetric particles in e + e - and ep collisions, to perturbation theory in quantum field theory, and new supersymmetric extensions of relativistic algebra. (Z.J

Installation of the final component of the Large Hadron Collider particle accelerator is under way along the Franco-Swiss border near Geneva, Switzerland. When completed this summer, the LHC will be the world's largest and most complex scientific instrument.

With a few exceptions, all on-axis injection and extraction schemes implemented in circular particle accelerators, synchrotrons, and storage rings, make use of magnetic and electrostatic septa with systems of slow-pulsing dipoles acting on tens of thousands of turns and fast-pulsing dipoles on just a few. The dipoles create a closed orbit deformation around the septa, usually referred to as an orbit bump. A new approach is presented which obviates the need for the septum deflectors. Fastpulsing elements are still required, but their strength can be minimized by choosing appropriate local accelerator optics. This technique should increase the beam clearance and reduce the usually high radiation levels found around the septa and also reduce the machine impedance introduced by the fast-pulsing dipoles. The basis of the technique is the creation of stable islands around stable fixed points in horizontal phase space. The trajectories of these islands may then be adjusted to match the position and angle of the inco...

The Hydrogen Safety Panel brings a broad cross-section of expertise from the industrial, government, and academic sectors to help advise the U.S. Department of Energy’s (DOE) Fuel Cell Technologies Office through its work in hydrogen safety, codes, and standards. The Panel’s initiatives in reviewing safety plans, conducting safety evaluations, identifying safety-related technical data gaps, and supporting safety knowledge tools and databases cover the gamut from research and development to demonstration and deployment. The Panel’s recent work has focused on the safe deployment of hydrogen and fuel cell systems in support of DOE efforts to accelerate fuel cell commercialization in early market applications: vehicle refueling, material handling equipment, backup power for warehouses and telecommunication sites, and portable power devices. This paper resulted from observations and considerations stemming from the Panel’s work on early market applications. This paper focuses on hydrogen system components that are installed in outdoor enclosures. These enclosures might alternatively be called “cabinets,” but for simplicity, they are all referred to as “enclosures” in this paper. These enclosures can provide a space where a flammable mixture of hydrogen and air might accumulate, creating the potential for a fire or explosion should an ignition occur. If the enclosure is large enough for a person to enter, and ventilation is inadequate, the hydrogen concentration could be high enough to asphyxiate a person who entered the space. Manufacturers, users, and government authorities rely on requirements described in codes to guide safe design and installation of such systems. Except for small enclosures used for hydrogen gas cylinders (gas cabinets), fuel cell power systems, and the enclosures that most people would describe as buildings, there are no hydrogen safety requirements for these enclosures, leaving gaps that must be addressed. This paper proposes that

These Proceedings contain the contributions to the Workshop HADRONS-94,held in Uzhgorod between September 7-11,1994. They covers the topics: - elastic and diffractive scattering of hadrons and nuclei; -small-x and spin physics; - meson and baryon spectroscopy; - dual and string models; - collective properties of the strongly interacting matter

Spectra of hadrons show various and complex structures due to the strong coupling constants of the quantum chromodynamics (QCD) constituting its fundamental theory. For their understandings, two parameters, i.e., (1) the quark mass and (2) their excitation energies are playing important roles. In low energies, for example, rather simple structures similar to the positronium appear in the heavy quarks such as charms and bottoms. It has been, however, strongly suggested by the recent experiments that the molecular resonant state shows up when the threshold to decay to mesons is exceeded. On the other hand, chiral symmetry and its breaking play important roles in the dynamics of light quarks. Strange quarks are in between and show special behaviors. In the present lecture, the fundamental concept of the hadron spectroscopy based on the QCD is expounded to illustrate the present understandings and problems of the hadron spectroscopy. Sections are composed of 1. Introduction, 2. Fundamental Concepts (hadrons, quarks and QCD), 3. Quark models and exotic hadrons, 4. Lattice QCD and QCD sum rules. For sections 1 to 3, only outline of the concepts is described because of the limited space. Exotic hadrons, many quark pictures of light hadrons and number of quarks in hadrons are described briefly. (S. Funahashi)

The VLT enclosures main functions are to protect the telescopes during operational as well as non-operational phases from any adverse weather conditions and to provide optimal conditions for observation. An adequate design of a ventilation and wind protection system is important for the performance of the enclosures with respect to the minimization of the corresponding seeing effects. The VLT enclosures are equipped with ventilation doors on the azimuth platform level, with louvers on the rotating part and with a windscreen at the observing slit. Extensive qualification tests of the louvers and windscreen mechanical assemblies have been performed during the enclosures development phase. This paper gives an overview over the general layout of the enclosures and the major subsystems, summarizes the main functional specifications and gives the main results and conclusions of the functional performance tests. Presently the first enclosure erection is nearing its completion and pre- commissioning of all systems will commence. The status of the site erection of the enclosures is presented and the planning for the next phases of the erection is presented.

The COmmon Muon and Proton Apparatus for Structure and Spectroscopy is a fixed target experiment at the CERN SPS accelerator. In the past two years hadron spectroscopy was brought into focus. A huge amount of data was taken, using hadronic beams at a momentum of 190 GeV$/c$ impinging on hydrogen, lead, nickel and tungsten targets. The primary goal for the hadron programme is the study of resonance production by diractive scattering, central production and photon exchange. To bring clarity in the intriguing question about the existence of exotic states, such as glueballs and hybrids, the analysis of several channels have been started. We present here a selective overview of the current status.

In the framework of the 2006 experimental benchmark organized at the GSI (Darmstadt, Germany) by the EC CONRAD network, a neutron dosimetry intercomparison was performed in a workplace field around a carbon target hit by 400 MeV/u 12 C ions. The radiation protection group of the INFN-LNF participated to the intercomparison with a Bonner sphere spectrometer equipped with an active 6 LiI(Eu) scintillator and a set of passive detectors, namely MCP-6s (80mgcm -2 )/MCP-7 TLD pairs from TLD Poland. Both active and passive spectrometers, independently tested and calibrated, were used to determine the field and dosimetric quantities in the measurement point. The FRUIT unfolding code, developed at the INFN-LNF radiation protection group, was used to unfold the raw BSS data. This paper compares the results of the active or passive spectrometers, obtaining a satisfactory agreement in terms of both spectrum shape and value of the integral quantities, as the neutron fluence or the ambient dose equivalent. These results allow qualifying the BSS based on TLD pairs as a reliable passive method to be used around high energy particle accelerators even in low dose rate areas. This is particularly useful in those workplaces where the active instruments could be disturbed by the presence of pulsed fields, large photon fluence or electromagnetic noise

The Japanese Hadron Project (JHP) is aimed at producing various kinds of unstable secondary beams based on high-intensity protons from a new accelerator complex. The 1 GeV protons, first produced from a 1 GeV linac, are transferred to a compressor/stretcher ring, where a sharply-pulsed beam or a stretched continuous beam will be produced. The pulsed beam will be used for a pulsed muon source (M arena) and a spallation neutron source (N arena). A part of the proton beam will be used to produce unstable nuclei, which will be accelerated to several MeV/nucleon (E arena). The purpose and impact of JHP will be described in view of future applications of hadronic beams to nuclear energy and material science. (author)

What really happened during the Big Bang? Why did matter form? Why do particles have mass? To answer these questions, scientists and engineers have worked together to build the largest and most powerful particle accelerator in the world: the Large Hadron Collider. Includes glossary, websites, and bibliography for further reading. Perfect for STEM connections. Aligns to the Common Core State Standards for Language Arts. Teachers' Notes available online.

The main concepts used at present in hadron physics are explained. Hadrons are defined as those particles which take part in strong interactions and they consist of two classes of particles: baryons (fermions) and mesons (bosons). The study of the dynamics of hadrons is approached by resorting to symmetry groups. In this approach, the isospin group SU 2 (I), hypercharge 'Y' strangeness S and SU 3 are explained. All the known hadrons are listed with their characteristic quantum numbers. The Gell-Mann-Nishijima relation is derived. The Eight-foldway i.e. the irreducible representations of SU 3 group, is explained. The concept of quarks is introduced to explain to observed hadronic states and symmetry break down in SU 3 . The observation of psi particles i.e. the resonances around 3.1 and 3.7 GeV and the modifications introduced in the SU 3 scheme are dealt with. The paracharge(Z) and SU 4 (Z) are explained. The psi(3.7) is assigned to a 16-plet in the family of vector mesons. The proposed models based on 'charm' and 'colour' scheme are discussed. (A.K.)

Free convective condensation in a vertical enclosure was studied numerically and the results were compared with experiments. In both the numerical and experimental investigations, mist formation was observed to occur near the cooling wall, with significant droplet concentrations in the bulk. Large recirculation cells near the end of the condensing section were generated as the heavy noncondensing gas collecting near the cooling wall was accelerated downward. Near the top of the enclosure the recirculation cells became weaker and smaller than those below, ultimately disappearing near the top of the condenser. In the experiment the mist density was seen to be highest near the wall and at the bottom of the condensing section, whereas the numerical model predicted a much more uniform distribution. The model used to describe the formation of mist was based on a Modified Critical Saturation Model (MCSM), which allows mist to be generated once the vapor pressure exceeds a critical value. Equilibrium, nonequilibrium, and MCSM calculations were preformed, showing the experimental results to lie somewhere in between the equilibrium and nonequilibrium predictions of the numerical model. A single adjustable constant (indicating the degree to which equilibrium is achieved) is used in the model in order to match the experimental results.

Full Text Available We use a potential flow solver to investigate the aerodynamic aspects of flapping flights in enclosed spaces. The enclosure effects are simulated by the method of images. Our study complements previous aerodynamic analyses which considered only the near-ground flight. The present results show that flying in the proximity of an enclosure affects the aerodynamic performance of flapping wings in terms of lift and thrust generation and power consumption. It leads to higher flight efficiency and more than 5% increase of the generation of lift and thrust.

We use a potential flow solver to investigate the aerodynamic aspects of flapping flights in enclosed spaces. The enclosure effects are simulated by the method of images. Our study complements previous aerodynamic analyses which considered only the near-ground flight. The present results show that flying in the proximity of an enclosure affects the aerodynamic performance of flapping wings in terms of lift and thrust generation and power consumption. It leads to higher flight efficiency and more than 5% increase of the generation of lift and thrust.

On the basis of bass cabinets, this paper deals with the problem of reducing loudspeaker enclosure weight. An introductory market analysis emphasizes that lighter cabinets are sought, but maintenance of sound quality is vital. The problem is challenged through experiments and simulations in COMSO...... Multiphysics, which indicate that weight reduction and sound quality maintenance is possible by reducing wall thickness and using adequate bracing and lining.......On the basis of bass cabinets, this paper deals with the problem of reducing loudspeaker enclosure weight. An introductory market analysis emphasizes that lighter cabinets are sought, but maintenance of sound quality is vital. The problem is challenged through experiments and simulations in COMSOL...

Plans for future hadron colliders are presented, and accelerator physics and engineering aspects common to these machines are discussed. The Tevatron is presented first, starting with a summary of the achievements in Run IB which finished in 1995, followed by performance predictions for Run II which will start in 1999, and the TeV33 project, aiming for a peak luminosity $L ~ 1 (nbs)^-1$. The next machine is the Large Hadron Collider LHC at CERN, planned to come into operation in 2005. The last set of machines are Very Large Hadron Colliders which might be constructed after the LHC. Three variants are presented: Two machines with a beam energy of 50 TeV, and dipole fields of 1.8 and 12.6 T in the arcs, and a machine with 100 TeV and 12 T. The discussion of accelerator physics aspects includes the beam-beam effect, bunch spacing and parasitic collisions, and the crossing angle. The discussion of the engineering aspects covers synchrotron radiation and stored energy in the beams, the power in the debris of the p...

What is the universe made of? How did it start? This Manual tells the story of how physicists are seeking answers to these questions using the worlds largest particle smasher  the Large Hadron Collider  at the CERN laboratory on the Franco-Swiss border. Beginning with the first tentative steps taken to build the machine, the digestible text, supported by color photographs of the hardware involved, along with annotated schematic diagrams of the physics experiments, covers the particle accelerators greatest discoveries  from both the perspective of the writer and the scientists who work there. The Large Hadron Collider Manual is a full, comprehensive guide to the most famous, record-breaking physics experiment in the world, which continues to capture the public imagination as it provides new insight into the fundamental laws of nature.

The talk summarizes the principles of particle acceleration and addresses problems related to storage rings like LEP and LHC. Special emphasis will be given to orbit stability, long term stability of the particle motion, collective effects and synchrotron radiation.

Full Text Available The ability to image through metallic enclosures is an important goal of any scanning technology for security applications. Previous work demonstrated the penetrating power of electromagnetic imaging through thin metallic enclosures, thus validating the technique for security applications such as cargo screening. In this work we study the limits of electromagnetic imaging through metallic enclosures, considering the performance of the imaging for different thicknesses of the enclosure. Our results show, that our system can image a Copper disk, even when enclosed within a 20 mm thick Aluminum box. The potential for imaging through enclosures of other materials, such as Lead, Copper, and Iron, is discussed.

This project investigates the energy performance and cost effectiveness of several state-of-the-art retrofit strategies that could be used in triple-deckers and colonial houses, common house types in New England. Several emerging building enclosure technologies were integrated, including high R-value aerogel and vacuum insulations, in forms that would be energy efficient, flexible for different retrofit scenarios, durable, and potentially cost-competitive for deep energy retrofits.

Battery enclosure arrangements for a vehicular battery system. The arrangements, capable of impact resistance include plurality of battery cells and a plurality of kinetic energy absorbing elements. The arrangements further include a frame configured to encase the plurality of the kinetic energy absorbing elements and the battery cells. In some arrangements the frame and/or the kinetic energy absorbing elements can be made of topologically interlocked materials.

Heavy quark systems and glueball candidates, the particles which are relevant to testing QCD, are discussed. The review begins with the heaviest spectroscopically observed quarks, the b anti-b bound states, including the chi state masses, spins, and hadronic widths and the non-relativistic potential models. Also, P states of c anti-c are mentioned. Other heavy states are also discussed in which heavy quarks combine with lighter ones. The gluonium candidates iota(1460), theta(1700), and g/sub T/(2200) are then covered. The very lightest mesons, pi-neutral and eta, are discussed. 133 refs., 24 figs., 16 tabs

Heavy quark systems and glueball candidates, the particles which are relevant to testing QCD, are discussed. The review begins with the heaviest spectroscopically observed quarks, the b anti-b bound states, including the chi state masses, spins, and hadronic widths and the non-relativistic potential models. Also, P states of c anti-c are mentioned. Other heavy states are also discussed in which heavy quarks combine with lighter ones. The gluonium candidates iota(1460), theta(1700), and g/sub T/(2200) are then covered. The very lightest mesons, pi-neutral and eta, are discussed. 133 refs., 24 figs., 16 tabs. (LEW)

This article discusses the main building blocks of a superconducting (SC) linac, the choice of SC resonators, their frequencies, accelerating gradients and apertures, focusing structures, practical aspects of cryomodule design, and concepts to minimize the heat load into the cryogenic system. It starts with an overview of design concepts for all types of hadron linacs differentiated by duty cycle (pulsed or continuous wave) or by the type of ion species (protons, H-, and ions) being accelerated. Design concepts are detailed for SC linacs in application to both light ion (proton, deuteron) and heavy ion linacs. The physics design of SC linacs, including transverse and longitudinal lattice designs, matching between different accelerating–focusing lattices, and transition from NC to SC sections, is detailed. Design of high-intensity SC linacs for light ions, methods for the reduction of beam losses, preventing beam halo formation, and the effect of HOMs and errors on beam quality are discussed. Examples are ta...

It cannot be assumed that storage enclosures considered safe for traditionally printed images and documents are suitable for modern, digitally printed materials. In this project, a large variety of digital print types were tested using a modified version of the ISO 18916 Imaging materials-Processed imaging materials-Photographic activity test for enclosure materials standard to assess the risk to digital prints by paper enclosures known to be inert or reactive with traditional photographic prints. The types of enclosures tested included buffered and non-buffered cotton papers, and groundwood paper. In addition, qualitative filter paper that had been wetted and dried with either an acidic or basic solution was also tested to determine the effects of enclosure pH on digitally printed materials. It was determined that, in general, digital prints tended to be less reactive with various enclosure types than traditional prints. Digital prints were most sensitive to paper that contained groundwood. The enclosure reactivity test results were then integrated with previous published work on the tendencies of various enclosure types to abrade, ferrotype, or block to digital prints in order to create a comprehensive set of recommendations for digital print storage enclosures.

The purpose of this study is to analyze the potential for radiant cooling using the atmospheric sky window and to evaluate the desired characteristics of a radiant cooling material (RCM) applied to the ceiling window of a three-dimensional enclosure. The thermal characteristics of the system are governed by the geometry, ambient temperature, sky radiative temperature, amount of solar energy and its direction, heat transfer modes, wall radiative properties, and radiative properties of the RCMs. A semi-gray band analysis is utilized for the solar and infrared bands. The radiosity/irradiation method is used in each band to evaluate the radiant exchanges in the enclosure. The radiative properties for the RCM are varied in a parametric study to identify the desired properties of RCMs. For performance simulation of real RCMs, the radiative properties are calculated from spectral data. The desired solar property is a high reflectance for both opaque and semi-transparent RCMs. For a semi-transparent RCM, a low value of the solar transmittance is preferred. The desired infrared property is a high emittance for an opaque RCM. For a semi-transparent RCM, a high infrared transmittance is desired, and the emittance should be greater than zero

... enclosure must be provided minimum vertical space and floor space as follows: (i) Prior to February 15, 1994...; (ii) On and after February 15, 1994: (A) Each primary enclosure housing cats must be at least 24 in... requirement for the queen, such housing must be approved by the attending veterinarian in the case of a...

... IMPORTATION, EXPORTATION, AND TRANSPORTATION OF WILDLIFE Standards for the Humane and Healthful Transport of... that droppings do not fall into food or water troughs or onto other perched birds. There shall be... enclosure. (c) An enclosure used to transport one or more birds that rest by perching shall be large enough...

The Hadron Calorimeter (HCAL) is designed for the LHCb experiment. The main purpose of the detector is to provide data for the L0 hadron trigger. The HCAL is designed as consisting of two symmetric movable parts of about 500 ton in total getting in touch in operation position without non-instrumented zones. The lateral dimensions of an active area are X = 8.4 m width, Y = 6.8 m height, and is distanced from the interaction point at Z=13.33 m. Both halves are assembled from stacked up modules. An internal structure consisting of thin iron plates interspaced with scintillating tiles has been chosen. Attention is paid to optimize the detector according to the requirements of the experiment, reducing the spending needed for its construction. Different construction technologies are being discussed. The calorimeter properties have been extensively studied with a variety of prototype on the accelerator beam. The calibration with a radioactive source and module-0 construction experience is discussed.

The Hadron Calorimeter (HCAL) is designed for the LHCb experiment. The main purpose of the detector is to provide data for the L0 hadron trigger. The HCAL is designed as consisting of two symmetric movable parts of about 500 ton in total getting in touch in operation position without non-instrumented zones. The lateral dimensions of an active area are X=8.4 m width, Y=6.8 m height, and is distanced from the interaction point at Z=13.33 m. Both halves are assembled from stacked up modules. An internal structure consisting of thin iron plates interspaced with scintillating tiles has been chosen. Attention is paid to optimize the detector according to the requirements of the experiment, reducing the spending needed for its construction. Different construction technologies are being discussed. The calorimeter properties have been extensively studied with a variety of prototype on the accelerator beam. The calibration with a radioactive source and module-0 construction experience is discussed

A measurement of hadron production cross-sections for the simulation of accelerator neutrino beams and a search for muon neutrino to electron neutrino oscillations in the Δm2 ~ 1 eV2} region. This dissertation presents measurements from two different high energy physics experiments with a very strong connection: the Hadron Production (HARP) experiment located at CERN in Geneva, Switzerland, and the Mini Booster Neutrino Experiment (Mini-BooNE) located at Fermilab in Batavia, Illinois.

A dedicated D1 beam separation dipole is currently being developed at KEK for the Large Hadron Collider Luminosity upgrade (HL-LHC). Four 150 mm aperture, 5.6 T magnetic field and 6.7 m long Nb-Ti magnets will replace resistive D1 dipoles. The development includes fabrication and testing of 2.2 m model magnets. The dipole has a single layer coil and thin spacers between coil and iron, giving a non-negligible impact of saturation on field quality at nominal field. The magnetic design of the straight section coil cross section is based on 2D optimization and a separate optimization concerns the coil ends. However, magnetic measurements of the short model showed a large difference (tens of units) between the sextupole harmonic in the straight part and the 2D calculation. This difference is correctly modelled only by a 3D analysis: 3D calculations show that the magnetic field quality in the straight part is influenced by the coil ends, even for the 6.7 m long magnets. The effect is even more remarkable in the sho...

This report presents a preconceptual design (design criteria and assumptions) for electrostatic enclosures to be used during buried transuranic waste recovery operations. These electrostatic enclosures (along with the application of dust control products) will provide an in-depth contamination control strategy. As part of this preconceptual design, options for electrostatic curtain design are given including both hardwall and fabric enclosures. Ventilation systems, doors, air locks, electrostatic curtains, and supporting systems also are discussed. In addition to the conceptual design, engineering scale tests are proposed to be run at the Test Reactor Area. The planned engineering scale tests will give final material specifications for full-scale retrieval demonstrations

Quantum Chromodynamics provides a good description of many aspects of high energy hadron-hadron collisions, and this will be described, along with some aspects that are not yet understood in QCD. Topics include high E T jet production, direct photon, W, Z and heavy flavor production, rapidity gaps and hard diffraction

Quantum Chromodynamics provides a good description of many aspects of high energy hadron-hadron collisions, and this will be described, along with some aspects that are not yet understood in QCD. Topics include high E{sub T} jet production, direct photon, W, Z and heavy flavor production, rapidity gaps and hard diffraction.

One of the most important problems that are met at the laboratories producing and handling radioisotopes is that of designing, building and operating enclosures suitable for the safe handling of active substances. With this purpose in mind, an enclosure has been designed and built for handling moderately high activities under a shielding made of 150 mm thick lead. In this report a description is given of those aspects that may be of interest to people working in this field. (Author)

One of the most important problems that are met at the laboratories producing and handling radioisotopes is that of designing, building and operating enclosures suitable for the safe handling of active substances. With this purpose in mind, an enclosure has been designed and built for handling moderately high activities under a shielding made of 150 mm thick lead. In this report a description is given of those aspects that may be of interest to people working in this field. (Author)

One of the most important problems that are met at the laboratories producing and handling radioisotopes is that of designing, building and operating enclosures suitable for the safe handling of active substances. With that purpose in mind, an enclosure has been designed and built for handling moderately high activities under a shielding made of 150 mm thick lead. A description is given of those aspects that may be of interest to people working in this field. (author) [es

This thesis considers the political and economic forces shaping the Internet as a medium of increasing importance in everyday life. The digital communications infrastructure is currently facing commercial enclosure on three layers: physical (regulation and ownership of the wires), content (copyright policies, media technology, ownership), and space (ownership and orientation of online tools, spaces, and services). This thesis explores and analyzes the power dynamics driving enclosure on each ...

medium properties of hadrons. I discuss the relevant symmetries of QCD and how they might affect the observed hadron properties. I then discuss at length the observable consequences of in-medium changes of hadronic properties in reactions with ...

The origin of sub-TeV gamma rays detected by Fermi-LAT from the Fermi bubbles at the Galactic center is unknown. In a hadronic model, acceleration of protons and/or nuclei and their subsequent interactions with gas in the bubble volume can produce observed gamma ray. Such interactions naturally produce high-energy neutrinos, and detection of those can discriminate between a hadronic and a leptonic origin of gamma rays. Additional constraints on the Fermi bubbles gamma-ray flux in the PeV range from recent HAWC observations restrict hadronic model parameters, which in turn disfavor Fermi bubbles as the origin of a large fraction of neutrino events detected by IceCube along the bubble directions. We revisit our hadronic model and discuss future constraints on parameters from observations in very high-energy gamma rays by CTA and in neutrinos.

This brief provides an in-depth overview of the physics of hadron therapy, ranging from the history to the latest contributions to the subject. It covers the mechanisms of protons and carbon ions at the molecular level (DNA breaks and proteins 53BP1 and RPA), the physics and mathematics of accelerators (Cyclotron and Synchrotron), microdosimetry measurements (with new results so far achieved), and Monte Carlo simulations in hadron therapy using FLUKA (CERN) and MCHIT (FIAS) software. The text also includes information about proton therapy centers and carbon ion centers (PTCOG), as well as a comparison and discussion of both techniques in treatment planning and radiation monitoring. This brief is suitable for newcomers to medical physics as well as seasoned specialists in radiation oncology.

The vast majority of radiation treatments for cancerous tumors are given using electron linacs that provide both electrons and photons at several energies. Design and construction of these linacs are based on mature technology that is rapidly becoming more and more standardized and sophisticated. The use of hadrons such as neutrons, protons, alphas, or carbon, oxygen and neon ions is relatively new. Accelerators for hadron therapy are far from standardized, but the use of hadron therapy as an alternative to conventional radiation has led to significant improvements and refinements in conventional treatment techniques. This paper presents the rationale for radiation therapy, describes the accelerators used in conventional and hadron therapy, and outlines the issues that must still be resolved in the emerging field of hadron therapy

The vast majority of radiation treatments for cancerous tumors are given using electron linacs that provide both electrons and photons at several energies. Design and construction of these linacs are based on mature technology that is rapidly becoming more and more standardized and sophisticated. The use of hadrons such as neutrons, protons, alphas, or carbon, oxygen and neon ions is relatively new. Accelerators for hadron therapy are far from standardized, but the use of hadron therapy as an alternative to conventional radiation has led to significant improvements and refinements in conventional treatment techniques. This paper presents the rationale for radiation therapy, describes the accelerators used in conventional and hadron therapy, and outlines the issues that must still be resolved in the emerging field of hadron therapy. (author)

The vast majority of radiation treatments for cancerous tumors are given using electron linacs that provide both electrons and photons at several energies. Design and construction of these linacs are based on mature technology that is rapidly becoming more and more standardized and sophisticated. The use of hadrons such as neutrons, protons, alphas, or carbon, oxygen and neon ions is relatively new. Accelerators for hadron therapy are far from standardized, but the use of hadron therapy as an alternative to conventional radiation has led to significant improvements and refinements in conventional treatment techniques. This paper presents the rationale for radiation therapy, describes the accelerators used in conventional and hadron therapy, and outlines the issues that must still be resolved in the emerging field of hadron therapy.

The objective of this study was to design an enclosure suitable for studying the ecotoxicological effects of vehicle emissions on groups of wild birds without compromising welfare. Two, adjacent enclosures sheltered from sunlight, wind and rain, were bird-proofed and wrapped with thick polyethylene sheeting. Emissions were directed into the treatment enclosure from the exhaust of a light-duty gasoline truck, using flexible, heat-proof pipe, with joins sealed to prevent leakage. During active exposure, the engine was idled for 5 h/day, 6 days/week for 4 weeks. Fans maintained positive pressure (controls) and negative pressure (treatment), preventing cross-contamination of enclosures and protecting investigators. Four sets of passive, badge-type samplers were distributed across each enclosure, measuring nitrogen dioxide, sulfur dioxide and volatile organic compounds (NO 2 , SO 2 and VOCs, respectively), and were complemented by active monitors measuring VOCs and particulate matter (2.5 µm diameter, PM 2.5 ). We found that the concentrations of NO 2 , SO 2 and PM 2.5 were not different between treatment and control enclosures. Volatile organic compounds (e.g. benzene, toluene, ethylbenzene and xylenes) were approximately six times higher in the treatment enclosure than control (13.23 and 2.13 µg m -1 , respectively). In conclusion, this represents a successful, practical design for studying the effects of sub-chronic to chronic exposure to realistic mixtures of vehicle exhaust contaminants, in groups of birds. Recommended modifications for future research include a chassis dynamometer (vehicle treadmill), to better replicate driving conditions including acceleration and deceleration.

The exploitation of hadronic final states played a key role in the successes of all recent HEP collider experiments, and the ability to use the hadronic final state will continue to be one of the decisive issues during the LHC era. Monte Carlo techniques to make efficient use of hadronic final states have been developed for many years, and have a technological culmination in object oriented tool-kits for hadronic shower simulation that now are becoming available. In the present paper we give a brief overview on the physics modeling underlying hadronic shower simulation, and report on advanced techniques used and developed for simulation of hadronic showers in HEP experiments. We will discuss the three basic types of modelling - data driven, parametrisation driven, and theory driven modelling - and demonstrate ways to combine them in a flexible manner for concrete applications. We will confront the different types of modelling with the stringent requirements on hadronic shower simulation posed by LHC, and inve...

A coverage of results on high energy and high multiplicity hadron reactions, charm searches and related topics, ultrahigh energy events and exotic phenomena (cosmic rays), and the nuclear effects in high energy collisions and related topics is discussed. 67 references

Full Text Available Like land before the industrial revolution, in the present global economy much knowledge is being enclosed in private hands. In this paper we argue that these enclosures have become a major factor in specialization among firms and among countries: both are forced to specialize in the fields that are not restricted by the enclosures of the others. We use data on 26 OECD countries over the 1978-2006 period. We estimate the effect of patents endowments of countries on their investment specialization across sectors and show that knowledge enclosures involve self-reinforcing innovation patterns. Moreover, we perform a structural change analysis and find that the TRIPs agreement has significantly strengthened the relationship between countries' patents specialization and investment specialization. We conclude by suggesting that stronger international patent protection may restrict global investment opportunities, and this may be one of the factors contributing to the present crisis.

Quark recombination is a successful model to describe the hadronization of a deconfined quark gluon plasma. Jet-like dihadron correlations measured at RHIC provide a challenge for this picture. We discuss how correlations between hadrons can arise from correlations between partons before hadronization. An enhancement of correlations through the recombination process, similar to the enhancement of elliptic flow is found. Hot spots from completely or partially quenched jets are a likely source of such parton correlations.

In this paper experiments in a scale model are used as a first attempt to investigate how different flow elements such as supply air jets, thermal plumes and free convection flows interact with each other in a large enclosure, if the path of each individual flow element changes and if this influe......In this paper experiments in a scale model are used as a first attempt to investigate how different flow elements such as supply air jets, thermal plumes and free convection flows interact with each other in a large enclosure, if the path of each individual flow element changes...

The following topics were dealt with: Hadronic reactions and resonances, structure of mesons, baryons, glueballs, and hybrids, physics with strange and charmed quarks, future projects and facilities. (HSI)

"Steve Meyers of Cern and Jie Wei of Beijing's Tsinghua University are the first recipients of a new prize for particle physics. The pair were honoured for their contributions to numerous particle-accelerator projects - including Cern's Large Hadron Collider - by the Asian Committee for Future Accelerators (ACFA)..." (1 paragraph)

The hadron spectroscopy sessions of the Working Group on Hadron and Nuclear Spectroscopy are summarized. The present status of the field is discussed, along with the main priorities and open questions for the future. The required characteristics of optimum future facilities are outlined

The paper deals with the space-time structure of the sub-atomic world and attempts to construct the fields of the constitutents of the hadrons. Then it is attempted to construct the fields of the hadrons from these micro-fields. (autho r). 24 refs

At a Workshop on the Future of Hadron Facilities, held on 15-16 August at Los Alamos National Laboratory, several speakers pointed out that the US physics community carrying out fixed target experiments with hadron beam had not been as successful with funding as it deserved. To rectify this, they said, the community should be better organized and present a more united front

In this paper we consider some issues about the statistical model of the hadronization in a holographic approach. We introduce a Rindler like horizon in the bulk and we understand the string breaking as a tunneling event under this horizon. We calculate the hadron spectrum and we get a thermal, a...

A new version of the fire tube model is developed to describe hadron-hadron collisions at ultrarelativistic energies. Several improvements are introduced in order to include the longitudinal expansion of intermediate fireballs, which remedies the overestimates of the transverse momenta in the previous version. It is found that, within a wide range of incident energies, the model describes well the experimental data for the single particle rapidity distribution, two-body correlations in the pseudo-rapidity, transverse momentum spectra of pions and kaons, the leading particle spectra and the K/π ratio. (author)

The document summarizes the recent papers, presentations and other public information on Radio-Frequency (RF) Linear Accelerators (linacs) and Fixed-Field Alternating-Gradient (FFAG) accelerators for hadron therapy. The main focus is on technical aspects of these accelerators. This report intends to provide a general overview of the state-of-the-art in those accelerators which could be used in short and middle-term for treating cancer.

The Hadron Calorimeter of the CMS detector for the CERN LHC accelerator is designed to measure hadron jets as well as single hadrons. It has six segments. The central barrel made of brass and scintillators covers the vertical bar η vertical bar range of about 0 to 1.3. Two End Caps, also made of brass and scintillators extends the vertical bar η vertical bar range to 3.0. Two Forward calorimeters made of iron and quartz fibers cover the range 3.0 to 5.0. Since the barrel portion of the calorimeter is only 6.5 interaction lengths, the outer barrel will sample, by scintillators, outside the magnet coil and cryostat. Progress has been made on all subsystems and prototypes have been built. We now have a better understanding of magnetic field effects on calorimeters

COMPASS is an experiment located at CERN SPS accelerator. For the results presented in this paper a 160 GeV positive muon beam was impinging on 6 LiD target. The COMPASS spectrometer was designed to reconstruct scattered muons and charged hadrons in a wide kinematic range. COMPASS preliminary results on hadron, pion and kaon multiplicities are presented. The hadron and pion data show a good agreement with (N)LO QCD expectations and some of these preliminary data have been already successfully incorporated in the global NLO QCD fits to world data. However, the results for kaon multiplicities, are different from the expectations of the DSS fit. There is also a tension between COMPASS and HERMES results, the only other experiment which measured kaon multiplicities in Semi-Inclusive Deep Inelastic scattering

MINOS, the Main Injector Neutrino Oscillation Search, will study neutrino flavor transformations using a Near detector at the Fermi National Accelerator Laboratory and a Far detector located in the Soudan Underground Laboratory in northern Minnesota. The MINOS collaboration also constructed the CalDet (calibration detector), a smaller version of the Near and Far detectors, to determine the topological and signal response to hadrons, electrons and muons. The detector was exposed to test-beams in the CERN Proton Synchrotron East Hall during 2001-2003, where it collected events at momentum settings between 200 MeV/c and 10 GeV/c. In this dissertation we present results of the CalDet experiment, focusing on the topological and signal response to hadrons. We briefly describe the MINOS experiment and its iron-scintillator tracking-sampling calorimters as a motivation for the CalDet experiment. We discuss the operation of the CalDet in the beamlines as well as the trigger and particle identification systems used to isolate the hadron sample. The method used to calibrate the MINOS detector is described and validated with test-beam data. The test-beams were simulated to model the muon flux, energy loss upstream of the detector and the kaon background. We describe the procedure used to discriminate between pions and muons on the basis of the event topology. The hadron samples were used to benchmark the existing GEANT3 based hadronic shower codes and determine the detector response and resolution for pions and protons. We conclude with comments on the response to single hadrons and to neutrino induced hadronic showers.

Quark fragmentation functions (FF) D h q ( z ; Q 2 ) describe final-state hadronisation of quarks q into hadrons h . The FFs can be extracted from hadron multiplicities produced in semi-inclusive deep inelastic scattering. The COMPASS collaboration has recently measured charged hadron multiplicities for identified pions and kaons using a 160 GeV/c muon beam impinging on an isoscalar LiD target. The data cover a large kinematical range and provide an important input for global QCD analyses of world data at NLO, aiming at the determination of FFs. The latest results from COMPASS on pion multiplicities and pion fragmentation functions will be discussed.

"In summer 2008, scientists will switch on one of the largest machines in the world to search for the smallest of particle. CERN's Large Hadron Collider particle accelerator has the potential to chagne our understanding of the Universe."

Flames are stretched by being pulled along their frontal surface by the flow field in which they reside. Their trajectories tend to approach particle paths, acquiring eventually the role of contact boundaries, -interfaces between the burnt and unburnt medium that may broaden solely as a consequence of diffusion. Fundamental properties of flow fields governing such flames are determined here on the basis of the zero Mach number model, providng a rational method of approach to the computational analysis of combustion fields in enclosures where, besides the aerodynamic properties flow, the thermodynamic process of compression must be taken into account. To illustrate its application, the method is used to reveal the mechanism of formation of a tulip-shape flame in a rectangular enclosure under nonturbulent flow conditions.

This document is part of a series of 5 technical manuals produced by the Challenge Program Project CP34 “Improved fisheries productivity and management in tropical reservoirs”. This manual is based on the experience gained by the partners of the project “Improved Fisheries Productivity and Management in Tropical Reservoirs” (CP34) funded by the Challenge Program on Water and Food. As part of this project, the partners designed, developed and tested in the field three enclosures in Lake Nasser...

For 50 years the field of hadronic parity violation has been unresolved. Since the 1980's the standard theoretical framework for hadronic parity violation has been the DDH model. However, discrepancies between the DDH model and experiment have called the use of this model into question. At low energies a new model independent analysis of hadronic parity violation can be carried out via pionless effective field theory. With the use of pionless effective field theory and new precision experiments, focusing on systems with A<=4 in order to eliminate nuclear physics uncertainties, the field of hadronic parity violation at low energies will finally be understood. This talk will give an overview of the theory and possible future experiments in this old yet still exciting field.

In the context of the Hagedorn temperature half-centenary I describe our understanding of the hot phases of hadronic matter both below and above the Hagedorn temperature. The first part of the review addresses many frequently posed questions about properties of hadronic matter in different phases, phase transition and the exploration of quark-gluon plasma (QGP). The historical context of the discovery of QGP is shown and the role of strangeness and strange antibaryon signature of QGP illustrated. In the second part I discuss the corresponding theoretical ideas and show how experimental results can be used to describe the properties of QGP at hadronization. Finally in two appendices I present previously unpublished reports describing the early prediction of the different forms of hadron matter and of the formation of QGP in relativistic heavy ion collisions, including the initial prediction of strangeness and in particular strange antibaryon signature of QGP.

Recent experimental results from studies of hadron interactions at Fermilab are surveyed. Elastic, total and charge-exchange cross section measurements, diffractive phenomena, and inclusive production, using nuclear as well as hydrogen targets, are discussed in these lectures

Recent experimental data on the properties of jets in hadronic reactions are reviewed and compared with theoretical expectations. Jets are clearly established as the dominant process for high E/sub T/ events in hadronic reactions. The cross section and the other properties of these events are in qualitative and even semiquantitative agreement with expectations based on perturbative QCD. However, we can not yet make precise tests of QCD, primarily because there are substantial uncertainties in the theoretical calculations. 45 references. (WHK)

The evidence for a three-valued 'color' degree of freedom in hadron physics is reviewed. The structure of color models is discussed. Consequences of color models for elementary particle physics are discussed, including saturation properties of hadronic states, π 0 →2γ and related decays, leptoproduction, and lepton pair annihilation. Signatures are given which distinguish theories with isolated colored particles from those in which color is permanently bound. (Auth.)

Local Gauge Invariance of SU(3)/sub c/ and color confinement would require that the only hadrons in the world be glueballs. However, when we add the quarks and obtain QCD it is experimentally clear that quark built states mask the expected glueballs. Thus discovery of glueballs is essential for the viability of QCD. Papers presented at the 1983 International Europhysics Conference on High Energy Physics on the hadronic production of glueballs and searches for glueballs are reviewed

Particle physics experiments at modern high luminosity particle accelerators achieve orders of magnitude higher count rates than what was possible ten or twenty years ago. This extremely large statistics allows to draw far reaching conclusions even from minute signals, provided that these signals are well understood by theory. This is, however, ever more difficult to achieve. Presently, technical and scientific progress in general and experimental progress in particle physics in particular, shows typically an exponential growth rate. For example, data acquisition and analysis are, among many other factor, driven by the development of ever more efficient computers and thus by Moore's law. Theory has to keep up with this development by also achieving an exponential increase in precision, which is only possible using powerful computers. This is true for both types of calculations, analytic ones as, e.g., in quantum field perturbation theory, and purely numerical ones as in Lattice QCD. As stated above such calculations are absolutely indispensable to make best use of the extremely costly large particle physics experiments. Thus, it is economically reasonable to invest a certain percentage of the cost of accelerators and experiments in related theory efforts. The basic ideas behind Lattice QCD simulations are the following: Because quarks and gluons can never be observed individually but are always ''confined'' into colorless hadrons, like the proton, all quark-gluon states can be expressed in two different systems of basis states, namely in a quark-gluon basis and the basis of hadron states. The proton, e.g., is an eigenstate of the latter, a specific quark-gluon configuration is part of the former. In the quark-gluon basis a physical hadron, like a proton, is given by an extremely complicated multi-particle wave function containing all effects of quantum fluctuations. This state is so complicated that it is basically impossible to model it

Holography inspired stringy hadrons (HISH) is a set of models that describe hadrons: mesons, baryons and glueballs as strings in flat four dimensional space-time. The models are based on a "map" from stringy hadrons of holographic confining backgrounds. In this note we review the "derivation" of the models. We start with a brief reminder of the passage from the AdS5 ×S5 string theory to certain flavored confining holographic models. We then describe the string configurations in holographic backgrounds that correspond to a Wilson line, a meson, a baryon and a glueball. The key ingredients of the four dimensional picture of hadrons are the "string endpoint mass" and the "baryonic string vertex". We determine the classical trajectories of the HISH. We review the current understanding of the quantization of the hadronic strings. We end with a summary of the comparison of the outcome of the HISH models with the PDG data about mesons and baryons. We extract the values of the tension, masses and intercepts from best fits, write down certain predictions for higher excited hadrons and present attempts to identify glueballs.

The well established concept of Taylor Models is introduced, which offer highly accurate C0 enclosures of functional dependencies, combining high-order polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly non-linear dynamical systems. A method is proposed to extend the existing implementation of Taylor Models in COSY INFINITY from double precision coefficients to arbitrary precision coefficients. Great care is taken to maintain the highest efficiency possible by adaptively adjusting the precision of higher order coefficients in the polynomial expansion. High precision operations are based on clever combinations of elementary floating point operations yielding exact values for round-off errors. An experimental high precision interval data type is developed and implemented. Algorithms for the verified computation of intrinsic functions based on the High Precision Interval datatype are developed and described in detail. The application of these operations in the implementation of High Precision Taylor Models is discussed. An application of Taylor Model methods to the verification of fixed points is presented by verifying the existence of a period 15 fixed point in a near standard Henon map. Verification is performed using different verified methods such as double precision Taylor Models, High Precision intervals and High Precision Taylor Models. Results and performance of each method are compared. An automated rigorous fixed point finder is implemented, allowing the fully automated search for all fixed points of a function within a given domain. It returns a list of verified enclosures of each fixed point, optionally verifying uniqueness within these enclosures. An application of the fixed point finder to the rigorous analysis of beam transfer maps in accelerator physics is presented. Previous work done by

Whereas muon in extensive air shower is an observable sensitive to the primary composition and to the hadron interaction properties, the discrepancy between measured and estimated values has not been solved. Using the new data of forward hadron from an accelerator experiment, a hadron interaction model (QGSJET II-04) is modified. Air shower simulation with the modified interaction model produces more muons. The increased muons concentrate around the shower core and the number density of secondary particle far from core does not change much.

The FCC-hh (Future Hadron-Hadron Circular Collider) is one of the three options considered for the next generation accelerator in high-energy physics as recommended by the European Strategy Group. The layout of FCC-hh has been optimized to a more compact design following recommendations from civil engineering aspects. The updates on the first order and second order optics of the ring will be shown for collisions at the required centre-of-mass energy of 100 TeV. Special emphasis is put on the dispersion suppressors and general beam cleaning sections as well as first considerations of injection and extraction sections.

Recent results of hadron spectroscopy and hadron-hadron interaction within a quark model framework are reviewed. Higher order Fock space components are considered based on new experimental data on low-energy hadron phenomenology. The purpose of this study is to obtain a coherent description of the low-energy hadron phenomenology to constrain QCD phenomenological models and try to learn about low-energy realizations of the theory.

In recent years artificial neural networks (ANN ) have emerged as a mature and viable framework with many applications in various areas. Artificial neural networks theory is sometimes used to refer to a branch of computational science that uses neural networks as models to either simulate or analyze complex phenomena and/or study the principles of operation of neural networks analytically. In this work a model of hadron- hadron collision using the ANN technique is present, the hadron- hadron based ANN model calculates the cross sections of hadron- hadron collision. The results amply demonstrate the feasibility of such new technique in extracting the collision features and prove its effectiveness

The CERN Dragon Boat team – the Hadron Dragons – achieved a fantastic result at the "Paddle for Cancer" Dragon Boat Festival at Lac de Joux on 6 September. CERN Hadron Dragons heading for the start line.Under blue skies and on a clear lake, the Hadron Dragons won 2nd place in a hard-fought final, following top times in the previous heats. In a close and dramatic race – neck-and-neck until the final 50 metres – the local Lac-de-Joux team managed to inch ahead at the last moment. The Hadron Dragons were delighted to take part in this festival. No one would turn down a day out in such a friendly and fun atmosphere, but the Dragons were also giving their support to cancer awareness and fund-raising in association with ESCA (English-Speaking Cancer Association of Geneva). Riding on their great success in recent competitions, the Hadron Dragons plan to enter the last Dragon Boat festival of 2009 in Annecy on 17-18 October. This will coincide with t...

Local quark gluon theories are converted into bilocal field theories via functional techniques. The new field quanta consist of all quark-antiquark bound states in the ladder approximation. they are called bare hadrons. Hadronic Feynman graphs are developed which strongly resemble dual diagrams. QED is a special case with the bare hadrons being positronium atoms. Photons couple to hadrons via intermediate vector mesons in a current-field identity. The new theory accommodates naturally bilocal currents measured in deep-inelastic ep scattering. Also, these couple via intermediate mesons. In the limit of heavy gluon masses, the hadron fields become local and describe π, rho, A 1 , sigma mesons in a chirally invariant Lagrangian (the sigma model). Many interesting new relations are found between meson and quark properties such as m/sub rho/ 2 approx. = 6M 2 , where M is the true nonstrange quark mass after spontaneous breakdown of chiral symmetry. There is a simple formula linking these quark masses with the small bare masses of the Lagrangian. The quark masses also determine the vacuum expectations of scalar densities. These show an SU(3) breaking in the vacuum of approx. = -16%. 14 figures

A many-body approach to hadron structure is presented, in which we consider two parton species: spin-0 (b-partons), and spin-1/2 (f-partons). We extend a boson and a fermion pairing scheme for the b-, and f-partons respectively, into a Yang-Mills gauge theory. The main feature of this theory is that the gauge field is not identified with the usual gluon field variable in QCD. We study the confinement problem of the hadron constituents, and obtain, for low temperatures, partons that are confined by energy gaps. As the critical temperatures for the corresponding phase transitions are approached, the energy gap gradually disappears, and confinement is lost. The theory goes beyond the non-relativistic harmonic oscillator quark model, in the sense of giving physical reasons why a non-relativistic approximation is adequate in describing the internal dynamics of hadron structure. (author)

The LHCb experiment is designed to study the properties and decays of heavy ﬂavored hadrons produced in pp collisions at the LHC. The data collected in the LHC Run I enables precision spectroscopy studies of beauty and charm hadrons. The latest results on spectroscopy of conventional and exotic hadrons are reviewed. In particular the discovery of the ﬁrst charmonium pentaquark states in the $J/\\psi p$ system, the possible existence of four-quark states decaying to $J/\\psi \\phi$ and the conﬁrmation of resonant nature of the $Z_c(4430)^−$ mesonic state are discussed. In the sector of charmed baryons, the observation of ﬁve new $\\Omega_c$ states, the observation of the $\\Xi^+_{cc}$ and the study of charmed baryons decaying to $D^0 p$ are presented.

I describe how hadron-hadron scattering amplitudes are related to the eigenstates of QCD in a finite cubic volume. The discrete spectrum of such eigenstates can be determined from correlation functions computed using lattice QCD, and the corresponding scattering amplitudes extracted. I review results from the Hadron Spectrum Collaboration who have used these finite volume methods to study ππ elastic scattering, including the ρ resonance, as well as coupled-channel πK, ηK scattering. The very recent extension to the case where an external current acts is also presented, considering the reaction πγ* → ππ, from which the unstable ρ → πγ transition form factor is extracted. Ongoing calculations are advertised and the outlook for finite volume approaches is presented.

The aim of this thesis is to give a general survey of the beauty physics at LEP. Study about beautiful hadrons masses; B 0 d and B + mesons masses are measured at the Υ (4S), the principal interest of the Lep is to observe and measure the other beautiful hadrons; the analysis which help to determine for the first time, the B 0 s and Λ 0 b mesons masses are described. Life time of the different beautiful hadrons are given. The oscillations of the neutral B mesons are described. All these measures allowed to determine the parameters of the Cabibbo Kobayashi Maskawa matrix. (N.C.). 78 refs., 74 figs., 11 tabs

High energy laboratories are performing experiments in heavy ion collisions to explore the structure of matter at high temperature and density. This elementary book explains the basic ideas involved in the theoretical analysis of these experimental data. It first develops two topics needed for this purpose, namely hadron interactions and thermal field theory. Chiral perturbation theory is developed to describe hadron interactions and thermal field theory is formulated in the real-time method. In particular, spectral form of thermal propagators is derived for fields of arbitrary spin and used to calculate loop integrals. These developments are then applied to find quark condensate and hadron parameters in medium, including dilepton production. Finally, the non-equilibrium method of statistical field theory to calculate transport coefficients is reviewed. With technical details explained in the text and appendices, this book should be accessible to researchers as well as graduate students interested in thermal ...

Work on high energy hadron-hadron collisions in the geometrical model, performed under the DOE Contract No. EY-76-S-09-0946, is summarized. Specific items studied include the behavior of elastic hadron scatterings at super high energies and the existence of many dips, the computation of meson radii in the geometrical model, and the hadronic matter current effects in inelastic two-body collisions

Hadron Molecules are particles made out of hadrons that are held together by self interactions. In this report we discuss seven such molecules and their self interactions. The f0(980), a0(980), f1(1400), ΔN(2150) and π1(1400) molecular structure is given. We predict that two more states the $K\\bar{K}K$(1500) and a1(1400) should be found.

Production of bottomonium in hadronic collisions is studied in the framework of the soft colour approach. We report some results for production of {upsilon} in the Tevatron and predictions for the future Large Hadron Collider (LHC). (author)

The biological and physical rationale for hadron therapy is well understood by the research community, but hadron therapy is not well established in mainstream medicine. This talk will describe the biological advantage of neutron therapy and the dose distribution advantage of proton therapy, followed by a discussion of the challenges to be met before hadron therapy can play a significant role in treating cancer. A proposal for a new research-oriented hadron clinic will be presented.

The parton recombination model has turned out to be a valuable tool to describe hadronization in high-energy heavy-ion collisions. I review the model and revisit recent progress in our understanding of hadron correlations. I also discuss higher Fock states in the hadrons, possible violations of the elliptic flow scaling and recombination effects in more dilute systems.

Experiments are being prepared at the Fermilab Tevatron and the CERN Large Hadron Collider that promise to deliver extraordinary insights into the nature of spontaneous symmetry breaking, and the role of supersymmetry in the universe. This article reviews the goals, challenges, and designs of these experiments. The first hadron collider, the ISR at CERN, has to overcome two initial obstacles. The first was low luminosity, which steadily improved over time. The second was the broad angular spread of interesting events. In this regard Maurice Jacob noted (1): The answer is ... sophisticated detectors covering at least the whole central region (45 degree le θ le 135 degree) and full azimuth. This statement, while obvious today, reflects the major revelation of the ISR period that hadrons have partonic substructure. The result was an unexpectedly strong hadronic yield at large transverse momentum (p T ). Partly because of this, the ISR missed the discovery of the J/ψ and later missed the Υ. The ISR era was therefore somewhat less auspicious than it might have been. It did however make important contributions in areas such as jet production and charm excitation and it paved the way for the SPS collider, also at CERN

An introduction to the techniques of analysis of hadron collider events is presented in the context of the quark-parton model. Production and decay of W and Z intermediate vector bosons are used as examples. The structure of the Electroweak theory is outlined. Three simple FORTRAN programs are introduced, to illustrate Monte Carlo calculation techniques. 25 refs.

"In the spring 2008, the Large Hadron Collider (LHC) machine at CERN (the European Particle Physics laboratory) will be switched on for the first time. The huge machine is housed in a circular tunnel, 27 km long, excavated deep under the French-Swiss border near Geneva." (1,5 page)

The Particle Flow Algorithms attempt to measure each particle in a hadronic jet individually, using the detector providing the best energy/momentum resolution. Therefore, the spatial segmentation of the calorimeter plays a crucial role. In this context, the CALICE Collaboration developed the Digital Hadron Calorimeter. The Digital Hadron Calorimeter uses Resistive Plate Chambers as active media and has a 1-bit resolution (digital) readout of 1 × 1 cm2 pads. The calorimeter was tested with steel and tungsten absorber structures, as well as with no absorber structure, at the Fermilab and CERN test beam facilities over several years. In addition to conventional calorimetric measurements, the Digital Hadron Calorimeter offers detailed measurements of event shapes, rigorous tests of simulation models and various tools for improved performance due to its very high spatial granularity. Here we report on the results from the analysis of pion and positron events. Results of comparisons with the Monte Carlo simulations are also discussed. The analysis demonstrates the unique utilization of detailed event topologies.

Plans for dealing with the torrent of data from the Large Hadron Collider's detectors have made the CERN particle-phycis lab, yet again, a pioneer in computing as well as physics. The author describes the challenges of processing and storing data in the age of petabyt science. (4 pages)

From 64492 selected \\tau-pair events, produced at the Z^0 resonance, the measurement of the tau decays into hadrons from a global analysis using 1991, 1992 and 1993 ALEPH data is presented. Special emphasis is given to the reconstruction of photons and \\pi^0's, and the removal of fake photons. A detailed study of the systematics entering the \\pi^0 reconstruction is also given. A complete and consistent set of tau hadronic branching ratios is presented for 18 exclusive modes. Most measurements are more precise than the present world average. The new level of precision reached allows a stringent test of \\tau-\\mu universality in hadronic decays, g_\\tau/g_\\mu \\ = \\ 1.0013 \\ \\pm \\ 0.0095, and the first measurement of the vector and axial-vector contributions to the non-strange hadronic \\tau decay width: R_{\\tau ,V} \\ = \\ 1.788 \\ \\pm \\ 0.025 and R_{\\tau ,A} \\ = \\ 1.694 \\ \\pm \\ 0.027. The ratio (R_{\\tau ,V} - R_{\\tau ,A}) / (R_{\\tau ,V} + R_{\\tau ,A}), equal to (2.7 \\pm 1.3) \\ \\%, is a measure of the importance of Q...

An introduction to the techniques of analysis of hadron collider events is presented in the context of the quark-parton model. Production and decay of W and Z intermediate vector bosons are used as examples. The structure of the Electroweak theory is outlined. Three simple FORTRAN programs are introduced, to illustrate Monte Carlo calculation techniques. 25 refs

inside the enclosures to be able to protect the electronic systems.In this work, moisture transfer into a typical electronic enclosure is numerically studied using CFD. In order to reduce theCPU-time and make a way for subsequent factorial design analysis, a simplifying modification is applied in which...

A scenario of a generic printed circuit board (PCB) representing an electronic module inside a metallic enclosure is studied numerically. Following the surface equivalence theorem, the PCB is replaced with surface currents running on a Huygens box (HB) inside the enclosure and near-field errors w...

.... Obtain another small cylinder that has been charged with pure methanol if the system will be used for... and periodic determination of enclosure background emissions (hydrocarbons and methanol); initial determination of enclosure internal volume; and periodic hydrocarbon and methanol retention check and...

Hadronic showers are simulated by the MARS 10 code and compared with various experimental results obtained at high-energy accelerators. Good agreement between the experiment and the simulations is observed. MARS 10 is a fast and reliable instrument for numerical studies of the average characteristics of hadronic showers. 16 refs

external memory point enclosure data structure that can be used to answer a query in O(log B N+K/B) I/Os, where B is the disk block size. To obtain this bound, Ω(N/B 1−ε ) disk blocks are needed for some constant ε>0. With linear space, the best obtainable query bound is O(log 2 N+K/B) if a linear output...... term O(K/B) is desired. To show this we prove a general lower bound on the tradeoff between the size of the data structure and its query cost. We also develop a family of structures with matching space and query bounds....

Today HadronAccelerators with high intensity and high brightness beams increasingly rely on transverse feedback systems for the control of instabilities and the preservation of the transverse emittance. With particular emphasis, but not limited to, the CERN HadronAccelerator Chain, the progress made in recent years, and the performances achieved are reviewed. Hadron colliders such as the LHC represent a particular challenge as they ask for low noise electronic systems in these feedbacks for acceptable emittance growth. Achievements of the LHC transverse feedback system used for damping injection oscillations and to provide stability throughout the cycle are summarized. This includes its use for abort gap and injection cleaning as well as transverse blow-up for diagnostics purposes. Beyond systems already in operation, advances in technology and modern digital signal processing with increasingly higher digitization rates have made systems conceivable to cure intra-bunch motion. With its capabilities to both ...

COMPASS is a multi-purpose fixed-target experiment at the CERN Super Proton Synchrotron aimed at studying the structure and spectrum of hadrons. One primary goal is the search for new hadronic states, in particular spin-exotic mesons and glueballs. We present recent results of partial-wave analyses of (3$\\pi$)$^{-}$ and $\\pi^{-}\\eta'$ final states based on a large data set of diffractive dissociation of a 190 GeV/c $\\pi^{-}$ beam on a proton target in the squared four-momentum-transfer range 0.1 $< t'

This track is an example of real data collected from the DELPHI detector on the Large Electron-Positron (LEP) collider at CERN, which ran between 1989 and 2000. A 3-jet event is seen, resulting from the decay of a Z0 into a quark and an antiquark, one of which emits a gluon. Both quarks and gluons appear in the detector as showers of hadrons, comonly called jets.

Electromagnetic polarizabilities of hadrons are reviewed, after a discussion of classical analogues. Differences between relativistic and non-relativistic approaches can lead to conflicts with conventional nuclear physics sum rules and calculational techniques. The nucleon polarizabilities are discussed in the context of the non-relativistic valence quark model, which provides a good qualitative description. The recently measured pion polarizabilities are discussed in the context of chiral symmetry and quark-loop models. 58 refs., 5 figs

In chiral bag model an expression is obtained for the quark wave functions with account of color and pion interaction of quarks. The quadrupole moments of nonstrange hadrons are calculated. Quadrupole moment of nucleon isobar is found to be Q(Δ)=-6.3x10 -28 esub(Δ)(cm)sup(2). Fredictions of the chiral bag model are in strong disagreement with the non-relativistic quark model

"The Large Hadron Collider (LHC) is expected to provide its first collisions in November 2007, CERN has announced. A two-month run at 0.9 TeV is planned for 2007 to test the accelerating and detecting equipment, and a full power run at 14 TeV is expected in the spring of 2008."

"CERN, the European Oganization for Nuclear Research, took delivery of the last superconducting main magnet for the Large Hadron Collider (LHC) on Monday, completint the full set of 1624 main magnets required to build the world's largest and most powerful particle accelerator."

"The Scientific Information Port (PIC), a technological centre located on the campus of the UAB, recently started work on the first stage of the European project Large Hadron Collider (LHC), the largest particle accelerator in the world, which has the aim of reproducing conditions similar to those produced during the Big Bang in order to study the origins of matter." (1/2 page)

The interpretation of the experimental data in superhigh energy cosmic rays requires the calculations using various models of elementary hadron interaction. One should prefer the models justified by accelerator data and giving definite predictions for superhigh energies. The model of quark-gluon pomeron strings (the QGPS models) satisfies this requirement.

Japan's Ministry of Education, Science and Culture (Monbusho), announced on May 10 that it would help to finance the construction of CERN*'s next particle accelerator, the Large Hadron Collider (LHC). This announcement follows the visit of a CERN delegation, led by Director-General Prof. Christopher Llewellyn Smith to Japan in March 1995.

QCD, the theory of the strong interaction, has a non-trivial vacuum structure. One way to characterize this structure is by means of non-vanishing matrix elements of quark or gluon operators, the condensates. In hot and/or dense enough strongly interacting media, QCD is subject to phase transitions or rapid crossovers. Consequently, the condensates typically change with density and temperature. An exciting aspect of hadron physics is the question how these changes affect the properties of hadrons which are put in a strongly interacting environment. Here vector mesons deserve special attention as they couple directly to (virtual) photons. The latter can decay into dileptons which leave the strongly interacting system untouched. Via that process information about possible in-medium modifications of the vector mesons can be carried to the detectors. With the ω-meson as an example I will review our current understanding of the connections between in-medium changes of condensates and the in-medium changes of the properties of hadrons. (author)

This White Paper presents the recommendations and scientific conclusions from the Town Meeting on QCD and Hadronic Physics that took place in the period 13-15 September 2014 at Temple University as part of the NSAC 2014 Long Range Planning process. The meeting was held in coordination with the Town Meeting on Phases of QCD and included a full day of joint plenary sessions of the two meetings. The goals of the meeting were to report and highlight progress in hadron physics in the seven years since the 2007 Long Range Plan (LRP07), and present a vision for the future by identifying the key questions and plausible paths to solutions which should define the next decade. The introductory summary details the recommendations and their supporting rationales, as determined at the Town Meeting on QCD and Hadron Physics, and the endorsements that were voted upon. The larger document is organized as follows. Section 2 highlights major progress since the 2007 LRP. It is followed, in Section 3, by a brief overview of the physics program planned for the immediate future. Finally, Section 4 provides an overview of the physics motivations and goals associated with the next QCD frontier: the Electron-Ion-Collider.

Hadron resonances can play a significant role in hot hadronic matter. Of particular interest for this workshop are the contributions of hyperon resonances. The question about how to quantify the effects of resonances is here addressed. In the framework of the hadron resonance gas, the chemically equilibrated case, relevant in the context of lattice QCD calculations, and the chemically frozen case relevant in heavy ion collisions are discussed.

In a simplified model we study the hadronization of quark matter in an expanding fireball, in particular the approach to a final hadronic composition in equilibrium. Ideal hadron gas equilibrium constrained by conservation laws, the fugacity parametrization, as well as linear and nonlinear coalescence approaches are recognized as different approximations to this in-medium quark fusion process. It is shown that colour confinement requires a dependence of the hadronization cross section on quark density in the presence of flow (dynamical confinement). (22 refs).

"Radiation-resistant optical fibre is being used by CERN, the European Laboratory for Particle Physics, in the world's largest particle accelerator, the Large Hadron Collider (LHC) near Geneva." (1 page)

A VLHC informal study group started to come together at Fermilab in the fall of 1995 and at the 1996 Snowmass Study the parameters of this machine took form. The VLHC as now conceived would be a 100 TeV hadron collider. It would use the Fermilab Main Injector (now nearing completion) to inject protons at 150 GeV into a new 3 TeV Booster and then into a superconducting pp collider ring producing 100 TeV c.m. interactions. A luminosity of {approximately}10{sup 34} cm{sup -2}s{sup -1} is planned. Our plans were presented to the Subpanel on the Planning for the Future of US High- Energy Physics (the successor to the Drell committee) and in February 1998 their report stated ``The Subpanel recommends an expanded program of R&D on cost reduction strategies, enabling technologies, and accelerator physics issues for a VLHC. These efforts should be coordinated across laboratory and university groups with the aim of identifying design concepts for an economically and technically viable facility`` The coordination has been started with the inclusion of physicists from Brookhaven National Laboratory (BNL), Lawrence Berkeley National Laboratory (LBNL), and Cornell University. Clearly, this collaboration must expanded internationally as well as nationally. The phrase ``economically and technically viable facility`` presents the real challenge.

... contained within the enclosure and cannot put any part of its body outside the enclosure in a way that could... access into the enclosure are secured with animal-proof devices that prevent accidental opening of the... metal having air spaces; and (10) The primary enclosure has a solid, leak-proof bottom, or a removable...

Electronic enclosure design and the internal arrangement of PCBs and components influence microclimate inside the enclosure. This work features a general electronic unit with parallel PCBs. One of the PCB is considered to have heat generating components on it. The humidity and temperature profiles...... geometry of the device and related enclosure design parameters on the humidity and temperature profiles inside the electronic device enclosure....

This paper describes the research work in high energy physics by the group at the University of California, Riverside. Work has been divided between hadron collider physics and e + -e - collider physics, and theoretical work. The hadron effort has been heavily involved in the startup activities of the D-Zero detector, commissioning and ongoing redesign. The lepton collider work has included work on TPC/2γ at PEP and the OPAL detector at LEP, as well as efforts on hadron machines

Some properties of the ground state energy solutions for the hydrogen atom in a spherical enclosure are discussed. The application of the many-point Pade approximants to this kind of systems inside a box is consider also. (Author) [pt

The market growth of consumer electronics makes it essential for industries and policy-makers to work together to develop sustainable products. The objective of this study is to better understand how to promote environmentally sustainable consumer electronics by examining the use of various materials in laptop enclosures (excluding mounting hardware, internal components, and insulation) using screening-level life cycle assessment. The baseline material, is a fossil plastic blend of polycarbonate-acrylonitrile butadiene styrene. Alternative materials include polylactic acid, bamboo, aluminum, and various combinations of these materials known to be currently used or being considered for use in laptops. The flame retardants considered in this study are bisphenol A bis(diphenyl phosphate), triphenyl phosphate, 9,10-dihydro-9-oxa-10-phosphaphenanthrene-10-oxide, and borax-boric acid-phosphorous acid. The Tool for the Reduction and Assessment of Chemical and other environmental Impacts v2.1 was used for the assessment of impacts related to climate change, human and ecological health, and resource use. The assessment demonstrates that plastics, relative to the other materials, are currently some of the better performing materials in terms of having the lowest potential environmental impact for a greater number of impact categories based on product life cycle models developed in this study. For fossil plastics, the material performance increases with increasing post-con

Electronic components and devices are exposed to a wide variety of climatic conditions, therefore the protection of electronic devices from humidity is becoming a critical factor in the system design. The ingress of moisture into typical electronic enclosures has been studied with defined paramet....... The moisture buildup inside the enclosure has been simulated using an equivalent RC circuit consisting of variables like controlled resistors and capacitors to describe the diffusivity, permeability, and storage in polymers....

This test plan describes experimental details of an engineering-scale electrostatic enclosure demonstration to be performed at the Idaho National Engineering Laboratory in fiscal year (FY)-93. This demonstration will investigate, in the engineering scale, the feasibility of using electrostatic enclosures and devices to control the spread of contaminants during transuranic waste handling operations. Test objectives, detailed experimental procedures, and data quality objectives necessary to perform the FY-93 experiments are included in this plan

Four sessions on Hadron Dynamics were organized at this Workshop. The first topic, QCD Exclusive Reactions and Color Transparency, featured talks by Ralston, Heppelman and Strikman; the second, QCD and Inclusive Reactions had talks by Garvey, Speth and Kisslinger. The third dynamics session, Medium Modification of Elementary Interactions had contributions from Kopeliovich, Alves and Gyulassy; the fourth session Pre-QCD Dynamics and Scattering, had talks by Harris, Myhrer and Brown. An additional joint Spectroscopy/Dynamics session featured talks by Zumbro, Johnson and McClelland. These contributions are reviewed briefly in this summary. Two additional joint sessions between Dynamics and η physics are reviewed by the organizers of the Eta sessions. In such a brief review there is no way the authors can adequately summarize the details of the physics presented here. As a result, they concentrate only on brief impressionistic sketches of the physics topics discussed and their interrelations. They include no bibliography in this summary, but simply refer to the talks given in more detail in the Workshop proceedings. They focus on topics which were common to several presentations in these sessions. First, nuclear and particle descriptions of phenomena are now clearly converging, in both a qualitative and quantitative sense; they show several examples of this convergence. Second, an important issue in hadron dynamics is the extent to which elementary interactions are modified in nuclei at high energies and/or densities, and they illustrate some of these medium effects. Finally, they focus on those dynamical issues where hadron facilities can make an important, or even a unique, contribution to the knowledge of particle and nuclear physics

Four sessions on Hadron Dynamics were organized at this Workshop. The first topic, QCD Exclusive Reactions and Color Transparency, featured talks by Ralston, Heppelman and Strikman; the second, QCD and Inclusive Reactions had talks by Garvey, Speth and Kisslinger. The third dynamics session, Medium Modification of Elementary Interactions had contributions from Kopeliovich, Alves and Gyulassy; the fourth session Pre-QCD Dynamics and Scattering, had talks by Harris, Myhrer and Brown. An additional joint Spectroscopy/Dynamics session featured talks by Zumbro, Johnson and McClelland. These contributions are reviewed briefly in this summary. Two additional joint sessions between Dynamics and {eta} physics are reviewed by the organizers of the Eta sessions. In such a brief review there is no way the authors can adequately summarize the details of the physics presented here. As a result, they concentrate only on brief impressionistic sketches of the physics topics discussed and their interrelations. They include no bibliography in this summary, but simply refer to the talks given in more detail in the Workshop proceedings. They focus on topics which were common to several presentations in these sessions. First, nuclear and particle descriptions of phenomena are now clearly converging, in both a qualitative and quantitative sense; they show several examples of this convergence. Second, an important issue in hadron dynamics is the extent to which elementary interactions are modified in nuclei at high energies and/or densities, and they illustrate some of these medium effects. Finally, they focus on those dynamical issues where hadron facilities can make an important, or even a unique, contribution to the knowledge of particle and nuclear physics.

An overview of the current phenomenological models of hadron structure, whose theoretical basis is the Quantum Chromodynamics (QCD), is presented. A short introduction to the QCD permits to focalize the relevant properties which are attached to those models. Following, bag-like models (in particular, MIT bag and chiral extensions) and potential-like models among them the Karl and Isgur non-relativistic model and a semi-relativistic model, free of the Klein paradox, with equal scalar-vetorial mixture of confinement potential are shortly studied. Enphasis is given to the baryons, treated, basically, as three-quarks systems. (L.C.) [pt

Full Text Available The results of resonance particle productions (ρ0, ω, K*, ϕ, Σ*, and Λ* measured by the STAR collaboration at RHIC from various colliding systems and energies are presented. Measured mass, width, 〈pT〉, and yield of those resonances are reviewed. No significant mass shifts or width broadening beyond the experiment uncertainties are observed. New measurements of ϕ and ω from leptonic decay channels are presented. The yields from leptonic decay channels are compared with the measurements from hadronic decay channels and the two results are consistent with each other.

This series of lectures is devoted to review ot he connections between QCD and string theories. One reviews the phenomenological models leading to string pictures in non perturbative QCD and the string effects, related to soft gluon coherence, which arise in perturbative QCD. One tries to build a string theory which goes to QCD at the zero slope limit. A specific model, based on superstring theories is shown to agree with QCD four point amplitudes at the Born approximation and with one loop corrections. One shows how this approach can provide a theoretical framework to account for the phenomenological property of parton-hadron duality.(author)

This paper investigates the sound absorption effect of microperforated panels (MPPs) in small-scale enclosures, an effort stemming from the recent interests in using MPPs for noise control in compact mechanical systems. Two typical MPP backing cavity configurations (an empty backing cavity and a honeycomb backing structure) are studied. Although both configurations provide basically the same sound absorption curves from standard impedance tube measurements, their in situ sound absorption properties, when placed inside a small enclosure, are drastically different. This phenomenon is explained using a simple system model based on modal analyses. It is shown that the accurate prediction of the in situ sound absorption of the MPPs inside compact acoustic enclosures requires meticulous consideration of the configuration of the backing cavity and its coupling with the enclosure in front. The MPP structure should be treated as part of the entire system, rather than an absorption boundary characterized by the surface impedance, calculated or measured in simple acoustic environment. Considering the spatial matching between the acoustic fields across the MPP, the possibility of attenuating particular enclosure resonances by partially covering the enclosure wall with a properly designed MPP structure is also demonstrated.

A brief review of Monte Carlo event generators for simulating hadron-hadron collisions is presented. Particular emphasis is placed on comparisons of the approaches used to describe physics elements and identifying their relative merits and weaknesses. This review summarizes a more detailed report.

or ¯pp collider is a synchrotron machine, where the particle and antiparticle beams are accelerated inside the same vacuum pipe (figure 1) using the same set of bending mag- nets and accelerating cavities (not shown). Thanks to their equal mass and opposite charge, the two beams go round in identical orbits on top of ...

Cooling of hadron beams (including heavy-ions) is a powerful technique by which accelerator facilities around the world achieve the necessary beam brightness for their physics research. In this paper, we will give an overview of the latest developments in hadron beam cooling, for which high energy electron cooling at Fermilab's Recycler ring and bunched beam stochastic cooling at Brookhaven National Laboratory's RHIC facility represent two recent major accomplishments. Novel ideas in the field will also be introduced.

This document provides a brief overview of the recently published report on the design of the Large Hadron Electron Collider (LHeC), which comprises its physics programme, accelerator physics, technology and main detector concepts. The LHeC exploits and develops challenging, though principally existing, accelerator and detector technologies. This summary is complemented by brief illustrations of some of the highlights of the physics programme, which relies on a vastly extended kinematic range, luminosity and unprecedented precision in deep inelastic scattering. Illustrations are provided regarding high precision QCD, new physics (Higgs, SUSY) and electron-ion physics. The LHeC is designed to run synchronously with the LHC in the twenties and to achieve an integrated luminosity of O(100) fb$^{-1}$. It will become the cleanest high resolution microscope of mankind and will substantially extend as well as complement the investigation of the physics of the TeV energy scale, which has been enabled by the LHC.

The Italian National Centre for Oncological Hadron-therapy is currently under construction in Pavia. It is designed for the treatment of deep-seated tumours (up to a depth of 27 cm of water equivalent) with proton and C-ion beams as well as for both clinical and radio-biological research. The particles will be accelerated by a 7-MeV u -1 LINAC injector and a 400-MeV u -1 synchrotron. In the first phase of the project, three treatment rooms will be in operation, equipped with four fixed beams, three horizontal and one vertical. The accelerators are currently undergoing commissioning. The main radiation protection problems encountered (shielding, activation, etc.) are hereby illustrated and discussed in relation to the constraints set by the Italian national authorities. (authors)

The COmmon Muon and Proton Apparatus for Structure and Spectroscopy (COMPASS) is a multi-purpose fixed-target experiment at the CERN Super Proton Synchrotron (SPS) aimed at studying the structure and spectrum of hadrons. In the naive Constituent Quark Model (CQM) mesons are bound states of quarks and antiquarks. QCD, however, predict the existence of hadrons beyond the CQM with exotic properties interpreted as excited glue (hybrids) or even pure gluonic bound states (glueballs). One main goal of COMPASS is to search for these states. Particularly interesting are so called spin-exotic mesons which have J^{PC} quantum numbers forbidden for ordinary q\\bar{q} states. Its large acceptance, high resolution, and high-rate capability make the COMPASS experiment an excellent device to study the spectrum of light-quark mesons in diffractive and central production reactions up to masses of about 2.5 GeV. COMPASS is able to measure final states with charged as well as neutral particles, so that resonances can be studied ...

QCD-motivated models for hadrons predict an assortment of "exotic" hadrons that have structures that are more complex than the quark-antiquark mesons and three-quark baryons of the original quark-parton model. These include pentaquark baryons, the six-quark H-dibaryon, and tetraquark and glueball mesons. Despite extensive experimental searches, no unambiguous candidates for any of these exotic configurations have yet to be identified. On the other hand, a number of meson states, one that seems to be a proton-antiproton bound state, and others that contain either charmed-anticharmed quark pairs or bottom-antibottom quark pairs, have been recently discovered that neither fit into the quark-antiquark meson picture nor match the expected properties of the QCD-inspired exotics. Here I briefly review results from a recent search for the H-dibaryon, and discuss some properties of the newly discovered states –the so-called XYZ mesons– and compare them with expectations for conventional quark-antiquark mesons and the predicted QCD-exotic states. (author)

This lecture deals with the relationship between accelerator technology in high-energy-physics laboratories and the development of superconductors. It concentrates on synchrotron magnets, showing how their special requirements have brought about significant advances in the technology, particularly the development of filamentary superconducting composites. Such developments have made large superconducting accelerators an actuality: the Tevatron in routine operation, the Hadron Electron Ring Accelerator (HERA) under construction, and the Superconducting Super Collider (SSC) and Large Hadron Collider (LHC) at the conceptual design stage. Other applications of superconductivity have also been facilitated - for example medical imaging and small accelerators for industrial and medical use. (orig.)

Parton recombination has been found to be an extremely useful model to understand hadron production at the Relativistic Heavy Ion Collider. It is particularly important to explore its connections with hard processes. This article reviews some of the aspects of the quark recombination model and places particular emphasis on hadron correlations.

A pressure vessel enclosure is described comprising a primary pressure vessel, a first pressure vessel containment assembly adapted to enclose said primary pressure vessel and be spaced apart therefrom, a first upper pressure vessel jacket adapted to enclose the upper half of said first pressure vessel containment assembly and be spaced apart therefrom, said upper pressure vessel jacket having an upper rim and a lower rim, each of said rims connected in a slidable relationship to the outer surface of said first pressure vessel containment assembly, mean for connecting in a sealable relationship said upper rim of said first upper pressure vessel jacket to the outer surface of said first pressure vessel containment assembly, means for connecting in a sealable relationship said lower rim of said first upper pressure vessel jacket to the outer surface of said first pressure vessel containment assembly, a first lower pressure vessel jacket adapted to enclose the lower half of said first pressure vessel containment assembly and be spaced apart therefrom, said lower pressure vessel jacket having an upper rim connected in a slidable relationship to the outer surface of said first pressure vessel containment assembly, and means for connecting in a sealable relationship said upper rim of said first lower pressure vessel jacket to the outer surface of said first pressure vessel containment assembly, a second upper pressure vessel jacket adapted to enclose said first upper pressure vessel jacket and be spaced apart therefrom, said second upper pressure vessel jacket having an upper rim and a lower rim, each of said rims adapted to slidably engage the outer surface of said first upper pressure vessel jacket, means for sealing said rims, a second lower pressure vessel jacket adapted to enclose said first lower pressure vessel jacket and be spaced apart therefrom

The objective of this Safety Report is to provide information to Member States to help ensure that a nuclear installation that will be or has been placed in a safe enclosure mode is maintained in a safe state until the final dismantling is performed and the facility or site released from regulatory control. This period of time may be referred to as the deferred dismantling, safe enclosure or long term storage period. During this safe enclosure period, control of the radioactive material and any other hazardous material must be maintained and the safety of the general public and the environment ensured. This Safety Report applies to the safe enclosure of large nuclear facilities such as nuclear power plants, research reactors, large research facilities, large manufacturing facilities and some fuel cycle facilities. Safe enclosure is not normally applicable to smaller industrial and medical installations owing to the small amount of radioactive material present and the nature of that material. This Safety Report would not normally be applicable to facilities that contain long lived radionuclides, as there is little benefit in placing them into safe enclosure. For these facilities, immediate dismantling is normally the preferred option. This publication describes the activities and concerns that are considered from the time when the initial decision is taken to defer dismantling activities, to the point when final dismantling commences or resumes. It is an expansion of the guidance provided in three IAEA Safety Guides. This Safety Report discusses methods that can be used to meet safety requirements, describes good practices and gives practical examples. The IAEA has published a Technical Report that provides technical details relating to the safe enclosure strategy

Following impressive results from early phase trials in Japan and Germany, there is a current expansion in European hadron therapy. This article summarises present European Union-funded projects for research and co-ordination of hadron therapy across Europe. Our primary focus will be on the research questions associated with carbon ion treatment of cancer, but these considerations are also applicable to treatments using proton beams and other light ions. The challenges inherent in this new form of radiotherapy require maximum interdisciplinary co-ordination. On the basis of its successful track record in particle and accelerator physics, the internationally funded CERN laboratories (otherwise known as the European Organisation for Nuclear Research) have been instrumental in promoting collaborations for research purposes in this area of radiation oncology. There will soon be increased opportunities for referral of patients across Europe for hadron therapy. Oncologists should be aware of these developments, whi...

JHF aims at promoting the variety of research fields using various secondary beams produced by high-intensity proton beams. The accelerator of JHF will be an accelerator complex of a 200 MeV LINAC, a 3 GeV booster proton synchrotron, and a 50 GeV proton synchrotron. The four main experimental facilities of K-Arena, M-Arena, N-Arena, and E-Arena are planed. The outline of the project is presented. (author)

This thesis focuses on a prototype of a highly granular hadronic calorimeter at the planned International Linear Collider optimized for the Particle Flow Approach. The 5.3 nuclear interaction lengths deep sandwich calorimeter was built by the CALICE collaboration and consists of 38 active plastic scintillator layers. Steel is used as absorber material and the active layers are subdivided into small tiles. In total 7608 tiles are read out individually via embedded Silicon Photomultipliers (SiPM). The prototype is one of the first large scale applications of these novel and very promising miniature photodetectors. The work described in this thesis comprises the commissioning of the detector and the data acquisition with test beam particles over several months at CERN and Fermilab. The calibration of the calorimeter and the analysis of the recorded data is presented. A method to correct for the temperature dependent response of the SiPM has been developed and implemented. Its successful application shows that it...

The Large Hadron Collider (LHC) is the most complex instrument ever built for particle physics research. It will, for the first time, provide access to the TeV-energy scale. Numerous technological innovations are necessary to achieve this goal. For example, two counterrotating proton beams are guided and focused by superconducting magnets whose novel two-in-one structure saves cost and allowed the machine to be installed in an existing tunnel. The very high (>8-T) field in the dipoles can be achieved only by cooling them below the transition temperature of liquid helium to the superfluid state. More than 80 tons of superfluid helium are needed to cool the whole machine. So far, the LHC has behaved reliably and predictably. Single-bunch currents 30% above the design value have been achieved, and the luminosity has increased by five orders of magnitude. In this review, I briefly describe the design principles of the major systems and discuss some initial results.

Proton polarimetry at RHIC uses the interference of electromagnetic (EM) and hadronic scattering amplitudes. The EM spin-flip amplitude for protons is responsible for the proton's anomalous magnetic moment, and is large. This then generates a significant analyzing power for small angle elastic scattering. RHIC polarimetry has reached a 5% uncertainty on the beam polarization, and seem capable of reducing this uncertainty further. Polarized neutron beams ax also interesting for RHIC and for a polarized electron-polarized proton/ion collider in the fume. In this case, deuterons, for example, have a very small anomalous magnetic moment, making the approach used for protons impractical. Although it might be possible to use quasielastic scattering from the protons in the deuteron to monitor the polarization. 3-He beams can provide polarized neutrons, and do have a large anomalous magnetic moment, making a similar approach to proton polarimetry possible

The construction of the Large Hadron Collider (LHC) has been a massive endeavour spanning almost 30 years from conception to commissioning. Building the machine with the highest possible energy (7 TeV) in the existing large electron–positron (LEP) collider tunnel of 27 km circumference and with a tunnel diameter of only 3.8 m has required considerable innovation. The first was the development of a two-in-one magnet, where the two rings are integrated into a single magnetic structure. This compact two-in-one structure was essential for the LHC owing to the limited space available in the existing LEP collider tunnel and the cost. The second was a bold move to the use of superfluid helium cooling on a massive scale, which was imposed by the need to achieve a high (8.3 T) magnetic field using an affordable Nb-Ti superconductor.

Over these two years, our group has been worked in hadronic physics at Saturn and CEBAF using the polarimeter POLDER. Tensor polarization observables have been measured in the reaction H(p bar, d bar)π + between 580 and 1300 MeV proton energy. The group has also been leader in an experiment, performed in 1997 at CEBAF. By measuring the t 20 polarization of the recoil deuteron produced in the elastic electron-deuteron scattering at large Q 2 , the separation of the charge and quadrupole form-factors of the deuteron will be performed for Q=4.1-6.8 fm -1 . Finally, we were involved in the construction and test of the neutron polarimeter HARP and in the definition of the physics program of the ELFE project. (authors)

This article describes the status of major new accelerator projects and prospects as of mid 1988. It looks at hadron colliders and electron positron colliders. The author looks both at the technology of the machines, and how it will have to be developed for future devices, and the effort required to extract the important physics information from the resulting reaction cascades which are exected to come out of these devices

This article describes the status of major new accelerator projects and prospects as of mid 1988. It looks at hadron colliders and electron positron colliders. The author looks both at the technology of the machines, and how it will have to be developed for future devices, and the effort required to extract the important physics information from the resulting reaction cascades which are exected to come out of these devices.

Triangular enclosures are typical configurations of attic spaces found in residential as well as industrial pitched-roof buildings. Natural convection in triangular rooftops has received considerable attention over the years, mainly on right-angled and isosceles enclosures. In this paper, a finite volume CFD package is employed to study the laminar air flow and temperature distribution in asymmetric rooftop-shaped triangular enclosures when heated isothermally from the base wall, for aspect ratios (AR) 0.2 <= AR <= 1.0, and Rayleigh number (Ra) values 8 × 105 <= Ra <= 5 × 107. The effects of Rayleigh number and pitch angle on the flow structure and temperature distributions within the enclosure are analysed. Results indicate that, at low pitch angle, the heat transfer between the cold inclined and the hot base walls is very high, resulting in a multi-cellular flow structure. As the pitch angle increases, however, the number of cells reduces, and the total heat transfer rate progressively reduces, even if the Rayleigh number, being based on the enclosure height, rapidly increases. Physical reasons for the above effect are inspected.

Ventilation is frequently used as a means for preventing the build up of flammable or toxic gases in enclosed spaces. The effectiveness of the ventilation often has to be considered as part of a safety case or risk assessment. In this paper methods for assessing ventilation effectiveness for hazardous area classification are examined. The analysis uses data produced from Computational Fluid Dynamics (CFD) simulations of low-pressure jet releases of flammable gas in a ventilated enclosure. The CFD model is validated against experimental measurements of gas releases in a ventilation-controlled test chamber. Good agreement is found between the model predictions and the experimental data. Analysis of the CFD results shows that the flammable gas cloud volume resulting from a leak is largely dependent on the mass release rate of flammable gas and the ventilation rate of the enclosure. The effectiveness of the ventilation for preventing the build up of flammable gas can therefore be assessed by considering the average gas concentration at the enclosure outlet(s). It is found that the ventilation rate of the enclosure provides a more useful measure of ventilation effectiveness than considering the enclosure air change rate.

Triangular enclosures are typical configurations of attic spaces found in residential as well as industrial pitched-roof buildings. Natural convection in triangular rooftops has received considerable attention over the years, mainly on right-angled and isosceles enclosures. In this paper, a finite volume CFD package is employed to study the laminar air flow and temperature distribution in asymmetric rooftop-shaped triangular enclosures when heated isothermally from the base wall, for aspect ratios (AR) 0.2 ≤ AR ≤ 1.0, and Rayleigh number (Ra) values 8 × 10 5 ≤ Ra ≤ 5 × 10 7 . The effects of Rayleigh number and pitch angle on the flow structure and temperature distributions within the enclosure are analysed. Results indicate that, at low pitch angle, the heat transfer between the cold inclined and the hot base walls is very high, resulting in a multi-cellular flow structure. As the pitch angle increases, however, the number of cells reduces, and the total heat transfer rate progressively reduces, even if the Rayleigh number, being based on the enclosure height, rapidly increases. Physical reasons for the above effect are inspected

The main subject of this report is to take a general view on the experiment with several tens of TeV hadron colliders. Intensive studies have been carried out about the physics and the detectors for such hadron machines. The experimental prospect of hadron colliders based on the studies and the view of the author are presented. To obtain a fundamental knowledge on the experiment with hadron colliders, the general properties of hadron scattering should be investigated. First, the total cross sections and charged particle multiplicity are estimated, and hard scattering process is reviewed. The cross sections for some interesting hard scattering process are summarized. The most serious problem for the experiment with hadron colliders is to pick out useful signals from enormous QCD back-ground processes, and a possibility of finding heavy Higgs bosons is discussed in detail as an example. On the basis of these studies, the requirement which general purpose detectors should satisfy is considered. Also the important machine parameters from experimental viewpoint are discussed. High energy hadron colliders have a potentiality to reveal new physics in TeV region, but the preparation for unexpected physics is necessary. (Kako, I.)

This brochure illustrates the incredible journey of a proton as he winds his way through the CERN accelerator chain and ends up inside the Large Hadron Collider (LHC). The LHC is CERN's flagship particle accelerator which can collide protons together at close to the speed of light, creating circumstances like those just seconds after the Big Bang.

This brochure illustrates the incredible journey of a proton as he winds his way through the CERN accelerator chain and ends up inside the Large Hadron Collider (LHC). The LHC is CERN's flagship particle accelerator which can collide protons together at close to the speed of light, creating circumstances like those just seconds after the Big Bang.

We discuss the physics opportunities and detector challenges at future hadron colliders. As guidelines for energies and luminosities we use the proposed luminosity and/or energy upgrade of the LHC (SLHC), and the Fermilab design of a Very Large Hadron Collider (VLHC). We illustrate the physics capabilities of future hadron colliders for a variety of new physics scenarios (supersymmetry, strong electroweak symmetry breaking, new gauge bosons, compositeness and extra dimensions). We also investigate the prospects of doing precision Higgs physics studies at such a machine, and list selected Standard Model physics rates.

This paper describes the research work in high energy physics by the group at the University of California, Riverside. Work has been divided between hadron collider physics and e{sup +}-e{sup {minus}} collider physics, and theoretical work. The hadron effort has been heavily involved in the startup activities of the D-Zero detector, commissioning and ongoing redesign. The lepton collider work has included work on TPC/2{gamma} at PEP and the OPAL detector at LEP, as well as efforts on hadron machines.

The spin-flip amplitudes of the meson-nucleon and nucleon-nucleon scattering are calculated in the framework of the dynamic model taking into account the interactions at large distances. The consideration of the strong form factors at the corresponding vertex and preasymptotic contributions allowed us to describe correctly the differential cross sections and spin effects of hadron-hadron scattering at high energies. On this basis predictions at high and superhigh energies are made. (orig.)

An event generator to simulate ultrarelativistic hadronhadron collisions is proposed. It is based on the following main assumptions: the process can be divided into two independent steps, string formation and string fragmentation; strings are formed as a consequence of color exchange between a quark of the projectile and a quark of the target; the fragmentation of strings is the same as in e + e - annihilation or in lepton nucleon scattering. 11 refs., 4 figs

of the dangerous parameters is high humidity of air. Moisture can inevitable reach the electronics either due to diffusion through the wall of an enclosure or small holes, which are designed for electrical or other connections. A driving force for humid air movement is the temperature difference between...... the operating electronics and the surrounding environment. This temperature, thus, gives rise to a natural convection, which we also refer to as breathing. Robust and intelligent enclosure designs must account for this breathing as it can significantly change the humidity distribution in the enclosure....... The approach is verified by measuring the temperature and humidity profiles in a test setup (container) while also considering the moisture flux outside the container. The test setup is a vertical cylinder enabling to simplify the modeling to 2D case. The experimental measurements are compared to simulations...

Full Text Available In many practical systems, acoustic radiation control on noise sources contained within a finite volume by an acoustic enclosure is of great importance, but difficult to be accomplished at low frequencies due to the enhanced acoustic-structure interaction. In this work, we propose to use acoustic metamaterials as the enclosure to efficiently reduce sound radiation at their negative-mass frequencies. Based on a circularly-shaped metamaterial model, sound radiation properties by either central or eccentric sources are analyzed by numerical simulations for structured metamaterials. The parametric analyses demonstrate that the barrier thickness, the cavity size, the source type, and the eccentricity of the source have a profound effect on the sound reduction. It is found that increasing the thickness of the metamaterial barrier is an efficient approach to achieve large sound reduction over the negative-mass frequencies. These results are helpful in designing highly efficient acoustic enclosures for blockage of sound in low frequencies.

Expressions for the shielding effectiveness of a conductive spherical enclosure excited by a Hertzian dipole have been derived using the dyadic Green's function technique. This technique has the advantage that the fields inside or outside the enclosure due to arbitrary current distribution may be found by employing the same set of dyadic Green's functions. The shielding effectiveness for plane wave incidence has been determined by considering the limiting case of the current source external to the spherical shell. Computed values of shielding effectiveness deduced in this manner have been compared with those obtained by the numerical evaluation of the expressions derived by earlier authors. The theory presented here may be useful to EMC (electromagnetic compatibility) engineers who must consider electromagnetic coupling from current sources in the vicinity of shielding enclosures.

Electronic systems are sometimes exposed to harsh environmental conditions of temperature and humidity. Moisture transfer into electronic enclosures and condensation can cause several problems such as corrosion and alteration in thermal stresses. It is therefore essential to study the local climate inside the enclosures to be able to protect the electronic systems. In this work, moisture transfer into a typical electronic enclosure is numerically studied using CFD. In order to reduce the CPU-time and make a way for subsequent factorial design analysis, a simplifying modification is applied in which the real 3D geometry is approximated by a 2D axial symmetry one. The results for 2D and 3D models were compared in order to calibrate the 2D representation. Furthermore, simulation results were compared with experimental data and good agreement was found.

A hadron radiation installation adapted to subject a target to irradiation by a hadron radiation beam, said installation comprising: - a target support configured to support, preferably immobilize, a target: - a hadron radiation apparatus adapted to emit a hadron radiation beam along a beam axis to

Full Text Available In order to predict the immunity of a generic missile (GENEC, not only the electronic system but also the enclosure has to be taken into consideration. While a completely closed metallic missile enclosure shows a high electric shielding effectiveness, it is decreased substantially by apertures which could not be avoided by different reasons. The shielding effectiveness of the generic missile could be investigated by means of a hollow cylinder equipped with different apertures. Numerical simulations and measurements of this hollow cylinder will be carried out and analyzed.

The Large Hadron Collider (LHC) Phase II upgrade aims to increase the accelerator luminosity by a factor of 5-10. Due to the expected higher radiation levels and the aging of the current electronics, a new readout system of the ATLAS experiment hadronic calorimeter (TileCal) is needed. A prototype of the upgrade TileCal electronics has been tested using the beam from the Super Proton Synchrotron (SPS) accelerator at CERN. Data were collected with beams of muons, electrons and hadrons at various incident energies and impact angles. The muons data allow to study the dependence of the response on the incident point and angle in the cell. The electron data are used to determine the linearity of the electron energy measurement. The hadron data will allow to tune the calorimeter response to pions and kaons modelling to improve the reconstruction of the jet energies. The results of the ongoing data analysis are discussed in the presentation.

With the possibility for 'exact' calculations within the framework of a fundamental theory, QCD, the role of models in strong interaction physics is changing radically. The relevance of detailed numerical model studies is diminishing with the development of those exact, numerical approaches to QCD. On the other hand, the insight gained from such purely numerical studies is necessarily limited and must be complemented by the more qualitative but also more intuitive insight gained from model studies. In particular, the subject of hadron-hadron interactions requires model studies to relate the wide variety of strong interaction physics to the fundamental properties of strong interaction physics. The author reports on such model studies of the hadron-hadron interaction

The opportunities to study B physics in a hadron collider are discussed. Emphasis is placed on the technological developments necessary for these experiments. The R and D program of the Bottom Collider Detector group is reviewed. (author)

It is shown that there are resonances with high mass and long lives, at the very least, longer than the 10 -23 second transit time across a hadron. The theoretical and then the experimental approaches to this problem are described

The CMS hadron calorimeter is a sampling calorimeter with brass absorber and plastic scintillator tiles with wavelength shifting fibres for carrying the light to the readout device. The barrel hadron calorimeter is complemented with a outer calorimeter to ensure high energy shower containment in CMS and thus working as a tail catcher. Fabrication, testing and calibrations of the outer hadron calorimeter are carried out keeping in mind its importance in the energy measurement of jets in view of linearity and resolution. It will provide a net improvement in missing $\\et$ measurements at LHC energies. The outer hadron calorimeter has a very good signal to background ratio even for a minimum ionising particle and can hence be used in coincidence with the Resistive Plate Chambers of the CMS detector for the muon trigger.

For the ZEUS detector, which is presently under construction, a prototype of the forward calorimeter was built and its performance was investigated. All measuremen were performed with the test beams of the accelerators PS and SPS at CERN in Geneva. Hadrons, electrons and muons we available in an energy range from 0.5 to 100 GeV. The calorimeter prototype consists of 16 towers with 20 cmx20 cross-section. Each tower is longitudinally segmented in a electromagnetic section with 25 cm length and two hadronic sections, each with 63 cm length. All towers are built fro plates of depleted uranium of 3.3 mm thickness and scintillator plates with 2.6 mm thickness. In previous measurements these values were determined to be the best choice, in order to obtain the best energy resolution. The light readout was performed by wavelength shifters (PMMA, doped with Y-7) and photomultiplier tubes. Some important results of these measurements are: The design energy resolution 35%/√E+2% for hadrons and 18%/√E+2% for electrons has been achieved. The e/h-ratio w determined to be 1.00±0.01. The linearity of the calorimeter was found to be better than 2% in the investigated energy range. The nonuniformity of the wavelength shifter was improved to be better than 2% by reflectors with absorbing patterns printed on it. The deviations from uniformity at the boundaries between calorimeter modules were reduced to 7% by insertion of lea sheets in the gap. A big advantage was the possibility to use the natural radioactivity of the uranium for calibrati purposes. (orig.) [de

In the upcoming month the KLOE-2 data taking campaign will start at the upgraded DAFNE phi-factory of INFN Laboratori Nazionali di Frascati. The main goal is to collect an integrated luminosity of about 20 fb^(-1) in 3-4 years in order to refine and extend the KLOE program on both kaon physics and hadron spectroscopy. Here the expected improvements on the results of hadron spectroscopy are presented and briefly discussed.

Successful cancer patient survival and local tumor control from hadron radiotherapy warrant a discussion of potential secondary late effects from the radiation. The study of late-appearing clinical effects from particle beams of protons, carbon, or heavier ions is a relatively new field with few data. However, new clinical information is available from pioneer hadron radiotherapy programs in the USA, Japan, Germany and Switzerland. This paper will review available data on late tissue effects from particle radiation exposures, and discuss its importance to the future of hadron therapy. Potential late radiation effects are associated with irradiated normal tissue volumes at risk that in many cases can be reduced with hadron therapy. However, normal tissues present within hadron treatment volumes can demonstrate enhanced responses compared to conventional modes of therapy. Late endpoints of concern include induction of secondary cancers, cataract, fibrosis, neurodegeneration, vascular damage, and immunological, endocrine and hereditary effects. Low-dose tissue effects at tumor margins need further study, and there is need for more acute molecular studies underlying late effects of hadron therapy.

Since the beginning of 2007, HCAL has made significant progress in the installation and commissioning of both hardware and software. A large fraction of the physical Hadron Calorimeter modules have been installed in UX5. In fact, the only missing pieces are HE- and part of HO. The HB+/- were installed in the cryostat in March. HB scintillator layer-17 was installed above ground before the HB were lowered. The HB- scintillator layer-0 was installed immediately after completion of EB- installation. HF/HCAL Commissioning The commissioning and checkout of the HCAL readout electronics is also proceeding at a rapid pace in Bldg. 904 and USC55. All sixteen crates of HCAL VME readout electronics have been commissioned and certified for service. Fifteen are currently operating in the S2 level of USC55. The last crate is being used for firmware development in the Electronics Integration Facility in 904. All installed crates are interfaced to their VME computers and receive synchronous control from the fully-equipp...

The components that contribute to the signal of a hadron calorimeter and the factors that affect its performance are discussed, concentrating on two aspects; energy resolution and signal linearity. Both are decisively dependent on the relative response to the electromagnetic and the non-electromagnetic shower components, the e/h signal ratio, which should be equal to 1.0 for optimal performance. The factors that determine the value of this ratio are examined. The calorimeter performance is crucially determined by its response to the abundantly present soft neutrons in the shower. The presence of a considerable fraction of hydrogen atoms in the active medium is essential for achieving the best possible results. Firstly, this allows one to tune e/h to the desired value by choosing the appropriate sampling fraction. And secondly, the efficient neutron detection via recoil protons in the readout medium itself reduces considerably the effect of fluctuations in binding energy losses at the nuclear level, which dominate the intrinsic energy resolution. Signal equalization, or compensation (e/h = 1.0) does not seem to be a property unique to 238 U, but can also be achieved with lead and probably even iron absorbers. 21 refs.; 19 figs

The use of accelerators in cancer therapy has a long and successful history. Present-day developments are directed towards high-precision irradiation of tumours with hadron beams. Firstly, hadron beams localise the irradiation in a low-dose, entry corridor with a high-dose volume at the precise depth of the Bragg peak that is set by the beam energy. Secondly, the heavier the ion the lower the defocusing effect of the multiple scattering in the patient's body, but other considerations such as nuclear fragmentation and the physics of the energy transfer lead to an optimum in the mass range around carbon. The ability to deliver the irradiation to small volumes with millimetre precision opens the way to treating complex-shaped tumours that are in close proximity to vital organs. The creation of the well-focused ion beams at variable, but precise, energies over spill times of a few seconds is best done with a synchrotron using slow extraction. However, slow extraction is notoriously difficult to stabilise and it i...

MINOS, the Main Injector Neutrino Oscillation Search, will study neutrino flavor transformations using a Near detector at the Fermi National Accelerator Laboratory and a Far detector located in the Soudan Underground Laboratory in northern Minnesota. The MINOS collaboration also constructed the Ca1Det (calibration detector), a smaller version of the Near and Far detectors, to determine the topological and signal response to hadrons, electrons and muons. The detector was exposed to test- beams in the CERN Proton Synchrotron East Hall during 2001–2003, where it collected events at momentum settings between 200 MeV/c and 10 GeV/c. In this dissertation we present results of the Ca1Det experiment, focusing on the topological and signal response to hadrons. We briefly describe the MINOS experiment and its vi iron-scintillator tracking-sampling calorimeters as a motivation for the CalDet experiment. We discuss the operation of the CalDet in the beamlines as well as the trigger and particle identification s...

Shielding effectiveness (SE) is an important measure of how well an enclosure reduces the electromagnetic (EM) field incident upon it. Commonly, when the shielding effectiveness of an enclosure is stated it is for the case when the enclosure is empty. Including contents such as printed circuit boards (PCBs) in the enclosure will affect the shielding effectiveness as the PCB absorbs EM energy. One technique of determining how much energy a PCB absorbs is to measure its absorption cross section...

The subject of this presentation was intended to cover the history of hadron colliders. However this broad topic is probably better left to historians. I will cover a much smaller portion of this subject and specialize my subject to the history of the Tevatron. As we will see, the Tevatron project is tightly entwined with the progress in collider technology. It occupies a unique place among accelerators in that it was the first to make use of superconducting magnets and indeed the basic design now forms a template for all machines using this technology. It was spawned in an incredibly productive era when new ideas were being generated almost monthly and it has matured into our highest energy collider complete with two large detectors that provide the major facility in the US for probing high Pt physics for the coming decade

The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question here by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.

A feedback microprocessor has been built for the TEVATRON. It has been constructed to be applicable to hadron colliders in general. Its inputs are realtime accelerator measurements, data describing the state of the TEVATRON, and ramp tables. The microprocessor software includes a finite state machine. Each state corresponds to a specific TEVATRON operation and has a state-specific TEVATRON model. Transitions between states are initiated by the global TEVATRON clock. Each state includes a cyclic routine which is called periodically and where all calculations are performed. The output corrections are inserted onto a fast TEVATRON-wide link from which the power supplies will read the realtime corrections. We also store all of the input data and output corrections in a set of buffers which can easily be retrieved for diagnostic analysis. In this paper we will describe this device and its use to control the TEVATRON tunes as well as other possible applications

Corrosion reliability of electronic products is a key factor for electronics industry, and today there is a large demand for performance reliability in a wide range of temperature and humidity during day and night time periods. Corrosion failures are still a challenge due to the combined effects...... of temperature, humidity and corrosion accelerating species in the atmosphere. Moreover the surface region of printed circuit board assemblies is often contaminated by various aggressive chemical species.This study describes the overall effect of the exposure to severe climate conditions and internal heat cycles...... on the humidity and temperature profile inside typical electronic enclosures. Defined parameters include external temperature and humidity conditions, temperature and time of the internal heat cycle, thermal mass, and ports/openings size. The effect of the internal humidity on electronic reliability has been...

The field strength distribution in real reverberant enclosures, such as plane fuselages or factory interiors, is often too complex for a deterministic approach but can also fall into a category where theoretical assumptions normal for perfect cavities no longer hold. It can vary a lot depending on

The cichlid fish, Sargochromis codringtoni has been suggested as an alternative to the use of molluscicides for controlling schistosome intermediate host snails. To determine the effect of the fish on Bulinus globosus and Biomphalaria pfeifferi populations in irrigation canals, an enclosure experiment was carried out.

The invention relates to a sealed enclosure of the glove-box type. According to the invention, the box-frame comprises: angle-bars having a right-angled cross-section, sealing joints, tightening bars and fastening means [fr

Natural convective fluid flow and heat transfer in rectangular enclosures bounded by three adiabatic walls and one thermally active and differentially heated vertical side were predicted by using the finite difference method. The effects of different temperature functions, aspect ratio and Rayleigh numbers on the natural ...

We present a Monte Carlo calculation of the micro-canonical ensemble of the ideal hadron-resonance gas including all known states up to a mass of about 1.8 GeV and full quantum statistics. The micro-canonical average multiplicities of the various hadron species are found to converge to the canonical ones for moderately low values of the total energy, around 8 GeV, thus bearing out previous analyses of hadronic multiplicities in the canonical ensemble. The main numerical computing method is an importance sampling Monte Carlo algorithm using the product of Poisson distributions to generate multi-hadronic channels. It is shown that the use of this multi-Poisson distribution allows for an efficient and fast computation of averages, which can be further improved in the limit of very large clusters. We have also studied the fitness of a previously proposed computing method, based on the Metropolis Monte Carlo algorithm, for event generation in the statistical hadronization model. We find that the use of the multi-Poisson distribution as proposal matrix dramatically improves the computation performance. However, due to the correlation of subsequent samples, this method proves to be generally less robust and effective than the importance sampling method. (orig.)

Identifying hadronic molecular states and/or hadrons with multiquark components either with or without exotic quantum numbers is a long-standing challenge in hadronic physics. We suggest that studying the production of these hadrons in relativistic heavy ion collisions offers a promising resolution to this problem as yields of exotic hadrons are expected to be strongly affected by their structures. Using the coalescence model for hadron production, we find that, compared to the case of a nonexotic hadron with normal quark numbers, the yield of an exotic hadron is typically an order of magnitude smaller when it is a compact multiquark state and a factor of 2 or more larger when it is a loosely bound hadronic molecule. We further find that some of the newly proposed heavy exotic states could be produced and realistically measured in these experiments.

The three colliders operated to date have taught us a great deal about the behaviour of both bunched and debunched beams in storage rings. The main luminosity limitations are now well enough understood that most of them can be stronglu attenuated or eliminated by approriate design precautions. Experience with the beam-beam interaction in both the SPS and the Tevatron allow us to predict the performance of the new generation of colliders with some degree of confidence. One of the main challenges that the accelerator physicist faces is the problem of the dynamic aperture limitations due to the lower field quality expected, imposed by economic and other constraints.

Enclosure equipment, such as glove box, tong box etc., is an important kind of equipment for nuclear industry and nuclear scientific research. The status of the establishment and implementation of Chinese standards on enclosure equipment is briefly described. Some problems and deficiency existing in these standards are pointed out. The ISO standard projects on containment enclosures as well as their present progress situations are introduced. The measure for updating Chinese standards on enclosure equipment in accordance with the principle of adopting international standards are recommended. Some issues which should be taken into account in adopting ISO standards on containment enclosures are also discussed

COMPASS is a fixed-target experiment at the CERN SPS for the investigation of the structure and the dynamics of hadrons. The experimental setup features a large acceptance and high momentum resolution spectrometer including particle identification and calorimetry and is therefore ideal to access a broad range of different final states. Following the promising observation of a spin-exotic resonance during an earlier pilot run, COMPASS focused on light-quark hadron spectroscopy during the years 2008 and 2009. A data set, world leading in terms of statistics and resolution, has been collected with a 190GeV/c hadron beam impinging on either liquid hydrogen or nuclear targets. Spin-exotic meson and glueball candidates formed in both diffractive dissociation and central production are presently studied. Since the beam composition includes protons, the excited baryon spectrum is also accessible. Furthermore, Primakoff reactions have the potential to determine radiative widths of the resonances and to probe chiral pe...

Precision tests of the Standard Theory require theoretical predictions taking into account higher-order quantum corrections. Among these vacuum polarisation plays a predominant role. Vacuum polarisation originates from creation and annihilation of virtual particle–antiparticle states. Leptonic vacuum polarisation can be computed from quantum electrodynamics. Hadronic vacuum polarisation cannot because of the non-perturbative nature of QCD at low energy. The problem is remedied by establishing dispersion relations involving experimental data on the cross section for e+ e− annihilation into hadrons. This chapter sets the theoretical and experimental scene and reviews the progress achieved in the last decades thanks to more precise and complete data sets. Among the various applications of hadronic vacuum polarisation calculations, two are emphasised: the contribution to the anomalous magnetic moment of the muon, and the running of the fine structure constant α to the Z mass scale. They are fundamental ingre...

We present new results for the r hadronic spectral functions analysis using data accumulated by the ALEPH detector at LEP during the years 1991-94. The vector and the axial-vector spectral functions are determined from their respective unfolded, i.e., physical invariant mass spectra. The r vector and axial-vector hadronic widths and certain spectral moments are exploited to measure a, and nonperturbative contributions at the r mass scale. The best, and experimentally and theoretically most robust, determination of a,(Mr) is obtained from the inclusive (V + A) fit that yields a,(Mr) = 0.349 ± 0.018 giving a,(Mz) = 0.1 212 ± 0.0022 after the evolution to the mass of the Z boson. The approach of the Operator Product Expansion (OPE) is tested experimentally by means of an evolution of the r hadronic width to masses smaller than the r mass.

We argue that the standard decompositions of the hadron mass overlook pressure effects, and hence should be interpreted with great care. Based on the semiclassical picture, we propose a new decomposition that properly accounts for these pressure effects. Because of Lorentz covariance, we stress that the hadron mass decomposition automatically comes along with a stability constraint, which we discuss for the first time. We show also that if a hadron is seen as made of quarks and gluons, one cannot decompose its mass into more than two contributions without running into trouble with the consistency of the physical interpretation. In particular, the so-called quark mass and trace anomaly contributions appear to be purely conventional. Based on the current phenomenological values, we find that in average quarks exert a repulsive force inside nucleons, balanced exactly by the gluon attractive force.

The paper concerns some of the ideas underlying quarks and their interactions, the way that quarks build up hadrons, and the extent to which the QCD theory can be applied to phenomena involving nuclei. The article is part of the Proceedings of the International School of Nuclear Physics, Erice, 1987. A description is given of quarks and multiplets. Colour is discussed with respect to: evidence for colour, a non abelian Su(3) theory, the pauli principle at work in hadrons, and spin flavour correlations and magnetic moments. Colour, gluons and the inter quark potential are also examined. (UK)

The non-perturbative nature of quantum chromodynamics (QCD) has historically left a gap in our understanding of the connection between the fundamental theory of the strong interactions and the rich structure of experimentally observed phenomena. For the simplest properties of stable hadrons, this is now circumvented with the use of lattice QCD (LQCD). In this talk I discuss a path towards a rigorous determination of few-hadron observables from LQCD. I illustrate the power of the methodology by presenting recently determined scattering amplitudes in the light-meson sector and their resonance content.

For 50 years particle accelerators employing accelerating cavities and deflecting magnets have been developed at a prodigious rate. New accelerator concepts and hardware ensembles have yielded great improvements in performance and GeV/$. The great idea for collective acceleration resulting from intense auxiliary charged-particle beams or laser light may or may not be just around the corner. In its absence, superconductivity (SC) applied both to rf cavities and to magnets opened up the potential for very large accelerators without excessive energy consumption and with other economies, even with the cw operation desirable for colliding beams. HEP has aggressively pioneered this new technology: the Fermilab single ring 1 TeV accelerator - 2 TeV collider is near the testing stage. Brookhaven National Laboratory's high luminosity pp 2 ring 800 GeV CBA collider is well into construction. Other types of superconducting projects are in the planning stage with much background R and D accomplished. The next generation of hadron colliders under discussion involves perhaps a 20 TeV ring (or rings) with 40 TeV CM energy. This is a very large machine: even if the highest practical field B approx. 10T is used, the radius is 10x that of the Fermilab accelerator. An extreme effort to get maximum GeV/$ may be crucial even for serious consideration of funding

We show that a good part of the hadronic resonances could very well not be resonances at all. We extend the principle of Ramsauer effect of atomic physics to other Physics' areas and especially to hadronic physics

We discuss quark recombination applied to the hadronization of a quark gluon plasma. It has been shown that the quark recombination model can explain essential features of hadron production measured in high energy heavy ion collisions.

Problems in nuclear physics generally involve several nucleons due to the composite structure of the atomic nucleus. To study such systems one has to solve the Schroedinger equation and therefore has to know a nucleon-nucleon potential. Experimental data and theoretical considerations indicate that nucleons consist of constituent particles, called quarks. Today, Quantum Chromodynamics (QCD) is believed to be the fundamental theory of strong interactions. Consequently, one should try to understand the nucleon-nucleon interaction from first principles of QCD. At nucleonic distances the strong coupling constant is large. Thus, a perturbative treatment of QCD low energy phenomena is not adequate. However, the formulation of QCD on a four-dimensional Euclidean lattice (lattice QCD) makes it possible to address the nonperturbative aspects of the theory. This approach has already produced valuable results. For example, the confinement of quarks in a nucleon has been demonstrated, and hadron masses have been calculated In this thesis various methods to extract the hadron-hadron interactions from first principles of lattice QCD are presented. One possibility is to consider systems of two static hadrons. A comparison of results in pure gluonic vacuum and with sea quarks is given for both the confinement and the deconfinement phase of QCD. Numerical simulations yield attractive potentials in the overlap region of the hadrons for all considered systems. In the deconfinement phase the resulting potentials are shallower reflecting the dissolution of the hadrons. A big step towards the simulation of realistic two-hadron systems on the lattice is the consideration of mesons consisting of dynamic valence quarks. This is done for the two most important fermionic discretization schemes in the pure gluonic vacuum. A calculation in coordinate space utilizing Kogut-Susskind fermions for the valence quarks yields meson-meson potentials with a long ranged interaction, an intermediate

The idea is to demonstrate the existence of a point structure of hadrons by D.T.U. (dual topological unitarization), an approach which involves confinement since its fundamental objects are hadrons. The aim is: - to establish rules of complementarity or correspondance between DTU and QCD at the level of invariant scale distributions; - to use these rules to predict either parton densities from hadron processes, or hadron distributions on the basis of the parton densities taken from deep inelastic processes [fr

This paper outlines the end to end effort to produce lightweight electronics enclosures for NASA GSFC electronics applications with the end goal of presenting an array of lightweight box options for a flight opportunity. Topics including the development of requirements, design of three different boxes, utilization of advanced materials and processes, and analysis and test will be discussed. Three different boxes were developed independently and in parallel. A lightweight machined Aluminum box, a cast Aluminum box and a composite box were designed, fabricated, and tested both mechanically and thermally. There were many challenges encountered in meeting the requirements with a non-metallic enclosure and the development of the composite box employed several innovative techniques.

This article proposes a method for designing electromagnetic compatibility shielding enclosures using a peer-to-peer based distributed optimization system based on a modified particle swarm optimization algorithm. This optimization system is used to obtain optimal solutions to a shielding enclosure design problem efficiently with respect to both electromagnetic shielding efficiency and thermal performance. During the optimization procedure it becomes evident that optimization algorithms and computational models must be properly matched in order to achieve efficient operation. The proposed system is designed to be tolerant of faults and resource heterogeneity, and as such would find use in environments where large-scale computing resources are not available, such as smaller engineering companies, where it would allow computer-aided design by optimization using existing resources with little to no financial outlay.

Present work numerically investigates three-dimensional non-linear aerodynamic structures of airflow in a slot-ventilated compartment with three ports. A numerical code based on Reynolds average Navier-Stokes equations and Reynolds stress turbulence model was validated, and successfully conducted for simulating the airflow in the studied room. Numerical results are particularly presented to illustrate the effects of the inlet airflow velocity, enclosure width, supplying ports elevation and Reynolds number on the multiple flow patterns and the associated ventilation flow rate and blowing-axial momentum decay for equal-magnitude opposite jet flows. It is shown that the room airflow rate and the shift of jet flow interface can be promoted or inhibited, depending strongly on the jet velocity, enclosure width and elevation of supplying ports

An enclosure or burrow restrains an awake animal during an imaging procedure. A tubular body, made from a radiolucent material that does not attenuate x-rays or gamma rays, accepts an awake animal. A proximal end of the body includes an attachment surface that corresponds to an attachment surface of an optically transparent and optically uniform window. An anti-reflective coating may be applied to an inner surface, an outer surface, or both surfaces of the window. Since the window is a separate element of the enclosure and it is not integrally formed as part of the body, it can be made with optically uniform thickness properties for improved motion tracking of markers on the animal with a camera during the imaging procedure. The motion tracking information is then used to compensate for animal movement in the image.

The significance of interior humidity in attaining sustainable, durable, healthy and comfortable buildings is increasingly recognised. Given their significant interaction, interior humidity appraisals need a qualitative and/or quantitative assessment of interior moisture buffering. While...... suggested protocols for the simple and fast measurement of the moisture buffer potential of interior elements allow qualitative assessment, none of these are currently dependable for a wide range of moisture production regimes. In response to these flaws, this paper introduces the production......-adaptive characterisation of the moisture buffer potential of interior elements and corroborates their superposition toward a room-enclosure moisture buffer potential. It is verified that this enables qualitative comparison of enclosures in relation to interior moisture buffering. It is moreover demonstrated that it forms...

In this paper, the effects of volumetric heat sources on natural convection heat transfer and flow structures in a wavy-walled enclosure are studied numerically. The governing differential equations are solved by an accurate finite-volume method. The vertical walls of enclosure are assumed to be heated differentially whereas the two wavy walls (top and bottom) are kept adiabatic. The effective governing parameters for this problem are the internal and external Rayleigh numbers and the amplitude of wavy walls. It is found that both the function of wavy wall and the ratio of internal Rayleigh number (Ra I ) to external Rayleigh number (Ra E ) affect the heat transfer and fluid flow significantly. The heat transfer is predicted to be a decreasing function of waviness of the top and bottom walls in case of (IRa/ERa)>1 and (IRa/ERa)<1. (authors)

Trackers based on scintillating-fiber technology are being considered by the Solenoidal Detector Collaboration at SSC and the D[phi] collaboration at Fermilab. An important issue is the effect of the radiation existing in the detector cores on fiber properties. Most studies of radiation damage in scintillators have irradiated small bulk samples rather than fibers, and have used X-rays, [sup 60]Co gammas, or electron beams, often at accelerated rates. The authors have irradiated some 600 fibers in the Fermilab Tevatron C[phi] area, thereby obtaining a hadronic irradiation at realistic rates. Four-meter-long samples of ten Bicron polystyrene-based fiber types, maintained in air, dry nitrogen, argon, and vacuum atmospheres within stainless-steel tubes, were irradiated for seven weeks at various distances from the accelerator beam pipes. Maximum doses, measured by thermoluminescence detectors, were about 80 Krad. Fiber properties, particularly light yield and attenuation length, have been measured over a one-year period. A description of the work together with the results is presented. At the doses achieved, corresponding to a few years of actual fiber-tracking detector operation, little degradation is observed. In addition, recovery after several days' exposure to air has been noted. Properties of unirradiated samples kept in darkness show no changes after one year.

The treatment of cancer with accelerator beams has a long history with betatrons, linacs, cyclotrons and now synchrotrons being exploited for this purpose. Treatment techniques can be broadly divided into the use of spread-out beams and scanned 'pencil' beams. The Bragg-peak behaviour of hadrons makes them ideal candidates for the latter. The combination of precisely focused 'pencil' beams with controllable penetration (Bragg peak) and high, radio-biological efficiency (light ions) opens the way to treating the more awkward tumours that are radio-resistant, complex in shape and lodged against critical organs. To accelerate light ions (probably carbon) with pulse-to-pulse energy variation, a synchrotron is the natural choice. The beam scanning system is controlled via an on-line measurement of the particle flux entering the patient and, for this reason, the beam spill must be extended in time (seconds) by a slow-extraction scheme. The quality of the dose intensity profile ultimately depends on the uniformity o...

The treatment of cancer with accelerator beams has a long history with linacs, cyclotrons and now synchrotrons being exploited for this purpose. Treatment techniques can be broadly divided into the use of spread-out beams and scanned 'pencil' beams. The Bragg-peak behaviour of hadrons makes them ideal candidates for the latter. The combination of precisely focused 'pencil' beams with controllable penetration (Bragg peak) and high, radio-biological efficiency (light ions) opens the way to treating the more awkward tumours that are radio-resistant, complex in shape and lodged against critical organs. To accelerate light ions (probably carbon) with pulse-to-pulse energy variation, a synchrotron is the natural choice. The beam scanning system is controlled via an on-line measurement of the particle flux entering the patient and, for this reason, the beam spill must be extended in time (seconds) by a slow-extraction scheme. The quality of the dose intensity profile ultimately depends on the uniformity of the beam ...

The assumption of diffuse reflection (Lambert's Law) leads to integral equations for the wall intensity in a reverberant sound field in the steady state and during decay. The latter equation, in the special case of a spherical enclosure with uniformly absorbent walls and uniform wall intensity, allows exponential decay with a decay time which agrees closely with the Norris--Eyring prediction. The sound-intensity and sound-energy density in the medium, during decay, are also calculated

The aim of this conference is to review the status, progress and future plans of the field of hadron spectroscopy, and relate these to understanding hadron dynamics. This series of biennial conferences began in 1985 at College Park, Maryland, USA, with the 15th conference held in Nara, Japan in November 2013. Hadron 2015 will be organized by Jefferson Lab.

The hadronic calorimeter is assembled on the end-cap of the CMS detector in the assembly hall. Hadronic calorimeters measure the energy of particles that interact via the strong force, called hadrons. The detectors are made in a sandwich-like structure where these scintillator tiles are placed between metal sheets.

about QCD by treating Nc as a variable, so the study of hadron properties as a function of quark mass is leading us to a deeper appreciation of hadron structure. As examples we cite progress in using the chiral properties of QCD to connect hadron masses, magnetic moments, charge radii and structure functions calculated ...

Introduction. Top quark studies are an important aspect of physics program at the Tevatron and the ..... Sin- gle top quark production at hadron colliders was first established in 2007 by the Tevatron experiments. Basically, there are three production processes: the s-channel, the t-channel ... So confronting the measurement.

New results on top quark production are presented from four hadron collider experiments: CDF and D0 at the Tevatron, and ATLAS and CMS at the LHC. Cross-sections for single top and top pair production are discussed, as well as results on the top–antitop production asymmetry and searches for new physics including ...

We consider the application of regularization by dimensional reduction to NLO corrections of hadronic processes. The general collinear singularity structure is discussed, the origin of the regularization-scheme dependence is identified and transition rules to other regularization schemes are derived.

The latest years have seen a resurrection of interest in searches for exotic states motivated by tantalising observations by Belle and CDF. Using the data collected at pp collisions at 7 and 8 TeV by the LHCb experiment we present the unambiguous new observation of exotic charmonia hadrons produced in B decays.

The paper gives a brief overview of CERN's Large Hadron Collider (LHC) project. After an outline of the physics motivation, we describe the LHC machine, interaction rates, experimental challenges, and some important physics channels to be studied. Finally we discuss the four experiments planned at the LHC: ATLAS, CMS, ALICE and LHC-B.

The paper gives a brief overview of CERN's Large Hadron Collider (LHC) project. After an outline of the physics motivation, we describe the LHC machine, interaction rates, experimental challenges, and some important physics channels to be studied. Finally we discuss the four experiments planned at the LHC: ATLAS, CMS, ALICE and LHC-B

PARTNER, the Particle Training Network for European Radiotherapy, has recently been awarded 5.6 million euros by the European Commission. The project, which is coordinated by CERN, has been set up to train researchers of the future in hadron therapy and in doing so aid the battle against cancer.

We study the properties of charmed hadrons in dense matter within a coupled-channel approach which accounts for Pauli blocking effects and meson self-energies in a self-consistent manner We analyze the behaviour in this dense environment of dynamically-generated baryonic resonances as well as the

The Discovery Channel Telescope (DCT) is a project of Lowell Observatory, undertaken with support from Discovery Communications, Inc., to design and construct a 4-meter class telescope and support facility on a site approximately 40 miles southeast of Flagstaff, Arizona. The Discovery Channel Telescope Enclosure was completed in November, 2009. The DCT Enclosure is an octagonal steel structure with insulated composite panel skin. The structure rotates on sixteen compliant bogie assemblies attached to the stationary facility. The shutter is composed of two independently actuated, bi-parting structures that provide a viewing aperture. To improve seeing, the skin is covered with adhesive aluminum foil tape and the enclosed observing area is passively ventilated via rollup doors. The observing area can also be actively ventilated using a downdraft fan, and there are provisions for upgrades to active air conditioning. The enclosure also includes operational equipment such as a bridge crane, personnel lift, and access platforms. This paper discusses some of the design trades as well as the construction challenges and lessons learned by the DCT Project, its designer M3 Engineering and Technology Corporation (M3), and its general contractor, Building and Engineering Contractors, Southwest (BEC Southwest).

The third symposium on Science of Hadrons under Extreme Conditions, organized by the Research Group for Hadron Science, Advanced Science Research Center, was held at Tokai Research Establishment of JAERI on January 29 to 31, 2001. The symposium was devoted for discussions and presentations of research results in wide variety of hadron physics such as nuclear matter, high-energy nuclear reactions, quantum chromodynamics, neutron stars, supernovae, nucleosynthesis as well as finite nuclei to understand various aspects of hadrons under extreme conditions. Twenty two papers on these topics presented at the symposium, including a special talk on the present status of JAERI-KEK joint project on high-intensity proton accelerator, aroused lively discussions among approximately 40 participants. The 20 of the presented papers are indexed individually. (J.P.N.)

During last century experiments with accelerators have been extensively used to improve our understanding of matter. They are now the most common tool used to search for new phenomena in high energy physics. In the process of probing smaller distances and searching for new particles the center of mass energy has been steadily increased. The need for higher center of mass energy made hadron colliders the natural tool for discovery physics. Hadron colliders have a major drawback with respect to electron-positron colliders. As shown in fig. 1 the total cross section is several orders of magnitude larger than the cross section of interesting processes such as top or Higgs production. This means that, in order to observe interesting processes, it’s necessary to have collisions at very high rates and it becomes necessary to reject on-line most of the “non-interesting” events. In this thesis I have described the wide range of SVT applications within CDF.

Three-body mechanisms in hadron collisions, and the role of the A = 3 system are reviewed, and the excitation functions of the proton deuteron system in interactions at energies up to 2.9 GeV are discussed. Meson productions at large angles reveal structures due to the mesonic degrees of freedom in the interaction of the proton with the deuteron, exciting n * isobars in intermediate states. Propagation in the nuclei does not seem to change the properties of these isobars. The meson double scattering mechanism provides a way to understand coherent meson production in pd capture. It is difficult to say whether this coherent process corresponds to eigenstates of the A = 3 system. The sharing of the momentum transfer between the three nucleons renders impossible the observation of high momentum components in coherent proton captures. The possible contribution of the electromagnetic probe in hadron physics with a multi GeV electron accelerator is mentioned

We conducted large-scale, replicated experiments to test the effects of two parallel power lines on area use, behaviour, and activity of semidomestic reindeer in enclosures. Yearling female reindeer were released into four 50 x 400 m enclosures; two treatment enclosures with power lines and two control enclosures. Reindeer from two herds, one from Kautokeino (domestic tame) and one from Vaga, (domestic wild) were tested separately and compared. Individual location within the enclosures was not affected by the power lines. Effects on restless behaviour were ambiguous, with slightly more restless behaviour in the treatment enclosures for the domestic tame reindeer, while the domestic wild reindeer maintained a stable level in the treatment enclosures, increasing with time in the control enclosures. Activity changes were slightly more common among animals within treatment enclosures for both herds, with no indication of habituation during the experiment. The domestic wild reindeer had more than three times the amount of restless behaviour than the domestic tame reindeer. Our study indicates that for reindeer in enclosures, the disturbance from a power line construction is negligible. This suggests that power lines are a minor disturbing factor compared to human handling when using fenced in areas like grazing gardens in reindeer husbandry.

At Present, about five thousands accelerators are devoted to biomedical applications. They are mainly used in radiotherapy, research and medical radioisotopes production. In this framework oncological hadron-therapy deserves particular attention since it represents a field in rapid evolution thanks to the joint efforts of laboratories with long experiences in particle physics. It is the case of CERN where the design of an optimised synchrotron for medical applications has been pursued. These lectures present these activities with particular attention to the new developments which are scientifically interesting and/or economically promising.

We present recent results of hadron spectroscopy and hadron–hadron interaction from the perspective of constituent quark models. We pay special attention to the role played by higher order Fock space components in the hadron spectra and the connection of this extension with the hadron–hadron interaction. The main goal of our description is to obtain a coherent understanding of the low-energy hadron phenomenology without enforcing any particular model, to constrain its characteristics and learn about low-energy realization of the theory

Recent results on two-particle correlations in rapidity space, forward-backward multiplicity correlations, charge correlations, flavour and baryon number correlations as well as Bose-Einstein correlations of identical particles are reviewed. Particular emphasis is given to the data from e + e - annihilation which serve in many respects as reference point in the interpretation of correlation phenomena observed in hadronic reactions. (orig.)

This final report describes both the engineering development of a DC bioeffects test enclosure for small laboratory animals, and the biological protocol for the use of such enclosures in the testing of animals to determine possible biological effects of the environment associated with HVDC transmission lines. The test enclosure which has been designed is a modular unit, which will house up to eight rat-sized animals in individual compartments. Multiple test enclosures can be used to test larger numbers of animals. A prototype test enclosure has been fabricated and tested to characterize its electrical performance characteristics. The test enclosure provides a simulation of the dominant environment associated with HVDC transmission lines; namely, a static electric field and an ion current density. A biological experimental design has been developed for assessing the effects of the dominant components of the HVDC transmission line environment.

With the emergence of extended producer responsibilityregulations for electronic devices, it is becoming increasingly importantfor electronics manufacturers to apply design for recycling (DFR) methodsin the design of plastic enclosures. This paper presents an analyticalframework for quantifying the environmental and economic benefits of DFRfor plastic computer enclosures during the design process, usingstraightforward metrics that can be aligned with corporate environmentaland financial performance goals. The analytical framework is demonstratedvia a case study of a generic desktop computer enclosure design, which isrecycled using a typical US "take-back" system for plastics from wasteelectronics. The case study illustrates how the analytical framework canbe used by the enclosure designer to quantify the environmental andeconomic benefits of two important DFR strategies: choosing high-valueresins and minimizing enclosure disassembly time. Uncertainty analysis isperformed to quantify the uncertainty surrounding economic conditions inthe future when the enclosure is ultimately recycled.

The theoretical physicist Victor “Viki” Weisskopf, Director-General of CERN from 1961 to 1965, once “There are three kinds of physicists, namely the machine builders, the experimental physicists, and the theoretical physicists. […] The machine builders are the most important ones, because if they were not there, we would not get into this small-scale region of space. If we compare this with the discovery of America, the machine builders correspond to captains and ship builders who really developed the techniques at that time. The experimentalists were those fellows on the ships who sailed to the other side of the world and then landed on the new islands and wrote down what they saw. The theoretical physicists are those who stayed behind in Madrid and told Columbus that he was going to land in India.” Rather than focusing on the theoretical physicists, as most popular science books on particle physics do, this beautifully written and also entertaining book is different in that, firstly, the main foc...

J / ψ (ψ 3686) decay is an ideal place to study light hadron spectroscopy. BESIII has collected the largest J / ψ , ψ (3686) samples in the world, including 1.31 billion J / ψ events and 0.5 billion ψ (3686) events. In this paper, latest experimental results at BESIII about the p p ‾ mass threshold enhancement and X (1835) are presented, which help us to understood the nature of the states around 1.8 GeV. Results of a model independent partial wave analysis of J / ψ → γπ0π0 and a partial wave analysis of J / ψ → γϕϕ are also presented, which may contribute to the search for possible scalar, pseudoscalar or tensor glueballs. More experimental results about light hadron spectroscopy at BESIII are expected.

Full Text Available Semiclassical light-front bound-state equations for hadrons are presented and compared with experiment. The essential dynamical feature is the holographic approach; that is, the hadronic equations in four-dimensional Minkowski space are derived as holograms of classical equations in a 5-dimensional anti-de Sitter space. The form of the equations is constrained by the imposed superconformal algebra, which fixes the form of the light-front potential. If conformal symmetry is strongly broken by heavy quark masses, the combination of supersymmetry and the classical action in the 5-dimensional space still fixes the form of the potential. By heavy quark symmetry, the strength of the potential is related to the heavy quark mass. The contribution is based on several recent papers in collaboration with Stan Brodsky and Guy de Téramond.

We show that application of quantum unitary groups, in place of ordinary flavor SU(n f ), to such static aspects of hadron phenomenology as hadron masses and mass formulas is indeed fruitful. So-called q-deformed mass formulas are given for octet baryons 1/2 + and decuplet baryons 3/2 + , as well as for the case of vector mesons 1 - involving heavy flavors. For deformation parameter q, rigid fixation of values is used. New mass sum rules of remarkable accuracy are presented. As shown in decuplet case, the approach accounts for effects highly nonlinear in SU(3)-breaking. Topological implication (possible connection with knots) for singlet vector mesons and the relation q ↔ Θ c (Cabibbo angle) in case of baryons are considered

The European PARTNER project developed a prototypical system for sharing hadron therapy data. This system allows doctors and patients to record and report treatment-related events during and after hadron therapy. It presents doctors and statisticians with an integrated view of adverse events across institutions, using open-source components for data federation, semantics, and analysis. There is a particular emphasis upon semantic consistency, achieved through intelligent, annotated form designs. The system as presented is ready for use in a clinical setting, and amenable to further customization. The essential contribution of the work reported here lies in the novel data integration and reporting methods, as well as the approach to software sustainability achieved through the use of community-supported open-source components.

Basic quantum mechanical properties of systems constituted by two and three gentileons are deduced in this paper. By using Pauli's theorem and symmetry properties of the intermediate states it is shown that, in some cases, gentileons must have half-odd-integral spin. As an immediate and natural result of our theoretical analysis, we show how fundamental observed properties of composed hadrons can be predicted from first principles assuming quarks as spin 1/2 gentileons. (author) [pt

The hadronic shift in pionic hydrogen has been redetermined to be ε {sub 1s} = 7.086 ± 0.007(stat) ± 0.006(sys) eV by X-ray spectroscopy of ground-state transitions applying various energy calibration schemes. The experiment was performed at the high-intensity low-energy pion beam of the Paul Scherrer Institut by using the cyclotron trap and an ultimate-resolution Bragg spectrometer with bent crystals. (orig.)

Full Text Available Two simple techniques for the measurement of the shielding efficiency for the electric field of a symmetric enclosure are presented.y They allow measurements in the symmetry plane of the enclosure without disturbances b the cables. Measurements in near and far field regions are possible. Comparisons with simulation results are shown. Both techniques are particularly suitable for the evaluation of numerical techniques. Limits of the concept of shielding efficiency of enclosures at high frequencies are also briefly discussed.

This report summarizes topics in hadron spectroscopy and production which could be addressed at CEBAF with an energy upgrade to E{sub {gamma}} = 8 GeV and beyond. The topics discussed include conventional meson and baryon spectrocopy, spectroscopy of exotica (especially molecules and hybrids), CP and CPT tests using {phi} mesons, and new detector and accelerator options.

This contribution presents general features of the hadron physics program developed at the Thomas Jefferson Laboratory. This is made using the EM and Weak probes provided by the electron beams of the CEBAF accelerator and address mostly the non-perturbative regime of QCD. (author)

A next step of energy increase of hadron colliders beyond the LHC requires high-field superconducting magnets capable of providing a dipolar field in the range of 16 T in a 50 mm aperture with accelerator quality. These characteristics could meet the re-quirements for an upgrade of the LHC to twice

Financial pressures from member states have upset the calculations of the European Laboratory for Particle Physics's (CERN) major accelerator, the Large Hadron Collider (LHC). Despite preference for domestic high energy programs, CERN members accord high priority to LHC physics. Converting to a global facility can help spread the high annual cost of subscription. But given the political realities, a revision of the LHC project appears more feasible. CERN's management needs to deploy its skills to overcome the financial obstacles to the facility.

We propose possible signatures of `exogamous' combinations between partons in the different W+ and W- hadron showers in e+e- -> W+W- events with purely hadronic final states. Within the space-time model for hadronic shower development that we have proposed previously, we find a possible difference of about 10 % between the mean hadronic multiplicity in such purely hadronic final states and twice the hadronic multiplicity in events in which one W decays hadronically and the other leptonically,...

Calorimeters for particle physics experiments with integration time of a few ns will substantially improve the capability of the experiment to resolve event pileup and to reject backgrounds. In this paper time development of hadronic showers induced by 30 and 60 GeV positive pions and 120 GeV protons is studied using Monte Carlo simulation and beam tests with a prototype of a sampling steel-scintillator hadronic calorimeter. In the beam tests, scintillator signals induced by hadronic showers in steel are sampled with a period of 0.2 ns and precisely time-aligned in order to study the average signal waveform at various locations w.r.t. the beam particle impact. Simulations of the same setup are performed using the MARS15 code. Both simulation and test beam results suggest that energy deposition in steel calorimeters develop over a time shorter than 3 ns providing opportunity for ultra-fast calorimetry. Simulation results for an "ideal" calorimeter consisting exclusively of bulk tungsten or copper are presented to establish the lower limit of the signal integration window.

Full Text Available In this paper a hybrid formulation is presented which combines edge-based vector finite method (FEM and Method of Moments (MoM in frequency domain to predict electromagnetic field distribution inside an enclosure with aperture. While MoM is used for solving the surface integrals related with the aperture field components via equivalent surface currents, FEM is used for solving electromagnetic fields inside of the enclosure. Numerical results for shielding effectiveness and electrical energy of enclosure with aperture are calculated by the hybrid method and they are presented and validated with the existing literature. Then the method is applied to different enclosures with different aperture sizes.

In an experiment at the Fermi National Accelerator Laboratory we have compared the production of large transverse momentum hadrons from targets of W, Ti, and Be bombarded by 300 GeV protons. The hadron yields were measured at 90 degrees in the proton-nucleon c.m. system with a magnetic spectrometer equipped with 2 Cerenkov counters and a hadron calorimeter. The production cross-sections have a dependence on the atomic number A that grows with P{sub 1}, eventually leveling off proportional to A{sup 1.1}.

Full Text Available This study investigates the natural convection induced by a temperature difference between cold outer polygonal enclosure and hot inner circular cylinder. The governing equations are solved numerically using built-in finite element method of COMSOL. The governing parameters considered are the number of polygonal sides, aspect ratio, radiation parameter, and Rayleigh number. We found that the number of contra-rotative cells depended on polygonal shapes. The convection heat transfer becomes constant at L / D > 0 . 77 and the polygonal shapes are no longer sensitive to the Nusselt number profile.

The prototype of the hadron calorimeter module consisting of 66 scintillator/lead layers with the 15x15 cm^2 cross section and 5 nuclear interaction lengths has been designed and produced for the zero degree calorimeter of the BM@N experiment. The prototype has been tested with high energy muon beam of the U-70 accelerator at IHEP. The results of the beam test for different types of photo multipliers and light guides are presented. The results of the Monte-Carlo simulation of the calorimeter ...

An accelerator complex for hadron therapy based on a chain of cyclotrons is under development at JINR (Dubna, Russia), and the corresponding conceptual design is under preparation. The complex mainly consists of two superconducting cyclotrons. The first accelerator is a compact cyclotron used as an injector to the main accelerator, which is a six-fold separated sector machine. The facility is intended for generation of protons and carbon beams. The H2+ and 12C6+ ions from the corresponding ECR ion sources are accelerated in the injector-cyclotron up to the output energy of 70 MeV/u. Then, the H2+ ions are extracted from the injector by a stripping foil, and the resulting proton beam with the energy of 70 MeV is used for medical purposes. After acceleration in the main cyclotron, the carbon beam can be either used directly for therapy or introduced to the main cyclotron for obtaining the final energy of 400 MeV/u. The basic requirements to the project are the following: compliance to medical requirements, compact size, feasible design, and high reliability of all systems of the complex. The advantages of the dual cyclotron design can help reaching these goals. The initial calculations show that this design is technically feasible with acceptable beam dynamics. The accelerator complex with a relatively compact size can be a good solution for medical applications. The basic parameters of the facility and detailed investigation of the magnetic system and beam dynamics are described.

Radiation protection aspects relevant to medical accelerators are discussed. An overview is first given of general safety requirements. Next. shielding and labyrinth design are discussed in some detail for the various types of accelerators, devoting more attention to hadron machines as they are far less conventional than electron linear accelerators. Some specific aspects related to patient protection are also addressed. Finally, induced radioactivity in accelerator components and shielding walls is briefly discussed. Three classes of machines are considered: (1) medical electron linacs for 'conventional' radiation therapy. (2) low energy cyclotrons for production of radionuclides mainly for medical diagnostics and (3) medium energy cyclotrons and synchrotrons for advanced radiation therapy with protons or light ion beams (hadron therapy). (51 refs).

Recently, the utilization of accelerators has increased rapidly, and the increase of accelerating energy and beam intensity is also remarkable. The studies on accelerator shielding have become important, because the amount of radiation emitted from accelerators increased, the regulation of the dose of environmental radiation was tightened, and the cost of constructing shielding rose. As the plans of constructing large accelerators have been made successively, the survey on the present state and the problems of the studies on accelerator shielding was carried out. Accelerators are classified into electron accelerators and proton accelerators in view of the studies on shielding. In order to start the studies on accelerator shielding, first, the preparation of the cross section data is indispensable. The cross sections for generating Bremsstrahlung, photonuclear reactions generating neutrons, generation of neutrons by hadrons, nuclear reaction of neutrons and generation of gamma-ray by hadrons are described. The generation of neutrons and gamma-ray as the problems of thick targets is explained. The shielding problems are complex and diversified, but in this paper, the studies on the shielding, by which basic data are obtainable, are taken up, such as beam damping and side wall shielding. As for residual radioactivity, main nuclides and the difference of residual radioactivity according to substances have been studied. (J.P.N.)

Four candidates for charmless hadronic B decay are observed in a data sample of four million hadronic Z decays recorded by the {\\sc aleph} detector at {\\sc lep} . The probability that these events come from background sources is estimated to b e less than $10^{-6}$. The average branching ratio of weakly decaying B hadrons (a mixture of $\\bd$, $\\bs$ and $\\lb$ weighted by their production cross sections and lifetimes , here denoted B) into two long-lived charged hadrons (pions, kaons or protons) is measured to be $\\Br(\\btohh) = \\resultBR$. The relative branching fraction $\\rratio$, where $\\rs$ is the ratio of $\\bs$ to $\\bd$ decays in the sample, is measured to be $\\resultR$. %Branching ratio upper limits are also obtained for a variety In addition, branching ratio upper limits are obtained for a variety of exclusive charmless hadronic two-body decays of B hadrons.

Following impressive results from early phase trials in Japan and Germany, there is a current expansion in European hadron therapy. This article summarises present European Union-funded projects for research and co-ordination of hadron therapy across Europe. Our primary focus will be on the research questions associated with carbon ion treatment of cancer, but these considerations are also applicable to treatments using proton beams and other light ions. The challenges inherent in this new form of radiotherapy require maximum interdisciplinary co-ordination. On the basis of its successful track record in particle and accelerator physics, the internationally funded CERN laboratories (otherwise known as the European Organisation for Nuclear Research) have been instrumental in promoting collaborations for research purposes in this area of radiation oncology. There will soon be increased opportunities for referral of patients across Europe for hadron therapy. Oncologists should be aware of these developments, which confer enhanced prospects for better cancer cure rates as well as improved quality of life in many cancer patients. PMID:20846982

Full Text Available A system for optical tracking of frozen hydrogen microsphere targets (pellets has been designed. It is intended for the upcoming hadron physics experiment PANDA at FAIR, Darmstadt, Germany. With such a tracking system one can reconstruct the positions of the individual pellets at the time of a hadronic interaction in the offline event analysis. This gives information on the position of the primary interaction vertex with an accuracy of a few 100 µm, which is very useful e.g. for reconstruction of charged particle tracks and secondary vertices and for background suppression. A study has been done at the WASA detector setup (Forschungszentrum Jülich, Germany to check the possibility of classification of hadronic events as originating in pellets or in background. The study has been done based on the instantaneous rate a Long Range TDC which was used to determine if a pellet was present in the accelerator beam region. It was clearly shown that it is possible to distinguish the two event classes. Also, an experience was gained with operation of two synchronized systems operating in different time scales, as it will also be the case with the optical pellet tracking.

We present an overview of the physics results obtained by experiments at the Large Hadron Collider (LHC) in 2009–2010, for an integrated luminosity of L ≈ 40 pb$^{−1}$ , collected mostly at a centre-of-mass energy of √ s = 7 TeV. After an introduction to the physics environment at the LHC and the current performance of the accelerator and detectors, we will discuss quantum chro- modynamics and B-physics analyses, W and Z production, the first results in the top sector, and searches for new physics, with particular emphasis on su- persymmetry and Higgs studies. While most of the presented results are in remarkable agreement with Standard Model predictions, the excellent perfor- mance of the LHC machine and experiments, the prompt analysis of all data within just a few months after the end of data taking, and the high quality of the results obtained constitute an encouraging step towards unique measurements and exciting discoveries in the 2011–2012 period and beyond.

The Large Hadron Collider (LHC), a 26.7 km circumference superconducting accelerator equipped with high-field magnets operating in superfluid helium below 1.9 K, has now fully entered construction at CERN, the European Laboratory for Particle Physics. The heart of the LHC cryogenic system is the quasi-isothermal magnet cooling scheme, in which flowing two-phase saturated superfluid helium removes the heat load from the 36000 ton cold mass, immersed in some 400 m/sup 3/ static pressurised superfluid helium. The LHC also makes use of supercritical helium for nonisothermal cooling of the beam screens which intercept most of the dynamic heat loads at higher temperature. Although not used in normal operation, liquid nitrogen will provide the source of refrigeration for precooling the machine. Refrigeration for the LHC is produced in eight large refrigerators, each with an equivalent capacity of about 18 kW at 4.5 K, completed by 1.8 K refrigeration units making use of several stages of hydrodynamic cold compressor...

The Large Hadron Collider (LHC), a 26.7 km circumference superconducting accelerator equipped with high-field magnets operating in superfluid helium below 1.9 K, has now fully entered construction at CERN, the European Laboratory for Particle Physics. The heart of the LHC cryogenic system is the quasi-isothermal magnet cooling scheme, in which flowing two-phase saturated superfluid helium removes the heat load from the 36'000 ton cold mass, immersed in some 400 m3 static pressurised superfluid helium. The LHC also makes use of supercritical helium for non-isothermal cooling of the beam screens which intercept most of the dynamic heat loads at higher temperature. Although not used in normal operation, liquid nitrogen will provide the source of refrigeration for precooling the machine. Refrigeration for the LHC is produced in eight large refrigerators, each with an equivalent capacity of about 18 kW at 4.5 K, completed by 1.8 K refrigeration units making use of several stages of hydrodynamic cold compressors. T...

Meson-meson reactions A(q_1 \\bar{q}_1) + B(q_2 \\bar{q}_2) to q_1 + \\bar{q}_1 + q_2 + \\bar{q}_2 in high-temperature hadronic matter are found to produce an appreciable amount of quarks and antiquarks freely moving in hadronic matter and to establish a new mechanism for deconfinement of quarks and antiquarks in hadronic matter.

Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

This compilation is a survey of the existing data of hadronic atoms (pionic-atoms, kaonic-atoms, antiprotonic-atoms, sigmonic-atoms). It collects measurements of the energies, intensities and line width of X-rays from hadronic atoms. Averaged values for each hadronic atom are given and the data are summarized. The listing contains data on 58 pionic-atoms, on 54 kaonic-atoms, on 23 antiprotonic-atoms and on 20 sigmonic-atoms. (orig./HB) [de

We review the formalism of quark recombination applied to the hadronization of a quark-gluon plasma. Evidence in favour of the quark recombination model is outlined. Recent work on parton correlations, leading to detectable correlations between hadrons, is discussed. Hot spots from completely quenched jets are a likely source of such correlations which appear to be jet like. It will be discussed how such a picture compares with measurement of associated hadron yields at RHIC.

Since the very first ages, human beings have attempted to understand and manage their environment in order to survive. This is the case regarding wildlife, weather cycles and gathering periods. This is also reflected in the areas surrounding sites where individuals live; they changed the landscape with different aims. This type of archaeology has risen very recently - in the last few years - and is usually called Landscape Archaeology. Traditional landscape archaeology has dealt with earth and location related studies; the relationship of ancient peoples with the sky has been disregarded. The archaeoastronomical studies have mitigated this fact. Archaeoastronomy has revealed an important number of archaeological sites; many of them reveal a clear intention of astronomically designed buildings or structures. This implies a planned detailed design and obviously a deep understanding of astronomical knowledge. As examples of these sites a number of megalithic ditched enclosures sited in Portugal will be shown which were studied inside the project "Ditched enclosures plants and Neolithic cosmologies: A landscape, archaeoastronomical and geophysical perspective". The ideological and astronomical aspects inside the architecture of these types of sites will be explained. In this paper we present a new methodology applied in the archaeoastronomical calculations for southern Portugal sites. It includes GIS techniques and the development of an archaeoastronomical layer that can be used to display the computations over cartographic information from the archaeological sites. A Spatial Data Infrastructure is also created in order to expose the results.

In this paper a review of recent developments of turbulence models for natural convection in enclosures is presented. The emphasis is placed on the effect of the treatments of Reynolds stress and turbulent heat flux on the stability and accuracy of the solution for natural convection in enclosures. The turbulence models considered in the preset study are the two-layer k -ε model, the shear stress transport (SST) model, the elliptic-relaxation (V2-f) model and the elliptic-blending second-moment closure (EBM). Three different treatments of the turbulent heat flux are the generalized gradient diffusion hypothesis (GGDH), the algebraic flux model (AFM) and the differential flux model (DFM). The mathematical formulation of the above turbulence models and their solution method are presented. Evaluation of turbulence models are performed for turbulent natural convection in a 1:5 rectangular cavity ( Ra = 4.3x10 10 ) and in a square cavity with conducting top and bottom walls ( Ra =1.58x10 9 ) and the Rayleigh-Benard convection ( Ra = 2x10 6 ∼ Ra =10 9 ). The relative performances of turbulence models are examined and their successes and shortcomings are addressed

The 'Young Talented' education program developed by the Brazilian State Funding Agency (FAPERJ)[1] makes it possible for high-schools students from public high schools to perform activities in scientific laboratories. In the Atomic and Molecular Physics Laboratory at Federal University of Rio de Janeiro (UFRJ), the students are confronted with modern research tools like the 1.7 MV ion accelerator. Being a user-friendly machine, the accelerator is easily manageable by the students, who can perform simple hands-on activities, stimulating interest in physics, and getting the students close to modern laboratory techniques.

High energy hadron colliders have been the tools for discovery at the highest mass scales of the energy frontier from the SppS, to the Tevatron and now the LHC. This report reviews future hadron collider projects from the high luminosity LHC upgrade to a 100 TeV hadron collider in a large tunnel, the underlying technology challenges and R&D directions and presents a series of recommendations for the future development of hadron collider research and technology.

The present state of the production and observation of hadrons containing heavy quarks or antiquarks as valence constituents, in reactions initiated by real and (space-like) virtual photon or by hadron beams is discussed. Heavy flavor production in e + e - annihilation, which is well covered in a number of recent review papers is not discussed, and similarly, neutrino production is omitted due to the different (flavor-changing) mechanisms that are involved in those reactions. Heavy flavors from spacelike photons, heavy flavors from real photons, and heavy flavors from hadron-hadron collisions are discussed

The formation of hadrons from energetic quarks, the dynamical enforcement of QCD confinement, is not well understood at a fundamental level. In Deep Inelastic Scattering, modifications of the distributions of identified hadrons emerging from nuclei of different sizes reveal a rich variety of spatial and temporal characteristics of the hadronization process, including its dependence on spin, flavor, energy, and hadron mass and structure. The EIC will feature a wide range of kinematics, allowing a complete investigation of medium-induced gluon bremsstrahlung by the propagating quarks, leading to partonic energy loss. This fundamental process, which is also at the heart of jet quenching in heavy ion collisions, can be studied for light and heavy quarks at the EIC through observables quantifying hadron ``attenuation'' for a variety of hadron species. Transverse momentum broadening of hadrons, which is sensitive to the nuclear gluonic field, will also be accessible, and can be used to test our understanding from pQCD of how this quantity evolves with pathlength, as well as its connection to partonic energy loss. The evolution of the forming hadrons in the medium will shed new light on the dynamical origins of the forces between hadrons, and thus ultimately on the nuclear force. Supported by the Comision Nacional de Investigacion Cientifica y Tecnologica (CONICYT) of Chile.

Full Text Available Many of the new structures observed since 2003 in experiments in the heavy quarkonium mass region, such as the X(3872 and Zc (3900, are rather close to certain thresholds, and thus can be good candidates of hadronic molecules, which are loose bound systems of hadrons. We will discuss the consequences of heavy quark symmetry for hadronic molecules with heavy quarks. We will also emphasize that the hadronic molecular component of a given structure can be directly probed in long-distance processes, while the short-distance processes are not sensitive to it.

The XIII International Workshop on Hadron Physics, XIII Hadron Physics, is intended for graduate students, postdocs and researchers in Hadronic Physics, High Energy Physics, Astrophysics and Effective Field Theories, who wish to improve their theoretical background, learn about recent experimental results and develop collaboration projects. The series Hadron Physics, in activity since 1988, has the format of an advanced school and has the objective to introduce, in a series of pedagogical lectures, new lines of research in Strong Interaction Physics, mainly concerned with QCD. It envisages also to stimulate collaborations in international level.

The scalar strong interaction hadron theory, SSI, is a first principles' and nonlocal theory at quantum mechanical level that provides an alternative to low energy QCD and Higgs related part of the standard model. The quark-quark interaction is scalar rather than color-vectorial. A set of equations of motion for mesons and another set for baryons have been constructed. This book provides an account of the present state of a theory supposedly still at its early stage of development. This work will facilitate researchers interested in entering into this field and serve as a basis for possible future development of this theory.

We study the properties of charmed hadrons in dense matter within a coupled-channel approach which accounts for Pauli blocking effects and meson self-energies in a self-consistent manner. We analyze the behaviour in this dense environment of dynamically-generated baryonic resonances as well as the open-charm meson spectral functions. We discuss the implications of the in-medium properties of open-charm mesons on the D s0 (2317) and the predicted X(3700) scalar resonances. (authors)

This booklet contains a basic guide to the principles of the production of ionizing radiation and to methods of radiation protection and dosimetry, and a discussion of the need for shielded enclosures. Shielding materials and the design of the enclosures are described

Conclusions: An overall noise reduction by 25 dB with the use of mineral wool as an extra liner on the inside of the enclosure, suggests that the effectiveness of the enclosure can be increased by using such absorber materials.

This report describes several electric, electronic and electromechanical assemblies which are used in radioactive handling enclosures. The authors propose an overview of existing or foreseen devices: a device to lift covers, a brightness comparator, a high voltage device to perform electrophoresis, a level sensor or regulator device, a regulation device to control under-pressure in an enclosure [fr

Full Text Available Insertion loss prediction of large acoustical enclosures using Statistical Energy Analysis (SEA method is presented. The SEA model consists of three elements: sound field inside the enclosure, vibration energy of the enclosure panel, and sound field outside the enclosure. It is assumed that the space surrounding the enclosure is sufficiently large so that there is no energy flow from the outside to the wall panel or to air cavity inside the enclosure. The comparison of the predicted insertion loss to the measured data for typical large acoustical enclosures shows good agreements. It is found that if the critical frequency of the wall panel falls above the frequency region of interest, insertion loss is dominated by the sound transmission loss of the wall panel and averaged sound absorption coefficient inside the enclosure. However, if the critical frequency of the wall panel falls into the frequency region of interest, acoustic power from the sound radiation by the wall panel must be added to the acoustic power from transmission through the panel.

"Inside CERN's Large Hadron Collider" by Mario Campanelli. Presentation on Wednesday, 16 March at 4 p.m. in the Library (bldg 52-1-052) The book aims to explain the historical development of particle physics, with special emphasis on CERN and collider physics. It describes in detail the LHC accelerator and its detectors, describing the science involved as well as the sociology of big collaborations, culminating with the discovery of the Higgs boson. Inside CERN's Large Hadron Collider Mario Campanelli World Scientific Publishing, 2015 ISBN 9789814656641​

The Japanese Hadron Project (JHP) is a large future plan of interdisciplinary and international scope, aimed at basic physics research by creating and using various secondary unstable particle beams such as mesons, muons, neutrons and accelerated exotic nuclei. It comprises a high-intensity proton linac of 1 GeV, a compressor/stretcher ring and an ISOL/accelerator to deliver beams to MESON, NEUTRON and EXOTIC NUCLEI arena's. In addition, as the present ongoing project, we are pushing KAON arena based on the KEK 12 GeV proton synchrotron. The present paper describes the scientific motivation and technological bases for this future project as well as the presently going pre-JHP research activities. (author)

This paper identifies some of the more important issues related to the design of a control system for the Advanced Hadron Facility (AHF). It begins with a brief description of the site layout and how the various accelerators operate in tandem to deliver beam to several experimental areas. Then it focuses on the control system by estimating from existing installations the number of data and control channels to be expected for the AHF. The total comes to 50,000. This channel count is converted to manpower and cost estimates for the control system by extrapolating from other accelerator facilities. Finally, special attention is given to two subsystems -- magnets and diagnostic equipment -- and the impact they will have on the control system. 11 refs., 5 figs., 6 tabs

Accelerator technology has advanced tremendously since the introduction of accelerators in the 1930s, and particle accelerators have become indispensable instruments in high energy physics (HEP) research to probe Nature at smaller and smaller distances. At present, accelerator facilities can be classified into Energy Frontier colliders that enable direct discoveries and studies of high mass scale particles and Intensity Frontier accelerators for exploration of extremely rare processes, usually at relatively low energies. The near term strategies of the global energy frontier particle physics community are centered on fully exploiting the physics potential of the Large Hadron Collider (LHC) at CERN through its high-luminosity upgrade (HL-LHC), while the intensity frontier HEP research is focused on studies of neutrinos at the MW-scale beam power accelerator facilities, such as Fermilab Main Injector with the planned PIP-II SRF linac project. A number of next generation accelerator facilities have been proposed and are currently under consideration for the medium- and long-term future programs of accelerator-based HEP research. In this paper, we briefly review the post-LHC energy frontier options, both for lepton and hadron colliders in various regions of the world, as well as possible future intensity frontier accelerator facilities.

Condensation and moisture related problems are the cause of failures in many cases and consequently serious concerns for reliability in electronics industry. Thus, it is important to control the moisture content and the relative humidity inside electronic enclosures. In this work, a computational...... fluid dynamics (CFD) model is developed to simulate moisture transfer into a typical electronic enclosure. In the first attempt, an isothermal case is de- veloped and compared against the well-known RC circuit analogy considering the behavior of an idealized electronic enclosure. It is shown that the RC...... method predicts a faster trend for the moisture transfer into the enclosure compared to the CFD. The effect of several important parameters, namely, position of the opening, initial relative humidity inside the enclosure, length and radius of the opening and temperature is studied using the devel- oped...

The multiplicity distribution is expressed in terms of supercluster production in hadronic processes at high energy. This process creates unstable clusters at intermediate stages and hadrons in final stage. It includes Poisson-transform distributions (with the partially coherent distribution as a special case) and is very flexible for phenomenological analyses. The associated Koba, Nielson, and Olesen limit and the behavior of cumulant moments are analyzed in detail for finite and/or infinite cluster size and particle size per cluster. In general, a supercluster distribution does not need to be equivalent to a negative binomial distribution to fit experimental data well. Furthermore, the requirement of such equivalence leads to many solutions, in which the average size of the cluster is not logarithmic: e.g., it may show a power behavior instead. Superclustering is defined as a two-or multi-stage process underlying observed global multiplicity distributions. At the first stage of the production process, individual clusters are produced according to a given statistical law. For example, the clustering distribution may be described by partially coherent (oreven sub-Poissonian distribution models. At the second stage, the clusters are considered as the sources of particle production. The corresponding distribution may then be as general as the clustering distribution just mentioned. 8 refs

I discuss several novel and unexpected aspects of quantum chromodynamics. These include: (a) the nonperturbative origin of intrinsic strange, charm and bottom quarks in the nucleon at large x; the breakdown of pQCD factorization theorems due to the lensing effects of initial- and final-state interactions; (b) important corrections to pQCD scaling for inclusive reactions due to processes in which hadrons are created at high transverse momentum directly in the hard processes and their relation to the baryon anomaly in high-centrality heavy-ion collisions; and (c) the nonuniversality of quark distributions in nuclei. I also discuss some novel theoretical perspectives in QCD: (a) light-front holography - a relativistic color-confining first approximation to QCD based on the AdS/CFT correspondence principle; (b) the principle of maximum conformality - a method which determines the renormalization scale at finite order in perturbation theory yielding scheme independent results; (c) the replacement of quark and gluon vacuum condensates by 'in-hadron condensates' and how this helps to resolve the conflict between QCD vacuum and the cosmological constant.

I discuss several novel and unexpected aspects of quantum chromodynamics. These include: (a) the nonperturbative origin of intrinsic strange, charm and bottom quarks in the nucleon at large x; the breakdown of pQCD factorization theorems due to the lensing effects of initial- and final-state interactions; (b) important corrections to pQCD scaling for inclusive reactions due to processes in which hadrons are created at high transverse momentum directly in the hard processes and their relation to the baryon anomaly in high-centrality heavy-ion collisions; and (c) the nonuniversality of quark distributions in nuclei. I also discuss some novel theoretical perspectives in QCD: (a) light-front holography - a relativistic color-confining first approximation to QCD based on the AdS/CFT correspondence principle; (b) the principle of maximum conformality - a method which determines the renormalization scale at finite order in perturbation theory yielding scheme independent results; (c) the replacement of quark and gluon vacuum condensates by 'in-hadron condensates' and how this helps to resolve the conflict between QCD vacuum and the cosmological constant.

A six-month design study for a future high energy hadron collider was initiated by the Fermilab director in October 2000. The request was to study a staged approach where a large circumference tunnel is built that initially would house a low field ({approx}2 T) collider with center-of-mass energy greater than 30 TeV and a peak (initial) luminosity of 10{sup 34} cm{sup -2}s{sup -1}. The tunnel was to be scoped, however, to support a future upgrade to a center-of-mass energy greater than 150 TeV with a peak luminosity of 2 x 10{sup 34} cm{sup -2} sec{sup -1} using high field ({approx} 10 T) superconducting magnet technology. In a collaboration with Brookhaven National Laboratory and Lawrence Berkeley National Laboratory, a report of the Design Study was produced by Fermilab in June 2001. 1 The Design Study focused on a Stage 1, 20 x 20 TeV collider using a 2-in-1 transmission line magnet and leads to a Stage 2, 87.5 x 87.5 TeV collider using 10 T Nb{sub 3}Sn magnet technology. The article that follows is a compilation of accelerator physics designs and computational results which contributed to the Design Study. Many of the parameters found in this report evolved during the study, and thus slight differences between this text and the Design Study report can be found. The present text, however, presents the major accelerator physics issues of the Very Large Hadron Collider as examined by the Design Study collaboration and provides a basis for discussion and further studies of VLHC accelerator parameters and design philosophies.

Particle physics makes its greatest advances with experiments at the highest energy. The only sure way to advance to a higher-energy regime is through hadron colliders--the Tevatron, the LHC, and then, beyond that, a Very Large Hadron Collider. At Snowmass-1996 [1], investigators explored the best way to build a VLHC, which they defined as a 100 TeV collider. The goals in this study are different. The current study seeks to identify the best and cheapest way to arrive at frontier-energy physics, while simultaneously starting down a path that will eventually lead to the highest-energy collisions technologically possible in any accelerator using presently conceivable technology. This study takes the first steps toward understanding the accelerator physics issues, the technological possibilities and the approximate cost of a particular model of the VLHC. It describes a staged approach that offers exciting physics at each stage for the least cost, and finally reaches an energy one-hundred times the highest energy currently achievable.

Remarkable scientific and technological progress during the last years has led to the construction of accelerator based facilities dedicated to hadron therapy. This kind of technology requires precise and continuous control of position, intensity and shape of the ions or protons used to irradiate cancers. Patient safety, accelerator operation and dose delivery should be optimized by a real time monitoring of beam intensity and profile during the treatment, by using non-destructive, high spatial resolution detectors. In the framework of AMIDERHA (AMIDERHA - Enhanced Radiotherapy with HAdron) project funded by the Ministero dell'Istruzione, dell'Università e della Ricerca (Italian Ministry of Education and Research) the authors are studying and developing an innovative beam monitor based on Micro Pattern Gaseous Detectors (MPDGs) characterized by a high spatial resolution and rate capability. The Monte Carlo simulation of the beam monitor prototype was carried out to optimize the geometrical set up and to predict the behavior of the detector. A first prototype has been constructed and successfully tested using 55Fe, 90Sr and also an X-ray tube. Preliminary results on both simulations and tests will be presented.

Based on the RFQ and H-type DTL structures, this dissertation is dedicated to study the beam dynamics in the presence of significantly strong space-charge effects while accelerating intense hadron beams in the low- and medium-{beta} region. Besides the 5 mA/30 mA, 17 MeV proton injector (RFQ+DTL) and the 125 mA, 40 MeV deuteron DTL of the EUROTRANS and IFMIF facilities, a 200 mA, 700 keV proton RFQ has been also intensively studied for a small-scale but ultra-intense neutron source FRANZ planned at Frankfurt University. The most remarkable properties of the FRANZ RFQ and the IFMIF DTL are the design beam intensities, 200 mA and 125 mA. A new design approach, which can provide a balanced and accelerated beam bunching at low energy, has been developed for intense beams. To design the IFMIF DTL and the injector DTL part of the EUROTRANS driver linac, which have been foreseen as the first real applications of the novel superconducting CH-DTL structure, intensive attempts have been made to fulfill the design goals under the new conditions. For the IFMIF DTL, the preliminary IAP design has been considerably improved with respect to the linac layout as well as the beam dynamics. By reserving sufficient drift spaces for the cryosystem, diagnostic devices, tuner and steerer, introducing SC solenoid lenses and adjusting the accelerating gradients and accordingly other configurations of the cavities, a more realistic, reliable and efficient linac system has been designed. On the other hand, the specifications and positions of the transverse focusing elements as well as the phase- and energy-differences between the bunch-center particle and the synchronous particle at the beginning of the {phi}{sub s}=0 sections have been totally redesigned. For the EUROTRANS injector DTL, in addition to the above-mentioned procedures, extra optimization concepts to coordinate the beam dynamics between two intensities have been applied. In the beam transport simulations for both DTL designs

The study of heavy quark production at hadron colliders will provide important tests and measurements within and possibly beyond the Standard Model. The results of a recent calculation of heavy quark hadronic production correlation properties at the full next-to-leading order (NLO) in perturbative QCD are presented. These properties are important for several applications. (R.P.) 8 refs.; 3 figs

Review lectures are given on lepton--hadron interactions at high energy--momentum transfers. Included are the inelastic scattering of leptons by nucleons and the process e + e - → hadrons. Experimental evidence for the quark--parton picture is reviewed and the more outstanding phenomena: charmed mesons and baryons, Bjorken scaling violation, and new quantum numbers and couplings are discussed. 74 references

Scintillating tiles are carefully mounted in the hadronic calorimeter for the LHCb detector. These calorimeters measure the energy of particles that interact via the strong force, called hadrons. The detectors are made in a sandwich-like structure where these scintillator tiles are placed between metal sheets.

Lectures are given on some of the theoretical ideas involved in electroproduction, neutrino production and electron--positron annihilation into hadrons. In so doing a study is simultaneously made of both the short distance behavior of products of currents and hadron structure. 56 references

Jet finding algorithms, as they are used in e + e- and hadron collisions, are reviewed and compared. It is suggested that a successive combination style algorithm, similar to that used in e + e- physics, might be useful also in hadron collisions, where cone style algorithms have been used previously

A study of the production of strange octet and decuplet baryons in hadronic decays of the Z recorded by the DELPHI detector at LEP is presented. This includes the first measurement of the \\Sigma^\\pm average multiplicity. The total and differential cross sections, the event topology and the baryon-antibaryon correlations are compared with current hadronization models.

Light-front dynamics(LFD) plays an important role in the analyses of relativistic few-body systems. As evidenced from the recent studies of generalized parton distributions (GPDs) in hadron physics, a natural framework for a detailed study of hadron structures is LFD due to its direct application in

The correlations between the particles produced in interactions of hadrons with emulsion nuclei were investigated. The data are in qualitative agreement with the models which describe the interactions with nuclei as subsequent collisions of the fast part of excited hadronic matter inside the nucleus. (author)

Semi-inclusive deep inelastic scattering (SIDIS) has been used extensively in recent years as an important testing ground for QCD. Studies so far have concentrated on better determination of parton distribution functions, distinguishing between the quark and antiquark contributions, and understanding the fragmentation of quarks into hadrons. Hadron pair (di-hadron) SIDIS provides information on the nucleon structure and hadronization dynamics that complement single hadron SIDIS. Di-hadrons allow the study of low- and high-twist distribution functions and Dihadron Fragmentation Functions (DiFF). Together with the twist-2 PDFs ( f1, g1, h1), the Higher Twist (HT) e and hL functions are very interesting because they offer insights into the physics of the largely unexplored quark-gluon correlations, which provide access into the dynamics inside hadrons. The CLAS spectrometer, installed in Hall-B at Jefferson Lab, has collected data using the CEBAF 6 GeV longitudinally polarized electron beam on longitudinally polarized solid NH3 targets. Preliminary results on di-hadron beam-, target- and double-spin asymmetries will be presented.

The number of electronics used in outdoor environment is constantly growing. The humidity causes about 19 % of all electronics failures and, especially, moisture increases these problems due to the ongoing process of miniaturization and lower power consumption of electronic components. Moisture loads are still not understood well by design engineers, therefore this field has become one of the bottlenecks in the electronics system design. The objective of this paper is to model moisture ingress into an electronics enclosure under isothermal conditions. The moisture diffusion model is based on a 1D quasi-steady state (QSS) approximation for Fick’s second law. This QSS approach is also described with an electrical analogy which gives a fast tool in modelling of the moisture response. The same QSS method is applied to ambient water vapour variations. The obtained results are compared to an analytical solution and very good agreement is found.

This invention concerns the thermal insulation assemblies and in particular a metallic assembly of stays and enclosures for a reflecting type panel used for insulation in nuclear reactors. Great flexibility is achieved by a corrugated strip placed edgewise around all the first reflecting insulating sheet. A second reflecting insulating sheet is then superposed on this corrugated strip which acts as a thickness spacer along the periphery of both sheets and also hermetically closes the intermediate space. The corrugations of the edge strip allow both sheets to be curved lengthwise or crosswise without causing their spacing to vary. These corrugations simply open like the pleats of an accordeon or a fan to fit the curve of the greatest radius [fr

The number of electronics used in outdoor environment is constantly growing. The humidity causes about 19 % of all electronics failures and, especially, moisture increases these problems due to the ongoing process of miniaturization and lower power consumption of electronic components. Moisture loads are still not understood well by design engineers, therefore this field has become one of the bottlenecks in the electronics system design. The objective of this paper is to model moisture ingress into an electronics enclosure under isothermal conditions. The moisture diffusion model is based on a 1D quasi-steady state (QSS) approximation for Fick’s second law. This QSS approach is also described with an electrical analogy which gives a fast tool in modelling of the moisture response. The same QSS method is applied to ambient water vapour variations. The obtained results are compared to an analytical solution and very good agreement is found.

The number of electronics used in outdoor environment is constantly growing. The humidity causes about 19 % of all electronics failures and, especially, moisture increases these problems due to the ongoing process of miniaturization and lower power consumption of electronic components. Moisture...... loads are still not understood well by design engineers, therefore this field has become one of the bottlenecks in the electronics system design. The objective of this paper is to model moisture ingress into an electronics enclosure under isothermal conditions. The moisture diffusion model is based...... on a 1D quasi-steady state (QSS) approximation for Fick's second law. This QSS approach is also described with an electrical analogy which gives a fast tool in modelling of the moisture response. The same QSS method is applied to ambient water vapour variations. The obtained results are compared...

The application of perturbative quantum chromodynamics to the dynamics of hadrons at short distance is reviewed, with particular emphasis on the role of the hadronic bound state. A number of new applications are discussed, including the modification to QCD scaling violations in structure functions due to hadronic binding; a discussion of coherence and binding corrections to the gluon and sea-quark distributions; QCD radiative corrections to dimensional counting rules for exclusive processes and hadronic form factors at large momentum transfer; generalized counting rules for inclusive processes; the special role of photon-induced reactions in QCD, especially applications to jet production in photon-photon collisions, and photon production at large transverse momentum. Also presented is a short review of the central problems in large P/sub T/ hadronic reactions and the distinguishing characteristics of gluon and quark jets. 163 references

This paper discusses the need for higher energy accelerators to probe the mysteries of the subatomic universe. Intermediate vector bosons are discussed as well as symmetry breaking and the standard model

Data from the recent measurements of very forward baryon and photon production with the H1 and ZEUS detectors at electron-proton collider HERA are presented and compared to the theoretical calculations and Monte Carlo models. Results are presented of the production of leading protons, neutrons and photons in deep inelastic scattering (ep → e' pX, ep → e'nX, ep → e' γ X) as well as the leading neutron production in the photoproduction of di-jets (ep → ejjXn). The forward baryon and photon results from the H1 and ZEUS Experiments are compared also with the models of the hadronic interactions of high energy Cosmic Rays. The sensitivity of the HERA data to the differences between the models is demonstrated. (authors)

The simplest, naive, model for a unified description of leptons and hadrons consists in postulating, besides the usual quarks p, n, lambda a fourth quark, with very heavy mass and very high binding to pairs like anti p n and anti p lambda. In a SU(4) scheme the fourth quark has a quantum number charm which may be taken as proportional to the lepton number. Muons would be distinguished from electrons by the occurence of a lambda-quark instead of a n-quark in their structure. The forces among these quarks would have to be such as to give leptons an almost point-like structure at the experimentally known energies as well as absence of strong interactions at these energies. However, one would expect the display of strong interactions by leptons at extremely high energies [pt

Λ, bar Λ, and K S 0 production is studied in a Monte Carlo dual parton model for hadron-hadron, hadron-nucleus, and nucleus-nucleus collisions with an SU(3) symmetric sea for chain formation (chain ends) but strangeness suppression in the chain fragmentation process. Additionally, (qq)-(bar q bar q) production from the sea was introduced into the chain formation process with the same probability as for the q→qq branching within the chain decay process. With these assumptions, multiplicity ratios and Feynman-x distributions for strange particles in h-h and multiplicity ratios in heavy ion collisions are reasonably well reproduced

A method for calculating high energy hadron cascades induced by multi-GeV electron and photon beams is described. Using the EGS4 computer program, high energy photons in the EM shower are allowed to interact hadronically according to the vector meson dominance (VMD) model, facilitated by a Monte Carlo version of the dual multistring fragmentation model which is used in the hadron cascade code FLUKA. The results of this calculation compare very favorably with experimental data on hadron production in photon-proton collisions and on the hadron production by electron beams on targets (i.e., yields in secondary particle beam lines). Electron beam induced hadron star density contours are also presented and are compared with those produced by proton beams. This FLUKA-EGS4 coupling technique could find use in the design of secondary beams, in the determination high energy hadron source terms for shielding purposes, and in the estimation of induced radioactivity in targets, collimators and beam dumps

A reduced-scale model and CFD predictions were used to investigate experimentally and numerically the airflow patterns within a ceiling-slot ventilated enclosure loaded by slotted boxes. The experiments were carried out with a laser Doppler anemometer. This paper concerns the air velocity characteristics within the jet and outside the boxes. Results make it possible to highlight the confinement effect due to enclosure and the influence of load porosity on the jet penetration, its development and hence the heterogeneity of ventilation within the enclosure. The numerical predictions obtained with the computational fluid dynamics Fluent package using the RSM turbulence model show rather good agreement with experimental data

Full Text Available The right of enclosure obtained ex gratia offers the possibility of analyzing the aims of those who undertook that system in the Eighteenth century. Their main goal was to divide with legal permission a public property in order to work it in private interest. The capitalistic intervention shows itself from the investment required to carry out the enclosure and the quest of higher returns from the land. Geographically enclosures are mainly located in Western Andalusia and their aim was to grow olive trees, vine and grassland.

One of the most important developments in physics is the increasing understanding of subatomic phenomena. The subatomic physics belong today to the canonical parts of a study of physics. In many universities therefore for this an introductory course is offered. The first edition arose from a script for such courses. The subatomic physic has since the first edition distinctly changed. Because I keep the concept of the book still as usual for good I have decided for a new edition. Many textbooks and courses in nuclear and particle physics try to motivate students in a certain direction. This is surely appropriate in an advanced state of a study. In the bachelor range this can lead to a not suffiecently wide development, and the book tries to counteract to this. How physical phenomena are to be describe depends on each energy scale. In the book for each scale a concise introduction is given to the occasionally required description. By this way regularity is reached, and it is avoided to give to the fields wrong priorities. The list of the meanwhile necessary changes is long, and I want to cite here only some topics. The chapter about high-energy accelerators is antiquated, many of the accelerators planned at that time were not realize. The realized new accelerators open new regions in hadron and heavy-ion physics, and maybe new observations and concepts are to be cited for this. How quarks bind to hadrons is today better understood and requires an extensive discussion. To be mentioned is also that the application range of perturbative quantum chromodynamics could be extended in different directions by new methods. The essential cause of the new edition is the experimental detection of the Higgs particle, which must now be treated extensively. A careful revision of the new edition led to a very large number of corrections and smaller improvements.

A review of the lifetimes of B hadrons measured by the CDF collaboration at Fermilab is presented. The data corresponds to 110 pb -1 of p anti p collisions at √s = 1.8 TeV. The inclusive B hadron lifetime is measured using a high statistics sample of B → J/ΨΧ decays. Species specific lifetimes of the B + , B 0 , B 0 s , and Λ 0 b are determined using both fully reconstructed decays and partially reconstructed decays consisting of a lepton associated with a charm hadron

We present general constraints on dark matter stability in hadronic decay channels derived from measurements of cosmic-ray antiprotons.We analyze various hadronic decay modes in a model-independent manner by examining the lowest-order decays allowed by gauge and Lorentz invariance for scalar and fermionic dark matter particles and present the corresponding lower bounds on the partial decay lifetimes in those channels. We also investigate the complementarity between hadronic and gamma-ray constraints derived from searches for monochromatic lines in the sky, which can be produced at the quantum level if the dark matter decays into quark-antiquark pairs at leading order.

A brief exposition of contemporary non-perturbative methods based on the Schwinger-Dyson (SDE) and Bethe-Salpeter equations (BSE) of Quantum Chromodynamics (QCD) and their application to hadron physics is given. These equations provide a non-perturbative continuum formulation of QCD and are a powerful and promising tool for the study of hadron physics. Results on some properties of hadrons based on this approach, with particular attention to the pion distribution amplitude, elastic, and transition electromagnetic form factors, and their comparison to experimental data are presented. (paper)

The duality between the partonic and hadronic descriptions of electron--nucleon scattering is a remarkable feature of nuclear interactions. When averaged over appropriate energy intervals the cross section at low energy which is dominated by nucleon resonances resembles the smooth behavior expected from perturbative QCD. Recent Jefferson Lab results indicate that quark-hadron duality is present in a variety of observables, not just the proton F2 structure function. An overview of recent results, especially local quark-hadron duality on the neutron, are presented here.

A report on the 7. workshop on hadron interactions is presented. Several works, both theoretical and experimental, on progress in hadron-hadron, hadron-nucleus and nucleus-nucleus interactions at very high energies are discussed. This includes cosmic ray interaction also

Non-extensive thermodynamics is a novel approach in high energy physics. In high-energy heavy-ion, and especially in proton-proton collisions we are far from a canonical thermal state, described by the Boltzmann-Gibbs statistic. In these reactions low and intermediate transverse momentum spectra are extremely well reproduced by the Tsallis-Pareto distribution, but the physical origin of Tsallis parameters is still an unsettled question. Here, we analyze whether Tsallis-Pareto energy distribution do overlap with hadron spectra at high-pT. We fitted data, measured in proton-proton (proton-antiproton) collisions in wide center of mass energy range from 200 GeV RHIC up to 7 TeV LHC energies. Furthermore, our test is extended to an investigation of a possible √s-dependence of the power in the Tsallis-Pareto distribution, motivated by QCD evolution equations. We found that Tsallis-Pareto distributions fit well high-pT data, in the wide center of mass energy range. Deviance from the fits appears at p T > 20-30 GeV/c, especially on CDF data. Introducing a pT-scaling ansatz, the fits at low and intermediate transverse momenta still remain good, and the deviations tend to disappear at the highest-pT data.

This experiment measures the production of direct real photons with large transverse momentum in pion-nucleon collisions at the SPS (H8 beam) using the NA3 spectrometer with an upgraded e-$\\gamma$ calorimeter. The experiment proceeds in steps of increasing complexity: \\item a) measurement of the direct $\\gamma$ cross-section in $\\pi^{\\pm}$C $\\rightarrow \\gamma +$ X and search for the annihilation process $q\\bar{q} \\rightarrow \\gamma$g by measuring the charge asymmetry at 200 GeV/c; \\item b) determination of the gluon structure function of the pion and the nucleon; \\item c) use of the $\\pi^{-}-\\pi^{+}$ difference on carbon, if found experimentally, to extract the gluon fragmentation from the $\\gamma$ hadron correlations. \\end{enumerate}\\\\ \\\\ For comparison, the quark fragmentation functions can, in principle, be extracted from processes where the Compton scattering qg $\\rightarrow$ q$\\gamma$ dominates and compared with data from D.I.S. as a test of the method. \\\\ \\\\ The existing standard NA3 spectrometer is we...

A brief discussion is given on the feasibility of using lasers to accelerate particle beams. A rough theory of operation is developed, and numerical results are obtained for an example equivalent to the Fermilab Accelerator

Quantum Chromodynamics (QCD), the theory of strong interactions, in principle describes the interaction of quark and gluon fields. However, due to the self-coupling of the gluons, quarks and gluons are confined into hadrons and cannot exist as free particles. The quantitative understanding of this confinement phenomenon, which is responsible for about 98\\% of the mass of the visible universe, is one of the major open questions in particle physics. The measurement of the excitation spectrum of hadrons and of their properties gives valuable input to theory and phenomenology. In the Constituent Quark Model (CQM) two types of hadrons exist: mesons, made out of a quark and an antiquark, and baryons, which consist of three quarks. But more advanced QCD-inspired models and Lattice QCD calculations predict the existence of hadrons with exotic properties interpreted as excited glue (hybrids) or even pure gluonic bound states (glueballs). The COMPASS experiment at the CERN Super Proton Synchrotron has acquired large da...

The aim of the COMPASS hadron programme is to study the light-quark hadron spectrum, and in particular, to search for evidence of hybrids and glueballs. COMPASS is a fixed-target experiment at the CERN SPS and features a two-stage spectrometer with high momentum resolution, large acceptance, particle identification and calorimetry. A short pilot run in 2004 resulted in the observation of a spin-exotic state with $J^{PC} =$ 1${−+}$ consistent with the debated /4\\pi_{1}$(1600). In addition, Coulomb production at low momentum transfer data provide a test of Chiral Perturbation Theory. During 2008 and 2009, a world leading data set was collected with hadron beam which is currently being analysed. The large statistics allows for a thorough decomposition of the data into partial waves. The COMPASS hadron data span over a broad range of channels and shed light on several different aspects of QCD.

The aim of the COMPASS hadron programme is to study the light-quark hadron spectrum, and in particular, to search for evidence of hybrids and glueballs. COMPASS is a fixed-target experiment at the CERN SPS and features a two-stage spectrometer with high momentum resolution, large acceptance, particle identification and calorimetry. A short pilot run in 2004 resulted in the observation of a spin-exotic state with $J^{PC} = 1^{-+}$ consistent with the debated $\\pi1(1600)$. In addition, Coulomb production at low momentum transfer data provide a test of Chiral Perturbation Theory. During 2008 and 2009, a world leading data set was collected with hadron beam which is currently being analysed. The large statistics allows for a thorough decomposition of the data into partial waves. The COMPASS hadron data span over a broad range of channels and shed light on several different aspects of QCD.

We review some recent results on phenomenological approaches to strong interactions inspired in gauge/string duality. In particular, we discuss how such models lead to very good estimates for hadronic masses.

Hadron multiplicities in semi-inclusive deep-inelastic scattering were measured on neon, krypton and xenon targets relative to deuterium at an electron(positron)-beam energy of 27.6 GeV at HERMES. These ratios were determined as a function of the virtual-photon energy {nu}, its virtuality Q{sup 2}, the fractional hadron energy z and the transverse hadron momentum with respect to the virtual-photon direction p{sub t}. Dependences were analysed separately for positively and negatively charged pions and kaons as well as protons and antiprotons in a two-dimensional representation. Compared to the one-dimensional dependences, some new features were observed. In particular, when z>0:4 positive kaons do not show the strong monotonic rise of the multiplicity ratio with {nu} as exhibited by pions and K{sup -}. Protons were found to behave very differently from the other hadrons. (orig.)

The areas d σ /d Ω of fitted quasifree scattering peaks from bound nucleons for continuum hadron-nucleus spectra measuring d2σ /d Ω d ω are converted to sum rules akin to the Coulomb sums familiar from continuum electron scattering spectra from nuclear charge. Hadronic spectra with or without charge exchange of the beam are considered. These sums are compared to the simple expectations of a nonrelativistic Fermi gas, including a Pauli blocking factor. For scattering without charge exchange, the hadronic sums are below this expectation, as also observed with Coulomb sums. For charge exchange spectra, the sums are near or above the simple expectation, with larger uncertainties. The strong role of hadron-nucleon in-medium total cross sections is noted from use of the Glauber model.

This paper reports the design and construction status, technical challenges, and future perspectives of the proton-linac based Compact Pulsed Hadron Source (CPHS) at the Tsinghua University, Beijing, China

Hadron multiplicities in semi-inclusive deep-inelastic scattering were measured on neon, krypton, and xenon targets relative to deuterium at an electron(positron)-beam energy of 27.6GeV at HERMES. These ratios were determined as a function of the virtual-photon energy {nu}, its virtuality Q{sup 2}, the fractional hadron energy z and the transverse hadron momentum with respect to the virtual-photon direction p{sub t}. Dependences were analysed separately for positively and negatively charged pions and kaons as well as protons and antiprotons in a two-dimensional representation. Compared to the one-dimensional dependences, some new features were observed. In particular, when z > 0.4 positive kaons do not show the strong monotonic rise of the multiplicity ratio with {nu} as exhibited by pions and negative kaons. Protons were found to behave very differently from the other hadrons. (orig.)

Recent results from HERMES in the hadron leptoproduction of nuclei are presented. The possible interpretations in terms of medium modifications of the parton fragmentation function and the implications on the parton energy loss are discussed

Full Text Available The J-PARC Hadron Facility is designed as a multipurpose experimental facility for a wide range of particle and nuclear physics programs, aiming to provide the world highest intensity secondary beams. Currently three secondary beam lines; K1.8, K1.8BR and KL together with the test beam line named K1.1BR come into operation. Various experimental programs are proposed at each beam line and some of them have been performed so far. As the first experiment at the J-PARC Hadron Facility, the Θ+ pentaquark was searched for via the pion-induced hadronic reaction in the autumn of 2010. Also experimental programs to search for new hadronic states such as K−pp have started to perform a physics run. The current status and near future programs are introduced.

A class of statistical models of hadron gas allowing an analytical solution is considered. A mechanism of a possible phase transition in such a system is found and conditions for its occurence are determined [ru

Results from measurements of multibody photoproduction at medium incident photon energy (2.8 to 4.8 GeV) are presented and discussed. Particular emphasis is placed on topics which are not well understood and which therefore motivate experiments with the upgraded electron accelerator and storage ring ELSA at the University of Bonn, FR Germany. (author)

As part of our ongoing studies into the potential application of 3D printing techniques to gaseous radiation detectors, we have studied the ability of 3D printed enclosures to resist environmental oxygen ingress. A set of cuboid and hexagonal prism shaped enclosures with wall thicknesses of 4 mm, 6 mm, 8 mm and 10 mm were designed and printed in nylon using a EOSINT P 730 Selective Laser Sintering 3D printer system These test enclosures provide a comparison of different environmental gas ingress for different 3D printing techniques. The rate of change of oxygen concentration was found to be linear, decreasing as the wall thickness increases. It was also found that the hexagonal prism geometry produced a lower rate of change of oxygen concentration compared with the cuboid shaped enclosures. Possible reasons as to why these results were obtained are discussed The implications for the this study for deployable systems are also discussed (authors)

Rarefied gas flow in a three-dimensional enclosure induced by nonuniform temperature distribution is numerically investigated. The enclosure has a square channel-like geometry with alternatively heated closed ends and lateral walls with a linear temperature distribution. A recently proposed implicit discrete velocity method with a memory reduction technique is used to numerically simulate the problem based on the nonlinear Shakhov kinetic equation. The Knudsen number dependencies of the vortices pattern, slip velocity at the planar walls and edges, and heat transfer are investigated. The influences of the temperature ratio imposed at the ends of the enclosure and the geometric aspect ratio are also evaluated. The overall flow pattern shows similarities with those observed in two-dimensional configurations in literature. However, features due to the three-dimensionality are observed with vortices that are not identified in previous studies on similar two-dimensional enclosures at high Knudsen and small aspect ratios.

Full Text Available The interpretation of quark (q- antiquark (q̄ pairs production and the sequential string breaking as tunneling through the event horizon of colour confinement leads to a thermal hadronic spectrum with a universal Unruh temperature, T ≃ 165 Mev, related to the quark acceleration, a, by T = a/2π. The resulting temperature depends on the quark mass and then on the content of the produced hadrons, causing a deviation from full equilibrium and hence a suppression of strange particle production in elementary collisions. In nucleus-nucleus collisions, where the quark density is much bigger, one has to introduce an average temperature (acceleration which dilutes the quark mass effect and the strangeness suppression almost disappears.

In a ceremony on 30 July, three of the 200 suppliers for the Large Hadron Collider (LHC) were presented with Golden Hadron awards. It is the third year that the awards have been presented to suppliers, not only for their technical and financial achievements but also for their compliance with contractual deadlines. This year the three companies are all involved in the supplies for the LHC's main magnet system.

The authors summarize and discuss present and future experiments on decays of light mesons and muons that were presented in the Hadronic Weak Interaction working group session of the open-quotes Workshop on Future Directions in Particle and Nuclear Physics at Multi-GeV Hadron Facilities.close quotes Precise measurements and rare-decay searches, which sense mass scales in the 1-1000 TeV region, are discussed in the context of the standard model and beyond

A summary of six LHCb results on the topic of charmless hadronic b-hadron decays is presented. These are comprised of: a search for the decay and updated branching fraction measurements of decays (h=K,π) [1]; the first observation of the decays and strong evidence for the decay [2]; the first observation of the decay [3]; a search for the decay [4]; the first observation of the decay [5] and evidence for CP-violation in decays [6].

The STAR Experiment at RHIC is capable of a wide variety of measurements of the production of various hadrons in nuclear collisions. Measurements of the relative production of particle yields can shed light on the validity of models for hadron production in these collisions which may help characterize the source. Preliminary results on these measurements at RHIC energies of sq root sup S NN = 130 GeV and 200 GeV are presented and comparisons to some models are discussed. (author)

The QCD counterpart of Hawking radiation from black holes leads to thermal hadron production in high energy collisions, from e + e - annihilation to heavy ion interactions. This hadronic radiation is emitted at a universal temperature T≅(σ/2π) 1/2 , where the string tension σ measures the colour field at the event horizon of confinement. Moreover, the emitted radiation is thermal 'at birth'; since the event horizon prevents all information transfer, no memory has to be destroyed kinetically. (author)

We show that, in wide classes of supergravity models, the ratio of the leptonic and hadronic decay widths of winos is very different from that of heavy leptons which decay via the standard weak interaction. For example, it becomes 20% in one case and 70% in another case, while it will be 36% in case of heavy leptons. The production of wino pairs in next e + e - experiments will be tested by the ratio of leptonic and hadronic final states. (Author)

Quark hadron dualism is discussed, based on observing the changes in the quark model characteristics after the inclusion into hadron degrees of freedom. A standard version of the potential model is presented. The potential which is responsible for the formation of mesons may be divided into two pieces: a short-range part for distances about 0.3 - 0.5 fm and a long-range part at distances more than 1 fm. (R.P.). 5 refs., 2 figs

Quark hadron dualism is discussed, based on observing the changes in the quark model characteristics after the inclusion into hadron degrees of freedom. A standard version of the potential model is presented. The potential which is responsible for the formation of mesons may be divided into two pieces: a short-range part for distances about 0.3 - 0.5 fm and a long-range part at distances more than 1 fm. (R.P.). 5 refs., 2 figs.

Strong interaction coupling parameters of particles with beauty quantum number are obtained using dispersion sum rules in various forms, e.g. current algebra sum rules, superconvergence sum rules and finite energy sum rules etc. These sum rules lead to a set of algebraic relations among masses and coupling constants. We compare the hadronic couplings of beautiful particles as obtained from various techniques and discuss their implications on the hadronic production of these states. (author)

Full Text Available Natural convection heat transfer and fluid flow have been examined numerically using the control-volume finite-element method in an isosceles prismatic cavity, submitted to a uniform heat flux from below when inclined sides are maintained isothermal and vertical walls are assumed to be perfect thermal insulators, without symmetry assumptions for the flow structure. The aim of the study is to examine a pitchfork bifurcation occurrence. Governing parameters on heat transfer and flow fields are the Rayleigh number and the aspect ratio of the enclosure. It has been found that the heated wall is not isothermal and the flow structure is sensitive to the aspect ratio. It is also found that heat transfer increases with increasing of Rayleigh number and decreases with increasing aspect ratio. The effects of aspect ratio become significant especially for higher values of Rayleigh number. Eventually the obtained results show that a pitchfork bifurcation occurs at a critical Rayleigh number, above which the symmetric solutions becomes unstable and asymmetric solutions are instead obtained.

This report describes the modeling of typical wall assemblies that have performed well historically in various climate zones. The WUFI (Warme und Feuchte instationar) software (Version 5.3) model was used. A library of input data and results are provided. The provided information can be generalized for application to a broad population of houses, within the limits of existing experience. The WUFI software model was calibrated or tuned using wall assemblies with historically successful performance. The primary performance criteria or failure criteria establishing historic performance was moisture content of the exterior sheathing. The primary tuning parameters (simulation inputs) were airflow and specifying appropriate material properties. Rational hygric loads were established based on experience - specifically rain wetting and interior moisture (RH levels). The tuning parameters were limited or bounded by published data or experience. The WUFI templates provided with this report supply useful information resources to new or less-experienced users. The files present various custom settings that will help avoid results that will require overly conservative enclosure assemblies. Overall, better material data, consistent initial assumptions, and consistent inputs among practitioners will improve the quality of WUFI modeling, and improve the level of sophistication in the field.

It is known that the coupling between a modally reactive boundary structure of an enclosure and the enclosed sound field induces absorption in the sound field. However, the effect of this absorption on the sound-field response can vary significantly, even when material properties of the structure and dimensions of the coupled system are not changed. Although there have been numerous investigations of coupling between a structure and an enclosed sound field, little work has been done in the area of sound absorption induced by the coupling. Therefore, characteristics of the absorption are not well understood and the extent of its influence on the behavior of the sound-field response is not clearly known. In this paper, the coupling of a boundary structure and an enclosed sound field in frequency bands above the low-frequency range is considered. Three aspects of the coupling-induced sound absorption are studied namely, the effects of exciting either the structure or the sound field directly, damping in the uncoupled sound field and damping in the uncoupled structure. The results provide an understanding of some features of the coupling-induced absorption and its significance to the sound-field response.

Full Text Available This is part one of a two-part interdisciplinary paper that examines the various forces (discourses and institutional processes that shape prisoner-student identities. Discourses of officers from a correctional website serve as a limited, single case study of discourses that ascribe dehumanized, stigmatized identities to "the prisoner." Two critical concepts, performative spaces and identity enclosures, are purposed as potential critical, emancipatory terms to explore the prisoner-student identity work that occurs in schools and elsewhere in prison. This paper is guided by the effort to assist teachers to act as transformative intellectuals in prisons and closed-custody settings by becoming more aware of the multilayered contexts--the politics of location--that undergird their work. Seeing the "bigger picture" has implications for how and what educators teach in prison settings and, perhaps, why education works to facilitate reentry. This paper is grounded in normalization theory. Normalization theorists believe prisons can facilitate reentry when they mirror important dimensions of outside life. The performance of multiple, contextualized identities, considered here and examined in more detail in a forthcoming article, serves as an example of how educators mirror "normal" life by facilitating the performance of different roles for prisoners on the inside.

Attapulgite clay and ground blast furnace slag cement can form a low solids slurry which, after setting and curing, exhibits very low permeability and substantial strength. Compared to better known cement bentonite slurries, the conductivity is 3 orders of magnitude lower and the strength is four times higher at a similar solids content. Coefficients of permeability have been measured in the 10 -10 cm/sec. range. As a containment barrier, no chemical compound has had detrimental effects on the integrity of the material. Compatibility with leachates at a pH under 2 has been demonstrated. Compared to leachable Ordinary Portland Cement and to bentonite gel shrinkage in the presence of certain organic compounds, the attapulgite clay and the selected slag cement behave as remarkably inert. A number of successful applications as vertical barriers, trenched and by the vibrated beam method, have been installed at remedial sites. Applications by jet grouting have been implemented under utilities to provide continuity. The potential for placement of such materials to form horizontal barriers by jet grouting or frac-grouting/mud jacking techniques, offers the possibility of creating complete enclosures in soils. The purely mineral nature of these slurries ensures long term chemical stability necessary for permanent containment

This report documents calculations conducted to determine if 42 low-power transmitters located within a metallic enclosure can initiate electro-explosive devices (EED) located within the same enclosure. This analysis was performed for a generic EED no-fire power level of 250 mW. The calculations show that if the transmitters are incoherent, the power available is 32 mW - approximately one-eighth of the assumed level even with several worst-case assumptions in place.

In many applications of chiral perturbation theory, one has to purify physical matrix elements from electromagnetic effects. On the other hand, the splitting of the Hamiltonian into a strong and an electromagnetic part cannot be performed in a unique manner, because photon loops generate ultraviolet divergences. In the present article, we propose a convention for disentangling the two effects: one matches the parameters of two theories - with and without electromagnetic interactions - at a given scale μ 1 , referred to as the matching scale. This method enables one to analyse the separation of strong and electromagnetic contributions in a transparent manner. We first study in a Yukawa-type model the dependence of strong and electromagnetic contributions on the matching scale. In a second step, we investigate this splitting in the linear sigma model at one-loop order, and consider in some detail the construction of the corresponding low-energy effective Lagrangian, which exactly implements the splitting of electromagnetic and strong interactions carried out in the underlying theory. We expect these model studies to be useful in the interpretation of the standard low-energy effective theory of hadrons, leptons and photons. (orig.)

For the third year running, CERN has awarded a prize to its best LHC suppliers. Three companies were presented with the Golden Hadron 2004. On Friday 30 July, three of the two hundred suppliers for the LHC were presented with Golden Hadron awards by Lyn Evans. This is the third year that the awards have been presented. This year it was the turn of Alstom-MSA (France), Ernesto Malvestiti S.p.A. (Italy) and Simic S.p.A. (Italy) to receive awards, not only for their technical and financial achievements but also for their compliance with contractual deadlines. From left to right: Sandro Ferraris (SIMIC), Guiseppi Ginola (SIMIC), Gérard Grunblatt (ALSTOM), Phillippe Mocaer (ALSTOM), Gianfranco Malvestiti (ERNESTO MALVESTITI), Ernesto Malvestiti (ERNESTO MALVESTITI) Alstom-MSA was awarded the prize for manufacturing superconducting cable for the LHC's main magnets, the dipoles designed to steer the particles round the accelerator and the quadrupoles designed to focus the particle beams. Seven thousand kilometres ...

We present the design of a future high-energy high-luminosity electron-hadron collider at RHIC called eRHIC. We plan on adding 20 (potentially 30) GeV energy recovery linacs to accelerate and to collide polarized and unpolarized electrons with hadrons in RHIC. The center-of-mass energy of eRHIC will range from 30 to 200 GeV. The luminosity exceeding 10{sup 34} cm{sup -2} s{sup -1} can be achieved in eRHIC using the low-beta interaction region with a 10 mrad crab crossing. We report on the progress of important eRHIC R&D such as the high-current polarized electron source, the coherent electron cooling, ERL test facility and the compact magnets for recirculation passes. A natural staging scenario of step-by-step increases of the electron beam energy by building-up of eRHIC's SRF linacs is presented.

Inclusive jet production in p-p and p ¯ -p collisions shows many of the same kinematic systematics as observed in single-particle inclusive production at much lower energies. In an earlier study (1974) a phenomenology, called radial scaling, was developed for the single-particle inclusive cross sections that attempted to capture the essential underlying physics of pointlike parton scattering and the fragmentation of partons into hadrons suppressed by the kinematic boundary. The phenomenology was successful in emphasizing the underlying systematics of the inclusive particle productions. Here we demonstrate that inclusive jet production at the Large Hadron Collider (LHC) in high-energy p-p collisions and at the Tevatron in p ¯ -p inelastic scattering shows similar behavior. The ATLAS inclusive jet production plotted as a function of this scaling variable is studied for √s of 2.76, 7 and 13 TeV and is compared to p ¯ -p inclusive jet production at 1.96 TeV measured at the CDF and D0 at the Tevatron and p-Pb inclusive jet production at the LHC ATLAS at √sNN=5.02 TeV . Inclusive single-particle production at Fermi National Accelerator Laboratory fixed target and Intersecting Storage Rings energies are compared to inclusive J /ψ production at the LHC measured in ATLAS, CMS and LHCb. Striking common features of the data are discussed.

The ATLAS detector is a multi-purpose detector measuring the energy and direction of particles produced in proton-proton collisions at a center of mass energy of 14 TeV provided by the Large Hadron Collider at the European center of particle physics, CERN. The main aim of this thesis is to assess the precision of the present understanding of the interactions of hadrons with matter (as implemented in Monte Carlo (MC) simulations) to describe the response of the ATLAS calorimeter and to predict the correction necessary to measure the full energy of pions. The simulations are compared to testbeam data. The present description of the response of the ATLAS central calorimeter is able to predict the energy corrections, as verified by using testbeam data. For the Combined Testbeam 2004 (CTB) a full slice of the central region of the ATLAS detector including all sub-detectors has been installed in the H8 beam line of the CERN SPS accelerator. Pions and electrons with the energies ranging from 1 to 350 GeV have been m...

An overview of the current status of the Main Injector Neutrino Oscillation Search (MINOS) is presented. MINOS is a long-baseline experiment with two detectors situated in North America. The near detector is based at the emission point of the NuMI beam at Fermilab, Chicago, the far detector is 735 km downstream in a disused iron mine in Soudan, Minnesota. A third detector, the calibration detector, is used to cross-calibrate these detectors by sampling different particle beams at CERN. A detailed description of the design and construction of the light-injection calibration system is included. Also presented are experimental investigations into proton-carbon collisions at 158 GeV/c carried out with the NA49 experiment at CERN. The NA49 experiment is a Time Projection Chamber (TPC) based experiment situated at CERN's North Area. It is a well established experiment with well known characteristics. The data gained from this investigation are to be used to parameterize various hadronic production processes in accelerator and atmospheric neutrino production. These hadronic production parameters will be used to improve the neutrino generation models used in calculating the neutrino oscillation parameters in MINOS.

Recent developments in cooling methods for portable electronic devices have heightened the need for using the large latent heat capacity of phase change materials (PCM). The aim of the present study is to evaluate the thermal characteristics of a PCM-based heat sink with high conductive materials. The solution is acquired as a procession of optimization stages which starts with the elemental area and proceeds toward the first assembly. Every optimization stage is the result of maximizing the safe operation time without allowing the electronics to reach the critical temperature. Primarily, the degrees of freedom and constrains were defined and then by changing the geometrical parameters, the target function which is the maximization of operation time, was optimized. Results show that the melting process in rectangular enclosures with vertical fins attached to the heated bottom surface can be affected by the contact surface between the fin and PCM and the convection of the melted PCM. For a rectangular enclosure with a constant area, it is better to use wider enclosure than the square and thin one. Also results indicate that the ratio of the vertical fin thickness to the horizontal one does not have a considerable effect on performance. By increasing the number of enclosures, the contact surface is raised, but the performance is not necessarily improved. - Highlights: • Thermal characteristics of a finned PCM-based heat sink are studied. • Constructal theory was used to optimize the PCM enclosures. • By increasing the number of enclosures, the performance is not necessarily improved

Three topics from the field of high energy cosmic rays that are relevant to properties of hadronic interactions at energies not accessible to existing accelerators are discussed. In each case, the implications for future experiments at ISABELLE and other accelerators planned to probe the energy range of E/sub Lab/ approximately 10 4 GeV and beyond are evaluated. A systematic analysis of inclusive distributions of photons produced in collisions of hadrons with light nuclei is given. The overall conclusion is that, although the data is consistent with scaling for small x in the fragmentation region, the plateau appears to rise significantly beyond ISR energies with a correspondingly rapid increase in multiplicity. The situation in the more controversial field of high p/sub T/ in cosmic rays is summarized. If the suggestions of some experiments are correct, then the high p/sub T/ component of hadronic interactions must become much more important relative to the normal component for E/sub Lab/ > 10 4 GeV than would be expected by extrapolating accelerator data on high p/sub T/ using fits of the form p/sub T/ -8 . Some analyses of atmospheric cascades produced by interactions of cosmic rays of E greater than or equal to 10 6 GeV are briefly reviewed. The interpretation of these experiments is ambiguous because the primary composition of cosmic rays is unknown at these energies. It is, however, possible to draw conclusions corresponding to various assumptions about the primary composition

Active galactic nuclei (AGN) are known to exhibit multi-wavelength variability across the whole electromagnetic spectrum. In the context of blazars, the variability timescale can be as short as a few minutes. Correlated variability has been seen in different bands of the electromagnetic spectrum: from radio wavelengths to high energy gamma-rays. This correlated variability in different wavelength bands can put constraints on the particle content, acceleration mechanisms and radiative properties of the relativistic jets that produce blazar emission. Two models are typically invoked to explain the origin of the broadband emission across the electromagnetic spectrum: Leptonic and Hadronic Modeling. Both models have had success in reproducing the broadband spectral energy distributions (SEDs) of blazar emission with different input parameters, making the origin of the emission difficult to determine. However, flaring events cause the spectral components that produce the SED to evolve on different timescales, producing different light curve behavior for both models. My Ph.D. research involves developing one-zone time dependent leptonic and lepto-hadronic codes to reproduce the broadband SEDs of blazars and then model flaring scenarios in order to find distinct differences between the two models. My lepto-hadronic code also considers the time dependent evolution of the radiation emitted by secondary particles (pions and muons) generated from photo-hadronic interactions between the photons and protons in the emission region. I present fits to the broadband SEDs of the flat spectrum radio quasars (FSRQs) 3C 273 and 3C 279 using my one-zone leptonic and lepto-hadronic model, respectively. I showed that by considering perturbations of any one of the selected input parameters for both models: magnetic field, particle injection luminosity, particle spectral index, and stochastic acceleration time scale, distinct differences arise in the light curves for the optical, X-ray and

Beam instabilities cover a wide range of effects in particle accelerators and they have been the subjects of intense research for several decades. As the machines performance was pushed new mechanisms were revealed and nowadays the challenge consists in studying the interplays between all these intricate phenomena, as it is very often not possible to treat the different effects separately. The aim of this paper is to review the main mechanisms, discussing in particular the recent developments of beam instability theories and simulations.

The Large Hadron Collider (LHC), due to be commissioned in 2005, will provide particle physics with the first laboratory tool to access the energy frontier above 1 TeV. In order to achieve this , protons must be accelerated and stored at 7 TeV, colliding with an unprecedented luminosity of 1034 cm-2 s-1. The 8.3 Tesla guide field is obtained using conventional NbTi technology cooled to below the lambda point of helium. Considerable modification of the infrastructure around the existing LEP tunnel is needed to house the LHC machine and detectors. The project is advancing according to schedule with most of the major hardware systems including cryogenics and magnets under construction. A brief status report is given and future prospects are discussed.

Full Text Available It has been known for about sixty years that proton and heavy ion therapy is a very powerful radiation procedure for treating tumors. It has an innate ability to irradiate tumors with greater doses and spatial selectivity compared with electron and photon therapy and, hence, is a tissue sparing procedure. For more than twenty years, powerful lasers have generated high energy beams of protons and heavy ions and it has, therefore, frequently been speculated that lasers could be used as an alternative to radiofrequency (RF accelerators to produce the particle beams necessary for cancer therapy. The present paper reviews the progress made towards laser driven hadron cancer therapy and what has still to be accomplished to realize its inherent enormous potential.

The effects of particle radiation of the type encountered in space flight on bacteriophages are investigated. Survival and mutagenesis were followed in dry film cultures or liquid suspensions of T4Br(+) bacteriophage exposed to high-energy (HZE) particles during orbital flight, to alpha particles and accelerator-generated hardrons in the laboratory, and to high-energy cosmic rays at mountain altitudes. The HZE particles and high-energy hadrons are found to have a greater relative biological efficiency than standard gamma radiation, while exhibiting a highly inhomogeneous spatial structure in the observed biological and genetic effects. In addition, the genetic lesions observed are specific to the type of radiation exposure, consisting primarily of deletions and multiple lesions of low revertability, with mode of action depending on the linear energy transfer. 18 references

The Large Hadron Collider (LHC), due to be commissioned in 2005, will provide particle physics with the first laboratory tool to access the energy frontier above 1 TeV. In order to achieve this , protons must be accelerated and stored at 7 TeV, colliding with an unprecedented luminosity of 1034 cm-2 s-1. The 8.3 Tesla guide field is obtained using conventional NbTi technology cooled to below the lambda point of helium. Considerable modification of the infrastructure around the existing LEP tunnel is needed to house the LHC machine and detectors. The project is advancing according to schedule with most of the major hardware systems including cryogenics and magnets under construction. A brief status report is given and future prospects are discussed.

After a decade of intensive R&D in the key technologies of high-field superconducting accelerator magnets and superfluid helium cryogenics, the Large Hadron Collider (LHC) has now fully entered its co nstruction phase, with the adjudication of major procurement contracts to industry. As concerns cryogenic engineering, this R&D program has resulted in significant developments in several fields, amon g which thermo-hydraulics of two-phase saturated superfluid helium, efficient cycles and machinery for large-capacity refrigeration at 1.8 K, insulation techniques for series-produced cryostats and mu lti-kilometre long distribution lines, large-current leads using high-temperature superconductors, industrial precision thermometry below 4 K, and novel control techniques applied to strongly non-line ar processes. We review the most salient advances in these domains.

In the framework of the ARCHADE project (Advanced Resource Center for Hadron-therapy in Europe), a research project in Carbone ion beam therapy and clinical Proton-therapy, this work investigates the beam monitoring and dosimetry aspects of ion beam therapy. The main goal, here, is to understand the operating mode of air ionization chambers, the detectors used for such applications. This study starts at a very fundamental level as the involved physical and chemical parameters of air were measured in various electric field conditions with dedicated setups and used to produce a simulation tools aiming at reproducing the operating response in high intensity PBS (Pencil Beam Scanning) coming from IBA's (Ion Beam Applications) next generation of proton beam accelerators. In addition, an ionization chamber-based dosimetry equipment was developed, DOSION III, for radiobiology studies conducted at GANIL under the supervision of the CIMAP laboratory. (author)

A study of the near-side ridge phenomenon in hadron-hadron collisions based on a cluster picture of multiparticle production is presented. The near-side ridge effect is shown to have a natural explanation in this context provided that clusters are produced in a correlated manner in the collision transverse plane.

This work concerns soft hadronic interactions which in the Standard Model carry most of the observable cross-section but are not amenable to quantitative predictions due to the very nature of the QCD (Theory of Strong Interactions). In the low momentum transfer region the evolving coupling constant caused perturbation theory to break down. In this situation better experimental understanding of the physics phenomena is needed. One aspect of the soft hadronic interactions will be discussed in this work: transfer of the baryon number from the initial to the final state of the interaction. The past experimental knowledge on this process is presented, reasons for its unsatisfactory status are discussed and condition necessary for improvement are outlined: that is experimental apparatus with superior performance over the full range of available interactions: hadron-hadron collision, hadron-nucleus and nucleus-nucleus interactions. A consistent model-independent picture of the baryon number transfer process emerging from the data on the full range of interactions is shown. It offers serious challenge to theory to provide quantitative and detailed explanation of the measurements. (author)

The specification of reliable cross sections for heavy quarks, including their production spectra in longitudinal and transverse momenta, and comparisons with data test the quantum chromodynamic (QCD) mechanisms by which all heavy objects are expected to be produced. Strategies in the search for new flavors such as top are predicated on best estimates of cross sections and of momentum distributions in phase space not only of the new flavor but, perhaps more importantly, of lighter flavors that contribute deceptive backgrounds. Those considering hadronic experiments to establish flavor-antiflavor mixing and possible CP violation require a detailed understanding of expected production spectra. A significant result reported during the past year was the completion of a calculation, through order α 3 s in QCD perturbation theory, of the inclusive single heavy quark production cross-section differential in the transverse momentum k T and rapidity y of the heavy quark. Here α s is the running coupling strength in QCD. This result follows the earlier publication of the cross section through order α 3 s integrated over all k T and y. Explicit comparisons with data have also been made. In this paper the author summarizes comparisons of the O(α 3 s ) cross sections with data on hadroproduction of charm and bottom, and the author includes several predictions and suggestions for new measurements. The O(α 3 s ) contributions are larger in many cases of interest than the O(α 2 s ) terms. Not yet available are O(α 3 s ) predictions for momentum correlations between the heavy Q and heavy Q for values of transverse momentum k T greater than the quark mass m Q

Including calculation models and measurements for a variety of electronic components and their concerned radiation environments, this thesis describes the complex radiation field present in the surrounding of a high-energy hadronaccelerator and assesses the risks related to it in terms of Single Event Effects (SEE). It is shown that this poses not only a serious threat to the respective operation of modern accelerators but also highlights the impact on other high-energy radiation environments such as those for ground and avionics applications. Different LHC-like radiation environments are described in terms of their hadron composition and energy spectra. They are compared with other environments relevant for electronic component operation such as the ground-level, avionics or proton belt. The main characteristic of the high-energy accelerator radiation field is its mixed nature, both in terms of hadron types and energy interval. The threat to electronics ranges from neutrons of thermal energies to GeV hadron...

A three-dimensional model for flow inside a fume hood enclosure was developed and numerical computations were carried out to explore the flow pattern and possible path of contaminant transport under different operating conditions of the hood. Equations for the conservation of mass and momentum were solved for different flow rate and opening conditions in the hood. The face velocity was maintained constant at its rated value of 0.4 m/s. The flow was assumed to enter through the front window opening (positive x-direction) and leave the cupboard through an opening on the top of the hood (positive z-direction). The flow was assumed to be fully turbulent. The κ-var-epsilon model was used for the prediction of turbulence. The flow pattern for different sash openings were studied. The flow patterns around an object located at the bottom of the hood was studied for different locations of the object. In addition, the effect of a person standing in front of the hood on the flow pattern was investigated. It was found that air entering the hood proceeds directly to the back wall, impinges it and turns upward toward the top wall and exits through the outlet. The flow finds its way around any object forming a recirculating region at its trailing surface. With an increase in the sash opening, the velocity becomes higher and the fluid traces the path to the outlet more quickly. The volume occupied by recirculating flow decreases with increase in sash opening. The computed flow patterns will be very useful to design experiments with optimum sash opening providing adequate disposal of contaminants with minimum use of conditioned air from inside the room

The nineteenth annual SLAC Summer Institute on Particle Physics took place from August 5 to 16, 1991, attracting 236 participants from 10 different countries. The theme was lepton-hadron scattering, the subjects ranging from the pioneering SLAC-MIT experiments, through the new era of e-p collisions to be ushered in by HERA. Richard Taylor led off the Institute with a historical review of lepton-proton scattering experiments, from Rutherford to the 1960s, while Sid Drell laid out the theoretical framework, in terms of parton distributions and sum rules. Frank Sciulli picked up where Richard Taylor left off, at the discovery of scaling violation, and brought us up to the present. Joel Feltesse and Roberto Peccei described the physics opportunities at HERA, most notably the investigation of the low x behavior of structure functions. Traudl Hansl-Kozanecka reviewed the current experimental status of QCD, at e + e - and hadron colliders as well as in deep-inelastic lepton-hadron scattering. Bob Hollebeek lectured on techniques for electromagnetic and hadronic calorimetry. Finally, Bob Siemann gave a series of lectures on the many uses of superconductivity in particle accelerators, from bending magnets at FNAL HERA and the SSC to RF cavities at CEBAF and LEP. Following the school, the topical conference provided us with a spectrum of current experimental and theoretical developments. Lepton-hadron scattering experiments at CERN and Fermilab were well represented. The existence of the 17 0 , keV neutrino was debated in two separate talks. We heard the latest results from the CDF and UA2 hadron collider experiments; from the four LEP experiments; and from ARGUS and CLEO. Also presented were overviews of the rare K decay program at BNL, the CP violation experiments at CERN and Fermilab, B physics, neutrino masses and mixings, and precision electroweak theory

The workshop on "Hadron-Hadron and Cosmic-Ray Interactions at multi-TeV Energies" held at the ECT* centre (Trento) in Nov.-Dec. 2010 gathered together both theorists and experimentalists to discuss issues of the physics of high-energy hadronic interactions of common interest for the particle, nuclear and cosmic-ray communities. QCD results from collider experiments -- mostly from the LHC but also from the Tevatron, RHIC and HERA -- were discussed and compared to various hadronic Monte Carlo generators, aiming at an improvement of our theoretical understanding of soft, semi-hard and hard parton dynamics. The latest cosmic-ray results from various ground-based observatories were also presented with an emphasis on the phenomenological modeling of the first hadronic interactions of the extended air-showers generated in the Earth atmosphere. These mini-proceedings consist of an introduction and short summaries of the talks presented at the meeting.

A concrete shielding for an electron accelerator of 1 MeV is suggested to replace its structural steel shielding. The thickness of such a shield is calculated. The calculational model used is based on standard and transmission curves given in the literature. The calculated concrete shielding is generally adequate to attenuate the accelerator produced radiation to a level 1 μ Gy/h or less at any point outside of the vault enclosure. 5 figs

Highlights of the 1993 Particle Accelerator Conference, held in Washington in May, were picked out in the previous issue (page 18). Talks on the big hadron colliders reflected the sea-change in the accelerator world where the scale, complexity and cost of the front-line projects has slowed the pace of developments (not unlike the scene in particle physics itself). Speaking before the anti-SSC vote in the House of Representatives in June, Dick Briggs reviewed the situation at the SSC Superconducting Supercollider in Ellis County, Texas. The linac building is near completion and the Low Energy Booster will be ready to receive components early next year. Tunnelling for the Main Ring is advancing rapidly with four boring machines in action. Five miles of tunnel have been completed since January and the pace has now stepped up to nearly a mile each week. The superconducting magnet news is good. Following the successful initial string test of a half cell of the magnet lattice, a two-ring full cell with all associated services is being assembled. The mechanical robustness of the magnet design was confirmed when a dipole was taken to 9.7 T when cooled to 1.8 K. In the Magnet Test Lab itself, ten test stands are installed and equipped

Approved for public release; distribution is unlimited In 1979,W. B. Colson and S. K. Ride proposed a new kind of electron accelerator using a uniform magnetic field in combination with a circularly-polarized laser field. A key concept is to couple the oscillating electric field to the electron’s motion so that acceleration is sustained. This dissertation investigates the performance of the proposed laser accelerator using modern high powered lasers and mag-netic fields that are significan...

A future electron-positron collider like the planned International Linear Collider (ILC) needs excellent detectors to exploit the full physics potential. Different detector concepts have been evaluated for the ILC and two concepts on the particle-flow approach were validated. To make particle-flow work, a new type of imaging calorimeters is necessary in combination with a high performance tracking system, to be able to track the single particles through the full detector system. These calorimeters require an unprecedented level of both longitudinal and lateral granularity. Several calorimeter technologies promise to reach the required readout segmentation and are currently studied. This thesis addresses one of these: The analogue hadron calorimeter technology. It combines work on the technological aspects of a highly granular calorimeter with the study of hadron shower physics. The analogue hadron calorimeter technology joins a classical scintillator-steel sandwich design with a modern photo-sensor technology, the silicon photomultiplier (SiPM). The SiPM is a millimetre sized, magnetic field insensitive, and low cost photo-sensor, that opens new possibilities in calorimeter design. This thesis outlines the working principle and characteristics of these devices. The requirements for an application specific integrated circuit (ASIC) to read the SiPM are discussed; the performance of a prototype chip for SiPM readout, the SPIROC, is quantified. Also the SiPM specific reconstruction of a multi-thousand channel prototype calorimeter, the CALICE AHCAL, is explained; the systematic uncertainty of the calibration method is derived. The AHCAL does not only offer a test of the calorimeter technology, it also allows to record hadron showers with an unprecedented level of details. Test-beam measurements have been performed with the AHCAL and provide a unique sample for the development of novel analysis techniques and the validation of hadron shower simulations. A method to

The effects of natural turbulent convection with the interaction of surface radiation in a rectangular enclosure have previously been numerically and theoretically studied. The analyses were carried out over a wide range of enclosure aspect ratios ranging from 0.0625 to 16, different enclosure sizes, with cold wall temperatures ranging from 283 to 373 K, and temperature ratios ranging from 1.02 to 2.61. The work was carried out using four fluids (Argon, Air, Helium and Hydrogen; whose properties vary with temperature). These can be used to calculate Nusselt number for pure natural convection and also to calculate the ratio between convection to radiation heat transfer for both square and rectangular enclosures. This work extends these results by providing an empirical solution for the case of radiation and natural convection in square and rectangular enclosures and also provides a correlation equation to calculate the total Nusselt number for these cases. This method allows the simple calculation of either the total heat transfer rate, given the fluid, the geometry and the temperatures of the hot and cold walls, or via a straightforward iterative technique, the temperature of one wall given the other wall temperature and the total heat transfer. -- Highlights: ► Previous work has non-dimensionalised flow in enclosures with and without radiation. ► This extends the work by enabling a simple iterative technique to work out temperatures for total heat transfer rate. ► The provided solution has a maximum deviation of 7.7%. ► The method works for a variety of enclosures sizes, aspect ratios, temperatures and gases

Highlights: • A novel multi-heat source plug-chip spray cooling enclosure was designed. • Enhanced surfaces with different geometric were analyzed in integrated enclosure. • Overall thermal control with adjustable parameters in enclosure was studied. • Temperature disequilibrium of multi-heat source in enclosure was tested. • A comprehensive assessment system used to evaluate the practicality was proposed. - Abstract: Practical and integrated spray cooling system is urgently needed for the cooling of high-performance electronic chips due to the growth requirements of thermal management in workstation. The integration of multi heat sources and the management of integral system are particularly lacking. In order to fill the vacancies in the study of plug-chip spray cooling, an integrated cooling enclosure was designed in this paper. Multi heat sources were placed in sealed space and the heat was removed by spray. The printed circuit board plug-ins and radio frequency resistors were used as analog motherboards and chips, respectively. The enhanced surfaces with four different geometries and the plain surface were studied under the conditions of different inclination angles. The results were compared and the maximum critical heat flux (CHF) was obtained. Moreover, with the intention of the overall management of multi-heat source in integrated enclosure, the effect of the flow rate and the temperature disequilibrium, and the pulse heating in the process of transient cooling were also analyzed. In addition, a comprehensive assessment system, used to evaluate the practicality of spray cooling experimental devices, was proposed and the performance of enclosure was evaluated.

Space per animal, or animal density, and enclosure type are important elements of functionally appropriate captive environments (FACEs) for chimpanzees. The National Institutes of Health (NIH) recommends that captive chimpanzees be maintained in areas of >250 ft 2 /animal. Several studies have investigated chimpanzee behavior in relation to space per animal, but only two studies have examined these variables while attempting to hold environmental complexity constant. Both have found few, if any, significant differences in behavior associated with increased space per animal. The NIH does not provide recommendations pertaining to enclosure type. Although Primadomes™ and corrals are considered acceptable FACE housing, no studies have investigated chimpanzee behavior in relation to these two common types of enclosures. We examined the NIH space per animal recommendation, and the effects of enclosure type, while maintaining similar levels of environmental complexity. We used focal animal observations to record the behavior of 22 chimpanzees in three social groups following within-facility housing transfers. Chimpanzees that were moved from an area with space below the NIH recommendation to the same type of enclosure with space above the recommendation (dome to double dome) exhibited significantly more locomotion and behavioral diversity post-transfer. Chimpanzees that were moved from an area with space below the recommendation to a different type of enclosure with space above the recommendation (dome to corral) exhibited significant increases in foraging and behavioral diversity, and a decrease in rough scratching. Lastly, chimpanzees that were moved from an area above the recommendation to a different enclosure type with space equal to the recommendation (corral to double dome) exhibited an increase in behavioral diversity. These results add to the body of literature that addresses the concept of specific minimum space requirements per chimpanzee, and highlight the

The principle of electrostatic accelerators is presented. We consider Cockcroft Walton, Van de Graaff and Tandem Van de Graaff accelerators. We resume high voltage generators such as cascade generators, Van de Graaff band generators, Pelletron generators, Laddertron generators and Dynamitron generators. The speci c features of accelerating tubes, ion optics and methods of voltage stabilization are described. We discuss the characteristic beam properties and the variety of possible beams. We sketch possible applications and the progress in the development of electrostatic accelerators.

Because the use of accelerated heavy ions would provide many opportunities for new and important studies in nuclear physics and nuclear chemistry, as well as other disciplines, both the Chemistry and Physics Divisions are supporting the development of a heavy-ion accelerator. The design of greatest current interest includes a tandem accelerator with a terminal voltage of approximately 25 MV injecting into a linear accelerator with rf superconducting resonators. This combined accelerator facility would be capable of accelerating ions of masses ranging over the entire periodic table to an energy corresponding to approximately 10 MeV/nucleon. This approach, as compared to other concepts, has the advantages of lower construction costs, lower operating power, 100 percent duty factor, and high beam quality (good energy resolution, good timing resolution, small beam size, and small beam divergence). The included sections describe the concept of the proposed heavy-ion accelerator, and the development program aiming at: (1) investigation of the individual questions concerning the superconducting accelerating resonators; (2) construction and testing of prototype accelerator systems; and (3) search for economical solutions to engineering problems. (U.S.)

The COMPASS fixed-target experiment at the CERN SPS is dedicated to the study of hadron structure and dynamics. One goal of the physics programme using hadron beams is the search for new states, in particular the search for $J^{PC}$ exotic states and glueballs. After a short pilot run in 2004 (190 GeV/c $\\pi^{-}$ beam, lead target), we started our hadron spectroscopy programme in 2008 by collecting unprecedented statistics using 190 GeV/c negative hadron beams on a liquid hydrogen target. A similar amount of data with 190 GeV/c positive hadron beams has been taken in 2009, as well as some data (negative beam) on nuclear targets. As a first result the observation of a significant $J^{PC}$ spin-exotic signal in the 2004 data -- consistent with the disputed $\\pi_1(1600)$ -- was recently published. Our spectrometer features good coverage by electromagnetic calorimetry, crucial for the detection of final states involving $\\pi^0$, $\\eta$ or $\\eta'$, and the 2008/09 data provide an excellent opportunity for the simu...

Jet measurements have long been tools used to understand QCD phenomena. There is still much to be learned from the production of hadrons inside of jets. In particular, hadron yields within jets from proton-proton collisions have been proposed as a way to unearth more information on gluon fragmentation functions. In 2011, the STAR experiment at RHIC collected 23 pb-1 of data from proton-proton collisions at √{ s} = 500 GeV. The jets of most interest for gluon fragmentation functions are those with transverse momentum around 6-15 GeV/c. Large acceptance charged particle tracking and electromagnetic calorimetry make STAR an excellent jet detector. Time-of-flight and specific energy loss in the tracking system allow particle identification on the various types of hadrons within the jets, e.g., distinguishing pions from kaons and protons. An integral part of analyzing the data collected is understanding how the finite resolutions of the various detector subsystems influence the measured jet and hadron kinematics. For this reason, Monte Carlo simulations can be used to track the shifting of the hadron and jet kinematics between the generator level and the detector reconstruction level. The status of this analysis will be presented. We would like to acknowledge the Ronald E. McNair program for supporting this research.

Some unusual features observed in hadronic collisions at high energies can be understood assuming that gluons in hadrons are located within small spots occupying only about 10% of the hadrons' area. Such a conjecture about the presence of two scales in hadrons helps to explain the following: why diffractive gluon radiation is so suppressed; why the triple-Pomeron coupling shows no t dependence; why total hadronic cross sections rise so slowly with energy; why diffraction cones shrink so slowly, and why α P ' R ' ; why the transition from hard to soft regimes in the structure functions occurs at rather large Q 2 ; why the observed Cronin effect at collider energies is so weak; why hard reactions sensitive to primordial parton motion (direct photon, Drell-Yan dileptons, heavy flavors, back-to-back dihadrons, seagull effect, etc.) demand such a large transverse momenta of the projectile partons, which is not explained by next-to-leading order calculations; why the onset of nuclear shadowing for gluons is so delayed compared to quarks; and why shadowing is so weak

Since Quantum Choromdynamics allows for gluon self-coupling, quarks and gluons cannot be observed as free particles, but only their bound states, the hadrons. This so-called confinement phenomenon is responsible for $98\\%$ of the mass in the visible universe. The measurement of the hadron excitation spectra therefore gives valuable input for theory and phenomenology to quantitatively understand this phenomenon. One simple model to describe hadrons is the Constituent Quark Model (CQM), which knows two types of hadrons: mesons, consisting of a quark and an antiquark, and baryons, which are made out of three quarks. More advanced models, which are inspired by QCD as well as calculations within Lattice QCD predict the existence of other types of hadrons, which may be e.g. described solely by gluonic excitations (glueballs) or mixed quark and gluon excitations (hybrids). In order to search for such states, the COMPASS experiment at the Super Proton Synchrotron at CERN has collected large data sets, which allow to ...

About 90 registered participants delivered more than 40 scientific papers. A great part of these presentations were of general interest about running projects such as CIME accelerator at Ganil, IPHI (high intensity proton injector), ESRF (European source of synchrotron radiation), LHC (large hadron collider), ELYSE accelerator at Orsay, AIRIX, and VIVITRON tandem accelerator. Other presentations highlighted the latest technological developments of accelerator components: superconducting cavities, power klystrons, high current injectors..

The operating electron accelerators and their importance in the nuclear and in the particle physics developments, are underlined. The principles of probing the nucleus by applying electron scattering techniques and the main experimental results, are summarized. In order to understand hadron interactions and the dynamics of quark confinement in nuclei, the high energy electrons must provide quantitative data on the following topics: the structure of the nucleon, the role of non nucleonic components in nuclei, the nature of short-range nucleon correlations, the origin of the short-range part of nuclear forces and the effects of the nuclear medium on quark distributions. To progress in the nuclear structure knowledge it is necessary to build a coherent strategy of accelerator developments in Europe

To fully exploit the physics potential of a future Lepton Collider requires detectors with unprecedented jet energy and dijet-mass resolution. To meet these challenges, detectors optimized for the application of Particle Flow Algorithms (PFAs) are being designed and developed. The application of PFAs, in turn, requires calorimeters with very fine segmentation of the readout, so-called imaging calorimeters. This talk reviews progress in imaging hadron calorimetry as it is being developed for implementation in a detector at a future Lepton Collider. Recent results from the large prototypes built by the CALICE Collaboration, such as the Scintillator Analog Hadron Calorimeter (AHCAL) and the Digital Hadron Calorimeters (DHCAL and SDHCAL) are being presented. In addition, various R and D efforts beyond the present prototypes are being discussed.

An introductory treatment of hadronization through functional integral calculus and bifocal Bose fields is given. Emphasis is placed on the utility of this approach for providing a connection between QCD and effective hadronic field theories. The hadronic interactions obtained by this method are nonlocal due to the QCD substructure, yet, in the presence of an electromagnetic field, maintain the electromagnetic gauge invariance manifest at the quark level. A local chiral model which is structurally consistent with chiral perturbation theory is obtained through a derivative expansion of the nonlocalities with determined, finite coefficients. Tree-level calculations of the pion form factor and π - π scattering, which illustrate the dual constituent-quark-chiral-model nature of this approach, are presented

The MIT bag model is modified in order to describe rotational motion of hadrons. It has a kind of 'diatomic molecular' structure; The rotational excitation of the MIT bag is described by the polarized two colored sub-bags which are connected with each other by the gluon flux. One sub-bag contains a quark and the other has an antiquark for mesons. For baryons, the latter sub-bag contains the remaining two quarks instead of the antiquark. The Regge trajectories of hadrons are explained qualitatively by our new model with the usual MIT bag parameters. In particular the Regge slopes are reproduced fairly well. It is also pointed out that the gluon flux plays an important role in the rotational motion of hadrons. (author)

The prototype module of LIBO, a linear accelerator project designed for cancer therapy, has passed its first proton-beam acceleration test. In parallel a new version - LIBO-30 - is being developed, which promises to open up even more interesting avenues.

Results of studies of anomalous electron-muon and electron-hadron events produced in electron-positron annihilation are presented. The data for this work were obtained with a lead-glass counter system, which was added to one octant of the Stanford Linear Accelerator Center-Lawrence Berkeley Laboratory magnetic detector at the electron-positron storage ring SPEAR. The lead-glass counter system provides good electron identification for part of the magnetic detector. The events under study have two detected charged particles and any number of detected photons. One detected charged particle is identified as an electron in the lead-glass counter system. The other detected charged particle is identified as a muon or hadron in the magnetic detector. Anomalous events are events which are not subject to conventional explanations; examples of conventional explanations are misidentification of particles or the decay of ordinary or strange hadrons. These data confirm previous observations of anomalous lepton production at SPEAR and DESY. The data corrected for charm background are consistent with heavy lepton production and decay. The branching ratio for the heavy lepton to decay into an electron and two neutrinos was measured to be 0.21 +- 0.05. The branching ratio for the heavy lepton to decay into one charged hadron, one neutrino and any number of photons was measured to be 0.28 +- 0.13. They are consistent with the theoretical values within the errors

Hadron Physics has drawn great interests from the Chinese nuclear and high-energy physics communities and has been one of the main research areas at major accelerator facilities in China. At the same time, the Chinese collaborations are playing increasingly important roles at international hadron physics facilities (Jefferson Lab, RHIC, COMPASS@CERN, J-PARC, …), in particular, at the recently upgraded 12 GeV-energy Jefferson Lab in US, which will provide a broad range of opportunities for frontier research in hadronic physics. Furthermore, the U.S. 2015 long range plan for nuclear science recommended Electron-Ion Colliders (EIC), as the highest priority for new facility construction after the completion of the FRIB as the next frontier for QCD physics. In China, an EIC@HIAF facility has been proposed by the Institute of Modern Physics of the Chinese Academy of Sciences to provide a powerful precision microscope for hadron physics study. In light of these new developments, the 8th workshop will be held at th...

Following the first experimental discoveries at the Large Hadron Collider (LHC) and the recent update of the European strategy in particle physics, CERN has undertaken an international study of possible future circular colliders beyond the LHC. The study, conducted with the collaborative participation of interested institutes world-wide, considers several options for very high energy hadron-hadron, electron-positron and hadron-electron colliders to be installed in a quasi-circular underground tunnel in the Geneva basin, with a circumference of 80 km to 100 km. All these machines would make intensive use of advanced superconducting devices, i.e. high-field bending and focusing magnets and/or accelerating RF cavities, thus requiring large helium cryogenic systems operating at 4.5 K or below. Based on preliminary sets of parameters and layouts for the particle colliders under study, we discuss the main challenges of their cryogenic systems and present first estimates of the cryogenic refrigeration capacities req...

Multi-hadron operators are crucial for reliably extracting the masses of excited states lying above multi-hadron thresholds in lattice QCD Monte Carlo calculations. The construction of multi-hadron operators with significant coupling to the lowest-lying states of interest involves combining single hadron operators of various momenta. The design and implementation of large sets of spatially-extended single-hadron operators of definite momentum and their combinations into two-hadron operators are described. The single hadron operators are all assemblages of gauge-covariantly-displaced, smeared quark fields. Group-theoretical projections onto the irreducible representations of the symmetry group of a cubic spatial lattice are used in all isospin channels. Tests of these operators on 24^3 x 128 and 32^3 x 256 anisotropic lattices using a stochastic method of treating the low-lying modes of quark propagation which exploits Laplacian Heaviside quark-field smearing are presented. The method provides reliable estimat...

accelerator programs. Microsoft runs accelerators in seven different countries. Accelerators have grown out of the infancy stage and are now an accepted approach to develop new ventures based on cutting-edge technology like the internet of things, mobile technology, big data and virtual reality. It is also...... and developing the best business ideas and support the due diligence process. Even universities are noticing that the learning experience of the action learning approach is an effective way to develop capabilities and change cultures. Accelerators related to what has historically been associated...... have the same purpose as businesses: To create customers....

Full Text Available Finite Difference Time Domain (FDTD is applied to study the characteristics of electromagnetic interference for power plane-battery management system (PP-BMS enclosure, for modeling the coupling of an incident electromagnetic pulse (EMP with a conducting wire through a BMS enclosure and aperture on it. Simulation and analysis are done by radius of the wires, incidence angles of EMP in the conditions of different polarized direction, and different annular apertures in consideration. The simulation result shows that interference of the electromagnetic coupling into the PP-BMS enclosure can be affected in different degrees by above factors. At low frequency, the larger the radius of the wire penetrated into the PP-BMS enclosure, the more interference is coupled into the BMS enclosure from electromagnetic field. Also, the electromagnetic energy coupled by penetrated wire when incident wave radiates aslant is more than the coupling energy when incident wave radiates the target vertically in the condition of vertical polarized direction of electric field, and less in the condition of horizontally polarized direction of electric field. Furthermore, in the case of the same aperture area, the coupling electromagnetic energy into the circular annular aperture is smaller than that into the rectangular and the square ones.

Full Text Available In response to declining fish stocks and increased societal concern, the marine ‘commons’ of New Jersey is no longer freely available to commercial and recreational fisheries. We discuss the concept of ‘creeping’ enclosure in relation to New Jersey’s marine fisheries and suggest that reduced access can be a cumulative process and function of multiple events and processes and need not be the result of a single regulatory moment. We begin with a short review of the ‘expected’ effects of enclosure, including loss of flexibility, erosion of community, proletarianization of fishermen, and corporatization of the fishery. We then present some findings of our research and discuss how the signs of enclosure are visible in fisheries that do not feature explicitly privatized property or access rights. We rely on an oral history approach and the rich detail that emerges from attention to the lived experiences of fish harvesters to provide a framework for understanding the range of cumulative effects that have resulted from this process of creeping enclosure. We conclude with a discussion of how the gradual process of enclosure has affected the flows of information between the bio-physical environment and fish harvesters, managers and scientists by reducing both participation in fisheries and the accumulation of knowledge itself.

Full Text Available The declining rabbit population in the Iberian Peninsula has led hunters and authorities to rear rabbits in captivity systems for their subsequent release. One alternative method to intensive rabbitry systems is the use of extensive breeding enclosures, since they produce animals of greater quality for hunting and conservation purposes. However, some of the factors that affect rabbit production in breeding enclosures are still unknown. The present study used partial least squares regression (PLSR to analyse the effects of plot size, scrub cover, slope, initial rabbit abundance, the resources needed to dig warrens, predation and proximity to other enclosures on rabbit abundance. The results of our study show a positive effect of the number of other fenced plots within a radius of 3 km, a positive relationship with the availability of optimal resources for building warrens and a positive influence of intermediate values of scrub cover. According to our results, to maximise rabbit production in the enclosures it would be advisable to concentrate the restocking effort by ensuring that the restocking plots are close to each other, thus avoiding isolated enclosures. Furthermore, the selection of plots with an appropriate scrub cover and high availability of elements that favour the construction of warrens, such as large stones, sloping land or tall shrubs, may optimise results.

This study investigated numerically the internal flow depending on Prandtl number of fluid and height of enclosure. The two-dimensional numerical simulations were performed for several heights of enclosure in the range between 0.01 m and 0.074 m. It corresponds to the aspect ratio (H/L) ranged from 0.07 to 0.5. Prandtl number was 0.2, 0.7 and 7. Rayleigh number based on the height of enclosure ranged between 8.49x10 3 and 1.20x10 8 . The numerical calculations were carried out using FLUENT 6.3. In order to confirm the influence of Prandtl number and height of side walls on the internal flow and heat transfer of the horizontal enclosure, the numerical study is carried out using the FLUENT 6.3. The numerical results for the condition of top cooling only agree well with Rayleigh-Benard natural convection. When the top and side walls were cooled, the internal flow of enclosure is more complex. The thickness of thermal and velocity boundary layer varies with Prandtl number. For Pr>1 the behavior of cells is unstable and irregular owing to the entrained plume, whereas the internal flow for Pr<1 is stable and regular. Also, the number of cells increases depending on decrease of height. As a result, the heat exchange increases

In 1998, the United States entered into an agreement with CERN to help build the Large Hadron Collider (LHC), with contributions to the accelerator and to the large HEP detectors. To accomplish this, the US LHC Accelerator Project was formed, encompassing expertise from Brookhaven National Laboratory, Fermi National Accelerator Laboratory and the Lawrence Berkeley National Laboratory. Contributions from the US LHC Accelerator project included superconducting high gradient quadrupoles and beam separation dipoles for the four interaction regions and the RF section; feedboxes for cryogenic, power and instrumentation distribution; neutral and hadron beam absorbers in the high luminosity regions; design of the inner triplet cryogenic system; beam tracking studies utilizing the design IR magnet field quality and magnet alignment; particle heat deposition studies in the IRs; and short sample characterization of superconducting cables used in the arc dipoles and quadrupoles. This report is a summary of the...

A broad class of accelerators rests on the induction principle whereby the accelerating electrical fields are generated by time-varying magnetic fluxes. Particularly suitable for the transport of bright and high-intensity beams of electrons, protons or heavy ions in any geometry (linear or circular) the research and development of induction accelerators is a thriving subfield of accelerator physics. This text is the first comprehensive account of both the fundamentals and the state of the art about the modern conceptual design and implementation of such devices. Accordingly, the first part of the book is devoted to the essential features of and key technologies used for induction accelerators at a level suitable for postgraduate students and newcomers to the field. Subsequent chapters deal with more specialized and advanced topics.

CERN has been involved in the dissemination of scientific results since its early days and has continuously updated the distribution channels. Currently, Inspire hosts catalogues of articles, authors, institutions, conferences, jobs, experiments, journals and more. Successful orientation among this amount of data requires comprehensive linking between the content. Inspire has lacked a system for linking experiments and articles together based on which accelerator they were conducted at. The purpose of this project has been to create such a system. Records for 156 accelerators were created and all 2913 experiments on Inspire were given corresponding MARC tags. Records of 18404 accelerator physics related bibliographic entries were also tagged with corresponding accelerator tags. Finally, as a part of the endeavour to broaden CERN's presence on Wikipedia, existing Wikipedia articles of accelerators were updated with short descriptions and links to Inspire. In total, 86 Wikipedia articles were updated. This repo...

Ten years after the discovery of the X(3872) we can assert that a number of exotic four-quark hadrons with hidden charm and beauty have been discovered, the most recent, Z(3900), found by BES in 2013, being among the top-striking ones. However, ten years have not been enough to dispel the controversy about their inner structure, with two body hadron molecules and compact multiquark states being the withstanding antipodal models. In this seminar I will review the status of the field, presenting both the experimental facts and the theoretical pictures attempting to interpret them.

Central exclusive production in hadron-hadron collisions at high energies, for example p + p → p + X + p, where the + represents a large rapidity gap, is a valuable process for spectroscopy of mesonic states X. At collider energies the gaps can be large enough to be dominated by pomeron exchange, and then the quantum numbers of the state X are restricted. Isoscalar JPC = 0++ and 2++ mesons are selected, and our understanding of these spectra is incomplete. In particular, soft pomeron exchanges favor gluon-dominated states such as glueballs, which are expected in QCD but not yet well established. I will review some published data.

In this paper, I will emphasize two points on Theoretical Aspects of Lepton-Hadron Scattering: (1) The crucial importance of testing the ''exact'' sum rules as tests of the local current algebra. Discrepancies, if found, between experiment and theory cannot be ''interpreted away'' in terms of more complex parton wave functions for the hadronic ground state. The three sum rules of interest are those of Adler, Bjorken, and Gross and Llewellyn-Smith. (2) An understanding of the corrections to scaling in QCD and what they teach us

Central exclusive production in hadron-hadron collisions at high energies, for example p + p -> p + X + p, where the "+" represents a large rapidity gap, is a valuable process for spectroscopy of mesonic states X. At collider energies the gaps can be large enough to be dominated by pomeron exchange, and then the quantum numbers of the state X are restricted. Isoscalar JPC = 0++ and 2++ mesons are selected, and our understanding of these spectra is incomplete. In particular, soft pomeron exchanges favor gluon-dominated states such as glueballs, which are expected in QCD but not yet well established. I will review some published data.

In this thesis I motivate and present a search for long lived massive R-hadrons using the data collected by the ATLAS detector in 2011. Both ionisation- and time-of-ight-based methods are described. Since no signal was found, a lower limit on the mass of such particles is set. The analysis was also...... published by the ATLAS collboration in Phys.Lett.B. titled `Searches for heavy long-lived sleptons and R-Hadrons with the ATLAS detector in pp collisions at sqrt(s) = 7 TeV'....

In the models for energetic heavy ion reactions it is assumed that during the reaction a hot and dense nuclear matter, a fireball is formed from all or a part of nucleons of the target and projectile nuclei. The process is similar to the chemical processes leading to dynamical equilibrium. The relaxation times necessary to establish ''chemical'' equilibrium among different hadrons in hot, dense hadronic matter is deducted in a statistical model. Consequences for heavy ion collisions are discussed. The possibility of Bose-Einstein pion condensation around the break-up time of the nuclear fireball is pointed out. (D.P.)

The techniques used to study top quarks at hadron colliders are presented. The analyses that discovered the top quark are described, with emphasis on the techniques used to tag b quark jets in candidate events. The most recent measurements of top quark properties by the CDF and DO Collaborations are reviewed, including the top quark cross section, mass, branching fractions, and production properties. Future top quark studies at hadron colliders are discussed, and predictions for event yields and uncertainties in the measurements of top quark properties are presented.

The lateral and longitudinal profiles of the hadronic showers detected by ATLAS iron-scintillator tile hadron calorimeter with longitudinal tile configuration have been investigated. The results are based on 100 GeV pion beam data. Due to the beam scan provided many different beam impact locations with cells it is succeeded to obtain detailed picture of transverse shower behavior. The underlying radial energy densities for four depths and for overall calorimeter have been reconstructed. The three-dimensional hadronic shower parametrization has been suggested

The performances of a set of hadron calorimeter towers for measuring the hadron impact point are described. It is shown that an accuracy of 1-2 cm can be achieved with a proper treatment of the data. (orig.)

Construction of hadron interaction amplitudes is discussed in terms of the recently proposed new string dynamics. Inclusion of the nucleon and the flavor characterizing hadron quantum numbers into dynamics of composite superconformal strings is discussed

In the fragmentation of a transversely polarized quark several left-right asymmetries are possible for the hadrons in the jet. When only one unpolarized hadron is selected, it exhibits an azimuthal modulation known as Collins effect. When a pair of oppositely charged hadrons is observed, three asymmetries can be considered, a di-hadron asymmetry and two single hadron asymmetries. In lepton deep inelastic scattering on transversely polarized nucleons all these asymmetries are coupled with the transversity distribution. From the high statistics COMPASS data on oppositely charged hadron-pair production we have investigated for the first time the dependence of these three asymmetries on the difference of the azimuthal angles of the two hadrons. The similarity of transversity induced single and di-hadron asymmetries is discussed. A phenomenological analysis of the data allows to establish quantitative relationships among them, providing strong indication that the underlying fragmentation mechanisms are all driven ...

Preliminary results are presented of an analysis of hadron production in deep inelastic scattering at 219 GeV. Results are presented on the distribution of hadron momenta, angular correlation of hadrons with respect to the muon scattering plane, recoil proton correlations, K +- /π and K 0 π ratios, an upper limit to D production, multiplicity of hadrons as a function of p/sub T/ and Q 2 , and phi 0 production. 15 references

Full Text Available Recently, new definitions of shielding effectiveness (SE for high-frequency and transient electromagnetic fields were introduced by Klinkenbusch (2005. Numerical results were shown for closed as well as for non closed cylindrical shields. In the present work, a measurement procedure is introduced using ultra wideband (UWB electromagnetic field pulses. The procedure provides a quick way to determine the transient shielding effectiveness of an enclosure without performing time consuming frequency domain measurements. For demonstration, a cylindrical enclosure made of conductive textile is examined. The field pulses are generated inside an open TEM-waveguide. From the measurement of the transient electric and magnetic fields with and without the shield in place, the electric and magnetic shielding effectiveness of the shielding material as well as the transient shielding effectiveness of the enclosure are derived.

The thermal and moisture management of electronic enclosures are fields of high interest in recent years. It is now generally accepted that the protection of electronic devices is dependent on avoiding critical levels of relative humidity (typically 60–90%) during operations. Leveraging...... focus the parameter-value space, before shifting to 3D CFD models for final evaluations and verification. The approach results in a system capable of predicting optimum design features for the thermal and moisture management of electronic enclosures in a time-efficient and practically implementable...... the development of rigorous calibrated CFD models as well as simple predictive numerical tools, the current paper tackles the optimization of critical features of a typical two-chamber electronic enclosure. The progressive optimization strategy begins the design parameter selection by initially using simpler...

Acurtain rose on the current world accelerator stage at the end of June when almost 750 delegates gathered in London for the fourth biennial European Particle Accelerator Conference (EPAC). As well as reports from all major Laboratories on their latest accelerator achievements and future plans, a special session featured invited contributions on high intensity issues while a seminar covered the increasing transfer of technology between Accelerator Laboratories and Industry. The first invited talk of the conference, by CERN Director General Chris Llewellyn Smith, concerned the future of high energy physics in Europe. Naturally this focused on the Large Hadron Collider project at CERN, which will open up important new physics frontiers for the 21st century

This lecture presents a review of cyclic accelerators and their energy limitations. A description is given of the phase stability principle and evolution of the synchrotron, an accelerator without energy limitation. Then the concept of colliding beams emerged to yield doubling of the beam energy as in the Tevatron 2 trillion electron volts (TeV) proton collider at Fermilab and the Large Hadron Collider (LHC) which is now planned as a 14-TeV machine in the 27 kilometer tunnel of the Large Electron Positron (LEP) collider at CERN. Then presentation is given of the Superconducting Supercollider (SSC), a giant accelerator complex with energy 40-TeV in a tunnel 87 kilometers in circumference under the country surrounding Waxahachie in Texas, U.S.A. These superhigh energy accelerators are intended to smash protons against protons at energy sufficient to reveal the nature of matter and to consolidate the prevailing general theory of elementary particle.