The activities and services of NANONET cover the rapidly growing fields of “Nanotechnologies & Nanobiotechnologies”. These fields are being regarded as the most significant development lever of modern and emerging technologies and economies.

Every approach in the fields of “Nanosciences & Nanotechnologies” needs to be done in the terms of interdisciplinarity. This interdisciplinarity takes shape in this Network, seeing that the collaborating laboratories and scientists research fields cover Physics, Material Science, Chemistry, Computing, Science & Engineering, Biology, Nanobiotechnology & Medicine and it is closely related to the Interdepartmental Post Graduate Program “Nanosciences & Nanotechnologies – N&N” of AUTh.

The target of NANONET is the creation of a Core that will Coordinate the activities & services of AUTh – Greek Labs & affiliated Laboratories, that are active in the fields of Nanosciences, Nanotechnologies & Nanobiotechnologies. Many labs from Greece, the Balcan area, Europe and USA are already participating in the network, thus enhancing their research, collaboration and funding prospects.The endmost purpose of NANONET is to cooperate with even more Laboratories, located in Greece and throughout the World and to enhance its relations & bonds with industry and society.

NANONET is a thematic network with cross-disciplinary role since it includes four vertical clusters:-Nanobiotechnology & Nanomedicine,-Thin Films & Organic Electronics-Nanoelectronics & Nanophotonics-Nanomaterials, Nanoengineering & Nanomechanics,-Nanotechnology in Energy & Environment,and two horizontal clusters:-Nanometrology & Tools, and-Computational Modelling at the Nanoscale.
The purpose of the different Clusters in NANONET is the reinforcement of existing collaborations and development of new ones between research and industry in the private and public sector; and the promotion of the innovative knowledge of each of the Clusters for funding purposes.

Who are the members?

Industrial enterprises, Research Centres and Institutions, Universities are invited to join NANONET. All those who are involved with research, development and application on one or more of the following thematic areas:

Becoming a member is giving you the chance to be integrated into a most promising International network of “Nanotechnology & Nanobiotechnology” experts.
Then you will have a lot of opportunities to influence and become part of the evolving future in these fields, by interacting with other members of the network.
Members have also access in a privileged area in the NANONET web site.

A project to develop a promising new astronomy imaging technique that can also denude a fully clothed human or see through thick fog has generated its first picture.

A so-called T-ray image of a human hand, taken through a 1/2-inch (15 millimeter) pad of paper, is the first product of the new terahertz camera. The technology is poised to revolutionize imaging in astronomy, medicine and airport security, proponents say.

The European Space Agency’s project to develop the camera was first reported by SPACE.com last June. While largely unheralded, T-ray imaging does not appear to be pie-in-the sky. In fact, a camera built by a company called QinetiQ and working in similar millimetric waves already last year had demonstrated the ability to peer through clothes and reveal a concealed weapon, along with much of a person’s body.

Images

A picture of a human hand, taken through a 1/2-inch (15 mm) pad of paper, is the first product of the new T-ray camera.

Derek Jenkins of the StarTiger team removes the first silicon machined wafer carrying the a terahertz array of sensors used in the new camera.

A fully clothed man imaged by a QinetiQ millimetric wave camera. Note the concealed gun. T-ray cameras are said to be similar but more powerful. Image used with permission.

The technique employs a little-studied but ubiquitous radiation. Detecting T-rays allows a camera to effectively see through smoke, walls and even clothing or bandages.

Low frequency versions of terahertz waves are known as millimeter waves, and they behave much like radio waves. At higher frequencies, the terahertz waves straddle the border between radio and optical emissions. The technology is sometimes referred to as quasi-optics.

Similar but less sensitive technology is already used to examine sea-surface temperatures from satellites. A future T-ray observatory might study the tails of comets, experts say, and the frequency could also shed new light on the early universe and how the first galaxies formed.

“Observations from space may be on the verge of a revolution with the possibility of looking into the terahertz frequency range,” said Peter de Maagt, project manager for StarTiger, which stands for Space Technology Advancements by Resourceful, Targeted and Innovative Groups of Experts and Researchers.

Few formal studies of T-ray technology exist, but an article on the Web site of the journal Nature last year said these cameras could be “the next big wave” in imaging for everything from cells to stars. Scientists at the Rensselaer Polytechnic Institute in New York claim T-ray technology will speed computer memory and sharpen flat-panel displays.

To develop the technology quickly, StarTiger was created by the ESA. The project brought a group of researchers together for a few months, provided ample money and facilities, and encouraged development of new technology in a short period of time. The researchers started in June, created their first T-ray image last fall, and released one this week.

“When we started last June we set an ambitious goal: to build in four months the first compact submillimeter-wave imager with near real time image capturing using state-of-the-art micro-machining technology,” said de Maagt. “We reached this goal when the first terahertz images were taken in September.”

Terahertz waves are unique because they can pass easily through some solid materials, yet they can also be focused as light to create images of objects behind the obscuring material.

Terahertz imaging may soon become a standard medical diagnostic technique, researchers with StarTiger say. T-rays could provide an image that has X-ray-like properties without the use of potentially harmful radiation. It might be particularly useful to augment dental X-rays and for possible early detection of skin cancers.

Pilots might one day use terahertz imagers to generate a picture of what’s ahead in heavy fog, StarTiger officials say. A higher resolution imager than currently developed would be needed for such a view.

The newly developed device is small enough to fit in a briefcase. A future version might one day be deployed to space to examine the early universe. If money were provided, a space-based T-ray camera could be deployed in two years, a StarTiger scientist said.

South Africa’s PBMR (Pebble Bed Modular Reactor) programme is perhaps the largest next generation reactor development effort in the world today. The project received a boost recently with the South African government’s commitment of significant funding. The proposed technology has also been evolving, with a switch from vertical to horizontal turbine orientation, changes in the core and a power uprate to 400 MWt.

**********

In October 2004, the South African government accepted a proposal to develop and market the country’s Pebble Bed Modular Reactor technology. The government also allocated significant funding for the project. At the same time, the Minister of Public Enterprises, Alec Erwin, stated an objective of eventually producing 4000 MWe to 5000 MWe of power from pebble bed reactors in South Africa to support diversification of supply and to avoid greenhouse emissions. This equates to between 25 and 30 PBMR reactors with 165 MWe output. Previously, in June 2004, the cabinet had approved a programme to train nuclear scientists.

PBMR (Pty) Ltd is supported by its current investors, Eskom, the Industrial Development Corporation of South Africa, and British Nuclear Fuels (BNFL), who share the vision of small, standardised, inherently safe, modular reactors as one of the best carbon-free alternatives for new power generation capacity around the world.

The PBMR project entails the building of a demonstration reactor project at Koeberg near Cape Town and a pilot fuel plant at Pelindaba near Pretoria. The current schedule is to start construction in 2007 and for the demonstration plant to be completed by 2010. The first commercial PBMR modules are planned for 2013 operation. To date over three million hours of engineering work have been expended in understanding the underlying German technology, developing modern applications in a direct Brayton configuration and completing the preliminary design. In this effort, the fundamental safety and economic objectives first set out for the project have been confirmed.

In November 2004 PBMR (Pty) Ltd was released to place all major contracts for the next phase. This included a contract with Mitsubishi Heavy Industries (MHI) of Japan for the next phase design and development of the PBMR helium driven turbo-generator system, as well as the core barrel assembly. MHI will do the complete design of the system, including material and seal tests. The design and validation tests provide for the special requirements of the helium working gas in the high temperature and pressure regimes. MHI will also develop the design of the core barrel assembly, which is a steel structure supporting the reactor internals, fuel spheres and the graphite reflector.

[FIGURE 1 OMITTED]

An important milestone was also reached on 22 November 2004 with the ground-breaking for the construction of a helium test facility (HTF) at Pelindaba. The HTF is a high temperature, high pressure, full height facility that will test prototype helium cycle components for the PBMR. It will simulate fuel-handling, reactivity control and shut-down systems operations in order to evaluate operability, durability, reliability and maintenance issues for individual components.

Eskom, who will purchase and operate the PBMR demonstration module, is awaiting the Minister of Environmental Affairs and Tourism’s final approval of the environmental impact assessment (EIA).

The basic concept

The PBMR power conversion is based on a single loop direct Brayton thermodynamic cycle with a helium-cooled and graphite-moderated nuclear core assembly as a heat source. The coolant gas transfers heat from the core directly to the power conversion system consisting of gas turbomachinery, electrical generator, compressors, gas coolers and recuperator.

The PBMR fuel consists of coated enriched uranium fuel kernels embedded in graphite spheres (see Figure 1). Each fuel kernel is coated with four successive layers of material. The innermost layer is porous carbon, which allows fission products to collect without creating excessive internal pressure. The next layer is pyrolytic carbon, followed by silicon carbide (a strong refractory material) and a final pyrolytic carbon layer. These outer three layers create a compound barrier against fission product release of which the silicon carbide coating plays the dominant role. This not only binds the uranium fuel and subsequent fission products, but also provides each kernel with its own miniature pressure vessel that is fundamental to exceptional safety.

A predetermined mass of these already “contained” fuel particles (each now approximately 1 mm in diameter) is then embedded inside a 50 mm graphite sphere, which is then covered with a 5 mm fuel-free graphite layer. The graphite making up the sphere acts as a moderator, and the outer layer protects the fuel particles from mechanical damage, such as abrasion. This fuel element design has proved its ability to tolerate the PBMR operating envelope for extended periods in the AVR and THTR German reactors, having been tested for over 20 years. In terms of the goals set for inherent safety, this ‘packaging’ creates an efficient barrier against the release of activity (radioactive nuclides produced by fission) during all normal operations and postulated abnormal conditions. Designing the fuel as spherical elements, coupled with the innovative on-line refuelling system ensures a neutronically homogeneous core and lower operating and upset temperatures.

The initial, 268 MWt, design

Initially, the PBMR design team developed a 268 MWt plant, which is described in this section and provides a reference base for subsequent discussion of the latest improvements in the design.

The reactor unit

The reactor unit is defined as the reactor pressure vessel and core internals, consisting of the core barrel, the graphite structures and the reactivity control units.

In the 268 MWt reference design, the reactor pressure vessel was about 6.2 m in diameter, approximately 20.5 m high, and manufactured from reactor grade forged steel with a wall thickness varying between 120 mm and 220 mm.

It had an internal steel core barrel with an internal diameter of 5.8 m and a wall thickness of 50 mm. This internal core barrel in turn supported the graphite reflector and carbon thermal shield.

The combined radial thickness of the graphite and carbon was 1 meter. The graphite reflector had 35 vertical borings to house the reactivity control units. The volume inside the graphite reflector–the core cavity–had a diameter of 3.5 m and an effective height of 8.5 m.

The core consisted of fuel spheres and a dynamic central reflector column of graphite spheres. The nominal diameter of this dynamic central reflector was 1.75 m.

[FIGURE 2 OMITTED]

The main power system

The design of the main power system consists of the reactor unit coupled to the power conversion system. Power is mainly controlled by varying the coolant mass flow rate through the power conversion system while the temperature differences across the core and the power conversion components are kept constant.

In the initial, 268 MWt, design, hot coolant leaving the core drove two vertical turbo-compressors and the vertical power turbo-generator; the so-called three-shaft system. From the power turbine, the coolant flows through the primary side of a recuperator and then through coolers and the compressors, thereafter flowing through the secondary side of the recuperator where it is preheated before re-entering the core.

The reactor core

The equilibrium core of the 268 MWt design consisted of approximately 333 000 fuel elements and 110 000 graphite spheres in the central reflector zone. Each fuel element contained 9 g of uranium having an enrichment of approximately 8.1%. The design had a target burn-up of approximately 80 000 MWd/t U.

The core had 18 control rods. Nine of the control rods were used for reactivity control during load changes, and the other nine were used to shut down the reactor. The reactor design catered for a load following capability within the range 100-40-100%. The nine control rods provide reactivity compensation for xenon poisoning effects during load following operation.

An independent and diverse shutdown system was used to shut the reactor down to cold conditions. This system used small absorber spheres which were dropped into 17 borings in the reflector. Removal of these spheres was by means of a helium gas conveyance system.

The fuelling system

An on-line fuel and graphite sphere loading and unloading scheme was employed. Similar to the scheme employed in the German reactors, unloading of fuel was at the bottom of the reactor vessel. The 268 MWt design made use of a single discharge chute. A core unloading device singularised the fuel and graphite spheres, after which each sphere was assayed as fuel or graphite and for degree of burn-up. Reusable…

Most of the technology described on these pages is related to horizontal axis wind turbines (HAWTs, as some people like to call them).

The reason is simple: All grid-connected commercial wind turbines today are built with a propeller-type rotor on a horizontal axis (i.e. a horizontal main shaft).

The purpose of the rotor, of course, is to convert the linear motion of the wind into rotational energy that can be used to drive a generator. The same basic principle is used in a modern water turbine, where the flow of water is parallel to the rotational axis of the turbine blades.

As you will probably recall, classical water wheels let the water arrive at a right angle (perpendicular) to the rotational axis (shaft) of the water wheel.

Vertical axis wind turbines (VAWTs as some people call them) are a bit like water wheels in that sense. (Some vertical axis turbine types could actually work with a horizontal axis as well, but they would hardly be able to beat the efficiency of a propeller-type turbine).

The only vertical axis turbine which has ever been manufactured commercially at any volume is the Darrieus machine, named after the French engineer Georges Darrieus who patented the design in 1931. (It was manufactured by the U.S. company FloWind which went bankrupt in 1997). The Darrieus machine is characterised by its C-shaped rotor blades which make it look a bit like an eggbeater. It is normally built with two or three blades.

The basic theoretical advantages of a vertical axis machine are:

1) You may place the generator, gearbox etc. on the ground, and you may not need a tower for the machine.

2) You do not need a yaw mechanism to turn the rotor against the wind.

The basic disadvantages are:

1) Wind speeds are very low close to ground level, so although you may save a tower, your wind speeds will be very low on the lower part of your rotor.

2) The overall efficiency of the vertical axis machines is not impressive.

3) The machine is not self-starting (e.g. a Darrieus machine will need a “push” before it starts. This is only a minor inconvenience for a grid connected turbine, however, since you may use the generator as a motor drawing current from the grid to to start the machine).

4) The machine may need guy wires to hold it up, but guy wires are impractical in heavily farmed areas.

5) Replacing the main bearing for the rotor necessitates removing the rotor on both a horizontal and a vertical axis machine. In the case of the latter, it means tearing the whole machine down. (That is why EOLE 4 in the picture is standing idle).

More than 100 port operators, developers, investors and wind manufactures from across the UK met yesterday with the Government to cut through a potential bottleneck in offshore wind farm development.

With the potential market for UK ports worth £1bn up to the year 2020, there is an extraordinary opportunity for ports to be involved in the supply of services to manufacturers and developers of offshore wind farms.

The UK needs ports with the capacity to handle large vessels and with available space for wind turbine manufacturers and their supply chain.

At the moment, there are too few sites to meet future demand for offshore wind technology, although port operators have started to recognise the potential revenue opportunities from offshore wind.

Energy Minister Mike O’Brien said in his speech: “We want the UK to make the world’s biggest investment in offshore wind. We are an island nation with a fantastic wind resource.

“Britain’s ports could become the hub of activity and economic opportunity as we massively increase the amount of renewable energy we get from our seas – and could be key to constructing and transporting wind technology.

“Thousands of jobs could be created as a result of the construction of wind farms – including in our ports which will face enormous demand. Today’s seminar will for the first time discuss what Britain’s ports have to gain from the development of offshore wind, and what needs to happen to make sure that government, energy developers and Britain’s ports work together so that all benefit.”

The meeting will inform the offshore wind industry about the opportunities and discuss how to co-ordinate the parties involved in developing ports.

While the success of vertical wind turbines or “egg beater” turbines for electricity generation is extremely limited, there are circumstances under which they can outperform horizontal axis or “propeller” turbines.
In the first place vertical axis wind turbines are easier to maintain than their horizontal counterparts because their moving parts a located near the ground. This is due to the vertical wind turbine’s configuration, which is something like an ordinary windmill lying on its back with its “face” to the sky. The airfoil or rotor blades are connected by arms to a shaft that rests on a bearing and drives a generator below, usually with a gearbox. As the rotor blades are vertical, a yaw device is not needed, making it more cost effective than a horizontal axis turbine. A yaw device assists the blades of the horizontal axis wind turbine to face into the wind.

Third, vertical axis wind turbines are not tall (remember they are like a horizontal turbine lying down) and this configuration can be useful where laws do not permit very tall structures. Fourth, small vertical turbines are quite easy to transport and install.

Fifthly, they do not need a free-standing tower so they are less expensive and stronger in the high winds that are closer to the ground.

Sixth, they have a lower tip speed ration (TPR) so they are less likely to break in high winds.

Seventh, they don’t need to be pointed into the wind. They can turn regardless of wind direction.

Eighth and most importantly, and this is the main area in which vertical wind turbines can out-perform the horizontal type, if vertical turbines are placed on the ground on high prominences (mesas, hilltops, ridgelines, etc.) or in passes, they can produce more power than horizontal vertical axis turbines.

The benefits of vertical axis wind turbines must, however, be weighed against their significant disadvantages. Most types of vertical turbines produce energy at only 50% of the efficiency of horizontal turbines due to their drag action. They are limited in terms of height. They also need to be constructed on relatively flat land, unlike the horizontal types.

Most vertical turbines have low starting torque and may require energy to start the turning. In addition, vertical wind turbines requiring guyed wires to hold them in place put stress on the bottom bearing as all the weight of the rotor is on the bearing. Guyed wires attached to the top bearing increase downward thrust in wind gusts. Solving this problem requires a superstructure to hold a top bearing in place in guyed wire models to eliminate the downward thrusts when there is gusting wind.
While the parts of installed vertical wind turbines are located on the ground, this means that they are under the weight of the structure above it, which can make changing parts near impossible without dismantling the structure, depending on the design of the structure.

This site has information about small manufactured wind turbines as well as other information about wind powerhttp://www.ndsu.nodak.edu/ndsu/klemen This site has information about small manufactured wind turbines as well as other information about wind power

http://www.dsgnspec.com –Rob has some great stuff for us experimentors… lots of electonic gizmos we all gott’a have. This is where the star/delta switch I used in my downwind turbine project came from and it works flawlessly! Also !!! If your looking for a small hand held tach, the non-contact type you gotta’ check out the “Tach JR” Perfect little tach at the right price!!! Dont miss out!!!

Great place to learn about renewable energy for young and old. A new and interesting way of learning.http://www.learnonline.com-Great place to learn about renewable energy for young and old. A new and interesting way of learning.

Here are a couple of great links to start with… There will be more added as time goes on.

www.otherpower.com They sell magnets and lots of other goodies we like to play with. Also their site offers a world of information. They also have another site where they sell their magnets and other trinkets to play with at www.wondermagnet.com

www.ScoraigWind.co.uk I learned a lot from Hugh Piggott, and would strongly recommend his books to anyone getting involved with wind power.

http://www.green-trust.orghas a wealth of information on just about any kind of Renewable energy you can think of. Expect to spend some time on this site…

www.utterpower.com . slow speed engines, generators, and more”… Lots of cool stuff we like to play with…

www.dragonflypower.com – lots of good information on wind power as well as a nicely built windturbine using an auto alternator.

http://www.ecs-solar.com – The Solar Industry’s Water Heater Bible
” Hot Water Systems: Lessons Learned 1977 to Today ” Solar Hot Water and Pool Heating Design / High Performance Low Maintenance Systems / Reality Checks Using Current Technology…a definitive how-to book for installing and maintaining high-performance and low-maintenance solar hot water systems — written by one of the leaders in solar contracting today.

San Francisco Bay and Northern California specializing in residential solar power installations.

http://www.clean-energy-ideas.com/wind_turbines.html – Discover how a home wind turbine could save money on your electricity bill. We also offer many environmental articles relating to solar, wind and geothermal power, and this includes a section on deciding if a wind turbine can be efficient in your area.

www.energyplanet.info – Lots of information in many many catagories of renewable and alternative energies. Plan to spend some time here … tons of info

Intro

Perhaps the most obvious type of device, the horizontal axis marine current turbin (HAMCT) shares almost all it’s attributes with the more familiar wind turbine. The principle of operation is identical and is energy conversion is performed in a similarily elegant manner. Only the massively increased energy density of water and the possibility of cavitation, coupled with the destructive wear and tear of a submerged structure result in any change in design with the consequence that HAMCTs are stockier and stubbier than wind turbines.
Like it’s wind power counterpart, the horizontal turbine must be yawed into the flow although in the case of bidirectional tidal stream flow, it is sufficient to collectively pitch the blades according to flow direction and operate the turbine in both up and downflow conditions.

Principle of operation: resultant hydrodynamic forces from inflow (largely) perpendicular to the rotor has components acting in the plane of the rotor, normal to the axis of the blade. The resulting torque is transferred via a shaft and gearbox to an alternator.

Modelling: Blade element momentum theory is applied to a generic model to calculate shaft power and rotor thrust. This approach is computationally quick, gives accurate results, and may be automated for various flow speeds, rotational speeds and rotor geometries.back to top

The Horizontal Axis MCT Model

Due to the previously mentioned non-availability of information, the model has had to be generated in the most generic sense. This does not preclude similarity with existing commercial devices, as mentioned, although for reasons to be covered, the physical characteristics of our turbines most closely resemble those used by Bahaj et al.

The primary concern is structure. Common sense dictates that an MCT will closely resemble a wind turbine, while a consideration of the massively increased loading indicates that such turbines will be more compact in order that the structure should be less massive. This applies to blades mainly, and the result is one where a massively thick aerofoil section is required at the root, and the thickness throughout remains relatively higher than that of a wind turbine.

Secondary is cavitation. If, as described in the hydrodynamics section the local cp drop below a critical value, cavitation will occur in the region of low pressure on the hydrodynamic top of the blade. As discussed, the results are initially a drop in performance, and in the long term a reduction in the integrity of the blade surface. As tendency to cavitate depends on (primarily) on local dynamic pressure, and hence the square of the local (relative) velocity, it is likely that cavitation will initially occur at the blade tip. Since an aerofoil’s cp is largely dependant on angle of attack it is apparent that to avoid cavitation, twist at the blade tip and blade tip velocity must be considered.

A tertiary consideration is aerofoil section, and this depends on the first two considerations. As long as the aerofoil is suitably thick enough to accommodate the required structure, whilst being chosen to reduce the likelihood of cavitation, the section itself is not of primary importance (within reason).

Within these bounds, it is desirable that the blade develops as much torque as possible and it is found that the relatively low tangential velocity at root requires a large amount of twist for optimal angle of attack to the resultant velocity, and a relatively large chord to accommodate the thick aerofoil and to increase tangential thrust. Towards the tip it is desired that lift should drop off and the reason for this is twofold: the first is structural considerations, and the second is the dependence on lift generation of an aerofoil with local dynamic pressure. The result is a blade tangential thrust distribution that is almost elliptical. Elliptical distribution would be optimal, but would lead to the following blade design, and obviously is non-realisable:

As a result of this consideration, our model is as follows (for actual values see HAMCT program available for download here):

The precise geometry has been compared to that of Bahaj in order to assure that it is sensible.

The blade element momentum program is then input with the geometry of the device, and a suitable lookup table for aerofoil characteristics at the appropriate Reynolds number range is provided. A module is added to the program that calculates the local cp at the blade tip and provides indication of likely cavitation. Finally, the model is run for a series of freestream velocities and RPMs, and within the cavitation bounds the optimum RPM and pitch setting for the blades is computed at each velocity.

This calculation has been performed in similar conditions to the experiments of Bahaj et al, and the results for this validation case follow. The tolerance for convergence of induction factors is set at 1×10-3 as a compromise between accuracy and computational time, and gives an average convergence time 0.0016 seconds per blade element (on an uninspiring desktop PC). The number of blade elements is simply calculated as if each blade element was spaced 1mm from it’s neighbours, and they are infinitesimally thin.

The model is now reconfigured so as to run calculations between 0.5 and 7ms-1, considered the maximum range of realistically expectable flow speeds. The radius is increased in .5 metre increments from a radius of 2.5m to a radius of 20m, which is considered the absolute maximum for West of Scotland conditions. The rotational speeds are incremented in revolutions per minute from a minimum of 1 RPM up until the integer immediately above that at which cavitation is likely to begin. This does result in some farcically low rotational speeds at the larger end of the radius range, but does yield a suitable number of results for various flow conditions. The blade collective pitch setting is also altered from -5 to 20 degrees. The output screen from the program run for a radius of 10m is displayed below.
Thus, assuming the turbine is to be run at CPmax in all flow conditions, the following results have been obtained. They are fairly self explanatory, but will be discussed in more detail within the methodology sections.

Results from the model may be downloaded in PDF format from the results download package here.

Plot of the zenith atmospheric transmission on the summit of Mauna Kea throughout the range of 1 to 3 THz of the electromagnetic spectrum at a precipitable water vapor level of 0.001 mm. (simulated)

The Earth’s atmosphere is a strong absorber of terahertz radiation, so the range of terahertz radiation is quite short, limiting its usefulness for communications. In addition, producing and detecting coherent terahertz radiation was technically challenging until the 1990s.

There have also been solid-state sources of millimeter and submillimeter waves for many years. AB Millimeter in Paris, for instance, produces a system that covers the entire range from 8 GHz to 1000 GHz with solid state sources and detectors. Nowadays, most time-domain work is done via ultrafast lasers.

In mid-2007, scientists at the U.S. Department of Energy’s Argonne National Laboratory, along with collaborators in Turkey and Japan, announced the creation of a compact device that can lead to a portable, battery-operated sources of T-rays, or terahertz radiation. The group was led by Ulrich Welp of Argonne’s Materials Science Division.[1] This new T-ray source uses high-temperature superconducting crystals grown at the University of Tsukuba, Japan. These crystals comprise stacks of Josephson junctions that exhibit a unique electrical property: when an external voltage is applied, an alternating current will flow back and forth across the junctions at a frequency proportional to the strength of the voltage; this phenomenon is known as the Josephson effect. These alternating currents then produce electromagnetic fields whose frequency is tuned by the applied voltage. Even a small voltage – around two millivolts per junction – can induce frequencies in the terahertz range, according to Welp.

In 2008 engineers at Harvard University announced they had built a room temperature semiconductor source of coherent Terahertz radiation. Until then sources had required cryogenic cooling, greatly limiting their use in everyday applications. [2]

Terahertz radiation is non-ionizing, and thus is not expected to damage tissues and DNA, unlike X-rays. Some frequencies of terahertz radiation can penetrate several millimeters of tissue with low water content (e.g. fatty tissue) and reflect back. Terahertz radiation can also detect differences in water content and density of a tissue. Such methods could allow effective detection of epithelialcancer with a safer and less invasive or painful system using imaging.

Some frequencies of terahertz radiation can be used for 3D imaging of teeth and may be more accurate and safer than conventional X-ray imaging in dentistry.

Security:

Terahertz radiation can penetrate fabrics and plastics, so it can be used in surveillance, such as security screening, to uncover concealedweapons on a person, remotely. This is of particular interest because many materials of interest have unique spectral “fingerprints” in the terahertz range. This offers the possibility to combine spectral identification with imaging. Passive detection of Terahertz signatures avoid the bodily privacy concerns of other detection by being targeted to a very specific range of materials and objects. [3]

Recently developed methods of THz time-domain spectroscopy (THz TDS) and THz tomography have been shown to be able to perform measurements on, and obtain images of, samples which are opaque in the visible and near-infrared regions of the spectrum. The utility of THz-TDS is limited when the sample is very thin, or has a low absorbance, since it is very difficult to distinguish changes in the THz pulse caused by the sample from those caused by long term fluctuations in the driving laser source or experiment. However, THz-TDS produces radiation that is both coherent and broadband, so such images can contain far more information than a conventional image formed with a single-frequency source.

A primary use of submillimeter waves in physics is the study of condensed matter in high magnetic fields, since at high fields (over about 15 teslas), the Larmor frequencies are in the submillimeter band. This work is performed at many high-magnetic field laboratories around the world.

The terahertz band, covering the wavelength range between 0.1 and 1 mm, is identical to the submillimeter wavelength band. However, typically, the term “terahertz” is used more often in marketing in relation to generation and detection with pulsed lasers, as in terahertz time domain spectroscopy, while the term “submillimeter” is used for generation and detection with microwave technology, such as harmonic multiplication.[citation needed]

14 March 2003

T-ray Triumph

T-ray vision. T-rays can penetrate many materials with non-damaging radiation and give highly detailed images, such as this slice of a tooth. A team has now produced high power, high quality T-rays for research.

X rays may be as familiar as your local dentist’s office or airport security checkpoint, but it’s unlikely that you’ve ever encountered a powerful T-ray. This long-wavelength infrared light could be used for a host of possible applications, but until recently, it’s been impossible to generate strong enough beams. Last year researchers used a linear accelerator to generate the most powerful T-ray beam ever made. Now, in the 7 March print issue of PRL, a team describes a way to produce a more stable and useful beam. They modified an electron storage ring–a facility for producing powerful x rays–to produce high-power T-rays. The results prove that a planned, dedicated T-ray facility in the US is realistic.

T-rays–beams of terahertz radiation–are useful because they can penetrate many materials, including living cells, without damaging them. At the same time, they provide a spectrum that’s highly sensitive to the material’s composition. The possible applications for T-rays range from basic research, such as studying the properties of superconductors, to medical imaging, and even security. One British company has already developed a way to use T-rays to image skin cancer, and some researchers envision using it to screen for biological and chemical weapons. But while LED’s, lasers, and vacuum tubes can create powerful beams at higher and lower frequencies, there’s no easy source for high-power terahertz radiation.

Last year, a team at the Thomas Jefferson National Accelerator Facility in Virginia found a way to fill this “terahertz gap,” using a linear accelerator to produce a 20-watt T-ray beam [1]. Now researchers at the BESSY synchrotron radiation facility in Berlin, Germany, have used a similar technique and obtained a higher quality beam.

At a synchrotron radiation source, electrons emit radiation as they are guided by magnets around a storage ring. Synchrotron facilities are optimized to emit powerful x rays, which have a variety of uses in research, but they also emit weak T-rays. The T-rays are weak because, explains researcher Gode Wüstefeld of BESSY, the electrons racing around a storage ring travel in discrete bunches that are about five millimeters long. Because the wavelength of terahertz radiation is less than one millimeter, the billion or so electrons in each bunch emit their radiation out of phase–the waves cancel each other out, and the total T-ray power is low.

In their new study, the BESSY team demonstrated that by adjusting the strength of some of the magnets in the storage ring, they could reduce the bunch length to about one millimeter. The electrons in each bunch then act as one giant “macro-particle” and emit coherent T-rays–a laser-like beam of far infrared light. “This sounds very simple,” says Wüstefeld. “But when we do this, we must do a lot of corrections, because the machine will not digest these changes.”

In the Jefferson Lab’s linear accelerator, a different, newly-produced bunch of electrons generates each terahertz radiation pulse. In an electron storage ring, where the same electron bunches circulate over and over, the beam is more stable and has a higher signal-to-noise ratio.

Researchers at the Lawrence Berkeley National Laboratory (LBNL) in California are particularly pleased to see these results. They recently proposed building a dedicated T-ray electron storage ring at their lab. “We’d been proposing this new source before the BESSY results, but we were proposing it based on a phenomenon that had never been observed,” says LBNL’s John Byrd. “We knew it should be there, but nobody had observed it. So it was very reassuring to us that they’ve observed it at BESSY.”

With a flash and a bang, a pellet of explosive detonates in a cavernouslaboratory on the outskirts of the Pennsylvania State Universitycampus. The explosive, triacetone triperoxide , is the one that terrorists reportedly used in their attack on the London subway in July 2005. Minutes after the lab explosion, engineers–some with bulky ear-protection gear still in place–stare at a laptop screen as they scan frame after frame of high-speed images. Beyond the flame and flying debris, the scientists focus on the ephemeral.supersonicshock waves that emanate from the blast. The waves appear in the pictures as rings, ripples, or streaks.

In studies by the Penn State investigators and others, a rare marriage of technologies is yielding unprecedented visualizations of the shock waves created by a variety of phenomena. The researchers have combined modern high-speed digital video with techniques known as shadowgraphy and schlierenschlie·renpl.n.1. Geology Irregular dark or light streaks in plutonic igneous rock that differ in composition from the principal mass.

2. imaging, which date back centuries.

In the 19th century, scientists used basic forms of the techniques for detecting flaws in lenses or for visualizing shock waves made by bullets. Today, researchers can study not only the meters-wide blast of a lump of TATP but also kilometers-long pressure patterns from the supersonic flightSupersonic flight

Relative motion of a solid body and a gas at a velocity greater than that of sound propagation under the same conditions. The general characteristics of supersonic flight can be understood by considering the laws of propagation of a of an aircraft.

Similar visualizations are illuminating the complex behavior and destructive impacts of shock waves in past and potential aviation disasters, says Penn State’s Gary S. Settles, who heads the lab in University Park. The investigators have also been capturing extraordinary footage of gunshots, and their analyses may alter the way in which weapons experts interpret some types of forensic evidence.

“A good fluid dynamicist knows you have to see the flow to know what’s going onVerb1.know what’s going on – be well-informed
be on the ball, be with it, know the score, know what’s what

know – know how to do or perform something; “She knows how to knit”; “Does your husband know how to cook?” ,” says physicist Leonard M. Weinstein of the NASANASA: see National Aeronautics and Space Administration.

NASA
in full National Aeronautics and Space Administration

Independent U.S. Langley Research CenterLangley Research Center (LaRC) Oldest of NASA’s field centers, LaRC is located in Hampton, Virginia and directly borders Poquoson, Virginia and Langley Air Force Base. LaRC focuses primarily on aeronautical research, though the Lunar Lander was flight-tested at this facility and a in Hampton, Va., who pioneered some of the visualization techniques.

re·fractv.1. , at the boundaries between air masses of different densities. The same phenomenon causes the twinkle of stars and the distorted appearance of objects on the far side of a patch of hot pavement.

The more rudimentary of the two methods, shadowgraphy, requires only a brilliant light source, an air disturbance, a glossy surface on the opposite side of the disturbance, and a camera to take the picture.

Consider a shadowgraph of the air above a heater. Rays of light bend as they pass through rising air currents. So, some parts of the glossy surface receive more light than they would in the absence of the air disturbance, and others receive less. The camera captures the image as a set of density ripples.

More sophisticated and sensitive than shadowgraphy, schlieren imaging has typically required a bright light, a pair of parabolicpar·a·bol·ic also par·a·bol·i·caladj.1. Of or similar to a parable.

2. Of or having the form of a parabola or paraboloid. mirrors–one in front of and the other behind the air disturbance–and a sharp-edged obstacle. One mirror collects light rays from the lamp and reorients them into a beam of parallel rays aimed at the disturbance. The other mirror collects the projected image and focuses it onto a spot. There, the sharp obstacle blocks many of the divergent rays, deepening the dark areas in the density ripples. Finally, the remaining light goes to a screen or into a camera that records the image.

Because of the focusing and the heightened contrast caused by the obstacle, schlieren imaging can visualize subtler fluctuations than can shadowgraphy, Settles says. On the other hand, sehlieren systems typically can make images only of subjects that fit within the light beam created by the first mirror.

For decades, the prohibitive cost of large, high-quality mirrors made for use in telescopes relegated most sehlieren imaging to small-scale phenomena such as bullets breaking the sound barrier and miniature models of aircraft in wind tunnels.

Then, in the early 1990s, Weinstein figured out ways to make schlieren pictures without mirrors. Other researchers had considered some of these methods but didn’t implement them, Weinstein says.

He illuminated subjects with light bounced off a backdrop of the light-reflecting material, called retroreflective sheeting, that’s used in highway signs. Weinstein added vertical black stripes. He used a large-format camera, rather than a mirror, for focusing images. A transparent photographic negative, adorneda·dorntr.v.a·dorned, a·dorn·ing, a·dorns1. To lend beauty to: “the pale mimosas that adorned the favorite promenade”Ronald Firbank.

2. , like the sheeting, with vertical black stripes sits in front of the film inside the camera. Because the black stripes of the negative exactly fill the spaces between the stripes on the camera’s image of the backdrop, the interlockinginterlocking /in·ter·lock·ing/ (-lok´ing) closely joined, as by hooks or dovetails; locking into one another.

interlocking Obstetrics A rare complication of vaginal delivery of twins; the 1st patterns cut off nearly all light rays except those bent by refractionrefraction, in physics, deflection of a wave on passing obliquely from one transparent medium into a second medium in which its speed is different, as the passage of a light ray from air into glass. .

That work led to the unprecedented, full-scale imaging of a range of previously invisible phenomena, including the shock waves created by blasts, flows of gases from industrial equipment, and wavy plumes of air around people, ovens, and air conditioners.

Besides opening indoor schlieren systems to large subjects, Weinstein invented a technique for taking schlieren shots outdoors and observing a truly huge phenomenon–the sky-filling shock waves from supersonic aircraft In aviation, a supersonic aircraft is one that is designed to exceed the speed of sound in at least some of its normal flight configurations. Overview
The great majority of supersonic aircraft today are military or experimental aircraft. in flight.

For that last feat, a camera-equipped, sun-tracking telescope observes through a slit a sliverof the sky that includes the edge of the sun. As an aircraft flies through that sliver, the camera composes an image of the plane and the sky above and below it from successive views through the slit.

Because refraction causes more or less of the sunlight in the sliver to pass through the slit, the schlieren image reveals the otherwise invisible shock waves streaming away from the aircraft. “I’ve taken one [schlieren image] where the shock wave reached 12,000 feet below the plane,” Weinstein notes.

HAVING A BLAST Working with Weinstein a decade ago, Settles and his colleagues built the world’s largest indoor schlieren-imaging system. It’s in an old warehouse on the outskirts of the Penn State campus. The lab’s retroreflective backdrop, which hangs on one wall, measures about 5 meters by 5 m.

The jumbo size of that setup has enabled the Penn State team to undertake schlieren observations of simulations of bomb blasts in passenger aircraft. In tests funded by the Federal Aviation AdministrationFederal Aviation Administration (FAA), component of the U.S. Department of Transportation that sets standards for the air-worthiness of all civilian aircraft, inspects and licenses them, and regulates civilian and military air traffic through its air traffic control , the researchers constructed a full-scale mockupof a portion of an aircraft passenger cabin and observed the effects of explosions under a passenger seat. They also recorded shock waves emerging from suitcases after a blast in a 60 percent mockup of an aircraft luggage compartment.

The studies, which the team described in 2003, provided the first direct, visual evidence of the shock-wave reverberations that could cause fuselage damage far from the site of a blast within a plane. The images have led to “useful information for the future so that aircraft can be designed to better withstand an explosion,” Settles says.

Recently, the Penn State team has taken remarkable pictures of firearms discharging that show much more than typical schlieren images do. Whereas the old method typically showed shock waves from the bullet and in the immediate vicinity of a muzzle, the newer, wide-view schlieren pictures also reveal shock waves and evidence of hot combustion gases extending for several meters.

“We all knew that that stuff was there, but [Settles] was one of the first to visualize it at that size. Previously, you could only look at a section of it;’ says Andrew Davidhazy of the Rochester (N.Y.) Institute of Technology. He teaches shadowgraphy and sehlieren techniques.

By depicting how shock waves and combustion-gas clouds interact with nearby objects, including the shooter, images could someday enable criminologists to more accurately link aspects of shootings, such as the weapon used and the distance at which it was fired, to powder burns or other gunshot effects on victims and suspects, Settles says. Davidhazy agrees: “The more information you can get about a [fired] round, the better your forensic analysis is going to be.”

FAST FORWARD Shadowgraphy went big-time earlier than schlieren imaging did, but until recently, large-scale shadowgraphy didn’t catch on. In the late 1950s, high-speed-photography pioneer Harold Edgerton demonstrated large-scale shadowgraphy by upgrading to a large, retrorefleetive screen. Using a roughly 1-m-by-2-m screen, rather than one the size of a dinner plate, he photographed the explosion of a dynamitedynamite, explosive made from nitroglycerin and an inert, porous filler such as wood pulp, sawdust, kieselguhr, or some other absorbent material. The proportions vary in different kinds of dynamite; often ammonium nitrate or sodium nitrate is added. cap.

More recently, Settles and his colleagues ushered shadowgraphy into the 21st century by projecting images onto an even bigger screen. They also record the images with a high-resolution digital video camera capable of taking thousands of 1-microsecond-long exposures each second.

A particular advantage of digital video in place of bulky film cameras is that shadowgraphy setups have suddenly become portable. The screen, now the bulkiest piece of the equipment, rolls up for transport.

Putting that road-worthiness to the test for the first time, the Penn State team took its gear to a U.S. Army lab in Aberdeen, Md., last summer to video explosions in aircraft-luggage containers, as part of a study for the Transportation Security Administration. Using protective shielding for their equipment and themselves, the researchers made images showing how the containers responded to explosions. They haven’t yet reported their results.

Settles described that fieldwork and other recent shadowgraphy developments in September 2005 at a visualization conference in Queensland, Australia.

Next, Settles’ team intends to use shadowgraphy to image shock waves unleashed in a full-scale, exploding airplane. The researchers have to wait until federally funded aviation engineers, responsible for fortifying aircraft against threats such as terrorist bombs and fuel-tank explosions, detonate

charges in a retired Boeing-747 or another big jet.

The Department of Homeland Securityperiodically conducts such tests to assess recent antibomb modifications on commercial airliners, but the date for the next experimental blast hasn’t been set, Settles says.

For a test of a whole plane, Settles’ team would set up its screen so that shock waves could be observed emerging from the side of the plane. If the test is on a cross-section removed from the fuselage, as it often is, the researchers would place their light and screen on one side of the open portion of the fuselage and put their camera on the other to observe shock wave reverberations, Settles says.

No previous test has visualized shock waves during an explosion in an actual airplane, he notes. But his team is now ready with the first-ever, simple, portable means to do so.

The 10 August 2006 arrest in Britain of 24 terrorists bent on smugglingsmuggling, illegal transport across state or national boundaries of goods or persons liable to customs or to prohibition. Smuggling has been carried on in nearly all nations and has occasionally been adopted as an instrument of national policy, as by Great Britain bomb components aboard airplanes and combining them en route is just the latest salvo in the Darwinian battle between developers of terrorist weaponry and those seeking to defeat them. The array of diabolical methods available to terrorists is truly terrifyingter·ri·fytr.v.ter·ri·fied, ter·ri·fy·ing, ter·ri·fies1. To fill with terror; make deeply afraid. See Synonyms at frighten.

2. To menace or threaten; intimidate. , ranging from nuclear weapons and “dirty bombs” to biological and chemical weapons and explosives.

Detection and assessment of terrorist threats is generally possible today with enough time, money, and laboratory equipment, but the ideal technology would be fast, accurate, cheap, easy to use, and portable or able to remotely detect threats, with an emphasis on prevention. No technology now exhibits all these virtues, but under the pressure of terrorists’ inventiveness, researchers are working steadily to develop and apply improved systems.

Now researchers at Argonne National LaboratoryArgonne National Laboratory, research center, based in Argonne, Ill., 27 mi (43 km) SW of downtown Chicago, with other facilities at the Idaho National Engineering Laboratory, 50 mi (80 km) W of Idaho Falls, Idaho. Founded in 1946 by the U.S.…..Click the link for more information. are getting promising results from experiments using “T rays,” the terahertzter·a·hertzn. Abbr. THz
One trillion (1012) hertz.

Noun1.terahertz – one trillion periods per second
THz (THz) part of the electromagnetic spectrumelectromagnetic spectrum

Total range of frequencies or wavelengths of electromagnetic radiation. The spectrum ranges from waves of long wavelength (low frequency) to those of short wavelength (high frequency); it comprises, in order of increasing frequency (or decreasing…..Click the link for more information.. In March 2006 Argonne announced that a research team there had shown for the first time that T rays can be used to identify explosives and poison gaspoison gas, any of various gases sometimes used in warfare or riot control because of their poisonous or corrosive nature. These gases may be roughly grouped according to the portal of entry into the body and their physiological effects. precursors. The Argonne team also successfully used millimeter-wave radar to remotely detect airborne chemicals and the effects of radiation in the air. These results are currently being written up for publication. T rays and millimeter waves are at the low-energy end of the electromagnetic spectrum, between microwaves and infrared frequencies. According to Nachappa “Sami” Gopalsami, a senior electrical engineer at Argonne and a lead researcher on the THz sensor project, the general characteristics of T rays and millimeter waves are the same. “But,” he says, “new physics and phenomena are beginning to be explored as we move up in frequencies.” Although many detection techniques currently in use are based on electromagnetic radiationelectromagnetic radiation, energy radiated in the form of a wave as a result of the motion of electric charges. A moving charge gives rise to a magnetic field, and if the motion is changing (accelerated), then the magnetic field varies and in turn produces an and mass spectrometry, T rays and millimeter waves have not previously been used in this context, mainly due to an inability to generate broadband pulses in these frequencies. In the Argonne experiments, however, THz spectrometry sensors provided unambiguous identification of explosive chemicals, including TNTTNT: see trinitrotoluene.

TNT
in full trinitrotoluene

Pale yellow, solid organic compound made by adding nitrate (−NO2) groups to toluene. and plastic explosives. Gopalsami says this method is “highly specific” and will eliminate interference from confounding elements. The Argonne team has been collaborating with researchers at Dartmouth College, Sandia National Laboratory, Sarnoff Corporation, and AOZT Finn-Trade of St. Petersburg, Russia. Funding has come from the U.S. Air Force, the Department of Energy, and the Department of Defense. But although national security imperatives are the driving force behind current research, many of the resulting technologies could also prove useful in environmental health applications. Being able to remotely detect and identify chemicals will be helpful in monitoring gas pipeline leaks, chemical plants, vehicle emissions, and the like. Gopalsami says the T ray technology can detect some of the most important environmental hazards including ozone, volatile organics, and cyanide compounds. Medical applications, particularly imaging techniques for body tissues and teeth, are also in the offing, especially because the THz zone is on the opposite end of the electromagnetic spectrum from X rays, and thus of lower energy and far less damaging to living tissue. Detection Difficulties Technical problems plague many existing detection methods. For example, X rays can penetrate almost anything but can harm the object being studied, and in living organisms they may damage DNADNA: see nucleic acid.

DNA
or deoxyribonucleic acid

One of two types of nucleic acid (the other is RNA); a complex organic compound found in all living cells and many viruses. It is the chemical substance of genes. and cause cancer. Laser and other optical instruments are less harmful, but their performance can be affected by wind, humidity, fog, and smoke. Just tracking terrorists’ movements is a nightmare. In a paper presented at the March 2002 Conference on Technology for Preventing Terrorism, David Dye of Lawrence Livermore National LaboratoryLawrence Livermore National Laboratory: see Lawrence Berkeley National Laboratory.

(body) Lawrence Livermore National Laboratory – (LLNL) A research organaisatin operated by the University of California under a contract with the US Department of Energy.…..Click the link for more information. noted that the United States has 7,606 miles of land border and some 12,452 miles of coastline. Further, Dye reported, 633.7 million people entered the United States at the nation’s 361 ports of entry. Even in the months just after September 11, the Coast Guard boarded only about 35% of the 5,112 vessels entering U.S. ports. Wrote Dye, “The government simply cannot perform 633.7 million hand searches every year, no matter how great the threat.” “Our biggest concern is explosives,” says Nico Melendez, a spokesman for the Transportation Security Administration (TSATSA

See tax-sheltered annuity (TSA).

). Melendez says the TSA started airport screening for explosives using what’s called an “air shower” system in the summer of 2004. In this system, passengers step into a booth-like portal that releases puffs of air aimed at their clothing and skin. An air sample is then collected and analyzed by an ion mobility spectrometer An ion mobility spectrometer (IMS) is a spectrometer capable of detecting and identifying very low concentrations of chemicals based upon the differential migration of gas phase ions through a homogeneous electric field. , which compares the air’s components against a database containing spectrographicspec·tro·graphn.1. A spectroscope equipped to photograph or otherwise record spectra.

2. A spectrogram.

spec profiles of target chemicals such as TNT, C-4, and Semtex. According to a 24 May 2006 press release from the Port of Portland (Oregon), 28 airports in the United StatesList of airports in the United States, grouped by state or territory and sorted by city.

Due to the large number of airports in the United States, this page only lists public use airports providing scheduled passenger services with over 10,000 passenger boardings per year

are now using air shower portals. THz waves are also useful for passenger screening because they can penetrate beneath clothing to detect hidden weapons. Peter Adrian, a senior analyst with business consultancy Frost & Sullivan, says, “One of the historical problems with gas sensors [including ion mobility spectrometers] is that they can be affected by extraneous environmental factors.” Conventional mobility spectrometers searching for explosives and trace levels of chemical warfare agents can’t always pick the target signal out of the “noise” of the many other chemicals in the environment, such as perfumes, and may be susceptible to false positives, causing delays and passenger frustration. Faster and more accurate identification of questionable materials is crucial to effective protection from terrorism. With too many false positives, people will become desensitizedde·sen·si·tizetr.v.de·sen·si·tized, de·sen·si·tiz·ing, de·sen·si·tiz·es1. To render insensitive or less sensitive.

2. Immunology To make (an individual) nonreactive or insensitive to an antigen.

to the danger. At the same time, a false negative means the system has failed, with potentially devastatingdev·as·tatetr.v.dev·as·tat·ed, dev·as·tat·ing, dev·as·tates1. To lay waste; destroy.

2. To overwhelm; confound; stun: was devastated by the rude remark.

consequences. The TSA is currently funding Argonne research into replacing the ion mobility spectrometer with THz spectrometry, says Gopalsami, who adds that with proper funding the device could be taken into the field in two years. Putting T Rays to the Task Argonne’s THz spectrometry technology measures the rotation of a molecule in the vapor or gas phase. Every molecule’s rotational pattern is unique, and exciting a molecule with T ray frequencies reveals the “fingerprint” for that molecule. A spectral identification algorithm uses the information to determine the specific compound being examined by matching it with a spectral library. One disadvantage of THz spectrometry, says Gopalsami, is that to be detected a molecule must be polar, or asymmetrical; methane, for example, cannot be detected this way because it is nonpolar nonpolar

not having poles; not exhibiting dipole characteristics.

, or symmetrical. Quick and accurate identification of a molecule is easiest when the molecules are rotating unimpeded in gas or vapor form under pressures well below normal atmospheric pressure, so that collisions between molecules are decreased. This is easy to establish in a laboratory, but difficult in field conditions. However, the Argonne researchers were able to overcome this handicap with millimeter-wave frequencies, which are less sensitive to atmospheric conditions; their longer wavelengths (relative to cloud particles) cause less reflection and scattering of the millimeter waves. “Additionally,” says Gopalsami, “there are gaps or windows in the millimeter-wave spectrum in which common molecules in air are mostly transparent to the millimeter waves.” Using millimeter-wave frequencies, the team identified airborne poison gas chemicals from 60 meters away and chemicals related to nuclear weapons from 600 meters. A major issue for counterterroristcoun·ter·ter·roradj.
Intended to prevent or counteract terrorism: counterterror measures; counterterror weapons.

n.
Action or strategy intended to counteract or suppress terrorism.

sensor development is whether a sensor must have a physical sample or whether it can detect and analyze a substance at a distance. The former are called “point sensors,” and the latter are “remote” or “standoff” detectors. Chemical, biological, and explosive materials generally require a point sensor. However, in an experiment with AOZT Finn-Trade, the Argonne team was able to tell when a nuclear power plant 9 kilometers away was in operation or idle by measuring radiation-induced changes in the air around the plant. Those changes were observable using microwave radar, but the team is also experimenting with millimeter-wave radar to achieve higher sensitivity of detection. Bioweapons also pose serious risks, and the development of sensors capable of rapid remote detection has been slow. The litany of known and possible biological agents is frightening, among them the viruses that cause smallpox, anthrax, plague, and Ebola hemorrhagic feverNoun1.Ebola hemorrhagic fever – a severe and often fatal disease in humans and nonhuman primates (monkeys and chimpanzees) caused by the Ebola virus; characterized by high fever and severe internal bleeding; can be spread from person to person; is largely limited to…..Click the link for more information.. Further, in an article in the 2006 special issue of EMBO Reports, authors Jonathan Tucker and Craig Hooper described how advances in protein engineering could make so-called fusion toxins another front-runner for terrorists. These custom-made “designer” poisons unite two or more naturally occurring toxins, such as ricinricin /ri·cin/ (ri´sin) a phytotoxin in the seeds of the castor oil plant (Ricinus communis), used in the synthesis of immunotoxins.
ri·cinn. and botulinumbot·u·li·num or bot·u·li·nusn.
An anaerobic, rod-shaped bacterium (Clostridium botulinum) that secretes botulin and inhabits soils. , to create a toxin significantly more toxic than either parent. Not only that, but unless counterterrorist researchers can stay abreast of possible combinations, a fusion toxin could be invisible to a sensor looking for a match in a preexistingpre·ex·ist or pre-ex·istv.pre·ex·ist·ed, pre·ex·ist·ing, pre·ex·ists

library. For bioweaponNoun1.bioweapon – any weapon usable in biological warfare; “they feared use of the smallpox virus as a bioweapon”
bioarm, biological weapon

anthrax bacillus, Bacillus anthracis – a species of bacillus that causes anthrax in humans and in animals (cattle

detection, Argonne researchers are working on a sensor based on dielectric properties of molecules. Dielectric materials are nonconducting and exhibit a complex property called a dielectric constant that can be measured by resonatorresonator /res·o·na·tor/ (rez´o-na?ter)1. an instrument used to intensify sounds.

2. an electric circuit in which oscillations of a certain frequency are set up by oscillations of the same frequency in another

techniques. Furthermore, they resonate at particular frequencies. DNA appears to resonate strongly in the THz region; therefore, the dielectric approach may eventually enable early detection of biological molecules without the use of more complex and much slower biochips that rely on analytical tools such as polymerase chain reactionpolymerase chain reaction(pŏl`ĭmərās’) (PCR), laboratory process in which a particular DNA segment from a mixture of DNA chains is rapidly replicated, producing a large, readily analyzed sample of a piece of DNA; the process is . As new technologies are developed, they will not necessarily eliminate older methods. “It’s hard to make a categorical statement that one approach is better than the others,” says Dye. Because the range of terrorist weapons is so broad, he adds, “You’ll end up with niche applications.” In the swirl of national security challenges, however, using a new part of the electromagnetic spectrum offers rich promise for thwarting the terrorist arsenal–and likely will produce benefits for environmental health as well. Suggested Reading Brower JL. 2005. The Terrorist Threat and Its Implications for Sensor Technologies. Presented at: Advances in Sensing with Security Applications; 17-30 July 2005; Il Ciocco, Italy. Available: http://www.nato-asi.org/sensors2005/papers/brower.pdf. Dye DH. 2002. Sensors for screening and surveillance. Presented at: Conference on Technology for Preventing Terrorism; 12-13 March 2002; Stanford, CA. Report UCRL-JC-147479. Gopalsami N, Raptis AC. Resonance-enhanced dielectric sensing of chemical and biological species. Presented at: Biological Threat Reduction Conference 2003; 19-21 March 2003; Albuquerque, NM. Siegel PH. 2004. Terahertz technology in biology and medicine. IEEE (Institute of Electrical and Electronics Engineers, New York, http://www.ieee.org) A membership organization that includes engineers, scientists and students in electronics and allied fields. Trans Microw Theory Tech 52(10):2438-2447. Tucker JB, Hopper C. 2006. Protein engineering: security implications. EMBO Rep 7(SI):S14-S17.

Walking the floor of WINDPOWER 2008, the annual conference and trade show for the wind energy industry, one couldn’t help but be transfixed by all of the different types of turbines – at least I couldn’t. The wind turbine has become the iconic of clean, renewable energy. But the classic three-bladed horizontal axis wind turbine, with its gracefully swooping blades, has become the symbol of not only renewable energy, but also of environmental consciousness and ecological possibility.

Despite the ubiquity of the three-bladed turbine, the oft-overlooked vertical-axes turbines are making quite a splash in the world of wind energy, especially in small and micro-applications. So what’s all the fuss about? Vertical-axis turbines apparently do not suffer from some of the same problems that plague small wind applications in urban settings including, aesthetic concerns, space requirements and sound levels

Other advantages of vertical-axis turbines:

Can produce up to 50% more electricity per year than conventional turbines with the same swept area;

Generate electricity at much lower wind speeds, as low as 4 mph (1.5 m/s)Will continue to generate power in high wind speeds, up to 130 mph (60m/s) depending on the mode;

Direct-drive units with no gearbox means a more efficient transfer of energy and no leaking oil;

Will not harm wildlife, in terms of bird and bat strikes.

Below, I’ll cover some more basic differences and show you a few photos and short videos of some of these turbines I saw down in Houston at WINDPOWER 2008.

The designers from Taiwanese start-up A.N.I.T.A. Energy (pictured above) showed me why their models have a low start-up wind speed, and that is because of the light metal bands you can see surrounding the turbine itself. Apparently this design allows users with a less substantial wind resource (particularly those in urban applications), squeeze some electricity from the local winds. The larger model pictured above (and in the second video below) is scalable and can be stacked as many as three-high and integrated with the rooftops of large buildings.

Unlike three-bladed designs, vertical-axis turbines do not need to “right themselves” into the wind, they are always in a fixed position in terms of their orientation. A few of the models I saw, most notably the designs from the Korea-based KR Windpower, (video above) had a manifold-type device that would swing around and funnel more wind into the turbine from the direction the wind was strongest. Continued…

<!--

<!-- <!--

Another design I saw scattered throughout the show floor were those that integrated small solar and wind together on the same unit (top photo and photo below). The unit below integrates both solar and wind onto a single 400W streetlight platform.

The turbine itself is a “GUS” from a company called Tangarie which features a reflective coating that reduces glare and can even be slathered with an advertisement or a state flag of Texas, as is the one below. The complete solar/wind/streetlight/pole package (made by another company altogether) costs about $7,000, not including installation.

A fresh, up-to-date perspective on the latest trends in clean tech. From renewable energy sources, to less toxic electronics and more efficient information technology, Cleantechnica introduces readers to the full scope of clean technology in language that doesn’t require an engineering degree.

Concrete that maintains itself by healing cracks improves the sustainability of infrastructure through its longer service life and lower maintenance inputs. Now researchers have developed flexible, self healing cement that won’t suffer catastrophic failure when strained in an earthquake.

We are so used to seeing maintenance teams working on our concrete buildings and structures that this expensive and carbon and energy intensive operation is taken for granted.But what if these structures could maintain themselves just as our bodies do when they heal cuts and scrapes? Dr Victor C. Li of the University of Michigan felt that this could be achieved through the development of new concretes. Self-healing cement is not a new idea. There is evidence of healing even in some ancient Roman buildings. However, it very seldom occurs with modern concrete and concrete designs.

Very small cracks, preferably less than a tenth of a millimetre (four thousanths of an inch) wide are able to heal themselves. This happens when cement particles that were previously unexposed to air and water become exposed on the surfaces of the crack. These particles react with water in the air and CO2 to form sufficient calcium carbonate to fill the crack and bond on both sides. These healed cracks appear as white scars on the concrete surface. The healed concrete has properties which are equal to or better than those of the concrete before it was stressed and cracked.

Dr Li and his team developed a bendable, engineered cement composite (EEC) that is much more flexible than traditional concrete. Traditional concrete can suffer catastrophic failure when strained in an earthquake or by routine overuse, but flexible ECC bends without breaking. Its properties are a result of specially coated reinforcing fibers that hold it together.

The beauty of EEC structures would be that most maintenance would need no intervention and would occur continually, so long as the concrete is moistened periodically. Also the concrete doesn’t require any reinforcing steel so the corrosion problem that has plagued the USA’s reinforced concrete infrastructure is essentially eliminated.

EEC can also be used to make buildings, bridges and roads more resistant to earthquakes, explosions and other extreme stresses thereby protecting lives. It will increase the service life of infrastructure and vastly reduce maintenance. This will have a significant effect on infrastructure’s lifecyle carbon and energy footprints improving the sustainability of the world’s infrastructure.

nytimes.com —Chemists have been trying for years to develop self-healing polymer coatings for use on cars, furniture and other objects. Some recent efforts use microspheres containing bonding chemicals. These tiny capsules are embedded in the coating. When a crack or scratch occurs, the spheres break and the chemicals flow into the void, patching it.

Biswajit Ghosh and Marek W. Urban of the University of Southern Mississippi have come up with another approach, which they describe in a paper in Science. With their method, what breaks is not a sphere, but a ring-shaped chemical, oxetane, that is incorporated in the polyurethane polymer. Another compound in the polymer, chitosan, forms cross-links at the places where the oxetane breaks, healing the scratch.

What makes the method potentially very useful is what causes the cross-links to form: exposure to ultraviolet light. That means that a damaged coating could heal itself in a matter of minutes or hours by being exposed to sunlight, which contains plenty of UV rays. If your car picks up a scratch at the parking garage, for instance, it might disappear by the time you arrive home.

Another potential advantage is that the coating does not require space-age materials. Chitosan is a derivative of chitin, which gives structure to the shells of shrimp, lobster and other crustaceans. So there’s plenty of it around.

Glendale, California

A portion of this large chemical facility was damaged during the Northridge Earthquake. Horizontal cracks at and near mid-height of the walls indicated that the original vertical reinforcement in the tilt-up panels was insufficient.

A major concern of the owner was that they did not want the repair to affect the operation of the plant at other undamaged buildings. This meant that the large number of pipes passing through the damaged building had to remain in place and operational during the repair.
The wall panels were strengthened with glass fabric placed in vertical strips. Both inside and outside surfaces of the walls were strengthened. The flexibility of the fabrics allowed them to be passed through small openings and behind the pipes, eliminating the need to dismantle any pipes.
The finished walls were coated with a UV-protective coating. Over 20,000 ft² of fabric was used on this project, which was completed in October 1994.

Seismic isolation is a construction method for protecting buildings, in which the building and ground are separated by an isolation system to limit the transmission of vibrations through the building.
It reduces the earthquake force and changes it to a slow vibration, so not only the building, but also everything inside is protected.

Earthquake resistant construction

Although the structure of the building can be protected, there is a possibility of causing secondary accidents due to falling furniture and damage to facades.

Seismic isolation provides numerous benefits.

Safety improvement
Damage to not only the building itself but also interior facilities are restricted and gas or water leaks are prevented, as are secondary accidents due to falling furniture and human lives are protected.

Maintenance of function
The function of buildings can be ensured even after a major earthquake and life can continue as normal.

Protection of property
The difficulties of repair, reinforcement, demolition and rebuilding were experienced after the Great Hanshin Earthquake.
Seismic isolation reduces concern.

Increased flexibility
Earthquake input to building can be considerably reduced while design flexibility is increased.
Precast construction method simplifies the whole structure including junctions and reduces weight of members.

Improvement of relief
Fear of earthquake can be alleviated and psychological burden is reduced.
Evacuation route is secured after an earthquake.

Economic effect of seismic isolation
Considering safety improvements for disaster during an earthquake and reduction of repair costs after an earthquake, seismic isolation can reduce life cycle cost.

But this was no ordinary earthquake. In a groundbreaking series of tests, engineering researchers from UC San Diego’s Jacobs School of Engineering jarred a full-size 275-ton building erected on a shake table, duplicating ground motions recorded during the January 17, 1994 Northridge earthquake in Los Angeles, California.

To record the impact on the building, the structure was fitted with some 600 sensors and filmed as the shake table simulated the earthquake, yielding a flood of data including stress, strain, and acceleration — so much information that engineers were having a hard time making sense of it all.

That’s where visualization experts from the San Diego Supercomputer Center (SDSC) at UC San Diego came in.

“By recreating the shake table experiment in movies in a virtual environment based on the observed data, this lets engineers explore all the way from viewing the ‘big picture’ of the entire building from a 360-degree viewpoint to zooming in close to see what happened to a specific support,” said SDSC visualization scientist Amit Chourasia. “Integrating these disparate data elements into a visual model can lead to critical new insights.”

Added José Restrepo, a professor of structural engineering at UCSD, “These visualizations give us an intuitive way to see how the building behaves in our shake table experiments — this tool will be very valuable in helping us understand the tests in ways we can’t from traditional approaches, and also in sharing this research with other engineers and the public.”

The costliest quake in U.S. history, the magnitude 6.7 Northridge event prompted calls for more scientific evaluation of structural elements, leading the engineers to conduct the building tests on one of the world’s largest shake tables at UCSD’s outdoor Englekirk Structural Engineering Center.

A paper by Chourasia describing the project was published in a special graphics issue of ACM Crossroads, the student journal of the Association for Computing Machinery.

In addition to helping engineers understand the earthquake’s impact on the building, the visualizations can also give researchers a tool to do “what if” virtual experiments.

“We found that the recorded motion aligns very well with the movie we created,” said Steve Cutchin, director of Visualization Services at SDSC. “This is important because knowing the model is accurate means it can be used to take simulated earthquake data and predict the sensor values — you can ask, ‘What if a larger 7 point earthquake hits?’ and simulate how the building will shake in response.”

To make the visualizations more useful and provide a rich visual context, the researchers wanted to incorporate recognizable elements from the surroundings, which meant integrating features from the actual video footage recorded during the test. “Our goal was to have fidelity not only in rendering quality but also in visual realism,” said Chourasia. In addition, the integrated video would let the researchers validate this virtual reconstruction visually.

Once Chourasia and his colleagues had developed the building model and animated the deformation caused by the shaking, they worked to align the virtual camera and lighting with the real world video camera so that the scene would match in the recorded footage and the virtual version.

“However, when we tried to composite the actual video footage, we found that the instruments had sampled the data at 50 Hz but the video was recorded at 29.97 Hz.,” Chourasia explained. “And there wasn’t any timing synchronization between the building sensors and camera.” This posed a serious hurdle for compositing.

“After viewing the video footage, we noticed that the recording also contained audio data, because the moving building and shake table make noise, and this proved to be the key.” By “listening to the building” and analyzing the audio and sensor signals, the researchers were able to synchronize the video and instrument data for the visualization.

In the future, the visualization researchers plan to develop lighting models for more realistic rendering and to find automated ways to match the real and virtual cameras. They are also distilling lessons learned from this study into requirements for a visualization workbench for analysis of the dissimilar types of data that come from structural and seismic experiments.

In addition to providing visualization services for the shake table experiments, SDSC is also home to the NEESit Services Center (NEESit) which is developing and maintaining a state-of-the-art grid to meet the cyberinfrastructure needs of the earthquake engineering community, including the UCSD facility and 14 other collaborating experimental sites across the nation.

Need to cite this story in your essay, paper, or report? Use one of the following formats:

APA

MLA

University of California – San Diego (2007, April 12). Virtual Research On Earthquake Resistant Structures. ScienceDaily. Retrieved April 25, 2009, from http://www.sciencedaily.com­/releases/2007/04/070411135030.htm

Illinois Earthquake Is A Wake-Up Call(Apr. 18, 2008) — Today’s early morning earthquake that jolted many in the central U.S. is a reminder that seismic events do occur in areas not normally thought of as “earthquake country.” It is also a lesson that …

Bridging The Gap In Nanoantennas

ScienceDaily (Apr. 25, 2009) — In a recent publication in Nature Photonics, a joint team of researchers at CIC nanoGUNE, Donostia International Physics Center DIPC, Centro de Física de Materiales of CSIC/UPV-EHU in San Sebastian (Spain), Harvard University (USA) and the Max Planck Institute of Biochemistry in Munich (Germany) reports an innovative method for controlling light on the nanoscale by adopting tuning concepts from radio-frequency technology.

The method opens the door for targeted design of antenna-based applications including highly sensitive biosensors and extremely fast photodetectors, which could play an important role in future biomedical diagnostics and information processing.

An antenna is a device designed to transmit or receive electromagnetic waves. Radio frequency antennas find wide use in systems such as radio and television broadcasting, point-to-point radio communication, wireless LAN, radar, and space exploration. In turn, an optical antenna is a device which acts as an effective receiver and transmitter of visible or infrared light. It has the ability to concentrate (focus) light to tiny spots of nanometer-scale dimensions, which is several orders of magnitude smaller than what conventional lenses can achieve. Tiny objects such as molecules or semiconductors that are placed into these so-called “hot spots” of the antenna can efficiently interact with light. Therefore optical antennas boost single molecule spectroscopy or signal-to-noise in detector applications.

In their experiments the researchers studied a special type of infrared antennas, featuring a very narrow gap at the center. These so called gap-antennas generate a very intense “hot spot” inside the gap, allowing for highly efficient nano-focusing of light. To study how the presence of matter inside the gap (the “load”) affects the antenna behavior, the researchers fabricated small metal bridges inside the gap (Figure b). They mapped the near-field oscillations of the different antennas with a modified version of the scattering-type near-field microscope that the Max Planck and nanoGUNE researchers had pioneered over the last decade.

For this work, they chose dielectric tips and operated in transmission mode, allowing for imaging local antenna fields in details as small as 50 nm without disturbing the antenna. “By monitoring the near-field oscillations of the different antennas with our novel near-field microscope, we were able to directly visualize how matter inside the gap affects the antenna response. The effect could find interesting applications for tuning of optical antennas” says Rainer Hillenbrand leader of the Nanooptics group at the newly established research institute CIC nanoGUNE Consolider.

The nanooptics group from DIPC and CSIC-UPV/EHU led by Javier Aizpurua in San Sebastián fully confirmed and helped to understand the experimental results by means of full electrodynamic calculations. The calculated maps of the antenna fields are in good agreement with the experimentally observed images. The simulations add deep insights into the dependence of the antenna modes on the bridging, thus confirming the validity and robustness of the “loading” concept to manipulate and control nanoscale local fields in optics.

Furthermore, the researchers applied the well developed radio–frequency antenna design concepts to visible and infrared frequencies, and explained the behavior of the loaded antennas within the framework of optical circuit theory. A simple circuit model showed remarkable agreement with the results of the numerical calculations of the optical resonances. “By extending circuit theory to visible and infrared frequencies, the design of novel photonic devices and detectors will become more efficient. This bridges the gap between these two disciplines” says Javier Aizpurua.

With this work, the researches provide first experimental evidence that the local antenna fields can be controlled by gap-loading. This opens the door for designing near-field patterns in the nanoscale by load manipulation, without the need to change antenna length, which could be highly valuable for the development of compact and integrated nanophotonic devices.

The bottom line depicts the topography, whereas the upper line plots the scanned near-field images. Figure a shows a metal nanorod that can be considered the most simple dipole antenna. The near-field image clearly shows the dipolar oscillation mode with positive fields in red and negative fields in blue color. By introducing a narrow gap at the center of the nanorod thus altering the “antenna load” (Figure c), two dipolar-like modes are obtained. When the gap is connected with a small metal bridge (Figure b), the dipole oscillation mode of Figure a can be restored as the near-field image clearly reveals. (Credit: Martin Schnell/Copyright CIC nanoGUNE)

New Apparatus Measures Fast Nanoscale Motions(Mar. 17, 2007) — A new nanoscale apparatus developed at JILA offers the potential for a 500-fold increase in the speed of scanning tunneling microscopes, perhaps paving the way for scientists to watch atoms vibrate … > read more

This is just the start. Anyone else have cheap pools? Do you want to feel friendly? I remain victorious. This is factual material. The first step is to read a very brief report on above ground pools.Fusion, cheap above ground swimming pools