Physics

10/26/2015

The outside of the Wendelstein 7-x stellarator with its conglomeration of equipment, ports, and supporting structure (Click Image To Enlarge)

In a large complex located at Greifswald in the north-east corner of Germany, sits a new and unusual nuclear fusion reactor awaiting a few final tests before being powered-up for the very first time. Dubbed the Wendelstein 7-x fusion stellarator, it has been more than 15 years in the making and is claimed to be so magnetically efficient that it will be able to continuously contain super-hot plasma in its enormous magnetic field for more than 30 minutes at a time. If successful, this new reactor may help realize the long-held goal of continuous operation essential for the success of nuclear fusion power generation.

Created by the Max Planck Institute for Plasma Physics (IPP) and designed with the aid of a supercomputer, the Wendelstein 7-x is the first large-scale optimized stellarator of its type ever to be commissioned. With a name like something out of Hitchhiker's Guide to the Galaxy and a containment vessel that literally provides a new twist on the doughnut shape we see in standard tokamak fusion reactors, the quirky stellarator design aims to provide an inherently more stable environment for plasma and a more promising route for nuclear fusion research in general.

Initially an American design conceived by Lyman Spitzer working at Princeton University in 1951, the stellarator was deemed too complex for the constraints of materials available in the middle of the 20th Century, and the more easily constructed toroid of the tokamak won out as the standard model for fusion research.

Though some stellarators have been constructed over the course of time – notably the predecessor to this latest iteration known as the Wendelstein 7-AS (Advanced Stellarator) – the calculations required to ensure ultimate plasma containment and control have only become possible with the advent of supercomputers.

As such, algorithms specifically created to fuse theory and practice have now been applied to the design of the Wendelstein 7-x, and its designers firmly believe that this latest version will have the stability required to be the precursor machine to full-blown, continuous nuclear fusion power generation.

For the eventual success of nuclear fusion power (essentially where two isotopes of hydrogen, deuterium and tritium, are subject to such energy that the strong nuclear force is overcome and they fuse to form helium and release copious amounts of neutron energy), stability is essential. This is because the enormous pressures and temperatures (around 100 million degrees Celsius (180 million °F)) used to create the plasma, and then accelerate the resulting ion and electron soup around the containment vessel, means that any instability in the magnetic containment field or the pressure vessel itself will result in degradation and ultimately the failure of the process.

What is the concept underlying the Wendelstein 7-X fusion device? This video, produced from various CADs, illustrates how the device is configured and what objectives are being pursued by the fusion research conducted at the Greifswald branch of Max Planck Institute for Plasma Physics with Wendelstein 7-X.

To achieve a more stable environment, the stellarator eschews the method of inducing current through the plasma to drive electrons and ions around the inside of the vessel as found in tokamak designs, instead relying entirely on external magnetic fields to move the particles along. In this way, stellarator designs are basically immune to the sudden and unexpected disruptions of plasma and the enormous – and often destructive – magnetic field collapses that sometimes occur in tokamaks.

As such, a stellarator reactor is able to hold the plasma in a containment field that twists through a set of magnetic coils to continuously hold the plasma away from the walls of the device. This is because, in a normal tokamak, with its doughnut-shaped containment vessel and electromagnet windings that loop through the center of the toroid and around the outside, the magnetic field is stronger in the center than it is on the outer side. This means that plasma contained in a tokamak tends to drift to the outer walls where it then collapses.

A graphic depicting the plasma flow (red) in the stellarator and its magnetic coils (blue) (Click Image To Enlarge)

The stellarator, on the other hand, avoids this situation by twisting the entire containment vessel into a shape that constantly forces the plasma stream into the center of the reactor vessel as it continuously encounters magnetic fields in opposing positions along its entire length.

The advantages of the stellarator over the tokamak come at a cost, however, as the many twists and turns that give the stellarator an advantage in magnetic containment also means that many particles can simply be lost as they veer off course following the path of the containment vessel itself. To help avoid this, a great many more magnetic coils are required for the stellarator and must be set up at very close intervals around the structure and super-cooled with liquid helium for maximum efficiency.

Construction of the Wendelstein 7-x stellerator took over 1 million man-hours (Click Image To Enlarge)

In the case of the Wendelstein 7-x, the weight of the 50, 3.5-meter (11.5-ft) tall non-planar super-conducting electromagnets alone is around 425 tonnes (468 tons) and their placement makes construction difficult and their assembly fraught with problems. Not to mention the fact that piping around vast quantities of liquid helium to ensure that the electromagnets superconduct at temperatures close to absolute zero makes the Wendelstein 7-x a plumber's nightmare, and a tricky addition to an already difficult balancing act.

As such, the physical design of the stellarator itself requires access ports for fuel ingress and egress, along with a myriad other entry points for instruments, sensors, and all the other necessary paraphernalia necessary to monitor the enormous pressures, voltages, and temperatures that it will be subject to in operation.

Dr. Matthias Otte, who is responsible for the measurement process, reports:

“Once the flux surface diagnostics were placed in operation, we were immediately able to see the first magnetic surfaces. Our images clearly show how magnetic field lines create closed surfaces in many toroidal circulations”.

The flux surface diagnostics enables the structure of the field to be precisely measured. For this purpose, a thin electron beam is injected and moves along a field line in circular tracks through the evacuated plasma vessel. It leaves behind a tracer, which is created by collision of the electrons with residual gas in the vessel. If, in addition, a fluorescent rod is moved through the vessel cross section, light spots are created when the electron beam hits the rod. In the camera recording, the entire cross section of the magnetic field gradually becomes visible.

Despite all of these problems, tests on the completed stellarator to maintain the sub-millimeter accuracy for the plasma path are progressing and show promise. In one recent test, an electron beam was injected into the stellarator and progressed along a predetermined field line in the circular tracks through the evacuated plasma vessel. As it moved through the machine, the beam created a tracer in its wake created by collisions with electrons contained in the residual gas in the vessel.

Photograph that combines the tracer of an electron beam on its multiple circulation along the inside of the containment vessel (Click Image To Enlarge)

Meanwhile, as the electron beam constantly circulated through the system, a fluorescent rod was pushed transversely through the vessel in cross section, and when the electron beam struck the rod, visible spots of light were created and the results recorded with a camera. In this way, the whole cross section of the magnetic field was gradually made visible.

"Once the flux surface diagnostics were placed in operation, we were immediately able to see the first magnetic surfaces. Our images clearly show how magnetic field lines create closed surfaces in many toroidal circulations."

Coil tests are conducted in the control room, the measured data from all test series are brought together and evaluated (Click Image To Enlarge)

Whilst in itself just another stepping stone toward the ultimate goal of practical fusion energy, the IPP stellarator is an important juncture in the field. With tokamak-based reactors still requiring more energy in than they actually produce, both the scientific and general public alike have grown wary of the long-held promises surrounding nuclear fusion. And, though many bodies, such as the University of Washington, Lockheed-Martin, and MIT, claim to be "close" to producing a working, sustainable, self-powering machine, nuclear fusion still remains a pipe dream.

This is where IPP's proving of the technology over the coming months leading to a full-blown commissioning of the machine may well provide the nexus between theory and practicality and, if not deliver on the promise of boundless energy, at least provide a proof of concept and renew flagging interest in a field that may, one day, solve all of our energy needs.

With approval to continue from nuclear regulators in Germany expected by the end of this month, the Wendelstein 7-x stellarator is slated for its first fully-operational tests in November this year. At a cost of more than €1 billion ($US 1.1 billion) and over 1 million man-hours of work committed so far, the hopes of Europe's future being a nuclear fusion-powered one may well rest on the ability of this machine to perform as expected. Watch this space.

COMMENTARY: The objective of fusion research like that being conducted by the Max Planck Institute for Plasma Physics (IPP)is to develop a power source that is friendly to the climate and the environment. Similarly to the sun, it harvests energy from the fusion of atomic nuclei. To light the fusion fire in a future power station, the fuel – a hydrogen plasma – must be confined in magnetic fields and heated to a temperature of over 100 million degrees. The Wendelstein 7-X, which will be the largest stellarator-type fusion device in the world, will not produce energy but will enable the suitability of this type of device as a power station to be investigated. With plasma discharges lasting up to 30 minutes, it should demonstrate its significant property – its ability to operate continuously.

A ring of 50 superconducting magnetic coils approximately 3.5 metres in height, is the key component of the device. Cooled with liquid helium to the superconducting temperature which is near to absolute zero, once switched on, they consume very little energy. Their special shapes are the result of refined optimisation calculations. Their task is to create a magnetic cage for the plasma with particularly good thermal-insulation properties.

In May 2014 the assembly of Wendelstein 7-X was completed on time and for over a year the preparations for operation have been under way. One by one, the operation of each technical system is being tested. From the end of April to the beginning of July 2015, attention was turned to the magnetic coils. As soon as the functional capability of these central system components was confirmed (see IPP Info 6/15), the testing of the magnetic surfaces was carried out. Configuration of the computer-supported data collection for the experimental operation is still to be carried out and in the periphery of the device the equipment for monitoring and heating the plasma requires completion. The objective: the Wendelstein 7-X should produce the first plasma this year.

Let's wish the physicists at IPP much success in taking the first step in the development of sustainable, self-powering, clean and efficient fusion energy. This sort of science was thought to be impossible due to the ultra-high temperatures required to create fusion energy. The radical Wendelstein 7-X stellarator, with its wacky, twisty, donut-shaped containment vessel, appears to be viable in containing the super-hot plasma, according to early tests. We hope that fusion energy theory becomes reality, and during the first real test in November, and that there are no dangerous accidents. Would hate to see $1.1 billion go up in smitherings.

07/14/2015

Just minutes before the long-awaited flyby took place at 7:49AM ET, NASA "teased" the final full-frame color image of Pluto set to be released before the event by publishing it on Instagram. It was taken at about 4PM ET on July 13th, according to NASA, from 476,000 miles away. The high-resolution image was released after the flyby, and can be seen above.

Final image of the dwarf planet Pluto taken by the New Horizon spacecraft (Click Image To Enlarge)

In the above image, we can see the "heart" of Pluto in much greater detail than before, craters that were impossible to make out in previous images, and a great view of the dwarf planet's dark equatorial belt.

The New Horizons team celebrates the new image of Pluto (Click Image To Enlarge)

There are more images of the face of pluto to come. The first true high-resolution mosaic image will be released tomorrow afternoon, and a few more will be released throughout the week. A much larger set will be released starting in September.

NASA jubilantly announces the successful flyby of Pluto by the New Horizons spacecraft with the following tweet:

Click Image To Enlarge

While the chance is around one in 10,000 that New Horizons will come into contact with debris during the flyby, spirits are high at mission control in Maryland. Ralph Semmel, director of the Johns Hopkins Applied Physics Laboratory, said.

"Tonight we're going to get the signal — and we will get the signal,"

NASA uploaded the following documentary video which details the journey of the New Horizons spacecraft from its early beginning to its flyby of Pluto.

The following infographic explains the mission of the New Horizons spacecraft beginining with its launch in 2006:

Click Image To Enlarge

COMMENTARY: It's incredible that after 3 billion miles and over 9 years, the New Horizons spacecraft was able to flyby the dwarf planet Pluto at a distance of about 2,700 miles from its surface. That is one incredible feat. I can hardly wait for those closeup images of the surface of Pluto. According to NASA, the pictures are being sent back to Earth using technology that existed nine years ago, so tbe process will be very slow and take nearly a year and a half to complete.

I am still find it hard to believe that Pluto was only discovered in 1930 and it appeared as a very faint and small speck in the vastness out outerspace with thousands of stars in its background. BTW, the some of the ashes of Clyde Tombough, the original discoverer of Pluto, are carried on board New Horizons.

12/10/2013

The Nobel Prize in Physics 2013 was awarded jointly to François Englert (left) and Peter W. Higgs(right) "for the theoretical discovery of a mechanism that contributes to our understanding of the origin of mass of subatomic particles, and which recently was confirmed through the discovery of the predicted fundamental particle, by the ATLAS and CMS experiments at CERN's Large Hadron Collider"

Click Images To Enlarge

François Englert and Peter W. Higgs are jointly awarded the Nobel Prize in Physics 2013 for the theory of how particles acquire mass. In 1964, they proposed the theory independently of each other (Englert together with his now deceased colleague Robert Brout). In 2012, their ideas were confirmed by the discovery of a so called Higgs particle at the CERN laboratory outside Geneva in Switzerland..

The awarded theory is a central part of the Standard Model of particle physics that describes how the world is constructed. According to the Standard Model, everything, from flowers and people to stars and planets, consists of just a few building blocks: matter particles. These particles are governed by forces mediated by force particles that make sure everything works as it should.

The entire Standard Model also rests on the existence of a special kind of particle: the Higgs particle. This particle originates from an invisible field that fills up all space. Even when the universe seems empty this field is there. Without it, we would not exist, because it is from contact with the field that particles acquire mass. The theory proposed by Englert and Higgs describes this process.

On 4 July 2012, at the CERN laboratory for particle physics, the theory was confirmed by the discovery of a Higgs particle. CERN’s particle collider, LHC (Large Hadron Collider), is probably the largest and the most complex machine ever constructed by humans. Two research groups of some 3,000 scientists each, ATLAS and CMS, managed to extract the Higgs particle from billions of particle collisions in the LHC.

Even though it is a great achievement to have found the Higgs particle — the missing piece in the Standard Model puzzle — the Standard Model is not the final piece in the cosmic puzzle. One of the reasons for this is that the Standard Model treats certain particles, neutrinos, as being virtually massless, whereas recent studies show that they actually do have mass. Another reason is that the model only describes visible matter, which only accounts for one fifth of all matter in the cosmos. To find the mysterious dark matter is one of the objectives as scientists continue the chase of unknown particles at CERN.

François Baron Englert was born in 1932 and is a Belgian theoretical physicist and 2013 Novel prize laureate (shared with Peter Higgs). He is Professor emeritus at the Universite libre de Bruxelles (ULB) where he is member of the Service de Physique Théorique. He is also a Sackler Professor by Special Appointment in the School of Physics and Astronomy at Tel Aviv University and a member of the Institute for Quantum Studies at Chapman University in California. He was awarded the 2010 J.J. Sakurai Prize for Theoretical Particle Physics (with Gerry Guralnik, C.R. Hagen, Tom Kibble, Peter Higgs and Robert Brout), the Wolf Prize in Physics in 2004 (with Brout and Higgs) and the High Energy and Particle Prize of the European Physical Society (with Brout and Higgs) in 1997 for the mechanism which unifies short and long range interactions by generating massive gauge vector bosons. He has made contributions in statistical physics, quantum field theory, cosmology, string theory and supergravity. He is the recipient of the 2013 Prince of Asturias Award in technical and scientific research, together with Peter Higgs and the CERN.

Peter W. Higgs CH, FRS, FRSE was born in 1929 and is a British theoretical physicist, Nobel laureate and emeritus professor at the University of Edinburgh. He is best known for his 1960s proposal of broken symmetry in electroweak theory, explaining the origin of mass of elementary particles in general and of the W and Z bosons in particular. This so-called Higgs mechanism, which was proposed by several physicists besides Higgs at about the same time, predicts the existence of a new particle, the Higgs boson (which was often described as "the most sought-after particle in modern physics". CERN announced on 4 July 2012 that they had experimentally established the existence of a Higgs-like boson, but further work is needed to analyse its properties and see if it has the properties expected from the Standard Model Higgs boson. On 14 March 2013, the newly discovered particle was tentatively confirmed to be + parity and zero spin, two fundamental criteria of a Higgs boson, making it the first known fundamental scalar particle to be discovered in nature (although previously, composite scalars such as the K had been observed over half a century prior). The Higgs mechanism is generally accepted as an important ingredient in the Standard Model of particle physics, without which certain particles would have no mass.

Nobel Prize in Chemistry for 2013

The Nobel Prize in Chemistry 2013 was awarded jointly to Martin Karplus (left), Michael Levitt (middle) and Arieh Warshel (right) "for the development of multiscale models for complex chemical systems".

Click Images To Enlarge

Chemists used to create models of molecules using plastic balls and sticks. Today, the modelling is carried out in computers. In the 1970s, Martin Karplus, Michael Levitt and Arieh Warshel laid the foundation for the powerful programs that are used to understand and predict chemical processes. Computer models mirroring real life have become crucial for most advances made in chemistry today.

Chemical reactions occur at lightning speed. In a fraction of a millisecond, electrons jump from one atomic to the other. Classical chemistry has a hard time keeping up; it is virtually impossible to experimentally map every little step in a chemical process. Aided by the methods now awarded with the Nobel Prize in Chemistry, scientists let computers unveil chemical processes, such as a catalyst’s purification of exhaust fumes or the photo­synthesis in green leaves.

The work of Karplus, Levitt and Warshel is ground-breaking in that they managed to make Newton’s classical physics work side-by-side with the fundamentally different quantum physics. Previously, chemists had to choose to use either or. The strength of classical physics was that calculations were simple and could be used to model really large molecules. Its weakness, it offered no way to simulate chemical reactions. For that purpose, chemists instead had to use quantum physics. But such calculations required enormous computing power and could therefore only be carried out for small molecules.

This year’s Nobel Laureates in chemistry took the best from both worlds and devised methods that use both classical and quantum physics. For instance, in simulations of how a drug couples to its target protein in the body, the computer performs quantum theoretical calculations on those atoms in the target protein that interact with the drug. The rest of the large protein is simulated using less demanding classical physics.

Today the computer is just as important a tool for chemists as the test tube. Simulations are so realistic that they predict the outcome of traditional experiments.

Martin Karplus was born in 1930 and is an Austrian-born American theoretical chemist. He is the Theodore William Richards Professor of Chemistry, emeritus at Harvard University. He is also Director of the Biophysical Chemistry Laboratory, a joint laboratory between the French National Center for Scientific Research and the University of Strasbourg, France. Karplus received the 2013 Nobel Prize in Chemistry, together with Michael Levitt and Arieh Warshel, for "the development of multiscale models for complex chemical systems".

Michael Levitt, FRS was born in 1947 and is an American-British-Israeli biophysicist and a professor of structural biology at Stanford University, a position he has held since 1987. His research is in computational biology and he is a member of the National Academy of Sciences. Levitt received the 2013 Nobel Prize in Chemistry, together with Martin Karplus and Arieh Warshel, for "the development of multiscale models for complex chemical systems".

Arieh Warshel (Hebrew: אריה ורשל, was born in 1940 and is an Israeli-American Distinguished Professor of Chemistry and Biochemistry at the University of Southern California. He received the 2013 Nobel Prize in Chemistry, together with Michael Levitt and Martin Karplus for "the development of multiscale models for complex chemical systems".

Nobel Prize in Medicine for 2013

The Nobel Prize in Physiology or Medicine 2013 was awarded jointly to James E. Rothman (left), Randy W. Schekman (middle) and Thomas C. Südhof (right) "for their discoveries of machinery regulating vesicle traffic, a major transport system in our cells".

Click Images To Enlarge

The 2013 Nobel Prize was awarded jointly to three scientists who have solved the mystery of how the cell organizes its transport system. Each cell is a factory that produces and exports molecules. For instance, insulin is manufactured and released into the blood and signaling molecules called neurotransmitters are sent from one nerve cell to another. These molecules are transported around the cell in small packages called vesicles. The three Nobel Laureates have discovered the molecular principles that govern how this cargo is delivered to the right place at the right time in the cell.

Randy Schekman discovered a set of genes that were required for vesicle traffic. James Rothman unravelled protein machinery that allows vesicles to fuse with their targets to permit transfer of cargo. Thomas Südhof revealed how signals instruct vesicles to release their cargo with precision.

Through their discoveries, Rothman, Schekman and Südhof have revealed the exquisitely precise control system for the transport and delivery of cellular cargo. Disturbances in this system have deleterious effects and contribute to conditions such as neurological diseases, diabetes, and immunological disorders.

How cargo is transported in the cell

In a large and busy port, systems are required to ensure that the correct cargo is shipped to the correct destination at the right time. The cell, with its different compartments called organelles, faces a similar problem: cells produce molecules such as hormones, neurotransmitters, cytokines and enzymes that have to be delivered to other places inside the cell, or exported out of the cell, at exactly the right moment. Timing and location are everything. Miniature bubble-like vesicles, surrounded by membranes, shuttle the cargo between organelles or fuse with the outer membrane of the cell and release their cargo to the outside. This is of major importance, as it triggers nerve activation in the case of transmitter substances, or controls metabolism in the case of hormones. How do these vesicles know where and when to deliver their cargo?

Traffic congestion reveals genetic controllers

Randy Schekman was fascinated by how the cell organizes its transport system and in the 1970s decided to study its genetic basis by using yeast as a model system. In a genetic screen, he identified yeast cells with defective transport machinery, giving rise to a situation resembling a poorly planned public transport system. Vesicles piled up in certain parts of the cell. He found that the cause of this congestion was genetic and went on to identify the mutated genes. Schekman identified three classes of genes that control different facets of the cell´s transport system, thereby providing new insights into the tightly regulated machinery that mediates vesicle transport in the cell.

Docking with precision

James Rothman was also intrigued by the nature of the cell´s transport system. When studying vesicle transport in mammalian cells in the 1980s and 1990s, Rothman discovered that a protein complex enables vesicles to dock and fuse with their target membranes. In the fusion process, proteins on the vesicles and target membranes bind to each other like the two sides of a zipper. The fact that there are many such proteins and that they bind only in specific combinations ensures that cargo is delivered to a precise location. The same principle operates inside the cell and when a vesicle binds to the cell´s outer membrane to release its contents.

It turned out that some of the genes Schekman had discovered in yeast coded for proteins corresponding to those Rothman identified in mammals, revealing an ancient evolutionary origin of the transport system. Collectively, they mapped critical components of the cell´s transport machinery.

Timing is everything

Thomas Südhof was interested in how nerve cells communicate with one another in the brain. The signalling molecules, neurotransmitters, are released from vesicles that fuse with the outer membrane of nerve cells by using the machinery discovered by Rothman and Schekman. But these vesicles are only allowed to release their contents when the nerve cell signals to its neighbours. How is this release controlled in such a precise manner? Calcium ions were known to be involved in this process and in the 1990s, Südhof searched for calcium sensitive proteins in nerve cells. He identified molecular machinery that responds to an influx of calcium ions and directs neighbour proteins rapidly to bind vesicles to the outer membrane of the nerve cell. The zipper opens up and signal substances are released. Südhof´s discovery explained how temporal precision is achieved and how vesicles´ contents can be released on command.

Vesicle transport gives insight into disease processes

The three Nobel Laureates have discovered a fundamental process in cell physiology. These discoveries have had a major impact on our understanding of how cargo is delivered with timing and precision within and outside the cell. Vesicle transport and fusion operate, with the same general principles, in organisms as different as yeast and man. The system is critical for a variety of physiological processes in which vesicle fusion must be controlled, ranging from signalling in the brain to release of hormones and immune cytokines. Defective vesicle transport occurs in a variety of diseases including a number of neurological and immunological disorders, as well as in diabetes. Without this wonderfully precise organization, the cell would lapse into chaos.

James E. Rothman was born 1950 in Haverhill, Massachusetts, USA. He received his PhD from Harvard Medical School in 1976, was a postdoctoral fellow at Massachusetts Institute of Technology, and moved in 1978 to Stanford University in California, where he started his research on the vesicles of the cell. Rothman has also worked at Princeton University, Memorial Sloan-Kettering Cancer Institute and Columbia University. In 2008, he joined the faculty of Yale University in New Haven, Connecticut, USA, where he is currently Professor and Chairman in the Department of Cell Biology.

Randy W. Schekman was born 1948 in St Paul, Minnesota, USA, studied at the University of California in Los Angeles and at Stanford University, where he obtained his PhD in 1974 under the supervision of Arthur Kornberg (Nobel Prize 1959) and in the same department that Rothman joined a few years later. In 1976, Schekman joined the faculty of the University of California at Berkeley, where he is currently Professor in the Department of Molecular and Cell biology. Schekman is also an investigator of Howard Hughes Medical Institute.

Thomas C. Südhof was born in 1955 in Göttingen, Germany. He studied at the Georg-August-Universität in Göttingen, where he received an MD in 1982 and a Doctorate in neurochemistry the same year. In 1983, he moved to the University of Texas Southwestern Medical Center in Dallas, Texas, USA, as a postdoctoral fellow with Michael Brown and Joseph Goldstein (who shared the 1985 Nobel Prize in Physiology or Medicine). Südhof became an investigator of Howard Hughes Medical Institute in 1991 and was appointed Professor of Molecular and Cellular Physiology at Stanford University in 2008.

Nobel Prize in Literature for 2013

The Nobel Prize in Literature 2013 was awarded to Alice Munro"master of the contemporary short story".

Click Image To Enlarge

Alice Ann Munro (néeLaidlaw); was born in 1931 and is a Canadian author writing in English. Munro's work has been described as having revolutionized the architecture of short stories, especially in its tendency to move forward and backward in time. Munro's fiction is most often set in her native Huron County in southwstern Ontario. Her stories explore human complexities in an uncomplicated prose style. Munro's writing has established her as "one of our greatest contemporary writers of fiction," or, as Cynthia Ozick put it, "our Chekhov." Alice Munro was awarded the 2013 Nobel Prize in Literature for her work as "master of the modern short story", and the 2009 Man Booker International Price for her lifetime body of work, she is also a three-time winner of Canada's Governor General's Award for fiction.

There is no way to predict the price of stocks and bonds over the next few days or weeks. But it is quite possible to foresee the broad course of these prices over longer periods, such as the next three to five years. These findings, which might seem both surprising and contradictory, were made and analyzed by this year’s Laureates, Eugene Fama, Lars Peter Hansen and Robert Shiller.

Beginning in the 1960s, Eugene Fama and several collaborators demonstrated that stock prices are extremely difficult to predict in the short run, and that new information is very quickly incorporated into prices. These findings not only had a profound impact on subsequent research but also changed market practice. The emergence of so-called index funds in stock markets all over the world is a prominent example.

If prices are nearly impossible to predict over days or weeks, then shouldn’t they be even harder to predict over several years? The answer is no, as Robert Shiller discovered in the early 1980s. He found that stock prices fluctuate much more than corporate dividends, and that the ratio of prices to dividends tends to fall when it is high, and to increase when it is low. This pattern holds not only for stocks, but also for bonds and other assets.

One approach interprets these findings in terms of the response by rational investors to uncertainty in prices. High future returns are then viewed as compensation for holding risky assets during unusually risky times. Lars Peter Hansen developed a statistical method that is particularly well suited to testing rational theories of asset pricing. Using this method, Hansen and other researchers have found that modifications of these theories go a long way toward explaining asset prices.

Another approach focuses on departures from rational investor behavior. So-called behavioral finance takes into account institutional restrictions, such as borrowing limits, which prevent smart investors from trading against any mispricing in the market.

The Laureates have laid the foundation for the current understanding of asset prices. It relies in part on fluctuations in risk and risk attitudes, and in part on behavioral biases and market frictions.

Eugene Francis "Gene" Fama (/ˈfɑːmə/) was born in 1939 and is an American economist and Nobel laureate in Economics, known for his work on portfolio theory and asset pricing, both theoretical and empirical.

He is currently Robert R. McCormick Distinguished Service Professor of Finance at the University of Chicago Booth School of Business. In 2013 it was announced that he would be awarded the Nobel Prize in Economic Sciences jointly with Robert Shiller and Lars Peter Hansen.

Lars Peter Hansen was born in `1952 and is the David Rockefeller Distinguished Service Professor of economics at the University of Chicago. Best known for his work on the Generalize Method of Moments, he is also a distinguished macroeconomist, focusing on the linkages between the financial and real sectors of the economy. In 2013, it was announced that he would be awarded the Nobel Memorial Prize in Economics, jointly with Robert J. Shiller and Eugene Fama.

Robert James "Bob" Shiller was born in 1946 and is an American economist, academic, and best-selling author. He currently serves as a Sterling Professor of Economics at Yale University and is a fellow at the Yale School of Management's International Center for Finance. Shiller has been a research associate of the National Bureau of Economic Research (NBER) since 1980, was Vice President of the American Economic Association in 2005, and President of the Eastern Economic Association for 2006-2007. He is also the co‑founder and chief economist of the investment management firm MacroMarkets LLC. Shiller is ranked among the 100 most influential economists of the world. On 14 October 2013, it was announced that Shiller, together with Eugene Fama and Lars Peter Hansen, would receive the 2013 Nobel Prize in Economics, “for their empirical analysis of asset prices”.

Nobel Prize For Peace 2013

The Nobel Peace Prize 2013 was awarded to Organisation for the Prohibition of Chemical Weapons "for its extensive efforts to eliminate chemical weapons".

The Norwegian Nobel Committee has decided that the Nobel Peace Prize for 2013 is to be awarded to the Organisation for the Prohibition of Chemical Weapons (OPCW) for its extensive efforts to eliminate chemical weapons.

During World War One, chemical weapons were used to a considerable degree. The Geneva Convention of 1925 prohibited the use, but not the production or storage, of chemical weapons. During World War Two, chemical means were employed in Hitler’s mass exterminations. Chemical weapons have subsequently been put to use on numerous occasions by both states and terrorists. In 1992-93 a convention was drawn up prohibiting also the production and storage of such weapons. It came into force in 1997. Since then the OPCW has, through inspections, destruction and by other means, sought the implementation of the convention. 189 states have acceded to the convention to date.

The conventions and the work of the OPCW have defined the use of chemical weapons as a taboo under international law. Recent events in Syria, where chemical weapons have again been put to use, have underlined the need to enhance the efforts to do away with such weapons. Some states are still not members of the OPCW. Certain states have not observed the deadline, which was April 2012, for destroying their chemical weapons. This applies especially to the USA and Russia.

Disarmament figures prominently in Alfred Nobel’s will. The Norwegian Nobel Committee has through numerous prizes underlined the need to do away with nuclear weapons. By means of the present award to the OPCW, the Committee is seeking to contribute to the elimination of chemical weapons.

COMMENTARY: Congratulations to all recipients. The 2013 Nobel laureates include six Americans. Here's a YouTube video of the Nobel Prize Ceremony:

11/28/2012

The U.S. may have had secret plans to detonate an atomic bomb on the moon at the height of the Cold War (Click Image To Enlarge)

A story that surfaced over a decade ago is making the rounds again this week, as some media outlets are reporting that the U.S. considered detonating an atomic bomb on the moon in an effort to intimidate the Soviet Union at the height of the Cold War.

On Sunday,the Daily Mail revived the story, citing a 12-year-old interview with physicist Leonard Reiffel, formerly of the U.S. military-backed Armour Research Foundation and later a deputy director of NASA. Celebrated astronomer Carl Sagan also was said to have been involved with the secret project, which reportedly was known as "A Study of Lunar Research Flights" or "Project A119." Sagan died in 1996.

In the interview, Reiffel reportedly said the plan had been to launch a rocket that would deliver a small nuclear device to the moon's surface, where it would detonate.

Reiffel, now 85, is believed to be the only official to have publicly confirmed his association with the project. However, a 190-page document called "A Study of Lunar Research Flights, Volume I" dated June 19, 1959 is available online through the Information for the Defense Community database. The document, available in PDF format, is credited to Reiffel and bears the heading of Air Force Special Weapons Center and the Air Research and Development Command based at Kirkland Air Force Base in New Mexico.

Click Image To Enlarge

The abstract reads:

Nuclear detonations in the vicinity of the moon are considered in this report along with scientific information which might be obtained from such explosions. The military aspect is aided by investigation of space environment, detection of nuclear device testing, and capability of weapons in space. A study was conducted of various theories of the moon's structure and origin, and a description of the probable nature of the lunar surface is given. The areas discussed in some detail are optical lunar studies, seismic observations, lunar surface and magnetic fields, plasma and magneti3 field effects, and organic matter on the moon.

Military officials abandoned the idea, Reiffel said, in part because of the danger it posed to people on Earth if the mission failed. Scientists also were concerned about contaminating the moon with radioactive material.

In a new interview with The Huffington Post, Richard Rhodes, a Pulitzer-prize-winning author and an affiliate of the Center for International Security and Cooperation at Stanford University, said he was unfamiliar with Project A119. If there had been a plan to send a nuclear missile to the moon in the 1950s, he said, it would have been hard-pressed to advance past the study stage. The first Soviet craft crash-landed on the moon in 1959, followed three years later by the American craft Ranger 4, reports National Geographic.

"I doubt we had any rockets that would have had the power to leave earth's orbit and hit the moon," Rhodes said. "It takes a lot of power to take things out of earth's gravitational pull, much more than to just put something in orbit."

If there had been a secret plan, the show of U.S. technological prowess would have been meant as a counter to Sputnik, Rhodes added.

Though nuking the moon sounds far-fetched, Rhodes said some of the projects that grew out of Cold War tensions were far from funny.

"One of the craziest things we ever did was develop and deploy nuclear tipped anti-aircraft missiles, plane to plane. That's always seemed like insanity," he said. Once miniaturized nuclear weapons were created, "as all the services wanted their share--so they had to think of some use for these things, and their uses were marginally insane," at least by today's risk-reward standards, he said.

When asked about the project, the U.S. Air Force declined to comment, the Associated Press reported in 2000.

COMMENTARY: What a crazy, idiotic idea to bomb the Moon, just to demonstrate our machismo and flex our nuclear muscles just to intimidate the U.S.S.R. Growing up during the height of the Cold War, I can clearly remember the air raid warning alarms and nuclear bomb safety tests that the schools used to conduct. The teachers would get the students in their classes to crawl under their desks and pretend there was a nuclear bomb attack. These tests would've done absolutely zero to protect us in the event of a real nuclear attac.

The whole idea of arming ourselves with enough nuclear weapons to destroy the World 100 times over on the theory that our enemies would never use their nukes is pure lunacy. If you ask me, all of this is done just to keep the Military Industrial Complex in business making weapons of mass destruction and the latest in tech weaponry.

Courtesy of an article dated November 28, 2012 appearing in The Huffington Post and an article dated November 28, 2012 appearing in CNN.com

01/22/2012

Bob Lazar stated that the “Sport Model” Flying Disc amplified the “Strong Nuclear Force” of Element 115 (UnUnPentium or UUP) to generate the gravity field for “Space-Time Compression.” Bob also stated that the U.S. Government had 500 pounds of Element 115 in their possession. The raw Element 115 was given to the U.S. Goverment at S4 by the Reticulan EBEs in the form of discs. The scientists at S4 sent the Element 115 discs through Groom Lake to Los Alamos National Laboratory in New Mexico, to be milled for use in the Anti-Matter Reactor. The Los Alamos personnel were told it was a new form of armor. They simply followed orders, milled it in accordance with the following steps, and sent it back to Groom Lake. It was during this process that some of the Element 115 turned up missing. As you’ll see below, the machining process to form the Element 115 wedge produces a tremendous amount of waste.

UFO Anti-Matter Reactor

In the following video, physicist Bob Lazar explains the mysterious Element 115 and the sophisticated anti-matter reactor used for powering the anti-gravity propulsion system used in the flying saucers located a top secret U.S. research facility known as S-4:

Bob Lazar stated that the Element 115 used as the fuel and gravity source in the “Sport Model” Flying Disc was stable. On February 2, 2004, scientists at the Lawrence Livermore National Laboratory, in collaboration with researchers from the Joint Institute for Nuclear Research in Russia (JINR), announced that they discovered two new super-heavy elements, Element 113 and Element 115. The Isotope of Element 115, produced by bombarding an Americium-243 (95Am243) nucleus with a Calcium-48 (20Ca48) nucleus, rapidly decayed to Element 113. then continued to decay until a meta-stable isotope was obtained.

Cutaway of the Sports Model UFO

The following hypothetical reaction displays the maximum theoretical atomic mass of an Element 115 Isotope that could be produced from combining an Americium-243 nucleus with a Calcium-48 nucleus. The following reaction assumes no neutrons were liberated during the process of the reaction:

95Am243 + 20Ca48 → 115UUP291 → 113UUT287 + 2He4 → ...

The following reactions are the actual reactions that took place in the laboratory by bombarding Americium-243 with Calcium-48, which resulted in the two Isotopes of Element 115, indicated below, being identified.

95Am243 + 20Ca48 → 115UUP288 + 30n1115UUP288 → 113UUT284 + 2He4 → ...

95Am243 + 20Ca48 → 115UUP287 + 40n1115UUP287 → 113UUT283 + 2He4 → ...

The maximum theoretical atomic mass isotope of Element 115 that could be produced in the reaction, above,115UUP291, would only have 176 neutrons in its nucleus. This isotope of Element 115 is shy 8 neutrons from containing the magic number of 184 neutrons. The two actual isotopes of Element 115 produced by this reaction, 115UUP288 and 115UUP287 contain 173 neutrons, shy 11 neutrons from the magic number of 184, and 172 neutrons, shy 12 neutrons from the magic number of 184, respectively.

Click Image To Enlarge

This latest scientific breakthrough, however, provides significant credibility to Bob Lazar’s claims rather than discrediting his claims. Bob Lazar’s Element 115 discs used to make the wedge for the “Sport Model” Flying Disc Anti-Matter Reactor would have to have been the isotope of Element 115 containing the magic number of 184 neutrons, therefore, having an atomic mass of 299. The nuclear configuration of this isotope of Element 115 would be identical to the nuclear configuration of the only known stable isotope of Element 83, Bismuth, 83Bi209, containing the magic number of 126 neutrons, except that the Element 115 isotope would have one more energy level completely filled with protons and neutrons. 82 protons and 114 protons are magic numbers for protons because 82 protons completely fill 6 proton energy levels and 114 protons completely fill 7 proton energy levels. The 83rd proton for Bismuth is a lone proton in the 7th proton energy level and the 115th proton for Element 115 is the lone proton in the 8th proton energy level. 126 neutrons completely fill 7 neutron energy levels and 184 neutrons completely fill 8 neutron energy levels. Refer to the Nucleon Energy Level Table for Bismuth and Element 115, below, for the nuclear configurations of Bismuth and Element 115. This stable isotope of Bismuth, Element 83, has very unique gravitational characteristics. Refer to the Henry William Wallace Patent: U.S. Patent 3,626,605, “Method and Apparatus for Generating a Secondary Gravitational Force Field.”

NOTE: Producing the theoretically stable super-heavy elements is very difficult because the reactant nuclei of these nuclear reactions do not have enough neutrons to result in a product nucleus with enough neutrons to obtain theoretical stability.

Click Image To Enlarge

Click Image To Enlarge

COMMENTARY: On February 4, 2004, two superheavy elements, elements 113 and 115, were recently synthesized through a collaborative effort between scientists from the Physical and Life Sciences Directorate at the Lawrence Livermore National Laboratory and researchers from the Joint Institute for Nuclear Research at the Flerov Laboratory for Nuclear Reactions in Dubna, Russia. Two isotopes of element 115 survived 30-80 milliseconds before decaying into isotopes of element 113 that survived approximately ten times longer prior to decaying themselves. Following a series of alpha-decays, the element 115 atoms decayed into long-lived isotopes (multiple hours) of element 105 (Db). The great-great-great granddaughter Db isotopes were also chemically identified in subsequent experiments.

Scientists are making tantalizing progress in the hunt for the elusive Higgs boson, a theoretical particle that could explain how the universe is built, though their data aren't robust enough yet to claim a conclusive discovery.

On Tuesday, physicists at the Large Hadron Collider, or LHC, near Geneva, Switzerland, said that data from two independent experiments had narrowed the range of the would-be particle's likely mass.

Large Hadron Collider at CERN (Click Image To Enlarge)

The Higgs boson is the only particle that the standard model of physics says should be there but hasn't been observed in an experiment. The model describes how matter is built and particles interact.

Scientists claimed progress in the search for the Higgs boson - colloquially known as the 'God particle' - which is considered the basic building block of the universe, Gautam Naik reports on digits. Photo: Getty Images.

Proof that the particle exists would help explain a big puzzle: why some objects in the universe—such as the quark, a constituent of protons—have mass, while other objects—such as photons, the constituent of light—possess only energy.

By extension, its discovery would help explain the presence of stars, planets and humans, and thus rank as one of the biggest coups for modern-day physics.

"The Higgs is the missing piece" in the current theory of matter, said Stefan Soldner-Rembold of the University of Manchester, England, who has been on a decade-long quest for the particle, though he wasn't involved in the recent LHC experiments. The latest data may be far from definitive, but "we're going in the direction that it is there," he added.

In 1964, three groups of physicists independently proposed the existence of the Higgs boson. It was named after one of the scientists, Peter Higgs, now an emeritus professor at the University of Edinburgh, Scotland. Thousands of scientists have since tried to chase down the fabled subatomic particle.

Because nobody knows what the mass of a Higgs boson might be, the particle must be hunted indirectly, typically in giant machines that propel particles to near-light speed, then smash them together and generate an array of other subatomic particles.

The hope is that one such particle would be the Higgs itself, though it would almost instantly decay into different combinations of other particles. Finding it would then involve looking for statistically significant "excesses" of those particles.

The latest experiments at LHC, which is overseen by the European particle-physics laboratory CERN, found modest excesses of this sort in the data, a promising sign. One of the experiments, known as Atlas, suggests that Higgs could have a tiny mass, in the range of 116 to 130 gigaelectronvolts, or GeV. The other experiment pegged the particle's mass at 115 to 127 GeV.

On an individual basis, none of these excesses is any more statistically significant than tossing a die and ending up with two sixes in a row. However, physicists are encouraged, because multiple independent measurements indicate that the Higgs may be lurking in the region of 124 to 126 GeV.

CERN researcher Fabiola Gianotti of the Atlas experiment, suggesting a mass about 125 times that of a proton, said.

"Over the last few weeks, we have started to see an intriguing excess of events around 125 GeV. This excess may be due to a fluctuation, but it could also be something more interesting. We cannot conclude anything at this stage. We need more study and more data."

How might the Higgs boson confer mass to particles? Physicists have suggested that as the universe cooled after the Big Bang, about 13.7 billion years ago, a force known as the Higgs field formed, along with the particle.

The video "The ATLAS Experiment - Mapping the Secrets of the Universe," was produced by ATLAS in two parts and describes the creation of the Universe beginning with the Big Bang, the Standard Model of Elementary Particles and forces making up matter, and the idea behind the creation of the ATLAS Experiment:

Part I:

Part II:

Under this scenario, the Higgs field permeates the universe, and any particles that interact with it are given a mass through the Higgs boson. The more they interact, the heavier they become. Particles that don't interact with the Higgs field are left with no mass at all.

CERN scientists say they plan to refine their analysis and won't be able to offer a definitive conclusion until sometime next year.

It's been an eventful year for the esoteric field of particle physics, especially at CERN. In September, for example, an experiment there reported ghostlike particles known as neutrinos apparently traveling a tiny bit faster than light, an apparent breach of the cosmic speed limit set down by Albert Einstein.

Yet, like many physicists, Dr. Soldner-Rembold of the University of Manchester isn't necessarily eager for the Higgs to be found.

He said.

"It would perhaps be even more exciting if it isn't where it's supposed to be. Then we'd have to come up with something else."

COMMENTARY: BREAKING NEWS: CERN physicists held a seminar on December 13, 2011 providing the latest update on the findings of research being done by the ATLAS Experiment to find the Higgs boson particle. The seminar was in two videos:

Part I:

Part II:

The ATLAS Experiment

The ATLAS Experiment is a particle physics experiment that is exploring the fundamental nature of matter and the basic forces that shape our universe. ATLAS has begun the search for new discoveries in the head-on collisions of protons of extraordinarily high energy. ATLAS is one of the largest collaborative efforts ever attempted in the physical sciences. There are 3000 physicists (Including 1000 students) participating from 174 universities and laboratories in 38 countries. Visit http://atlas.ch

The ATLAS Detector

ATLAS is one of two general-purpose particle detectors at the Large Hadron Collider (LHC) in CERN.

The job of ATLAS is to record and visualise the explosions of particles that result from the collisions at LHC. The information obtained on a particle includes its speed, mass, and electric charge, and this information helps physicists to work out the identity of the particle.

ATLAS will investigate a wide range of physics, including the search for the Higgs boson, extra-dimensions, and particles that could make up dark matter. ATLAS will record sets of measurements on the particles created in collisions - their paths, energies, and their identities.

The following view shows ATLAS under construction beginning in 2003 through animations and time lapse images and video clips:

The following two-part video series (a must view) of The ATLAS Experiment explains the inner workings of the ATLAS Detector, how CERN physicists using the LHC collide protons together to create the sub-atomic particles of matter (See The Standard Model of Elementary Particles below) and how those sub-atomic particles are detected, tracked and measured:

Part I:

Part II:

This is accomplished in ATLAS through six different detecting subsystems (Innter Detector, Electromagnetic Calorimeters, Hadronic Clorimeters, Muon Detectors, and Particle Identification Detectors) that identify particles and measure their momentum and energy.

Another vital element of ATLAS is the huge magnet system that bends the paths of charged particles for momentum measurement.

The interactions in the ATLAS detectors will create an enormous dataflow. To digest these data, ATLAS needs a very advanced trigger and data acquisition system, and a large computing system.

ATLAS is about 45 meters long, more than 25 meters high, and weighs about 7,000 tons. It is about half as big as the Notre Dame Cathedral in Paris and weighs the same as the Eiffel Tower or a hundred 747 jets (empty).

Various images of the ATLAS Detector (Click Images To Enlarge)

To view a comprehensive list of images of the ATLAS Detector through its various phases of construction click HERE.

To view the videos produced for the ATLAS Experiment YouTube channel click HERE.

If you would like a more thorough and technical description of the ATLAS Detector, Columbia University has published the following technical paper. You can download it by clicking HERE.

ATLAS Schedule

Late 2009 -- Startup of LHC and first event collisions at a total energy of 0.9 TeV and later at 2.36 TeV (above the previous world record).

March 2010 -- Event collisions at a total energy of 7 TeV. This led to about eight months of data taking before a few weeks of heavy ion collisions and the usual winter shutdown. Many papers with early results have come as a result of the 2010 run.

March 2011 -- Event collisions at a total energy of 7 TeV. Two years of much more intensive data taking. There will also be a few weeks of heavy ion collisions and a winter shutdown (Dec. 2011 - Feb. 2012).

2013 -- A long shutdown to prepare for an increase of the total energy towards 14 TeV.

Next 15-20 years -- Continued data taking with publication of results on an ongoing basis.

Make-Up of ATLAS Scientists

ATLAS is a virtual United Nations of 38 countries. In this troubled world, it is inspiring to see people from many lands working together in harmony. International collaboration has been essential to this success. These physicists come from more than 174 universities and laboratories and include 1000 students. ATLAS is one of the largest collaborative efforts ever attempted in the physical sciences.

Large Hadron Collider (LHC)

The protons are accelerated in opposite directions in the Large Hadron Collider, an underground accelerator ring 27 kilometres in circumference at the CERN Laboratory in Geneva, Switzerland. Crashing together in the center of ATLAS, the particles will produce tiny fireballs of primordial energy. LHC recreates the conditions at the birth of the Universe -- 30 million times a second. Relics of the early Universe not seen since the Universe cooled after the Big Bang 14 billion years ago will spring fleetingly to life again. The LHC is in effect a Big Bang Machine. (Portions of this text are paraphrased from an article written by Dennis Overbye in the New York Times on May 15, 2007, with permission.)

The Large Hadron Collider (LHC) is the world's largest and highest energy particle accelerator (17 miles in circumference) located at the CERN facility in Switzerland (Click Image To Enlarge)

Force-carrying particles: the photon (y), the eight gluons (g), and the very heavy weak bosons (responsible for radioactive decay: W± and the Z0).

All told, these particles and the way they interplay with one another fundamentally and successfully explains every phenomenon ever observed, with the sole exception of gravitation.

The Higgs boson (H) is the last particle predicted by the Standard Model that has yet to have been found. The Higgs is predicted to be the reason everything in the universe has mass. It is also supposed to break the Electro Weak symmetry: the fact that the EM boson is massless while the Weak bosons have mass.

The following Standard Model of Fundamental Particles and Interactions (expanded version) chart was prepared by the Contemporary Physics Education Project (CPEP) by Lawrence Berkeley National Laboratory Livermore in 2000 and provides more detailed descriptions of each sub-atomic family and the individual particles within each family. This chart does not make reference to the Higgs boson particle which according to theoretical physicists is predicted to exist, but is yet to be discovered.

I hope you have enjoyed this blog post as much as I have enjoyed putting it together for you.

10/03/2011

Corvallis, Oregon -- Some investors and entrepreneurs are braver than others. It's one thing to create the best iPhone app that mimics flatulence -- but to fund and join a large energy startup takes a certain level of testicular fortitude. Building a new automobile or solar factory or fuel cell is expensive and difficult and stands only a small chance of survival.

And if that startup happens to be developing a new take on light water reactors (that's nuclear reactors, son), well, that's a different animal altogether.

Although there are a few nuclear technology startups (Kurion, TerraPower, Hyperion, General Fusion, Tri-Alpha), the company with the clearest near-term chances of success seems to be Oregon's NuScale. This is not to diminish the work being done at the other firms. It's simply that NuScale's market-entrance strategy seems to better take into account the intricacies and glacial time-scale of Nuclear Regulatory Commission (NRC) approval.

Investor Maurice Gunderson of CMEA has labelled the small modular reactors (SMRs) designed by NuScale as one of the "game-changing" technologies in energy (along with utility-scale energy storage and fusion). CMEA is an investor in NuScale, along with Vulcan Capital and MKG, the Michael Kenwood Group.

We have reported on NuScale and SMRs numerous times, and we've covered the strong case that SMRs, small modular reactors, have made in their own favor.

Under the SMR concept, reactors can be built in factories and shipped to the site instead of being expensively and riskily built on-site. Rather than engineer and build reactors capable of producing over one gigawatt of electric power, SMRs can produce 10 megawatts to 350 megawatts of electricity (or heat). SMRs operate in similar fashion to conventional reactors or fossil fuel plants; nuclear fuel builds heat, which creates steam, which in turn is used to spin a turbine.

It is anticipated that SMRs will cost about the same to construct per kilowatt as large nuclear plants and will produce electricity at the same cost as a conventional nuclear plant (in the 6 to 8 cent/kWh range). SMRs are not new. The U.S. Army has built and operated small nuclear power plants in the past and the military uses small reactors to power naval vessels. But the incremental construction scheme of SMRs can change the financial and safety picture.

The sheer enormity of the undertaking and the level of commitment of this project were driven home at a NuScale-sponsored event I attended on the Oregon State University campus earlier this month. Remember that this is a startup project, not a multinational; most startups don't have to consider purchasing 8,490 tons of rebar or nuclear source term security issues.

More than 85 people from all layers of the nuclear ecosystem gathered to check in on NuScale's progress to date. One of the factors contributing to NuScale's progress and credibility is their access to a small-scale (electrically powered) nuclear integral test facility at OSU in which the technology can be run through its paces. One can essentially put a hole in pipe in a nuclear system and safely simulate failure behavior.

Dr. Jose Reyes, the CTO at NuScale, has experience at the NRC, and that knowledge is absolutely crucial in bringing this regulation-intense project to reality. Dr. Paul Lorenzini, the CEO at NuScale Power, is both a lawyer and a nuclear engineer, relevant skills for this prodigious effort.

One of the distinctions of the NuScale design is that it employs passive cooling, making the design safer and less complex with no pumps and no back-up pumps. The technology used by NuScale is proven, and in many cases, borrows from existing LWR designs. This is crucial as it allows the NRC to stay well within their comfort zone.

That's allowed NuScale to make real progress on the regulatory and political side, where, in the words of Reyes, the CTO, "We've seen huge changes in the acceptance of small reactors," and, in the words of the CEO, "in how much of a recognition of the role small reactors can play there is." On a related note, the Obama administration continued to support nuclear technology with a $2 billion conditional loan guarantee from the DOE last week to help finance AREVA’ s Eagle Rock Enrichment Facility near Idaho Falls, Idaho.

Reyes described the unit as a "stainless steel thermos, under water, underground." The firm has addressed safety issues throughout the design process: "Seismic isolators give remarkable seismic robustness," according to the CTO, and it is "walk-away safe" because of the water cooling design.

It is arguable that regulatory and political advances are as important as technical innovation in a project of this nature.

"You don't have to be a rocket scientist" to understand the value of SMRs, according to Lorenzini. He claims that the economics are validated along with the "incremental build-out option." Lorenzini stated, "The DNA in nuclear is economies of scale, but we asked 'how can we build a small plant to capture the economies of small?'" NuScale uses factory manufacturing, passive design and the ability to deliver the unit via barge, rail or truck.

NuScale has leveraged an existing supply chain with proven industry leaders like EPC firm Kiewit. The speaker from Kiewit said that they see "SMR construction looking more like conventional power plants."

"What has held it back is that nobody believed you could reach the price point," said Lorenzini.

On the subject of price, Jay Surina, the CFO of NuScale Power, estimated the cost at under $4000 per kilowatt at the 540 megawatt level -- a number that rivals or beats the price of existing "cathedral-style" nuclear plants.

According to the ﬁrm, a 540-megawatt power plant constructed from 12 of NuScale’s 45 megawatt reactors could produce power for 6 to 9 cents a kilowatt-hour on average over the plant’s lifetime, said Bruce Landrey, NuScale’s VP of business development.

NuScale hopes to submit design certification documents to the Nuclear Regulatory Commission in Q1 2012, and it will take about three years for the agency to complete its review. According to NuScale, approximately 95 percent of the regulatory basis for the NRC design review of a multi-module NuScale plant already exists. Using standard and proven computer codes, controls, components, control rod drives and enrichment levels, and fuel assembly design makes the approval process easier for the NRC and faster for all concerned. According to Lorenzini, there is a huge market for reactors in the 300-megawatt to 500-megawatt range.

About 20 percent of U.S. electricity comes from nuclear sources. Other nations like China, India and France will rely on nuclear for baseload power to an even greater degree going forward. We can't just wish it away.

Nuclear remains a financial and safety challenge and nuclear's detractors make good arguments -- everyone from Amory Lovins and his Rocky Mountain Institute to NIRS, the Nuclear Information and Resource Service, are able to point out the cost overruns and safety concerns. More valid objections here. I could go on.

Perhaps NuScale and SMRs can help the industry address some of nuclear's historic financial and marketing impediments.

09/30/2011

In a paper published online by the journal Nature Physics today, the ALPHA experiment at the European Organization for Nuclear Research or CERN1 reports that it has succeeded in trapping antimatter atoms for over 16 minutes: long enough to begin to study their properties in detail. ALPHA is part of a broad programme at CERN's antiproton decelerator (AD)2investigating the mysteries of one of nature's most elusive substances.

Click Image to Enlarge

Today, we live in a universe apparently made entirely of matter, yet at the big bang matter and antimatter would have existed in equal quantities. Nature seems to have a slight preference for matter, which allows our universe and everything in it to exist. One way of investigating nature's preference for matter is to compare hydrogen atoms with their antimatter counterparts, and that's what makes today's result important.

"We can keep the antihydrogen atoms trapped for 1000 seconds," explained ALPHA spokesperson Jeffrey Hangst of Aarhus University. "This is long enough to begin to study them -- even with the small number that we can catch so far."

In the paper published today, some 300 trapped antiatoms are reported to have been studied. The trapping of antiatoms will allow antihydrogen to be mapped precisely using laser or microwave spectroscopy so that it can be compared to the hydrogen atom, which is among the best-known systems in physics. Any difference should become apparent under careful scrutiny. Trapping antiatoms could also provide a complementary approach to measuring the influence of gravity on antimatter, which will soon be investigated with antihydrogen by the AEgIS experiment.

What is anti-matter?

Dr. Jeffrey Hangst from CERN describes how they trapped anti-matter.

Another important consequence of trapping antihydrogen for long periods is that the antiatoms have time to relax into their ground state, which will allow ALPHA to conduct the precision measurements necessary to investigate a symmetry known as CPT. Symmetries in physics describe how processes look under certain transformations. C, for example, involves swapping the electric charges of the particles involved in the process. P is like looking in the mirror, while T involves reversing the arrow of time.

Individually, each of these symmetries is broken -- processes do not always look the same. CPT, however, says that a particle moving forward through time in our universe should be indistinguishable from an antiparticle moving backwards through time in a mirror universe, and it is thought to be perfectly respected by nature. CPT symmetry requires that hydrogen and antihydrogen have identical spectra.

Says Hangst,

"Any hint of CPT symmetry breaking would require a serious rethink of our understanding of nature. But half of the universe has gone missing, so some kind of rethink is apparently on the agenda."

The next step for ALPHA is to start performing measurements on trapped antihydrogen, and this is due to get underway later this year. The first step is to illuminate the trapped anti-atoms with microwaves, to determine if they absorb exactly the same frequencies (or energies) as their matter cousins.

Explained Hangst,

"If you hit the trapped antihydrogen atoms with just the right microwave frequency, they will escape from the trap, and we can detect the annihilation -- even for just a single atom. This would provide the first ever look inside the structure of antihydrogen -- element number 1 on the anti-periodic table."

Notes:

1. CERN, the European Organization for Nuclear Research, is the world's leading laboratory for particle physics. It has its headquarters in Geneva. At present, its Member States are Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Italy, the Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland and the United Kingdom. One candidate for accession: Romania. India, Israel, Japan, the Russian Federation, the United States of America, Turkey, the European Commission and UNESCO have Observer status.

A lot of my fans don't know what CERN really is. These YouTube videos should help explain this $10 billion project.

A 3-minute tour of CERN

Theoretical physics professor Michio Kaku talks about CERN's test run after CERN was shutdown for a year and one-half to fix mechanical problems.

An excellent documentary of CERN explaining what it hopes to accomplish (sorry about the poor quality). This is when they thought CERN was going to cost only $6 billion. HA, HA.

2. ALPHA is one of several AD experiments investigating antimatter at CERN. ATRAP has pioneered trapping techniques, and is also investigating antihydrogen. ASACUSA has made measurements of unprecedented precision of the antiproton's mass, so far not revealing any divergence from that of the proton. ASACUSA is also developing complementary techniques for studying antihydrogen. AEgIS studies how antiprotons fall under gravity, and ACE investigates the potential use of antiprotons for cancer therapy.

COMMENTARY: For physicists, antimatter is probably the most valuable substance ever; the slightest bit of it could provide extremely valuable information that can help clear out some of the most stressing issues in modern physics. However, the thing is these little gifts are pretty hard to wrap. However, the ALPHA project at CERN achieved this remarkable feat and took a huge leap towards understanding one of the questions about the Universe: what’s the actual difference between matter and antimatter.

The team had 38 successful attempts to capture single antihydrogen atoms in a magnetic field for about 170 miliseconds. Says Jeffrey Hangs, spokesman for ALPHA collaboration at CERN,

“We’re ecstatic. This is five years of hard work."

And they should be. Since it restarted working, the Large Hadron Collider at CERN had quite a few good moments, but this is the best one so far. Antimatter (or the lakc of it) still poses one of the biggest mysteries ever; according to the theories up to date, at the Big Bang, matter and antimatter were produced in equal amounts, but somehow all the antimatter dissappeared, so now researchers are forced to turn to more and more advanced and delicate methods in order to find it and study it.

As you can guess by its name, antimatter is just like matter, only in reverse. So the antiprotons are just like normal protons, but they are negatively charged, while electrons have a positive charge. The main objective of this stage of the ALPHA project was to compare the relative energy of hydrogen and antihydrogen in order to confirm that antimatter and matter have the same electromagnetic properties, which is a key feature of the standard model.

This is not the first time antimatter was captured, the first time it was in 2002, with the ATHENA project; however, it lasted just several miliseconds, which made it impossible to analyze. What happens is that when you combine matter with antimatter, they vanish with a big boom, releasing high energy photons (gamma rays). In the ATHENA project, antihydrogen combined with hydrogen from the walls of the contained and annihilated each other.

To prevent this from happening, the ALPHA team used a totally different technique, which was way more difficult: capturing the antimatter in a magnetic trap. To capture the 38 atoms, they had to repeat the experiment no less than 335 times.

Of course, achieving these atoms was very costly, but the effort was definitely worth it. However, physicists are looking into other methods that could prove to be more effective in times to come.

I have been poking fun at CERN ever since they built the multi-billion dollar monstrosity and it experienced one problem after another, after another. They cranked it up in 2007, but it didn't work. Finally in 2009 they were able to get the damn thing to finally work.

At the end of 2010, they have accomplished something significant--capturing anti-matter in a vacuum chamber--but what exactly have they started? Have we taken the Genie out of the bottle?

We are venturing into the unknown area where there are a lot of 'ifs'. What will happen when we start working with heavier elements? Let's hope nobody gets hurt and we don't open a time warp or create a black hole in the universe.

Courtesy of an article dated June 5, 2011 appearing in Science Daily and an article dated November 18, 2011 appearing in ZME Science

09/29/2011

Does it seem like there are more law and MBA graduates than ever now? And how big is there decline in students pursuing education and engineering? As trends in in American culture, economy, and education change, so do students' choices in degree fields. The interactive infographic above explores which degrees and subjects have gained in popularity and which have declined over the course of ten years.

Click To Above Image To Launch Interactive Infographic

Bachelor's Degrees

Click Image To Enlarge

Master's Degrees

Click Image To Enlarge

Doctoral Degrees

Click Image To Enlarge

All Degrees

Click Image To Enlarge

Professional Degrees

Click Image To Enlarge

Engineering Degrees

Click Image To Enlarge

COMMENTARY:

Observations From The Above Charts

I think we should all be concerned that less students are entering the educational (down 5%) and engineering (up 17%) fields. We are now seeing a growing number of retiring teachers. The mass layoffs of teachers is really discouraging students from entering the field of education, and this is going to hurt our entire educational system. We will have shortages in the near future unless we can promote education as a career and increase teacher compensation. A higher percentage of engineering students are studing for their doctoral degrees as technology becomes more complex. Students entering the field of medicine and optometry increased by only 3% and 4% respectively, which means a shortage of medical doctors and optometrists in the years to come. Get used to seeing more Indian or Chinese doctors and optometrists treating patients in clinics and hospitals in the not too distant future. The fastest growing education fields are health (up 65%), digital and performing arts (up 52%) and business (up 48%). On a positive note, more students are entering college, but higher tuition and the student loan crisis is going to prove a challenge for many students at the lower socio-economic scale.

State of Science and Engineering in the U.S.

The state of science and engineering in the U.S. is strong, yet the nation's lead is shrinking, according to the latest report from the National Science Board. Based on a wide range of data - from R&D spending to higher-education trends in science and engineering fields - the group's Science and Engineering Indicators 2010 report suggests that U.S. dominance of world science and engineering has deteriorated significantly in recent years, due in large part to rapidly increasing capabilities in China and other Asian economies.

Everyone — from tech entrepreneurs and business analysts to news columnists and out-of-work engineers — is sounding the alarm: The United States is losing its innovative edge.

The latest edition indicates that while the state of U.S. science and engineering is strong, "U.S. dominance has eroded significantly" in recent years, due in large part to rapidly increasing capabilities among Asian nations, particularly China, Kei Koizumi, assistant director for federal R&D in President Obama's Office of Science and Technology Policy, said in an announcement of the findings.

"The data begin to tell a worrisome story," Koizumi said.

The NSB's key findings, highlighted below, shed light on America's science and engineering position in the global economy.

R&D — Between 1996 and 2007, North America's share of world R&D activity dropped from 40 percent to 35 percent. Meanwhile, the European Union's (EU) share decreased from 31 percent to 28 percent. The Asia-Pacific region's share rose from 24 percent to 31 percent during the same period, "even with Japan's comparatively low growth." The share of the rest of the world increased from 5 percent to 6 percent. The annual growth of R&D expenditures in the U.S., at just over 5 percent, is low compared to America's Asian counterparts; in India, South Korea, Taiwan, Thailand, Singapore, Malaysia and China, R&D budgets have increased up to four times that of the U.S. growth rate. American multinationals are shifting their overseas R&D from Europe to emerging Asian markets, whose share grew from 5 percent in 1995 to 14 percent in 2006.

NS&E Higher Education — Many Western countries are concerned about lagging student interest in studying natural sciences or engineering (NS&E), fields that convey technical skills and learning considered essential for knowledge-intensive economies. In the developing world, the number of first university NS&E degrees (broadly comparable to a U.S. baccalaureate) is rising, led by large increases in China, from about 239,000 in 1998 to 807,000 in 2006. NS&E degrees earned by Japanese and South Korean students combined in 2006 (about 235,000) approximated the number earned by U.S. students during that year, even though the U.S. population was considerably larger (300 million versus 175 million). The natural sciences include physical, biological, earth, atmospheric, ocean, agricultural and computer sciences as well as mathematics.

NS&E Doctorates Earned — China's domestically earned NS&E doctorates have shot up more than tenfold since the early 1990s, to about 21,000 in 2006, approaching the number awarded in the U.S. Most of the post-2002 increase in U.S. NS&E doctorates reflects degrees awarded to temporary and permanent visa holders, who in 2007 earned about 11,600 of 22,500 NS&E doctorates in the U.S. Foreign nationals have earned more than half of U.S. NS&E doctorates since 2006, 31 percent of whom are from East Asia, mostly from China. (Image credit: NSB, SEI 2010)

Engineering Doctorates and Visas — The engineering numbers are more concentrated. The share of U.S. engineering doctorates awarded to temporary and permanent visa holders rose from 51 percent in 1999 to 68 percent in 2007. Nearly three-quarters of these foreign Ph.D. recipients were from East Asia or India. While many of these individuals, especially those on temporary visas, will leave the U.S. after earning their doctorates, past trends suggest a large proportion will stay; 60 percent of temporary visa holders who had earned a U.S. science and engineering Ph.D. in 1997 were gainfully employed in the U.S in 2007, the highest 10-year stay rate ever observed.

Research Output — The number of research articles published in a set of international, peer-reviewed journals has grown from about 460,000 in 1988 to an estimated 760,000 in 2008. However, between 1995 and 2008, the U.S. and E.U.'s combined share of world scholarly articles dropped from 69 percent to 59 percent, while Asia's expanded from 14 percent to 23 percent. Over the past two decades, the number of engineering research articles in the U.S. has grown by less than 2 percent annually; likewise in Japan. Growth in the EU: about 4.4 percent. Meanwhile, China's output of engineering articles grew by close to 16 percent annually.

Patent Protection Filings — U.S. patents awarded to foreign inventors offer a broad indication of the distribution of inventive activity around the world. While inventors in the U.S. the EU and Japan produce almost all of these patents, and U.S. patenting by Chinese and Indian inventors remains modest, the number of patents earned by Asian inventors is on the rise, driven by activity in Taiwan and South Korea. From 1995 and 2008, the share of patents granted to U.S.-based inventions by the U.S. Patent and Trademark Office has shrunk from 55 percent to 49 percent. In 1997, 34 percent of high-value patents had U.S. inventors, yet this figure slipped to 30 percent by 2006.

In a 2007 special report, New Scientist explained that contemporary China "is a nation led by technocrats. The current generation of leaders is made up mostly of graduates from some of China's leading universities, typically trained in science and engineering."

"For those in the West," New Scientist said, "where lawyers dominate the political establishment, China provides an intriguing contrast."

09/23/2011

(GENEVA) — A startling find at one of the world's foremost laboratories that a subatomic particle seemed to move faster than the speed of light has scientists around the world rethinking Albert Einstein and one of the foundations of physics.

Now they are planning to put the finding — and by extension Einstein — to further high-speed tests to see if a revolutionary shift in explaining the workings of the universe is needed — or if the European scientists made a mistake.

What is CERN you ask?

Researchers at CERN, the European Organization for Nuclear Research, who announced the discovery Thursday are still somewhat surprised themselves and planned to detail their findings on Friday.

If these results are confirmed, they won't change at all the way we live or the way the universe behaves. After all, these particles have presumably been speed demons for billions of years. But the finding will fundamentally change our understanding of how the world works, physicists said.

Only two labs elsewhere in the world can try to replicate the results. One is Fermilab outside Chicago and the other is a Japanese lab put on hold by the tsunami and earthquake. Fermilab officials met Thursday about verifying the European study and said their particle beam is already up and running. The only trouble is that the measuring systems aren't nearly as precise as the Europeans' and won't be upgraded for a while, said Fermilab scientist Rob Plunkett.

Plunkett, a spokesman for the Fermilab team's experiments said.

"This thing is so important many of the normal scientific rivalries fall by the wayside. Everybody is going to be looking at every piece of information."

Plunkett said he is keeping an open mind on whether Einstein's theories need an update, but he added:

"It's dangerous to lay odds against Einstein. Einstein has been tested repeatedly over and over again."

Going faster than light is something that is just not supposed to happen according to Einstein's 1905 special theory of relativity — the one made famous by the equation E equals mc2. Light's 186,282 miles per second (299,792 kilometers per second) has long been considered the cosmic speed limit. And breaking it is a big deal, not something you shrug off like a traffic ticket.

Famed Columbia University physicist Brian Greene said.

"We'd be thrilled if it's right because we love something that shakes the foundation of what we believe. That's what we live for."

The claim is being greeted with skepticism inside and outside the European lab.

James Gillies a spokesman for CERN said.

"The feeling that most people have is this can't be right, this can't be real."

CERN provided the particle accelerator to send neutrinos on their breakneck 454-mile trip underground from Geneva to Italy. France's National Institute for Nuclear and Particle Physics Research collaborated with Italy's Ran Sass National Laboratory for the experiment, which has no connection to the Large Harden Collider located at CERN.

Gillies told The Associated Press that the readings have so astounded researchers that

"They are inviting the broader physics community to look at what they've done and really scrutinize it in great detail."

John Ellis, a theoretical physicist at CERN who was not involved in the experiment said that confirmation from the physics community will be necessary. He said.

"Einstein's special relativitiy theory pretty much underlies everything in modern physics. It has worked perfectly up until now."

And part of that theory is that nothing is faster than the speed of light.

CERN reported that a neutrino beam fired from a particle accelerator near Geneva to a lab 454 miles (730 kilometers) away in Italy traveled 60 nanoseconds faster than the speed of light. Scientists calculated the margin of error at just 10 nanoseconds, making the difference statistically significant.

Given the enormous implications of the find, they spent months checking and rechecking their results to make sure there were no flaws in the experiment.

A team at Fermilab had similar faster-than-light results in 2007. But that experiment had such a large margin of error that it undercut its scientific significance.

If anything is going to throw a cosmic twist into Einstein's theories, it's not surprising that it's the strange particles known as neutrinos. These are odd slivers of an atom that have confounded physicists for about 80 years.

The neutrino has almost no mass, it comes in three different "flavors," may have its own antiparticle and even has been seen shifting from one flavor to another while shooting out from the sun, said physicist Phillip Schewe, communications director at the Joint Quantum Institute in Maryland.

Fermilab team spokeswoman Jenny Thomas, a physics professor at the University College of London, said there must be a "more mundane explanation" for the European findings. She said Fermilab's experience showed how hard it is to measure accurately the distance, time and angles required for such a claim.

Nevertheless, the Fermilab team, which shoots neutrinos from Chicago to Minnesota, will go back to work immediately to try to verify or knock down the new findings, Thomas said.

Drew Baden, chairman of the physics department at the University of Maryland, said it is far more likely that there are measurement errors or some kind of fluke. Tracking neutrinos is very difficult. Baden said.

"This is ridiculous what they're putting out."

Baden called it the equivalent of claiming that a flying carpet is invented only to find out later that there was an error in the experiment somewhere.

"Until this is verified by another group, it's flying carpets. It's cool, but..."

So if the neutrinos are pulling this fast one on Einstein, how can it happen?

Stephen Parke, who is head theoretician at the Fermilab said there could be a cosmic shortcut through another dimension — physics theory is full of unseen dimensions — that allows the neutrinos to beat the speed of light.

Indiana University theoretical physicist Alan Kostelecky, theorizes that there are situations when the background is different in the universe, not perfectly symmetrical as Einstein says. Those changes in background may change both the speed of light and the speed of neutrinos.

But that doesn't mean Einstein's theory is ready for the trash heap, he said.

Just there are times when an additional explanation is needed, he said.

If the European findings are correct, "this would change the idea of how the universe is put together," Columbia's Greene said. But he added:

"I would bet just about everything I hold dear that this won't hold up to scrutiny."

COMMENTARY: I'll put my money on Albert Einstein. The physicists of today don't hold a second candle to Albert. When CERN's "faster-than-light subatomic particle" findings are put up to scrutiny, Albert will be right. Nothing can go faster than the speed of light. Maybe the speed of light is the problem. They say its 186,000 miles per second, but what if it's just off a click. Then this could throw things off. Just saying.

Courtesy of an article dated September 23, 2011 appearing in Time Science