Tuesday, August 31, 2010

When a physicist talks about beauty in science, it's usually in an abstract way. Astronomers have Hubble, biologists have flowers and rainforests, and geologists have the Grand Canyon. But what does the public think of physicists? We're all bombs and particle accelerators in many people's eyes. Here's a clear example of why that's not true. Kai-hung Fung, a diagnostic radiologist at a hospital in Hong Kong, used a little art and a 3D Computed Tomography (CT) scan to create this image. It's called "Cosmic Lungs."

The manipulated image is a top-down view of human lungs with the cosmic clouds represented by the blanket over the person's body on the right and the CT table cushion on the left. Fung digitally removed the rest. Fung also won the photography prize in the International Science and Engineering Visualization Challenge in 2007 for his piece "What Lies Behind Our Nose?"

Monday, August 30, 2010

For decades Australian physicists have lusted after a gravitational wave detector, but despite their lobbying, the Southern Hemisphere still has no such instrument. According to Science Magazine's News of the Week, American's at our own Laser Interferometer Gravitational-wave Observatory (LIGO) have concocted a scheme that could change that.

The U.S. currently has two detectors, the Livingston Observatory in Louisiana and the Hanford Observatory in Washington State. Because gravitational waves - quivers in the curvature of space time - are expected to travel at the speed of light, when one instrument makes a detection the other should follow in about 10 milliseconds. That short time lapse will let the physicists determine the direction the ripple came from to within a few degrees of the sky (for reference, amateur astronomers typically use the rule that your fist held out at arm's length is ten degrees of sky).Gravitational waves are expected to originate in the most dramatic of cosmic events, like when two neutron stars collide and while they've been indirectly shown to exist, no one has ever observed one. So a fundamental part of detecting such a wave will be having a firm grasp on where it came from. American physicists already coordinate with the Virgo Interferometer in Italy and they claim that should help them determine the direction to within a couple degrees in the sky, but adding a fourth in the Southern Hemisphere has long held appeal to physicists because it could potentially cut that down to one degree (or borrowing again from amateur astronomers, about the size of a pinky finger at arm's length).

Each of these instruments is a gigantic L-shape, composed of two 4-kilometer long vacuum chambers for legs. Inside of each vacuum chamber, a laser bounces light back and forth off a mirror in hopes that a gravitational wave will come along and disrupt it by the tiniest amount. A change as small as one-thousandth the width of a proton will set off the detectors. So, to make sure the detection is accurate, a third detector is constantly running inside the Hanford facility to verify the find.

LIGO's detectors are in the midst of a $200-million upgrade that will make them 10-times more sensitive - including the cross-checking instruments at Hanford - what the Americans realized is that there's no reason the back-up facility has to be in the same building, or even the same hemisphere. So by simply telling the National Science Foundation to deliver their parts to Australia instead of Washington, they can make everyone happy and increase the world's ability to detect gravitational waves.

The catch is that NSF of course isn't going to buy Australians the facilities to go along with the instruments, so if the Aussies are interested they'll have to cover the $170-million price tag to house them.

With Australians having a massive deficit, it's unclear whether or not the newly elected conservative government will approve the funds; especially after promising to cut spending in that country.

Friday, August 27, 2010

WASHINGTON (ISNS) -- Five years ago, Hurricane Katrina slammed into the Gulf Coast, devastating New Orleans and other regions along the Mississippi River Delta. Hurricane forecasting has steadily progressed over the intervening years, which should help cities and states better prepare for devastating cyclones. Now researchers have added another piece to the forecasting puzzle by determining how the texture of landscapes can affect a storm’s motion.

New research shows that rough areas of land, including city buildings and naturally jagged land cover like trees and forests can actually attract passing hurricanes. The research found also that storms traveling over river deltas hold together longer than those over dry ground. As a result, the city of New Orleans might feel a greater impact of hurricanes coming off the Gulf of Mexico than existing computer models predict.
A team from the City University of Hong Kong modeled the effects that different terrain has on the paths of tropical storms to determine how cities that lie in the path of a hurricane change a storm's motion.

"Cities impose greater friction on the swirling flow because of the tall buildings," said Johnny Chan, a professor of meteorology at the university. "Our results show that tropical cyclones tend to be 'attracted' towards areas of higher friction. So it is possible that cities could cause tropical cyclones to veer towards them."

Rough cityscapes and forests trap air. This compresses the air and forces it up into the atmosphere, adding energy to the storm and pulling the center of the hurricane toward the rough region. As a result, a city can cause a hurricane to swerve from its predicted path by as much as 20 miles.

The change is comparatively small for hurricanes that can reach widths of hundreds of miles, but according to Chan, "The main implication from this study is that in any computer prediction of the track of a hurricane, the representation of the land surface is important."

The researchers also found that river deltas contribute to the longevity of hurricanes. There is more heat-carrying moisture available to evaporate from the wet deltas than dry ground, prolonging the life of the storm.

Chan and Au-Yeung Yee Man developed a computer model to track the movements of a simulated hurricane across varying terrain. Meteorologists have previously incorporated changes in moisture between the sea and land into models, as well as some differences in land formations; however the refinements that Chan and Au-Yeung have developed help reduce the uncertainly in existing models, which improves the planning vital to reducing the losses that accompany Katrina-size storms and the repercussions that follow.

"Its direct applicability to real predictions may be a little bit limited, but I do applaud the idea of looking at the idea of moisture availability and surface roughness," said Bob Tuleya, a retired researcher at the Global Fluid Dynamics Laboratory at the National Oceanographic and Atmospheric Administration who looked at the research, which is slated to be published in the Journal of Geophysical Research.

Chan said the team will continue to refine their models in order to minimize the error before their research can be fully implemented in hurricane predictions. This includes factoring in the effects of the Earth's rotation and other land features such as mountains and jagged coastlines. In addition, the researchers are planning to check their models by looking at the historical records of hurricane paths for any sign of the direction changes that cities would have caused.

Thursday, August 26, 2010

I think I lost this story this week amidst the widespread coverage of a new solar system discovered with seven planets -- but astronomers at Arizona State (Go Lumberjacks!) published new research in the journal Nature Geoscience on Sunday about our own solar system's origins.

The ASU researchers acquired a small piece of a meteorite from a private dealer that had gotten the space rock off a local in Morocco after it was found in the Sahara desert. When the team analyzed the meteor, they found it was as much as 2 million years older than the previously accepted age of the solar system - or 4,568.2 million years old - making it the oldest object ever discovered on earth. While that is only a slight difference when compared to a 4.5 billion year old system, they also found the meteorite had some peculiar properties about it. The meteorite contains a type of iron that can only be formed when a star goes supernova. Previous theories of the solar system's origins held that it was created isolated from other stars, but recent research has pointed to an alternate theory that our solar nebula might have condensed with help from a star exploding nearby.

This new discovery should help push the scientific consensus towards such an alternative and dramatic beginning.

"This relatively small age adjustment means that there was as much as twice the amount of iron-60, a certain short-lived isotope of iron, in the early Solar System than previously determined. This higher initial abundance of this isotope in the Solar System can only be explained by supernova injection," said (Audrey) Bouvier, a faculty research associate in the School of Earth and Space Exploration (SESE) in ASU’s College of Liberal Arts and Sciences. "This supernova event, and possibly others, could have triggered the formation of the Solar System. By studying meteorites and their isotopic characteristics, we bring new clues about the stellar environment of our Sun at birth."

Wednesday, August 25, 2010

When ultra efficient LED light bulbs emerged on the scene they were hailed as a brighter and greener way to light the world, but research announced Monday by Sandia National Laboratories in New Mexico shows that might not necessarily be true. These physicists aren't some lingering agents of Thomas Edison's – they of course acknowledge LEDs are a superior technology - instead the researchers show in their paper that the potential problem lies in history and human nature.

"Presented with the availability of cheaper light, humans may use more of it, as has happened over recent centuries with remarkable consistency following other lighting innovations," Sandia physicist Jeff Tsao said in a press release.
While the potential for cheaper energy could increase the quality of life for billions around the globe, it also could mean an increase in energy usage. Tsao says that since the 16th century, with each revolution in lighting technology humans have used more light, instead of using the same amount of light for cheaper.

"Over the past three centuries… the world has spent about 0.72 percent of the world’s per capita gross domestic product on artificial lighting," said Tsao. "This is so for England in 1700, in the underdeveloped world not on the grid and in the developed world using the most advanced lighting technologies. There may be little reason to expect a different future response from our species."

It may seem like the developed world has plenty of light now, but the researchers claim an older population with diminished eye sight may crank up the lights and others might use it to brighten the dark days of winter. Tsao also says that improvements in lighting technology might also help us decrease light pollution and doesn't necessarily imply the future will light-saturated; he says modernized lighting gives humans more control, which could mean darker skies.

"More fuel-efficient cars don't necessarily mean we drive less, we may drive more," said paper co-author Jerry Simmons in the same release. "It's a tension between supply and demand. So, improvements in light-efficient technologies may not be enough to affect energy shortages and climate change."

Tuesday, August 24, 2010

A new telescope at the Big Bear Solar Observatory has captured the most detailed visible light image ever of a sunspot. The 1.6 meter, generically named NST -- for New Solar Telescope -- has remarkably clear seeing thanks to its location on a pristine mountain lake in Southern California, and also benefited from adaptive optics to generate the image.While the image has remarkable resolution of about a 50-mile section of the sun's surface, the technology is a test-bed for an even more ambitious project called the Advanced Technology Solar Telescope, which is expected to be completed in the coming decade.

"The new telescope now feeds a high-order adaptive optics system, which in turn feeds the next generation of technologies for measuring magnetic fields and dynamic events using visible and infrared light..."

"The new optical system will allow the researchers to increase the distortion-free field of view to allow for better ways to study these larger and puzzling areas of the Sun. MCAO on the NST will be a pathfinder for the optical system of NSO's 4-meter aperture ATST coming later in the decade.

Scientists believe magnetic structures, like sunspots hold an important key to understanding space weather. Space weather, which originates in the Sun, can have dire consequences on Earth's climate and environment. A bad storm can disrupt power grids and communication, destroy satellites and even expose airline pilots, crew and passengers to radiation."

As carbon dioxide continues to build up in Earth's atmosphere, it will also accumulate in her oceans. This rise in CO2 has already made the upper ocean more acidic and the same is expected to happen even in the lower depths in the coming century.

Physicists from the Woods Hole Oceanographic Institution now say that these changes will make some far flung reaches of the ocean more noisy. In a paper published last week in the Journal of the Acoustical Society of America, the team modeled ambient shipping noise for the deep ocean, incorporating forecasts for ocean pH levels and shipping noise in the coming century.Any first year physics student knows the thickness of a fluid is important in considering how well sound waves propagate, but also of crucial importance in sea water is the concentration of boric acid and other chemicals. Boron ions help filter out low frequency waves, but as the ocean gets increasingly acidic, the amount of boron ions will decrease and these low frequency sound waves will penetrate into deeper waters.

Previous research showed this increase in acidity may allow low-frequency sound to travel 70-percent farther, but the Woods Hole researchers found that the impacts will also be seen at frequencies more commonly produced by ships, with an increase of a couple decibels in the frequency range of 100Hz to 1000Hz. A few decibels isn't much, but deep in the ocean and far away from such noise sources, that can be a large difference.

It's not known what this will do to sea life, but marine mammals have been shown to be highly sensitive to the sounds humans have already introduced via shipping and Naval activity. Whales have been known to beach themselves in the presence of intense noise and the normally vocal species often stop communicating with each other in the presence of boats, however, there are many other sea creatures whose hearing hasn't been closely examined.

It's worth noting here that the research was sponsored by the U.S. Office of Naval Research.

Monday, August 23, 2010

According to a BBC article today, microbes taken from the cliffs of Beer -- a small fishing village in the UK (I know, sorry) -- were left outside of the ISS for a year and a half to see how well they could live in the harsh conditions of space. When the astronauts finally retrieved them, they found that many of the microbes were still alive and well (click for a video). Bacteria has survived in space for several years before, but this is the longest (known) that any photosynthesizing microbes have survived.

Interestingly, the microbes weren't selected for any special properties. The researchers in charge of the experiment guessed that if they sent extremophiles outside for this mission, they would do fine with the intense UV radiation and large temperature swings. So instead they just sent the community of rather ordinary tiny creatures that happened to be living on this rock already. Once examined back on earth, the researchers found that the survivors had thick cell walls and were similar to species known to live in Antarctica as well as hot desert climes. The group suspects such a correlation might mean they have a strong ability to repair their DNA.

"The experiment is part of a quest to find microbes that could be useful to future astronauts who venture beyond low-Earth orbit to explore the rest of the Solar System."

To which I say, 'wouldn't studying how actual beer lasts in space be more useful to colonizing the rest of the solar system?' But I digress...

Outside of the applications for exploration the researchers in charge cite, the study has some serious implications for panspermia, the controversial theory explored by astronomer Fred Hoyle (who also named the big bang) as well as Carl Sagan. The theory holds that living things could have traveled on space rocks from other worlds to seed life on earth.

Friday, August 20, 2010

In an effort to win back that very vocal Star Wars segment of the Physics Buzz audience, which I so recklessly upset last week by revealing that Death Star style lasers were impossible, I thought I'd share this vision of awesomeness with you.
Read the rest of the post . . .

Thursday, August 19, 2010

Solid Rocket Boosters have made routine commutes to and from space since the Space Shuttle program first began. For decades they've been used, beat back into shape, and used again. Here's something that might throw a little respect there way. In what David Levin over at PBS's Inside Nova has described as "a film even Stanley Kubrick would be proud of," NASA strapped a camera & mic to an SRB on STS 124. The result is seven-minutes of surreal anticipation, launch, floating, free-fall and then splash down. Enjoy, and make sure you've got the sound on.
Read the rest of the post . . .

Among the multi-billion dollar wish list astronomers released last week in their sixth decadal survey, was $421-million for the Large Synoptic Survey Telescope, marking the second time the scientific community has established the instrument as a priority. While the telescope's primary mirror is a relatively large 8.4 meters, it's not so much the size that matters here. What's different about the LSST is its exceptionally wide field of view -- seven times larger than the diameter of the moon as seen from earth. That broad view will allow it to scan the entire night sky in the southern hemisphere every 72 hours -- creating 30 terabytes of data per night -- from its perch high in the foothills of the Andes. Astronomers say it could help to unravel the mysteries of dark energy, dark matter, time-variable phenomena, supernovas and even asteroids. To help process this enormous amount of data, the LSST has partnered with Google, which plans to make much of the data easily accessible to the public and use it to create an up-to-date map of the night sky.

Pulling straight from the report:

-New Worlds, New Horizons in Astronomy and Astrophysics

"Over a 10-year lifetime, LSST will be a unique facility that, building on the success of the Sloan Digital Sky Survey, will produce a 100 billion megabyte publicly accessible database. The project is relatively mature in its design... The committee recommends that LSST be submitted immediately for NSF’s Major Research Equipment and Facilities Construction (MREFC) consideration with a view to achieving first light before the end of the decade."

Astronomers have proposed that the National Science Foundation and the Department of Energy split most of the bill, with tens of millions more coming from philanthropists like Bill Gates and Charles Simonyi. If my tabulations are correct, when built it would be the most expensive telescope on Earth; coming in well over the cost of the Keck Observatory (~$188 million), the Gran Telescopio Canarias (~$175 million) and the Large Binocular Telescope ($120 million). The above photo is of the LSST primary mirror spinning as it cools in the Steward Mirror Lab under the football stadium at the University of Arizona. It was taken by Alex Attanasio on a tour we were given in April 2008.

Wednesday, August 18, 2010

"I baptize you a Frenchman, daring child," French poet Paul Claudel once wrote, "with a dewdrop of champagne on your lips.”

I've always thought of physicists as beer drinkers, but I suppose if you live in France, you're a Champagne drinker whatever your profession. In a paper titled "On the losses of dissolved CO2 during Champagne serving," some French physicists have examined the proper way to pour a glass of bubbly. Apparently these mimosa loving researchers at the Université de Reims thought that determining the best technique to pour your champagne would be a fine use of their lab funds (to which I say 'hooray for populist scientists!'). It turns out that, counter to what the French think about pouring their bubbly into perfectly vertical glasses, physics proves that the same slightly-tilted tactic beer lovers have used for millennia is best for champagne too. Their conclusion might be intuitive, but it flies against hundreds of years of French tradition, and it certainly isn't the way your bartender would pour it.

The "méthod traditionnelle" holds that 9 grams of dissolved CO2 should go into every standard bottle of champagne, but that dissolved amount is equal to 5 liters of gaseous CO2 which must escape once you've popped the top. That's more than six times the actual volume of the bottle fleeing from your drink. And because that carbon dioxide is responsible for releasing the aroma as well as giving your mouth that effervescent feeling bubbly is known for (or in science speak "the chemosensory excitation of nociceptors in the oral cavity"), the more bubbles you keep in the glass the better.

According to their paper, which appears in the ACS Journal of Agricultural and Food Chemistry, they used an infrared thermography technique to image the champagne pouring and show how a standard vertical pour results in a loss of twice the amount of bubbles as a 'beer-like technique.' They also showed that serving your Champagne chilled significantly helps retain its bubbles too.

I doubt this will overthrow years of tradition, but next time you're in France, or some fancy restaurant where they pour your $15 split into a vertical glass, feel free to tell them 'No! You're doing it all wrong.'

On a side note, this is my nominee for the upcoming Ig Nobel Prize in physics.

Monday, August 16, 2010

If your Monday is dragging on too long, you might try blaming it on cosmic rays. In a paper published Friday by the journal Geophysical Research Letters, physicists from Paris and Moscow propose that the high energy protons and nuclei might have a surprising influence on Earth's length of day. The team claims that a previously noticed relationship between fluctuations in the length of day and the 11-year solar-cycle are actually caused by cosmic rays.

One of the team members, Vincent Courtillot of the Institute of Geophysics of Paris, says they examined the length of day -- as defined by the speed of the earth's rotation in a reference frame fixed with respect to the stars -- using a series of daily values over a 40 year period. They claim that up to 30-percent of changes could be directly related to the 11-year sunspot cycle. Of course, 30-percent of that change only amounts to a few tenths of a millisecond, so you'd never actually notice it, but what's more compelling (read 'very highly controversial') is the potential for cosmic-rays to have such a profound affect.

Courtillot and his colleagues have been among those championing a radical theory that cosmic rays can impact the formation of clouds and in turn, play a major part in climate changes. But how could cosmic rays possibly change the speed of our planet's rotation?

Here's how Courtillot explained it to me in an email:

"The causal chain is the following: the changes in Earth rotation are simply reflecting the changes in angular momentum of the Earth's atmosphere, more precisely the integral of zonal winds. And it has been suggested that cosmic rays influence cloud condensation nuclei formation. If you change the cloud cover by say 10-percent, you change the amount of energy reflected by cloud tops by 8 Watts per square meter, which is very significant in the Earth's radiative budget. So this is the suggested link: cosmic rays affect cloud cover, which affects the atmosphere's energy budget, which may alter the wind speeds and organization, which changes the Earth's angular momentum hence (length of day)."

It may sound like a reach (and relationships don't prove causation), but other physicists have claimed that the sun's magnetic field could potentially beat back cosmic rays and slow the rate at which they reach Earth. So when solar activity decreases there's less to deflect the cosmic rays and they can again reach Earth in greater numbers, potentially leading to a substantial enough change in winds to affect Earth's angular momentum.

Friday, August 13, 2010

WASHINGTON (ISNS) -- Astronomers announced Thursday the discovery of a new star, found with help from a most unusual source -- a screen saver.

Chris and Helen Colvin, owners of the personal computer running the screen saver are participants in a project called Einstein@home, an experiment in distributed computing which uses the donated idle time from hundreds of thousands of home computers across the globe in lieu of more expensive supercomputers. The June 11 discovery in Ames, Iowa of a pulsar -- a dense, rotating star that appears to pulse like a lighthouse beacon -- was confirmed on June 14 by another user's computer in Germany. It marks the first time an astronomical body has been discovered this way."We've both been users since the beginning, but it was his desktop that got the golden packet," said Helen Colvin, who has a doctorate in human-computer interaction and attributes much of the find to luck. "It's just something that runs in the background and we don't think about it very much." Both Helen and her husband Chris work as computer professionals for Wells Fargo.

The research team hopes to use the donated time from these computers to eventually find the gravitational waves which were predicted by Albert Einstein's theory of relativity but have never been directly detected. However, because of the huge number of volunteer contributors, the program also processes data from the Arecibo radio telescope in Puerto Rico looking for possible pulsars -- which are what's left after a massive star uses up its nuclear fuel and collapses under its own immense weight.

"Volunteer computing takes advantage of the fact that there are more than a billion computers around the world," said Einstein@home director Bruce Allen. "The collaborative computing power of these computers is substantially greater than those from a supercomputer."

With more than 50 distributed computing projects in existence and dozens more in development, the technology has become an increasingly popular way for scientists to process vast amounts of data on the cheap. Einstein@home takes data from Arecibo and LIGO and splits it up into individual work units to be processed by computers around the world. In this case, the data was actually collected in February of 2007 by Arecibo and sent out 157 times to volunteers before the pulsar was found hidden inside of it.

Other distributed computing projects have been able to create simulations of physical systems using volunteered computers in this way, but Einstein@home claims this is the first time anyone has made a physical discovery with it.

"There has been a lot of interesting research done by people like Rosetta@home and they have been finding a lot of things," said Michela Taufer project lead for another distributed computing project called Docking@home. Taufer added that while other projects may not have found anything new, the ability to model what you see happening in a physical experiment is important in itself. "You start with trying to validate your simulation results and you try to confirm what you see in the lab. So, it's a sort of validation and discovery."

While the pulsar found was a somewhat rare type that rotates multiple times a second, discovering a pulsar is nothing new in astronomy. But the team said the Colvin's discovery, published in this week's Science, demonstrates not only the power of distributed computing, but also helps the search for gravitational waves -- which scientists believe may be emitted by pulsars.

"We built it (the computer) ourselves with parts from a local computer shop, and anytime a part breaks we'll replace it with another," said Chris Colvin, who is a computer systems architect. "It's just one PC that's on a few hours a day. I'm sure there are a lot more people out there doing more than that."

The Colvin's said they've been donating time on their computers since they were first dating as undergraduates at Iowa State and living in the dorms. Helen was a computer engineering major and many of the other students in her all female dorm were also science and engineering students. Lots of their friends were running another distributed computing program called SETI@home, which uses volunteer computers to search the skies for radio signals from alien civilizations. After seeing a story about Einstein@home on the popular tech news website Slashdot, the couple volunteered their computers to that effort as well.

The team they now expect more discoveries to increase because of software upgrades made in March of this year and the increase in publicity they expect from the discovery.

Though Allen doesn't suspect any gravitational waves will be detected on Einstein@home for at least another 5-10 years, he says the project plans to continue processing Arecibo data to allow the public to make more discoveries. The team also announced at their press conference that another discovery has already been made on computers in the U.K. and Russia, but the group plans to withhold the details until the finders of this golden ticket have been informed.

By Eric Betz, ISNS ContributorInside Science News Service

Note: Einstein@Home was a project of the World Year of Physics, which was partially funded by the American Physical Society and the American Institute of Physics, publishers of Inside Science News Service.

Thursday, August 12, 2010

Legislation plans prepare for electrical emergencies that occur from events outside our control

WASHINGTON (ISNS) -- Protection plans for the nation's complex electrical grid passed by lawmakers in July include emergency events that are outside of control.

Electricity is all around us. It lifts elevators, pumps gas, lights rooms, cooks food, and even powers a growing fleet of cars. We generally take the vast electric grid for granted until it turns off. Only then do we realize how important it is. Blackouts owing to technical foul-ups are bad enough, but new hazards, some malicious and some from nature, threaten to create electrical disturbances on an unprecedented scale.Legislation that Congress passed in July hopes to strengthen the grid’s robustness against attacks of many kinds. The immediate aim of the Grid Reliability and Infrastructure Defense Act is to direct the Federal Energy Regulatory Commission, the main federal agency responsible for electricity matters, to establish security rules for utilities and other energy companies.

The GRID Act amends the old power law by recognizing several threats to the grid. One of these is an attack that tampers with grid computer control systems. Some utilities report fending off thousands of such cyber-attacks per day. Another is infrequent but potent geomagnetic storms, which can happen when eruptions of material from the sun send cascades of particles into Earth's atmosphere. These particles can cause beautiful auroral displays ("northern lights"), but can burn out the wiring in orbiting satellites and induce short-lived but large voltage surges in grid equipment on the ground. Past such storms have burned out expensive equipment and left millions in the dark. A carefully detonated nuclear bomb could emit radiation pulses that could do some of the same damage.

"The electric grid's vulnerability to cyber and other attacks is one of the single greatest threats to our national security," said Rep. Edward Markey, D-Mass., chairman of the Energy and Environment subcommittee and one of the sponsors of the bill. "Every one of our nation’s critical systems -- defense, water, healthcare, telecommunications, transportation, law enforcement, and financial services -- depends on the grid. This bipartisan legislation is critical to protecting the United States against this emerging threat."

One of the chief fears addressed by the GRID Act is that a major power outage might be long-lasting, especially if critical components were affected. Even "a small disruption in the power supply can wreak havoc on our economy, while an extended blackout of months would be catastrophic," said Rep. Fred Upton, R- Mich., another sponsor of the bill.

The GRID Act stipulates that energy companies take more precautions to guard against the highlighted threats. This would include having more spare parts on hand to deal with breakdowns. Transformers, the bulky devices that change electricity from one voltage to another, are particularly vulnerable to disturbances. Companies might pool their resources, and if necessary pass along the cost of extra equipment directly to consumers.

The act also creates a category of "protected" technology security information that is exempt from the Freedom of Information Act, the better to foil those who would plan terror attacks on the grid.

Wednesday, August 11, 2010

Somebody is going to have to break the news to Darth Vader. His Death Star's planet destroying potential is going to be way behind schedule. Research scheduled to be published in an upcoming issue of the journal Physical Review Letters shows that lasers are already close to reaching their maximum intensity and that the next generation of lasers currently being developed might be able to reach that limit.In 1997, an experiment at SLAC sent 47-billion eV electrons from its 2 mile long accelerator and collided them with a one trillion watt green laser to create a monstrous electromagnetic field. When the electrons and the photons from the laser impacted, they created higher energy gamma-ray photons, these gamma-ray photons then collided with photons in the laser beam again and shattered the vacuum as matter was spontaneously created from light within the experiment.

Creating light from matter is rather ordinary in terms of physics, as can be seen in nuclear explosions. But the SLAC experiment was the first to produce the opposite, and while the effect had been expected for some 50 years, the equipment hadn't existed to test it experimentally. It is known amongst physicists as creating "spark in a vacuum." When the electromagnetic field has enough energy, light becomes matter as a positron-electron pair is produced.

The SLAC experiment was just a singular event, but as lasers reach higher intensities the electric fields produced will increase as well and the team says that when they reach a critical intensity a cascade effect will occur as a result. The electron-positron pair is accelerated by the laser field itself at such high energies that they emit photons capable of spawning new pairs and continuing the process. In fact, their estimations indicate that even a single such pair can completely destroy the laser field as the energy created from the pairs can equal that of the laser.

"At high laser intensities interaction of the created electron and positron with the laser field can lead to production of multiple new particles and thus to formation of an avalanche-like electromagnetic cascade."

Physicists have suspected this cascade effect could limit the intensity of lasers, but these new calculations show that the effect will likely be seen in lasers already being built like the European laser projects ELI (Extreme Light Infrastructure) and XFEL (X-ray Laser Project).

Tuesday, August 10, 2010

The unstoppable Hubble Space Telescope, which celebrated its 20 anniversary in April, released a new image today titled Island Universe. The 28 hour exposure combined data gathered in 2006, 2007, and 2009 of a spiral galaxy some 320 million light years away in the Coma Cluster. The cluster is home to one of the most dense populations of galaxies in our cosmic neighborhood and this close proximity means the galaxies often interact violently. The long wispy formations seen along the arms of the main galaxy in the image are a result of the nearby galaxy in the upper right striping material off as they pass perilously close to each other. Click on the image for the high resolution version.

From the NASA press release:

"The galaxy, known as NGC 4911, contains rich lanes of dust and gas near its center. These are silhouetted against glowing newborn star clusters and iridescent pink clouds of hydrogen, the existence of which indicates ongoing star formation. Hubble has also captured the outer spiral arms of NGC 4911, along with thousands of other galaxies of varying sizes. The high resolution of Hubble's cameras, paired with considerably long exposures, made it possible to observe these faint details."

Monday, August 09, 2010

When an ultra high-energy cosmic ray enters the upper atmosphere it splinters into an elaborate shower of billions of secondary particles that head for Earth's surface. While ordinary cosmic rays are a relatively common phenomenon in physics, among astrophysicists these high energy cosmic rays are known as the most elusive of particles due to their rarity and mysterious origins.

They strike so infrequently that one will hit in any given square mile only about once a century. In order to capture them, the Pierre Auger Observatory distributed 1600 5' x 12' plastic water tanks across a 3000 square kilometer section of the Argentine Andes. When the particle crosses the tank the difference between the speed of light in air and the speed of light in water creates a shock similar to a supersonic jet, except with light. The result is a flash of light in the water called a Cherenkov emission that their detectors picks up.
While it's believed that the origin of high energy cosmic rays could be things like supermassive black holes at the center of galaxies, other suspected exotic sources might be dark matter or even extra dimensions. But earlier this year the physicists at the Pierre Auger Observatory uncovered what they now say is a source much closer to home.

The detectors had been finding that many of these ultra high energy particles are nuclei and not protons as had been previously thought. Nuclei are much more fragile than protons and have a tendency to disintegrate and become protons as they travel through space, so the discovery was quite unexpected. But in a paper due to be published in Physical Review Letters, they describe the surprising reason for the disparity: cosmic accelerators in our own galaxy.

The resolution came when the group realized that while stars exploding in the Milky Way can accelerate nuclei and protons alike, the protons would be launched from our galaxy immediately and the much heavier nuclei would get stuck in its magnetic field. As a result there are more nuclei in our vicinity than protons and they are detected in much greater numbers by the Pierre Auger Observatory.

While people have observed the sorts of stellar explosions that would create these high-energy particles in other galaxies, the group says that their discovery shows these explosions happen in our galaxy at least a few times every million years. They also say that the high energy nuclei they see now have been knocking around the galaxy for perhaps millions of years, trapped in a web of galactic magnetic fields.

While the research doesn't rule out all the other exotic sources for high energy cosmic rays, it does explain the overabundance of nuclei cosmic rays detected on earth.

Here's a great video by the University of Chicago on the Observatory and its research

Friday, August 06, 2010

Sixty-five years ago a blinding flash of light leveled the city of Hiroshima Japan. The United States dropped the atomic bomb, dubbed "Little Boy," over the city's downtown. Three days later, the United States dropped "Fat Man" on the outskirts of Nagasaki. Japan surrendered six days later, ending the most devastating war the world had ever seen.

Hiroshima was completely destroyed. Before and after photos show that the buildings in the city were almost totally razed. Over 70,000 people died the first day, with roughly 70,000 more perishing from radiation poisoning in the following months.But why was Hiroshima chosen in the first place? Its strategic importance was only part of the reason it was chosen. There were several military camps around the city including the headquarters of the Fifth Division and 2nd General Army. However up to that point, the U.S. Air Force hadn't deemed it as important enough to bomb in the preceding months of the war.

It sounds twisted, but in a way the city was chosen to be a massive science experiment. As far back as May, the bomb planning committee which included Manhattan Project leader Robert Oppenheimer chose Hiroshima as one of five potential sites because of its pristine state in order to cleanly study the bombs' effects on a metropolitan area. Dr. Joyce C. Stearns who was in charge of recommending targets to the air force, looked specifically for cities untouched by Allied bombing runs.

Hiroshima Before the Bomb

Hiroshima After the Bomb

Before Little Boy was dropped, the only nuclear explosion the world had ever seen up to that point was the Trinity test in the middle of the New Mexico desert. The test showed the bomb worked, and showcased its devastating effects over the desolate sands of the American Southwest. (To this day, large sections of ground are still covered in "Trinitite," a kind of glass formed when the intense heat and pressure of that fist mushroom cloud fused desert sand into glass.)

Nuclear weapons were a completely new weapon and no one knew for sure what kind of devastating effects it possessed. The scientists who created it knew the explosion would be huge and radiation would likely permeate itself around the area, though how much was anything but certain.

After the bombs fell, Japanese and U.S. researchers descended upon Hiroshima and Nagasaki to study its after effects. These researchers studied everything from the blast patterns of the bomb to the effects of radiation on human tissue. Much of what is known today about the effects of nuclear fallout came from the researchers studying the survivors of Hiroshima and Nagasaki.

The city of Hiroshima has long since been rebuilt; however the city still bears the marks of the bombing. Though almost the entire city was leveled, the ruins of the Hiroshima Prefectural Industrial Promotion Hall remained standing located almost directly below the detonation. Its ghostly shell still stands amidst the city today, as a reminder and a warning of the awesome power of nuclear weapons. It has been renamed the Hiroshima Peace Memorial or the Genbaku Dome, and is a part of the Hiroshima Peace Memorial Park. Read the rest of the post . . .

Wednesday, August 04, 2010

After a recent trip to the Air and Space Museum had me laughing hysterically at the shark repellent displayed in a U2 pilot survival kit, I did the natural thing and googled it. I was seriously surprised by what I found. So, in honor of Shark Week - the annual holiday as declared by Discovery Communications - I wanted to harp on some seriously suspect physics I noticed while looking up shark repellent bat sprays.

Two separate devices I found are being sold to this effect and from the looks of it, people are really buying. The first, called Aquashield, is using physics that seems about as likely to save you from a great white as a magnetic bracelet is to save you from Magnetic Field Deficiency Syndrome. And as near as I can tell from their Wikipedia regurgitation of irrelevant facts and the following email correspondence, it is a magnet.Inquiring email to Aquashield:

Our Question:

"So, it's a magnet?"

Aquashield Answer:

"Thank you for your interest in the Aquashield unit.

Yes you are correct, the unit utilizes integrated magnetic technology as part of the process to achieve this excited state of electrons. The flow through ports allow for the ionization process to take place. The field arrangement is such that the exothermic polymer allows for a stabilized field state so this field structure is never compromised. There are 24 sequential magnetic points that deliver this field state.

Hope this helps"

The magnet belt-buckle does have a pretty sweet logo on it, and if I were a hipster I might procure one just to put all other hipster belt-buckles to shame. It's funny at first and then you realize people are actually thinking this makes them safe. Snake oil is one thing, but if you sell someone snake oil the risk of it not working doesn't involve being eaten by a snake.

The other similarly named device, called Shark Shield, is at least more active, though $500 more expensive. This one supposedly works by sending out an electromagnetic wave that causes uncontrollable muscle spasms in any shark in your area.

According to Shark Shield's website:

Shark Shield is a three-dimensional electrical wave form which creates an unpleasant sensation impacting the shark’s ‘Ampullae of Lorenzini’. When the shark comes into proximity of the electrical wave form (around 8 meters in diameter) it experiences non-damaging but uncontrollable muscular spasms causing it to flee the area.

The field is projected from the unit by two electrodes, which create an elliptical field that surrounds the user. Both electrodes must be immersed in the water for the field to be created. The electrode configuration depends on the model of the Shark Shield unit.

From testing, the closer the shark is to the Shark Shield field, the more spasms occur in the sharks’ snouts. This becomes intolerable and the shark then veers away, and usually doesn’t return.

Sadly, a person was actually killed while wearing the device several years ago according to this article. Also, according to the same article a raft in South Africa was moving along with the Shark Shield attached to it when a 12 foot shark came up and ate the device.

One interesting thing about this though, is that sharks really do have the ability to detect electrical impulses. Sharks - as well as stingrays and some other aquatic creatures - have a number of electroreceptors in pores on their heads called ampullae of Lorenzini. This is how they hunt and possibly how they can navigate such large distances.

Does this translate to a shark being repelled by a magnet or an EM field? Neither company presents very compelling evidence. Shark Shield is supposed to cause uncontrollable muscle spasms in sharks, but in none of the videos posted on their website did I see that happen. Just because you survived a shark swimming near you doesn't mean the device worked. People swim by sharks all the time. They rarely attack.

However, one good thing about working in a building full of physicists is that you always have materials on hand for experiments, yes even bountiful strong magnets and sharks. I think our experiment is about at the level of their science, but I'll let you be the judge. Clearly the results of this experiment show that a Bala shark would be no less likely to eat you if you were wearing magnets. If the maker of either of these products cares to post any sort of evidence for the validity of their products, I more than welcome it.

Physics Buzz: We put magnets in a fish tank with our boss' Bala Sharks, you decide.

Tuesday, August 03, 2010

Rumors of Spirit's demise have been greatly exaggerated for years, it seems every six months or so we see one of these stories. Unfortunately, this time it looks like NASA is for real. In the continuing saga of the little rover that could, NASA announced Saturday (to much press) that Spirit may never call home from the surface of Mars. This pretty much blanketed the news this weekend, but I think we're all a little tone deaf after seeing the same story for so many years. I also think NASA was a little insensitive to poor spirit yesterday by announcing its plans for the new and improved rovers. So to remind us all of the scope of the tragedy, I thought I'd dig out my favorite xkcd.

Moving on to bigger and better rover plans, NASA announced in a press release yesterday that it had selected the new instruments for its upcoming joint Mars mission with ESA. The cooperative is a three mission series that will play out over the coming decade, as the pair hunt for traces of life on the surface of the Red Planet.

Included in the plans is the 2016 ExoMars orbiter and lander mission specifically designed to map the planets methane sources and help pick a landing site for the following 2018 ExoMars rover. The rover would have a drilling capability so it could scour its immediate area and try to determine if the planet's methane is a result of life. The ExoMars rover would then store select (potentially life-containing) samples which would be retrieved by the final mission in the series and return them to earth sometime in "the 2020s." This would represent the first time humans have returned a sample from another planet.

Of course the U.S. selectee institutions are no surprise, but details on the instruments are still very lacking. If anyone has seen the proposals for these please do share, while I can guess at most of these from existing technologies, I failed to find any good links or in-depth descriptions for most of the instruments.

ExoMars Climate Sounder (Link is for current MRO instrument)-- An infrared radiometer that provides daily global data on dust, water vapor and other materials to provide the context for data analysis from the spectrometers: John Schofield, NASA's Jet Propulsion Laboratory (JPL), Pasadena, Calif.

High Resolution Stereo Color Imager -- A camera that provides four-color stereo imaging at a resolution of two million pixels over an 8.5 km swath: Alfred McEwen, University of Arizona.

Mars Atmospheric Global Imaging Experiment-- A wide-angle, multi-spectral camera to provide global images of Mars in support of the other instruments: Bruce Cantor, Malin Space Science Systems, San Diego, Calif.

Monday, August 02, 2010

Accounting for well to wheel efficiency in electric cars: Do electrics have lower emission levels than hybrids?WASHINGTON (ISNS) -- As the Senate struggles with energy legislation this week, one of the few fixes with bipartisan support is a bill that would invest billions in putting electric-powered cars and trucks on the road. But it’s not clear whether it would be environmentally beneficial to do so. That debate has played out in an open conflict between electric vehicle proponents whose proposals would be implemented in the bill and auto industry executives pushing for funding of alternative technologies.The measure, as approved by the Committee on Energy and Natural Resources, would provide an additional $3.6 billion for electric vehicles if passed by the full Senate and put into effect several proposals in the Electrification Coalition's roadmap, including $1.5 billion to lower battery costs and help link the vehicles to the electric grid. Also in the bill is $2 billion in funds to put 400,000 electric cars on the road in the next three years and funds to develop specific communities that will rely on electric cars in a few regions throughout the country. It also creates a $10 million prize for the first commercially-viable battery with at least a 500 mile range.

"Republicans and Democrats agree that electrifying our cars and trucks is the single best way to reduce our dependence on oil," said Sen. Lamar Alexander (R-TN) in a recent statement. “Our goal should be to electrify half our cars and trucks within 20 years, which would reduce our dependence on petroleum products by about a third.”

While nearly every major auto manufacturer in the world plans to debut an electric vehicle in the next two years, scientists are divided on their estimates of the electric car's impact on the environment. Industry scientists have argued that an electric car is only as clean as the power plant it's plugged into, while proponents of electrics -- including Electrification Coalition member and FedEx CEO Fred Smith -- argue they produce less greenhouse gas emissions than a conventional hybrid even when the source is a dirty coal-fired plant.

Conflicting Studies

"Until we significantly alter how we produce electricity in our nation," Kathryn Clay, director of research at the industry group Auto Alliance said in Senate hearings on the bill, "including upstream emissions in the vehicle greenhouse gas standards will mean that electric vehicles will rate only marginally better than conventional internal combustion engines and comparatively worse than the conventional hybrids we have on the road today."

A study by the Sloan Automotive Laboratory at the Massachusetts Institute of Technology in Cambridge, funded by Ford, found that electric vehicles plugged into nuclear or renewable sources would result in drastic reductions in emissions; however, vehicles powered by electricity from coal plants would have larger carbon footprints than conventional automobiles. In June hearings on vehicle electrification legislation, the Auto Alliance stated it did not support the bill because it believed the government was unfairly favoring one technology over others. The Alliance represents most major car companies in the world with the exception of Nissan, Honda and Hyundai. Of the group's member companies, only GM received relatively substantial electric vehicle funds from the Recovery Act.

Auto Alliance members are also worried that emission standards on electric cars will leave auto makers uniquely responsible for upstream emissions from power plants -- a source which they have no control over.

"Including upstream emissions creates a huge disincentive for producing electric vehicles versus less costly and less game changing technology," said Clay.

Many non-industry researchers claim that there is a net drop in greenhouse gas emissions no matter what the power source is. Studies done by the National Resources Defense Council and the Electric Power Research Institute found that plug-in hybrid electric vehicles -- even those plugged into a dirty coal-fired plant -- would offer dramatic reductions in greenhouse gas emissions. And a Tesla Motors analysis found that even when considering the average sources of electricity in the United States, its fully electric Roadster is significantly more efficient than the Toyota Prius or other hybrids.

"Our studies would indicate that plug-in electric vehicles, even if powered by coal power plants that have not been modified to clean up the emissions … produce significantly less CO2 emissions than conventionally powered vehicles," said Smith.

As an additional benefit, the Electrification Coalition says it would be easier to regulate emissions from a few power plants than the hundreds of millions of cars on the road. And the cars will only become more efficient with time as the grid shifts towards renewable sources of electricity.

EPRI's report, which the Electrification Coalition relies on in its claims, says that previous studies have relied on limited information from the electricity and transportation industries. "We stand by our study with the NRDC, … that was the bellwether study," said the group's media relations manager Clay Perry. "We examined all the power sources throughout the country and those went into the study. We had access to a lot of data."

The administration and many politicians on both sides of the aisle also see the electrification of vehicles as a step toward reducing greenhouse gas emissions and as a path to recovery for a nation addicted to foreign oil. The United States currently spends $380 billion a year on imported oil – 70-percent of which is used for transportation -- and President Obama hopes to reduce that number by increasing the number of electric cars from essentially none, to one million in the next few years. The Recovery Act has already invested more than $5 billion in electric vehicles, with half of that going in loans to Nissan, Tesla and Fisker motor companies.

"This is an enormous national security problem," said Smith, "we have two shooting wars going on and there's no question at least in part they were precipitated by our dependence on imported foreign petroleum."

China, Denmark and Israel are among the countries that have also chosen to focus on electric cars, and that has many concerned China will gain an edge on the U.S. in the market.

A study released in May showed that 60-percent of Chinese citizens would be willing to buy an electric car, five-times the amount of Americans who said they were ready to convert in the same study, and the Chinese have already made significant investments in electric vehicle technology and infrastructure. The country produces 20 million electric scooters a year and plans to shift that infrastructure to cars in coming years in part due to the success of its own trial electric car communities program, which has already nearly doubled in size, growing from 13 to 22 cities.

The Alternative: Fuel Cells

While the United States has shifted its focus from fuel cell vehicles to electrics under the leadership of the current administration, other countries like Japan, Germany and South Korea have ramped up their efforts to produce fuel cell technology.

Fuel cells – which use hydrogen to produce electricity and then release water and heat as by-products – are not widely considered ready for prime-time and a production model car would cost around $1 million. Fuel cell vehicles would also require a non-existent hydrogen fuel infrastructure; whereas the electric infrastructure is already present and needs only work out certain accessibility problems. The Senate believes it has addressed those problems in the current bill and legislators have been quick to point out that most people will charge their vehicles at night during off-peak hours, making it much cheaper.

However, many engineers and policy makers in the United States still argue that fuel cell vehicles provide a better solution to reducing greenhouse gases without being limited by the short range of battery power and say the U.S. will be left behind in the long-term by focusing on electrics. The administration went as far as to cut funding for the technology in its last two annual budgets. These funds were eventually restored in the Senate last year, but the Auto Alliance would like to see the proposed electric vehicles bill include funding for fuel cell research.

"Trying to prejudge the market brings tremendous risks, and the problem is compounded if we make just a few large bets," said Clay.

For the time being, the Senate is showing some agreement with the Obama administration by focusing on battery-electric vehicles in the short term, though it continues to fund fuel cell research as well. Sen. Byron Dorgan (D – N.D.), who co-sponsored the bill, believes fuel cells will be important in the future, but thinks electrification is the solution in the near term.

"Last year the administration cut out $190-million of hydrogen fuel cell research that's going on, I put it all back in," said Dorgan. "Hydrogen and fuel cells are important, but that is not the rapid deployment, the near term deployment is electric vehicles."