We sat down with Brookhaven theoretical physicist Raju Venugopalan for a conversation about “color glass condensate” and the structure of visible matter in the universe.

Q. We’ve heard a lot recently about a “new form of matter” possibly seen at the Large Hadron Collider (LHC) in Europe — a state of saturated gluons called “color glass condensate.” Brookhaven Lab, and you in particular, have a long history with this idea. Can you tell me a bit about that history?

A. The idea for the color glass condensate arose to help us understand heavy ion collisions at our own collider here at Brookhaven, the Relativistic Heavy Ion Collider (RHIC)—even before RHIC turned on in 2000, and long before the LHC was built. These machines are designed to look at the most fundamental constituents of matter and the forces through which they interact—the same kinds of studies that a century ago led to huge advances in our understanding of electrons and magnetism. Only now instead of studying the behavior of the electrons that surround atomic nuclei, we are probing the subatomic particles that make up the nuclei themselves, and studying how they interact via nature’s strongest force to “give shape” to the universe today.

We do that by colliding nuclei at very high energies to recreate the conditions of the early universe so we can study these particles and their interactions under the most extreme conditions. But when you collide two nuclei and produce matter at RHIC, and also at the LHC, you have to think about the matter that makes up the nuclei you are colliding. What is the structure of nuclei before they collide?

We all know the nuclei are made of protons and neutrons, and those are each made of quarks and gluons. There were hints in data from the HERA collider in Germany and other experiments that the number of gluons increases dramatically as you accelerate particles to high energy. Nuclear physics theorists predicted that the ions accelerated to near the speed of light at RHIC (and later at LHC) would reach an upper limit of gluon concentration—a state of gluon saturation we call color glass condensate.* The collision of these super-dense gluon force fields is what produces the matter at RHIC, so learning more about this state would help us understand how the matter is created in the collisions. The theory we developed to describe the color glass condensate also allowed us to make calculations and predictions we could test with experiments.

Q. Have we seen hints that this color glass condensate exists at RHIC?

A. The very first experimental hints of color glass condensate came from early collisions of gold ions at RHIC in 2000 and more significantly later from collisions of light deuterium ions with the heavier gold ions. The precursor for the LHC phenomenon was seen around 2006 by scientists from RHIC’s STAR collaboration and subsequently PHENIX and PHOBOS. They all saw signs that particles streaming out of the collisions were correlated in an interesting and surprising way that showed up as a little bump on the graph—which we called a “ridge” because it looked like a mountain ridge. RHIC and LHC scientists now use sophisticated analyses to break down this signal into subtle wiggles of varying strengths, which can be further analyzed.

Key aspects of the wiggles in particle correlations could be explained by the “flow” of the hot dense matter produced when the ions collide—which we now know is a liquid-like plasma of quarks and gluons. But the surprising correlations also carried important information about the very earliest stages of matter formation, telling us about how gluons inside the colliding nuclei were creating this matter in the first place. The experimental information was consistent with the structures being generated by very strong gluon force fields at very short distances within the colliding nuclei—distances, predicted by gluon saturation, to be much smaller than the proton size.

The other strong piece of evidence for color glass condensate and gluon saturation we alluded to came from deuteron-gold collisions at RHIC in 2003, which do not create quark-gluon plasma. Certain particles streaming out in the “forward” direction, which the BRAHMS experiment was particularly designed to detect, were suppressed. That is, fewer particles with a given momentum were coming out at this particular angle than had been expected. It appeared that instead of the deuteron colliding and interacting with individual protons or neutrons in the gold nucleus, the smaller particle was hitting a bunch of protons simultaneously—or a dense field of gluons that acts like sticky molasses, making it harder for particles with a given momentum to be produced. PHOBOS, STAR and PHENIX also saw similar suppressions. This was a genuine prediction of the color glass condensate picture. Further experiments at RHIC by STAR and PHENIX during the 2008 deuteron-gold run drew out more details on particle correlation patterns predicted by the CGC theory.

Q. Do the “ridge” correlations have any significance aside from being possible indications of gluon saturation?

A. All the extra wiggles give you much more information about the structure of the flow—similar to the way astronomers have learned how subtle fluctuations in the cosmic microwave background radiation have left their fingerprints on the structure of the universe today. So the discovery of the “ridge effect” at RHIC led us to understand how the details of the initial conditions could lead to detailed variations in the flow of the matter produced at both RHIC and the LHC. To understand the properties of the quark-gluon plasma, we need to understand the initial conditions in detail.

In addition, the ridge may be imaging how strong force field lines behave between color charges, just as the distribution of iron filings around a magnet tells us about the magnetic flux around a magnet. If this analogy is borne out, that would be quite fundamental information about the strong force.

Q. How did these findings affect the development of theory?

A. There were still other possible explanations. That’s how science works. You need to accumulate more and more evidence. With each experimental finding, your model gets tested and refined. The next piece of data can fracture it. Some people try to disprove the model. Other people did a lot of work to refine the theory and make it stronger. So, we made some predictions about what we might see in future experiments, both at RHIC and the LHC. Thus far the heavy ion results from the LHC are consistent with our expectations.

Then, physicists from the CMS collaboration at the LHC, several of whom had worked on RHIC’s PHOBOS detector, used their experience with RHIC collisions to look for the same thing in proton-proton collisions at LHC—at 14 times the energy of the highest-energy proton-proton collisions at RHIC. Because of the higher energy level, they were able to look at extremely rare events where more than 110 charged particles come out in a single collision of two protons. By picking the events with lots of particles, they are essentially choosing the events where gluons are at their highest concentration in the colliding protons. In these rare events, they saw a tiny ridge, just like the one in gold-gold collisions at RHIC.

We spent half a year trying to understand this. We had developed this theory to predict and explain the ridge, but we thought it would only be observed in heavy ion collisions. But one of the theorists had predicted it would be there in proton-proton collisions at the LHC, and there it was. It couldn’t be due to quark-gluon plasma, because you don’t have a big enough system with proton-proton collisions. It had to be caused by gluon saturation, not flow of QGP. When we looked in detail at the LHC data, we were able to explain how these effects would change with various conditions, and we were able to explain things more quantitatively.

Then, when we knew they were going to be doing a very short preliminary run of proton-lead collisions at LHC in late 2012, we made some predictions about what would be seen there.

Q. What did the LHC proton-lead experiments observe? Did these data match your predictions?

A. So the LHC did these proton-lead collisions in a pilot run for just four hours, but they got an immense amount of data—sufficient to see something really dramatic. They observed the same ridge effect we had seen in gold-gold collisions at RHIC. These collisions were at much higher energy—25 times the energy of the deuteron-gold collisions at RHIC. So you get more of a gluon shockwave and a lot more particles coming out. There was enough data to do much more stringent tests of the idea of gluon saturation by looking for these correlations.

In proton-lead collisions they saw a bigger signal than in proton-proton collisions—about six times larger, even in collisions with the same number of particles coming out. The QGP flow explanation would have given you roughly the same signal size for the same number of particles produced, so that seems unlikely to me. Instead, to me, the result is sort of a “smoking gun” that they were seeing gluon saturation, because the bigger signal associated with the same particle number has to be due to more gluons at the initial stage, before the collision.

Q. What kind of further tests can you do?

A. The LHC will get a lot more data to test these ideas from the proton-lead run coming up this winter. Our models are so well detailed that a significant deviation would be able to knock down the model. That’s a good thing. A sign of a good model is that its predictions are sufficiently detailed and clear that it can be tested and even disproved, and one learns something in the process of doing so. If this idea of the color glass condensate is to fail, we would still learn a great deal from the failure of these ideas and we’d have to think deeply about what would replace it.

Even though our model works, there are a number of fundamental things we do not understand and are forced to model imperfectly. That is why we really need an electron ion collider, where we could collide electrons with heavy ions to probe the structure of the gluon fields directly. Though it may not have the tremendous energy reach at the LHC, an electron ion collider would allow us to explore the structure of matter with much greater precision. Subtle details of the properties of these extreme states of matter will give us a much more detailed picture, at the most fundamental level, about the structure of matter.

But in the meantime we can do some very interesting things at RHIC. One thing we’d like to do is collide polarized protons with heavy ions at RHIC. RHIC is the world’s only polarized proton collider, where the spins of the particles (and thus the quarks and gluons within) can be aligned in a chosen direction. When the spin of the polarized quarks and gluons in the proton interact with the gluon shockwave of the nucleus, the spin can be changed in different ways, depending on where the proton travels through the nucleus. Teasing out how the spin directions scatter off the internal gluons will help determine how dense that gluon field is at different parts of the nucleus.

These studies may help us understand how the orbital motion of the quarks and gluons within the proton contributes to proton spin. And they give us a different way of probing gluon saturation.

Q. What will confirmation of gluon saturation mean for physics—and the rest of us?

A. We are studying the structure of matter in its most fundamental form to learn something very deep about the structure of the proton, the most fundamental stable piece of matter we know of in the universe. We are going further than our wildest imagination ever thought possible. What we once thought of as fundamental objects are turning out to be much more complex. What is the origin of visible matter in the universe? It is the quarks and gluons. And we are probing the complexity of those particles as much as possible under the most extreme conditions.

We don’t really know where that will lead. It could open up completely new directions. 100 years ago, people were asking the same kinds of questions about electrons and photons, which we now use in so many ways in our everyday lives. If you had told them then that there would be something like our National Synchrotron Light Source (NSLS) accelerating electrons and using photons to look at atomic-level structures of things like superconductors, proteins, and ribosomes—to make better materials for energy applications or drugs to treat disease—they would never have believed you. But those are the kinds of advances that come out of in-depth studies of subatomic particles and their interactions.

]]>http://scienceblogs.com/brookhaven/2013/01/10/gluon-walls-a-new-form-of-matter/feed/4The Future of Subatomic Glue-Gazinghttp://scienceblogs.com/brookhaven/2012/06/14/the-future-of-subatomic-glue-gazing/
http://scienceblogs.com/brookhaven/2012/06/14/the-future-of-subatomic-glue-gazing/#commentsThu, 14 Jun 2012 13:55:01 +0000http://scienceblogs.com/brookhaven/?p=267RHIC, the Relativistic Heavy Ion Collider at Brookhaven Lab, found it first: a “perfect” liquid of strongly interacting quarks and gluons – a quark-gluon plasma (QGP) – produced by slamming heavy ions together at close to the speed of light. The fact that the QGP produced in these particle smashups was a liquid and not the expected gas, and that it flowed like a nearly frictionless fluid, took the physics world by surprise. These findings, now confirmed by heavy-ion experiments at the Large Hadron Collider (LHC) in Europe, have raised compelling new questions about the nature of matter and the strong force that holds the visible universe together.

Similarly, searches for the source of “missing” proton spin at RHIC have opened a deeper mystery: So far, it’s nowhere to be found.

To probe these and other puzzles, nuclear physicists would like to build a new machine: an electron-ion collider (EIC) designed to shine a very bright “light” on both protons and heavy ions to reveal their inner secrets.

“An electron-ion collider would be the brightest, highest-intensity ‘femtoscope’ to shine on the structure of matter,” said Brookhaven theoretical physicist Raju Venugopalan, referring to its ability to discern structures at the scale of femtometers – that’s 10-15 meters, a millionth of a nanometer, or a millionth of a billionth of a meter!

“Snapshots” of matter at that scale over a wide range of energies would offer deeper insight into the substructure of the nucleus, its constituents, and particularly its smallest components, the quarks and gluons and how they interact.

“Increasingly, it’s looking as if gluons and their interactions may hold the keys to many of our puzzles,” Venugopalan said. An electron-ion collider would be the ideal tool for gazing at the “glue” under conditions where scientists believe that it completely dominates the structure of neutrons, protons, and nuclei.

Evolution of physicists' understanding of proton spin: from the early view that the spins of the proton's three quarks should make up most if not all the proton's spin (left), to one in which gluons and the motion of quarks and gluons can also play significant roles (center). Current and planned investigations of the angular motion of quarks and gluons — the latter to be carried out at by an electron-ion collider (right) — may help resolve the mystery of the missing source of spin.

If an electron-ion collider becomes a reality, what the physicists learn will offer deeper insight into what holds 99 percent of the matter in the visible universe together. That’s the percentage of everything we see around us – from stars to planets to our own physical forms – that gets its mass from protons and neutrons, and thus ultimately from the quarks and gluons governed by the strong force.

“At the most fundamental level,” Venugopalan said, “we are driven by our curiosity to learn more about what we are made up of.”

Much more about the physics behind an electron-ion collider and unraveling exciting mysteries on the horizon can be found in this feature story at Brookhaven’s website. Anyone interested in relativistic time dilation, missing spin, and super-saturated color glass condensate should check it out.

-Karen McNulty Walsh, BNL Media & Communications Office

]]>http://scienceblogs.com/brookhaven/2012/06/14/the-future-of-subatomic-glue-gazing/feed/4Replication Tangohttp://scienceblogs.com/brookhaven/2012/03/15/replication-tango/
http://scienceblogs.com/brookhaven/2012/03/15/replication-tango/#commentsThu, 15 Mar 2012 11:40:46 +0000http://scienceblogs.com/brookhaven/2012/03/15/replication-tango/There are many complex steps to the dance of DNA replication. And scientists must learn to sway along in order to understand how both healthy and cancerous cells divide.

Scientists at Brookhaven National Laboratory have begun to learn how to follow the complex molecular choreography by which intricate cellular proteins recognize and bind to DNA to start the replication process.

The replication process starts off the same way in every cell. In the cell’s DNA, there are defined sites called the “origin of replication.” The cell in which the DNA is housed uses a protein called the “origin recognition complex,” or ORC, to begin replication. Unlike the bacterial genome, which has only one origin in its genome of several million base pairs, more complex eukaryotic organisms, such as humans, with a genome of 3.4 billion base pairs, have tens of thousands of replication origins so that DNA replication can be carried out simultaneously at these sites to duplicate the genome quickly.

The DNA replication origin recognition complex (ORC) is a six-protein machine with a slightly twisted half-ring structure (yellow). ORC is proposed to wrap around and bend approximately 70 base pairs of double stranded DNA (red and blue). When a replication initiator Cdc6 (green) joins ORC, the partial ring is now complete and ready to load another protein onto the DNA. This last protein (not shown) is the enzyme that unwinds the double stranded DNA so each strand can be replicated.

The Brookhaven scientists’ goal in this study was to understand the first “moves” of eukaryotic genome replication. Specifically, how the ORC recognizes and binds to the origin DNA, and how the origin-bound ORC enables the attachment of additional protein machinery that unfurls the DNA double helix into two single strands in preparation for DNA copying. The work has strong implications for health and disease, because unregulated or disregulated chromosomal duplication and uncontrolled cellular proliferation are the hallmarks of cancer.

Whereas previous studies have approached these topics in simpler, prokaryotic organisms, this study is examining the eukaryotic protein’s complex processes in detail.

Scientists used an imaging method known as cryo-electron microscopy to make high-resolution images of ORC in yeast, a model eukaryote, in isolation, as it binds to DNA, and later in the process when another protein unit binds to activate the entire structure.

This imaging produced a map of the entire ORC structure as it changes during the activation process.

The scientists then turned to atomic-level x-ray crystal structures of small protein subunits that had been produced by other scientists to explore the details of ORC’s behavior. Basically, ORC is a two-lobed, crescent-shaped protein complex that wraps around and dips the DNA strand along the interior curve of the crescent. Sequential binding of a “replication initiator” then induces a significant shape change in the origin-bound ORC structure.

This structural alteration is likely what opens the way for the attachment of the next piece of protein machinery essential to the DNA-replication process — the one that unwinds the two strands of the DNA double helix so that each can be copied…so the replication tango can begin again.

Read the original press release from Brookhaven National Laboratory here.
The study is published in the March 7, 2012, issue of the journal Structure.

This guest post was written by Natalie Crnosija, a science-writing intern at Brookhaven National Laboratory, and Karen McNulty Walsh, the Lab’s principal science writer.

This guest post was written by Ernie Lewis, an atmospheric scientist at Brookhaven Lab, who is leading a year-long climate study aboard two Horizon Lines cargo ships, the Spirit and Reliance. He recently returned from a preliminary “cruise” from L.A. to Hawaii and back aimed at assessing conditions for deploying instruments aboard the ships during the actual study, dubbed MAGIC, which will run from October 2012 through September 2013.

Hawaii was wonderful, even though I only had a short time in Waikiki Beach (I’m getting no sympathy from my friends at work on this point, so I’m trying a wider audience). Our team of scientists left Los Angeles on MAGIC Leg00a on Saturday, February 11, at 5:30 a.m., along with nearly 1,000 cargo containers aboard the Horizon Spirit. We spent the previous two days installing a meteorological (met) system on the mast and a navigation system to characterize ship motion to aid in determining how to keep one of the radars and other instruments pointed vertically when the ship rolls. Both the met system and the navigation system worked very well for the entire cruise. We arrived in Honolulu on Wednesday, February 15, at 9:30 p.m., after a mostly cloudy and cool trip. Fortunately the seas weren’t too rough and there was no seasickness (hurray!).

Thursday in Hawaii we checked the instruments, met with personnel at the Horizon office in Honolulu, picked up some supplies, met colleagues for lunch, and did a cursory check of email to see if there were any emergencies. We had just enough time after our errands to buy a few Hawaiian shirts and have one Mai Tai at Waikiki Beach before turning in the rental car and returning to the Spirit. We departed at 11 p.m. on MAGIC Leg00b, and arrived back in Los Angeles on Thursday, February 23, at 7 a.m.

Preparing to launch a real weather balloon with vegetable ballast

The trip back was much windier! The ship was traveling at 15 knots and we had headwinds of 25 knots most of the way home. That’s a relative wind speed of more than 45 miles per hour! One of the goals of the trip was to investigate the feasibility of weather balloon launches from different locations around the ship. Filling balloons on deck in those conditions is, needless to say, quite challenging, and there were strong downdrafts around the ship, which added another challenge.

I am proud to report that during the cruise we successfully launched five potatoes, two turnips, one avocado, four sweet potatoes, one rutabaga, and three squash. These vegetables served as stand-ins for the meteorological sensing packages we’ll launch during the real study. They’re about the same weight, biodegradable, and don’t cost nearly as much as the actual instruments. We sent only one squash into the ocean, had one explode in hand, and popped only two balloons.

A successful launch

Though these launches were a source of amusement for the Spirit crew (and for us), they were nevertheless important, as launches will be made four times daily during the MAGIC deployment. Nearly 1,000 sites around the world routinely launch two weather balloons daily to collect data used for weather forecasts. The balloons are filled with helium to a diameter of about one meter, with “sondes” containing sensors attached below the balloons. From measurements of temperature, pressure, location (via GPS), and relative humidity, a detailed profile of atmospheric structure (i.e., temperature and relative humidity as a function of height) can be obtained. During the actual study, these measurements will form part of the data set that helps us construct a long-term scientific picture of clouds over the ocean with the goal of improving how these climate drivers are incorporated into future climate models.

Today’s public seminar at CERN, where the ATLAS and CMS collaborations presented the preliminary results of their searches for the Standard Model (SM) Higgs boson with the full dataset collected during 2011, is a landmark for high-energy physics!

The Higgs boson is a still-hypothetical particle postulated in the mid-1960s to complete what is considered the SM of particle interactions. Its role within the SM is to provide other particles with mass. Specifically, the mass of elementary particles is the result of their interaction with the Higgs field. The Higgs boson’s properties are defined in the SM, apart from its mass, which is a free parameter of the theory.

Scientists are looking for signs of the Higgs boson by searching for the products of its decay. Two of the most prominent decay channels, or ways the Higgs can decay, are to form two photons or to form a pair of Z bosons, each of which subsequently decays to a pair of leptons (electrons or muons). Brookhaven National Laboratory (BNL) has played and continues to play a key role in the design, construction, and operation of the detectors of the ATLAS experiment that are used to observe electrons and photons (the liquid argon electromagnetic calorimeter) and muons (the muon spectrometer). Major contributions are also made in the data analysis, where Brookhaven scientists have leading roles. BNL also significantly contributes to the trigger — deciding which events to analyze in detail — and to computing.

The ATLAS Detector at CERN

Owing to the excellent performance of the Large Hadron Collider (LHC) and the stable operation of the ATLAS and CMS detectors, the two collaborations have achieved a five-fold increase of the dataset presented during the summer conferences, only a few months ago. The new result excludes the vast majority of the range where the Higgs boson mass could potentially lie, and leaves very little hiding space for the elusive boson.

Furthermore, both experiments observed in several channels an intriguing upward fluctuation of the data. Is this the first glimpse of the Higgs boson or just a statistical fluctuation? Only improved analysis, and more data will tell!

Scientists at the LHC look eagerly forward to next year’s LHC run period starting in early spring 2012. If the LHC performance projections work out as expected — and the LHC crew has been very good in keeping promises — we should be able to double the available dataset in time for the summer conferences and have a conclusion on the existence or not of the last missing piece of the Standard Model of particle physics.

For more on the story, see the US LHC press release issued jointly by Brookhaven Lab and Fermi National Accelerator Laboratory.

The Horizon Spirit, a 272-meter cargo ship, makes the round trip between Los Angeles and Hawaii every two weeks.

This is not a story about the latest mega cruise ship, with five swimming pools, 10 restaurants, a rock-climbing wall, and a casino. The vessel we’re talking about, the Horizon Spirit, will be outfitted instead with radars, aerosol sampling devices, and other high-tech tools. But even without the fancy umbrella drinks, Ernie Lewis, an atmospheric scientist at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, can’t wait to set sail.

Last month, he and several colleagues traveled to California to visit the Spirit, a cargo carrier owned by Horizon Lines that makes regular runs to and from Hawaii. In January, they’ll embark on a round-trip voyage to investigate how to get the ship ready for a yearlong mission gathering data to improve climate models, a project funded by DOE’s Atmospheric Radiation Measurement (ARM) Climate Research Facility.

The idea is to take a long-term look at the clouds over the ocean. These clouds have a large influence on Earth’s climate, but climate models have a tough time accurately representing them and the transitions among their different types — stratocumulus, cumulus, and so on. So Lewis proposed deploying the ARM Facility’s sophisticated scientific instruments aboard a cargo vessel that already plies a route across an area of the Pacific where these cloud-type transitions are an ever-present phenomenon to gather data to improve the models.

The project — dubbed MAGIC, for the Marine ARM GPCI Investigation of Clouds, where GPCI is a project comparing data from the major climate models — was just approved and will begin next October and run through September 2013.

]]>http://scienceblogs.com/brookhaven/2011/12/08/magic-spirit-cruise/feed/0Using X-rays to Peel Back the Layers of a Purported Rembrandthttp://scienceblogs.com/brookhaven/2011/12/02/using-x-rays-to-peel-back-the/
http://scienceblogs.com/brookhaven/2011/12/02/using-x-rays-to-peel-back-the/#commentsFri, 02 Dec 2011 06:00:00 +0000http://scienceblogs.com/brookhaven/2011/12/02/using-x-rays-to-peel-back-the/This guest post was written by Mona S. Rowe, science writer for Brookhaven National Laboratory’s National Synchrotron Light Source (NSLS) and NSLS-II.

The quest to authenticate an unknown Rembrandt painting, titled “Old Man with a Beard,” hit a dramatic high at the National Synchrotron Light Source (NSLS) at Brookhaven National Laboratory. Using an advanced x-ray detector developed at NSLS, scientists found compelling evidence that the famous Dutch master did indeed have his own hand on the painting.

“After doing the experiments at NSLS, I felt that the painting I held in my hands was a genuine Rembrandt,” said D. Peter Siddons, physicist with the Photon Sciences Directorate. “We had identified hidden paint layers, which the art historians considered critical to determining attribution.”

Siddons explained that art historian Ernst van de Wetering and his colleagues — University of Delft materials scientist Joris Dik, art restorer Martin Bijl, and University of Antwerp chemist Koen Janssens — had all been working closely together to answer questions about the painting’s attribution and to probe beneath the surface for what they believed was a second image. The Europeans were eager to see what more they could learn using a specialized detector at the New York facility an ocean away.

Rembrandt’s “Old Man With a Beard.”
Courtesy Rembrandt House.

The detector, named Maia, produced high-definition maps of the spatial distribution of different chemical elements in the painting, at speeds up to 100 times faster than previously achievable. Those results gave scientific support to the declaration of authentication just announced by van de Wetering at the Rembrandt House Museum in Amsterdam. Van de Wetering is chair of the Rembrandt Research Project and considered a preeminent authority on Rembrandt.

Technology supported by the U.S. Department of Energy at Brookhaven is developed and optimized to study energy-relevant materials. New techniques and capabilities, however, can often shed light on other problems. According to Siddons, most of the light sources around the world now support an influential amount of “cultural heritage” research, both in art and archaeology. “This field poses interesting scientific questions, particularly since many objects are structurally and chemically very complex, challenging our most sophisticated instruments,” he said.

In the case of a painting, what lies underneath could be just as interesting, if not more so, than what the eye sees on the surface. But how do you peel back the layers on a fragile — and valuable — canvas without destroying the original image? Art historians and scientists have found an extremely effective way using penetrating beams of x-rays and powerful detectors such as Maia at NSLS.

Quite a few paintings have made appearances on the NSLS experimental floor — ranging from those created by 20th-century artists Bertha Lum and Edward Hopper to the portrait in the news now, declared to be painted nearly 400 years ago by a young Rembrandt Harmenszoon van Rijn.

“At the time, Rembrandt’s many students perfected their technique by copying their teacher’s works,” said Dik. “As a result, there’s a whole complex group of paintings that are exactly the same, slightly modified, or in some way related to each other,” he said. “The goal is to understand more about how they are related and where the hand of the master begins and ends.”

Initial work was done at the European Synchrotron Radiation Facility, where heavy elements such as lead and mercury could be visualized. Then Dik, in collaboration with the Rembrandt Research Project and the painting’s owner, a private collector, discovered some puzzling information: Studies with a portable x-ray fluorescence (XRF) scanner revealed high concentrations of copper, not evident on the painting’s surface.

To investigate this anomaly further, the group teamed up with researchers from Cornell University, Australia’s Commonwealth Scientific and Industrial Research Organisation (CSIRO), and NSLS to use XRF through the Maia detector.

In x-ray fluorescence, a tiny x-ray beam is focused onto a sample, causing some electrons to be ejected from their “orbits” around atomic nuclei, leaving an ionized atom behind. Other electrons in the atom “fall” into the newly created vacancy and, in doing so, emit x-rays at energies characteristic of specific elements. As the sample is scanned through a highly focused x-ray beam, a 2D map of elemental composition vs. position is obtained.

“This is an application that benefits greatly from a synchrotron,” said Arthur Woll, a researcher at the Cornell High Energy Synchrotron Source and an expert in a variant of scanning XRF, not used for this painting, that allows the XRF signal to be resolved in 3D. “Yet, like all fluorescence experiments, one of the limitations is detector speed.”

That roadblock is primarily why the Rembrandt group crossed an ocean to use NSLS, where Siddons and colleagues have an international reputation for developing advanced x-ray detectors like Maia that are much faster and more precise than the traditional variety.

“We developed Maia to address challenges experienced by all x-ray microprobes with poor detector performance, including the inability to examine statistically significant areas of highly inhomogeneous samples, like those found in environmental samples or complex materials used in energy conversion,” Siddons said. The application of the technique to the analysis of a potentially famous artwork gave the researchers a chance to demonstrate the detector’s prowess.

“I knew the kind of data they’d get from the NSLS fluorescence detector would knock their socks off,” Woll said.

And it did.

The Maia detector — based on a massively parallel detector array composed of 384 individual detectors, an “on-the-fly” scanning system, and advanced analysis software — imaged the entire painting in about eight hours, a job that would normally take about 30 days.

The copper mapping by Maia revealed contour lines of a beardless, seemingly younger male figure wearing a collar and beret, characteristic of Rembrandt’s early self-portraits. It appears that Rembrandt started a self-portrait, left the painting unfinished, and later painted over the earlier work.

The newly authenticated Rembrandt painting under analysis in the Maia detector at Brookhaven Lab’s NSLS beamline X7A. Image from Out of the Shadows, a new documentary on the scientific analysis of artworks.

“The copper revealed by x-ray fluorescence corresponds to the underpainting, the first monochrome layer that Rembrandt traditionally put down when creating a new work,” Dik said. Cross-sectional analysis later confirmed the presence of copper in the lower layer of paint.

A third image, verified by infrared photography, revealed possibly a sketch made from carbon of a figure wearing a turban with a feather. “One thing was clear after the images Maia produced — this was no student copy,” said Dik. “You can’t always judge a painting by its surface!”

Given access to the underlying layers through the high-tech imaging techniques of science, the team’s art historians were able to draw enough similarities with other Rembrandt works and studio techniques to conclude that this painting was indeed by the master himself. According to Dik, someone who is copying a painting – such as a student in Rembrandt’s studio – would not change anything; they would simply replicate the work. Reworking an original painting into another work is an example of experimental techniques used by Rembrandt. The buried paint is proof of that, he said.

Dik added, “When I brought the painting to NSLS with my assistant, Matthias Alfeld, I knew the Maia detector was fast and sensitive. But the images far exceeded my expectations. We were very, very impressed.”

Siddons said that the Maia detector was originally inspired by a custom-made integrated circuit designed by Brookhaven Lab’s Instrumentation Division for a different x-ray technique, extended x-ray absorption fine structure. Maia was developed during an eight-year collaboration with CSIRO. With their expertise in high-speed custom computers, CSIRO added a real-time computer, which performs a sophisticated algorithm to extract the elemental contributions from the raw data as they are acquired. The algorithm allows reliable separation of weak fluorescence peaks that would otherwise be buried by nearby strong peaks from abundant elements. (See http://nmp.csiro.au/dynamic.html) Because of its speed and precision, the Maia detector and its associated analysis techniques will enhance studies that use x-ray fluorescence in the biological, environmental, geological, and materials sciences for measuring trace element concentrations in everything from soil deposits to the microstructure of exotic energy materials — and works of art.

The Maia detector recently won an R&D100 Award, which recognizes the top high-tech innovations.

Siddons is funded by the Office of Basic Energy Sciences (BES) within the U.S. Department of Energy Office of Science. BES plays a critical role in supporting detector development for research in a range of scientific disciplines, including materials science, physics, chemistry, and medicine. New detectors are essential to sustaining innovative science in the U.S.

]]>http://scienceblogs.com/brookhaven/2011/12/02/using-x-rays-to-peel-back-the/feed/0The Long Island Solar Farm at Brookhaven Lab is Generating Electricityhttp://scienceblogs.com/brookhaven/2011/11/18/the-long-island-solar-farm-at/
http://scienceblogs.com/brookhaven/2011/11/18/the-long-island-solar-farm-at/#commentsFri, 18 Nov 2011 12:30:00 +0000http://scienceblogs.com/brookhaven/2011/11/18/the-long-island-solar-farm-at/This guest post was written by Pat Looney, chair of the Sustainable Energy Technology Department at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory.

If the sun is shining over Long Island, NY, as you read this article, the Long Island Solar Farm (LISF) is generating enough clean solar energy to power as many as 4,500 homes for the Long Island Power Authority (LIPA).

Construction of the LISF at Brookhaven National Laboratory (BNL) began in the fall of 2010 and officially concluded this month when the array began commercial operation. LIPA hosted a formal commissioning ceremony today, November 18.

LISF is the largest solar power plant in the eastern United States. It sits atop nearly 200 acres at the southeast end of the Laboratory site and consists of 164,000 solar panels that provide LIPA with up to 32 megawatts of alternating current electricity.

Some of the 164,000 solar panels that make up the Long Island Solar Farm.

The LISF was developed by BP Solar and is privately owned, however, BNL will have access to data from the array as a condition of the easement agreement granted by DOE for use of the land. So as the solar panels at LISF are now collecting energy from the sun, researchers at Brookhaven are busy installing sensors and imagers to collect large amounts of data from LISF systems. The data will be used by researchers at the Lab and across the country to address the key issues facing deployment of large-scale solar power plants.

One of the main challenges we are focused on is the role of distributed generation — power generated by multiple sources distributed across a region — and how it will affect the efficiency and reliability of electricity delivery. For the past hundred years or so, electricity has been generated at large centralized power sources, such as dams and power plants, and then delivered to customers through the existing distribution grid of transformers, power lines, etc. With solar energy sources as large as the LISF and as small as arrays installed on roofs of homes, electricity from multiple power sources is now being integrated into that same grid, and the impacts of integrating this distributed generation are not well understood. Data from the LISF will help researchers develop and validate models that can be used to study this issue and develop new technologies to address it.

The Long Island Solar Farm will provide up to 32 megawatts of electricity — enough to power some 4,500 homes and businesses.

Another issue that will be studied is how the intermittent nature of solar energy will affect operation of the electric grid. As the sun rises, it sometimes shines brightly or is blocked by clouds, and then sets in the evening, so amounts of electricity generated at solar arrays will vary and be intermittent. These fluctuations could adversely affect the electric grid. With innovations and upgrades developed as a result of our quantitative observations, we want to make the grid and the electricity it provides as efficient, secure, and resilient as possible.

Since the LISF is privately owned, new technologies cannot be tested there. That is why construction of a second array on site — the Northeast Solar Energy Research Center (NSERC) — will begin in early 2012 and is expected to be complete next summer. This research array will be a DOE-owned user facility and a proving ground for Brookhaven Lab and our industrial partners to test new solar system technologies, including electrical inverters, storage devices, solar modules, and other technologies.

The five-acre NSERC array will generate approximately 700 kilowatts to one megawatt of electricity at full power, which will be distributed to the Laboratory’s electrical network for our own use.

Many different groups have worked together to develop both LISF and NSERC. Here on site, the Environment & Life Sciences, Facilities & Operations, and Global and Regional Solutions directorates, Information Technology Division, and the DOE Brookhaven Site Office have been instrumental in preparing our research agenda, and developing a plan to bring it to fruition. Collaborators from off site include American Superconductor, Blue Oak Energy, BP Solar, Electric Power Research Institute, General Electric, LIPA, National Renewable Energy Laboratory, New York State Energy Research and Development Authority, Stony Brook University, and University of California at San Diego. We gratefully acknowledge the DOE Energy Efficiency and Renewable Energy’s Solar Energy Technology Program for providing funding for this research.

]]>http://scienceblogs.com/brookhaven/2011/11/18/the-long-island-solar-farm-at/feed/0How I Learned to Start Worrying and Hate the Tick Bombhttp://scienceblogs.com/brookhaven/2011/10/18/how-i-learned-to-start-worryin/
http://scienceblogs.com/brookhaven/2011/10/18/how-i-learned-to-start-worryin/#commentsTue, 18 Oct 2011 15:52:32 +0000http://scienceblogs.com/brookhaven/2011/10/18/how-i-learned-to-start-worryin/This guest post was written by Brookhaven Lab science writing intern Kenrick Vezina, who will be sharing Brookhaven science stories from inside and outside laboratories on site through mid December.

I’m about to enter the well-worn, vegetation-free (read: tick-free) pathway that cuts through the forest near my dorm. I’m about two steps down the trail when I hear a screech from somewhere in the canopy overhead. It’s not the full-out war cry of a red-tailed hawk — the sound we’ve been trained by television to expect from the beak of every bird of prey — but it definitely sounds like a raptor. On my honor as a naturalist, I must investigate.
I can’t spot the bird, but it continues making furtive, rasping calls, as though taunting me to step off the trail to find it. It’s moving further into the woods. I look at the edge of the forest. White-tailed deer have eliminated most of the undergrowth, but there’s still enough low vegetation to make a haven for ticks. I shouldn’t.

Still, I think, I’ll be careful — just a few steps, and I’ll check myself for any unwanted hangers-on in just a moment.

Long story short, I don’t find the hawk, and, mildly disappointed, I resume my walk home. An idle glance downward reveals some mud stains on the bottom of my pants. Odd, I didn’t think the ground was wet.

Then, I watch in equal parts horror and awe as the “stains” begin to disassemble into hundreds of individual specks.

Two words: tick bomb!

A female dog tick (right) and blacklegged tick (left) side-by-side on an inch-scale ruler. Larval ticks are much smaller even than this — about the size of the period at the end of this sentence.

Each speck turned out to be a tick larva, smaller than the head of a pin. This time of year, says BNL’s Cultural and Natural Resource Manager Tim Green, tick egg masses are hatching, which means there are gobs of tiny, newborn ticks clustered on low-lying plants, just waiting for their first blood meal.

You can imagine my dismay at realizing I was potential host for more ticks than I could count, but rest assured that tossing my pants into the dryer for an hour was more than enough to kill all the larvae. (Ticks are very sensitive to humidity, and they die quickly if they dry out.)

Let my experience be a warning: if you’re going into the woods, don’t take chances — try these tips:

• Wear long pants and long sleeves.
• Use insect repellent.
• Be sure to check yourself for ticks.
• If you do get bitten, follow up with a doctor.

Thanks to everyone who provided warm words in response to my first post. If you’re a BNLer, or if you’re just familiar with the ecology on Long Island, I’d love to hear about your own experiences in the comments. Perhaps you can tip me off to interesting things to investigate. And don’t hesitate to share any questions.

Jason Graetz, left, and Jiajun Chen at NSLS beamline X14A with their transparent reactor for viewing chemistry in real time.

Here’s a recipe for basic chemistry: Mix a bunch of stuff in a reaction vessel and see what happens. Only you don’t really see the action taking place — unless you have some way to visualize the molecular magic.

Researchers at Brookhaven National Laboratory have developed just such a technique: They’ve fabricated a transparent chemical reactor vessel that allows x-rays to pass through and capture the chemical changes as they take place.

They recently used this real-time reaction monitoring setup to study the synthesis of lithium iron phosphate and pinpoint the best conditions for producing a defect-free material for rechargeable batteries.
Jason Graetz, a materials scientist and leader of Brookhaven’s energy storage group, explains the benefits this way:

Generally we make battery materials in a stainless steel reactor. There’s no window, no way to see the reaction — we just see what goes in and what comes out. So we designed a reactor made out of a glass capillary and, using synchrotron x-ray diffraction, we can not only probe the precursors — the initial parts of the reaction — but we can also track what happens as the reaction takes place.

The scientists started with a slurry of both solid and liquid precursors, placed them in the glass capillary reaction vessel, and placed the whole setup in beamline X14A at the National Synchrotron Light Source (NSLS), a source of extremely bright x-rays and other forms of light for probing materials’ structure and properties. As the x-rays pass through the transparent reaction vessel, they bounce off, or get diffracted by, the atoms in the reactor, producing a pattern that reveals the atomic structure of the various materials in the reactor and how they change as the reaction takes place.

Says Graetz:

Because we’re getting the diffraction pattern, we can learn something about the structure. By analyzing these diffraction patterns, we can also learn about the defect concentration in the material and can track the defects in real time as a function of temperature or time in the reaction.

By doing a series of experiments at different temperatures and different lengths of time, the scientists can identify where the defects are and where they start to disappear, allowing them to pinpoint the lowest temperature and the simplest reaction to produce a defect-free material. This research should therefore eliminate the need for further processing, thus reducing the cost of the most expensive part of lithium-ion batteries.