Brookhaven | Long Island, NY | USA

With the discovery of the long-sought Higgs boson at the Large Hadron Collider (LHC), the world’s largest and most powerful particle collider, folks unfamiliar with the intricacies of particle physics might think the field has reached its end. But physicists gathered at the Large Hadron Collider Physics Conference in New York City June 2-7 say they are eager to move forward. Even amid discussions of tight budgets that make some proposed projects appear impossible, the general tenor, as expressed by leaders in the field, is that the future holds great potential for even more significant discoveries.

Physicists joined New York Times science correspondent Dennis Overbye for a discussion on the future of the field.

At a session devoted to reflection and the future of the field, held Friday, June 6, Fabiola Gianotti, a particle physicist at Europe’s CERN laboratory (home of the LHC) and spokesperson for the LHC’s ATLAS experiment at the time of the Higgs discovery, said, “There is challenging work for everyone to make the impossible possible.” In fact, said James Siegrist, Associate Director of the Office of High Energy Physics within the U.S. Department of Energy’s (DOE) Office of Science, “I think the promise of the physics has never been greater.”

Co-sponsored by DOE’s Brookhaven National Laboratory and Columbia University, the week-long meeting featured updates on key findings from the LHC’s four experiments (including a possible hint of new physics), advances in theory, plans for future upgrades, and even future colliders—as well as apanel discussion moderated by Dennis Overbye, a science correspondent for the New York Times.

“We had a very successful conference with more than 300 participants discussing an impressive array of results from the recent LHC run,” said Brookhaven physicist Srini Rajagopalan, U.S. ATLAS Operations Program Manager and a co-organizer of the meeting. He also noted the extremely positive response to an open-to-the-public screening of Particle Fever, a documentary film that follows six scientists during the years leading up to the discovery of the Higgs boson. “I was simply amazed at the public interest in what we do. From young school students to senior citizens, people thronged to watch the movie and continued to ask questions late into the night.”

What keeps you up at night?

At Friday’s panel session, the Times’ Overbye had some questions of his own, perhaps more pointed that the public’s. He asked whether particle physicists’ streak of discoveries could be continued, whether the “glory days” for the U.S. were over, and what keeps physicists up at night. The panelists were realistic about challenges and the need for smart choices and greater globalization. But a spirit of optimism prevailed.

Natalie Roe, Director of the Physics Division at DOE’s Lawrence Berkeley National Laboratory—the first to respond—said, “I’m going to flip the question [of what keeps me up and night] and answer what gets me up in the morning.” Following a long period of experimental and theoretical successes, including the discovery of the Higgs, she said, “this is a very exciting time. There are still a few remaining details … dark matter and dark energy. And these are more than details; they are 95 percent of the universe!” With a long list of techniques available to get answers, she said, there is much work to be done.

University of California, Santa Cruz, physicist Steve Ritz, who recently chaired the Particle Physics Project Prioritization Panel (P5) and presented its recommendations for the future of the field, emphasized the importance of “telling our story,” staging and prioritizing future projects, and “aspiring to a greater program” that continues investments in crucial research and development to lay the foundation for future facilities.

Great technology progress, great challenges

In an overview talk that preceded the panel discussion, Gianotti presented a range of such future projects, including two possible linear accelerators, one in Japan the other at CERN, and two possible circular colliders, one in China and one at CERN. The latter, dubbed FCC, would be a proton-proton collider 80-100 kilometers in circumference—on the scale of the Superconducting Supercollider (SSC) once planned for and later cancelled in the U.S. Such a machine would push beyond the research limits of even the most ambitious upgrades proposed for the LHC.

Those upgrades, planned for data taking in Phase I in 2020 and Phase II in 2025, will begin the exploration of the coupling of the Higgs with other particles to explore the mechanism by which the Higgs generates mass, “electroweak symmetry breaking,” and searches for new physics beyond the standard model and into the realm of dark matter.

But, to really get at the heart of those questions and possibly reveal unknown physics, the scientists say the need for even higher precision and higher energy is clear.

Journey to the dark side

“Our elders had it easy compared to our students,” said Siegrist, describing the physics challenges now open to exploration. He likened this moment in time to the end of a video game his son had played where, “at the end of the game, you end up on ‘the dark side’ and have to start again.” In physics, he said, the dark sector—exploring dark matter and dark energy—is going to be equally challenging to everything that has come before.

To those who say building the future machines needed for this journey is impossible, Gianotti says, “didn’t the LHC also look close to impossible in the 1980s?” The path forward, she emphasized, is to innovate.

“Accelerator R&D is very important,” said Ritz, noting that, “anything we can do to design these machines to cost less” in terms of construction and operation should be done. “We need to be impatient about this,” he said. “We need to ask more and jump in more.”

Panelist Nima Arkani-Hamed, a theorist at the Institute of Advanced Study at Princeton University and Director of the Center for Future High Energy Physics in Beijing, China, likely agrees. He acknowledges the difficult task facing U.S. leadership in high-energy physics. “They are trying to make due with a budget that’s two or three times less than what our vision and this country deserves, and they are doing a good job,” he said. “But I worry that our generation will be viewed as the one that dropped the ball.”

“The sequence of steps for the next few decades is possible,” he added later. “It’s just a matter of will, not technology.”

But because of the scale and cost of future projects, he, like others, emphasized that “we will need the whole world and new pockets of resources and talent.”

The value of collaboration, competition, and globalization

Sergio Bertolucci, Director for Research and Computing at CERN, agreed. “We have been international, but we need to be truly global.”

Such cooperation and competition among nations is good for the field, Ritz emphasized. “We are intensely competitive. We want to be the ones to discover [something new.] But we are also cooperative because we can’t do it alone.”

Panelist Jerry Blazey, Assistant Director for Physical Sciences in the
Office of Science and Technology Policy, DOE’s Siegrist, and others agreed that the LHC is a great model for the field to stand behind and emulate for future collaborative projects. Blazey and Siegrist said OSTP and DOE would work together to discuss ways to smooth the process for such future multinational collaborations and to implement the recommendations of the P5 report.

These include future U.S. work at the LHC, an internationalized Long Baseline Neutrino Facility located at Fermi National Accelerator Laboratory, and a role in Japan’s proposed linear collider, as well as continued investments in the technologies needed for future experiments. Said University of California, Irvine, physicist Andrew Lankford, chair of the High Energy Physics Advisory Panel (HEPAP) to whom the report was delivered, the P5 report describes a field optimized for scientific progress. “It’s a ten year strategic plan—way more than a collection of cool experiments,” he said.

And it emphasizes the value of international competition and cooperation—perhaps one of the biggest successes of particle physics, aside from the breathtaking discoveries. Turning again to the example of the LHC collaborations, Ritz said, “50 years ago some of these people were in countries that were trying to kill one another. Now we don’t even think about what country they are from.”

As Brookhaven’s Rajagopalan summed up, “It is an exciting time for our field as we plan to move forward with ambitious global projects to address the fundamental questions of nature.”

Brookhaven Lab’s particle physics research is supported by the DOE Office of Science.

Advances in accelerators built for fundamental physics research have inspired improved cancer treatment facilities. But will one of the most promising—a carbon ion treatment facility—be built in the U.S.? Participants at a symposium organized by Brookhaven Lab for the 2014 AAAS meeting explored the science and surrounding issues.

by Karen McNulty Walsh

Accelerator physicists are natural-born problem solvers, finding ever more powerful ways to generate and steer particle beams for research into the mysteries of physics, materials, and matter. And from the very beginning, this field born at the dawn of the atomic age has actively sought ways to apply advanced technologies to tackle more practical problems. At the top of the list—even in those early days— was taking aim at cancer, the second leading cause of death in the U.S. today, affecting one in two men and one in three women.

Using beams of accelerated protons or heavier ions such as carbon, oncologists can deliver cell-killing energy to precisely targeted tumors—and do so without causing extensive damage to surrounding healthy tissue, eliminating the major drawback of conventional radiation therapy using x-rays.

“This is cancer care aimed at curing cancer, not just treating it,” said Ken Peach, a physicist and professor at the Particle Therapy Cancer Research Institute at Oxford University.

Peach was one of six participants in a symposium exploring the latest advances and challenges in this field—and a related press briefing attended by more than 30 science journalists—at the 2014 meeting of the American Association for the Advancement of Science in Chicago on February 16. The session, “Targeting Tumors: Ion Beam Accelerators Take Aim at Cancer,” was organized by the U.S. Department of Energy’s (DOE’s) Brookhaven National Laboratory, an active partner in an effort to build a prototype carbon-ion accelerator for medical research and therapy. Brookhaven Lab is also currently the only place in the U.S. where scientists can conduct fundamental radiobiological studies of how beams of ions heavier than protons, such as carbon ions, affect cells and DNA.

“We could cure a very high percentage of tumors if we could give sufficiently high doses of radiation, but we can’t because of the damage to healthy tissue,” said radiation biologist Kathryn Held of Harvard Medical School and Massachusetts General Hospital during her presentation. “That’s the advantage of particles. We can tailor the dose to the tumor and limit the amount of damage in the critical surrounding normal tissues.”

Yet despite the promise of this approach and the emergence of encouraging clinical results from carbon treatment facilities in Asia and Europe, there are currently no carbon therapy centers operating in the U.S.

Participants in the Brookhaven-organized session agreed: That situation has to change—especially since the very idea of particle therapy was born in the U.S.

Physicists as pioneers

“When Harvard physicist Robert Wilson, who later became the first director of Fermilab, was asked to explore the potential dangers of proton particle radiation [just after World War II], he flipped the problem on its head and described how proton beams might be extremely useful—as effective killers of cancer cells,” said Stephen Peggs, an accelerator physicist at Brookhaven Lab and adjunct professor at Stony Brook University.

As Peggs explained, the reason is simple: Unlike conventional x-rays, which deposit energy—and cause damage—all along their path as they travel through healthy tissue en route to a tumor (and beyond it), protons and other ions deposit most of their energy where the beam stops. Using magnets, accelerators can steer these charged particles left, right, up, and down and vary the energy of the beam to precisely place the cell-killing energy right where it’s needed: in the tumor.

The first implementation of particle therapy used helium and other ions generated by the Bevatron at Berkeley Lab. Those spin-off studies “established a foundation for all subsequent ion therapy,” Peggs said. And as accelerators for physics research grew in size, pioneering experiments in particle therapy continued, operating “parasitically” until the very first accelerator built for hospital-based proton therapy was completed with the help of DOE scientists at Fermilab in 1990.

But even before that machine left Illinois for Loma Linda University Medical Center in California, physicists were thinking about how it could be made better. The mantra of making machines smaller, faster, cheaper—and capable of accelerating more kinds of ions—has driven the field since then.

Advances in magnet technology, including compact superconducting magnets and beam-delivery systems developed at Brookhaven Lab, hold great promise for new machines. Peggs is working to incorporate these technologies in a prototype ‘ion Rapid Cycling Medical Synchrotron’ (iRCMS) capable of delivering protons and/or carbon ions for radiobiology research and for treating patients.

The benefits of using charged particles heavier than protons (e.g., carbon ions) stem not only from their physical properties—they stop and deposit their energy over an even smaller and better targeted tumor volume than protons—but also a range of biological advantages they have over x-rays.

As Kathryn Held elaborated in her talk, compared with x-ray photons, “carbon ions are much more effective at killing tumor cells. They put a huge hole through DNA compared to the small pinprick caused by x-rays, which causes clustered or complex DNA damage that is less accurately repaired between treatments—less repaired, period—and thus more lethal [to the tumor].” Carbon ions also appear to be more effective than x-rays at killing oxygen-deprived tumor cells, and might be most effective in fewer higher doses, “but we need more basic biological studies to really understand these effects,” Held said.

Different types of radiation treatment cause different kinds of damage to the DNA in a tumor cell. X-ray photons (top arrow) cause fairly simple damage (purple area) that cancer cells can sometimes repair between treatments. Charged particles—particularly ions heavier than protons (bottom arrow)—cause more and more complex forms of damage, resulting in less repair and a more lethal effect on the tumor. (Credit: NASA)

Held conducts research at the NASA Space Radiation Laboratory (NSRL) at Brookhaven Lab, an accelerator-based facility designed to fully understand risks and design protections for future astronauts exposed to radiation. But much of that research is relevant to understanding the mechanisms and basic radiobiological responses that can apply to the treatment of cancer. But additional facilities and funding are needed for research specifically aimed at understanding the radiobiological effects of heavier ions for potential cancer therapies, Held emphasized.

Hak Choy, a radiation oncologist and chair in the Department of Radiation Oncology at the University of Texas Southwestern Medical Center, presented compelling clinical data on the benefits of proton particle therapy, including improved outcomes and reduced side effects when compared with conventional radiation, particularly for treating tumors in sensitive areas such as the brain and spine and in children. “When you can target the tumor and spare critical tissue you get fewer side effects,” he said.

Data from Japan and Europe suggest that carbon ions could be three or four times more biologically potent than protons, Choy said, backing that claim with impressive survival statistics for certain types of cancers where carbon therapy surpassed protons, and was even better than surgery for one type of salivary gland cancer. “And carbon therapy is noninvasive,” he emphasized.

To learn more about this promising technology and the challenges of building a carbon ion treatment/research facility in the U.S., including perspectives from the National Cancer Institute, DOE and a discussion about economics, read the full summary of the AAAS symposium here: http://www.bnl.gov/newsroom/news.php?a=24672.

Karen McNulty Walsh is a science writer in the Media & Communications Office at Brookhaven National Laboratory.

This post was written by Brookhaven Lab scientists Shigeki Misawa and Ofer Rind.

Run 13 at the Relativistic Heavy Ion Collider (RHIC) began one month ago today, and the first particles collided in the STAR and PHENIX detectors nearly two weeks ago. As of late this past Saturday evening, preparations are complete and polarized protons are colliding with the machine and detectors operating in “physics mode,” which means gigabytes of data are pouring into the RHIC & ATLAS Computing Facility (RACF) every few seconds.

Today, we store data and provide the computing power for about 2,500 RHIC scientists here at Brookhaven Lab and institutions around the world. Approximately 30 people work at the RACF, which is located about one mile south of RHIC and connected to both the Physics and Information Technology Division buildings on site. There are four main parts to the RACF: computers that crunch the data, online storage containing data ready for further analysis, tape storage containing archived data from collisions past, and the network glue that holds it all together. Computing resources at the RACF are split about equally between the RHIC collaborations and the ATLAS experiment running at the Large Hadron Collider in Europe.

For RHIC, the data comes from heavy ions or polarized protons that smash into each other inside PHENIX and STAR. These detectors catch the subatomic particles that emerge from the collisions to capture information—particle species, trajectories, momenta, etc.—in the form of electrical signals. Most signals aren’t relevant to what physicists are looking for, so only the signals that trip predetermined triggers are recorded. For example, with the main focus for Run 13 being the proton’s “missing” spin, physicists are particularly interested in finding decay electrons from particles called W bosons, because these can be used as probes to quantify spin contributions from a proton’s antiquarks and different “flavors” of quarks.

Computers in the “counting houses” at STAR and PHENIX package the raw data collected from selected electrical signals and send it all to the RACF via dedicated fiber-optic cables. The RACF then archives the data and makes it available to experimenters running analysis jobs on any of our 20,000 computing cores.

Recent Upgrades at the RACF

Polarized protons are far smaller than heavy ions, so they produce considerably less data when they collide, but even still, when we talk about data at the RACF, we’re talking about a lot of data. During Run 12 last year, we began using a new tape library to increase storage capacity by 25 percent for a total of 40 petabytes—the equivalent of 655,360 of the largest iPhones available today. We also more than doubled our ability to archive data for STAR last year (in order to meet the needs of a data acquisition upgrade) so we can now sustain 700 megabytes of incoming data every second for both PHENIX and STAR. Part of this is due to new fiber-optic cables connecting the counting houses to the RACF, which provide both increased data rates and redundancy.

With all this in place, along with those 20,000 processing cores (most computers today have two or four cores), certain operations that used to require six months of computer time now can be completed often in less than one week.

Looking Ahead

If pending budgets allow for the full 15-week run planned, we expect to collect approximately four petabytes of data from this run alone. During the run, we meet formally with liaisons from the PHENIX and STAR collaborations each week to discuss the amount of data expected in the coming weeks and to assess their operational needs. Beyond these meetings, we are in continual communication with our users, as we monitor and improve system functionality, troubleshoot, and provide first-line user support.

We’ll also continue to work with experimenters to evaluate computing trends, plan for future upgrades, and test the latest equipment—all in an effort to minimize bottlenecks that slow the data from getting to users and to get the most bang for the buck.

Heat: Adventures in the World's Fiery Places (Little Brown, 2013). If you haven't already fallen in love with the groundbreaking science that's taking place at RHIC, this book about all things hot is sure to ignite your passion.

Bill Streever, a biologist and best-selling author of Cold: Adventures in the World’s Frozen Places, has just published his second scientific survey, which takes place at the opposite end of the temperature spectrum. Heat: Adventures in the World’s Fiery Places features flames, firewalking, and notably, a journey into the heart of the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory.

I accompanied Streever for a full-day visit in July 2011 with physicist Barbara Jacak of Stony Brook University, then spokesperson of the PHENIX Collaboration at RHIC. The intrepid reporter (who’d already tagged along with woodland firefighters and walked across newly formed, still-hot volcanic lava—among other adventures described in the book) met with RHIC physicists at STAR and PHENIX, descended into the accelerator tunnel, and toured the refrigeration system that keeps RHIC’s magnets supercold. He also interviewed staff at the RHIC/ATLAS Computing Facility—who face the challenge of dissipating unwanted heat while accumulating and processing reams of RHIC data—as well as theorists and even climate scientists, all in a quest for understanding the ultrawarm.

The result is an enormously engaging, entertaining, and informative portrayal of heat in a wide range of settings, including the 7-trillion-degree “perfect” liquid quark-gluon plasma created at RHIC, and physicists’ pursuit of new knowledge about the fundamental forces and interactions of matter. But Streever’s book does more: It presents the compelling story of creating and measuring the world’s hottest temperature within the broader context of the Lab’s history, including its role as an induction center during both World Wars, and the breadth and depth of our current research—from atoms to energy and climate research, and even the Long Island Solar Farm.

“Brookhaven has become an IQ magnet, where smart people congregate to work on things that excite geniuses,” he writes.

Streever’s own passion for science comes across clearly throughout the book. But being at “the top of the thermometer” (the title of his final chapter, dedicated in part to describing RHIC) has its privileges. RHIC’s innermost beam pipes—at the hearts of its detectors, inside which head-on ion collisions create the highest temperature ever measured in a laboratory—have clearly left an impression:

“… I am forever enthralled by Brookhaven’s pipes. At the top of the thermometer, beyond any temperature that I could possibly imagine, those pipes explore conditions near the beginning of the universe … In my day-to-day life, bundled in a thick coat or standing before my woodstove or moving along a snow-covered trail, I find myself thinking of those pipes. And when I think of them, I remember that at the top of the thermometer lies matter with the audacity to behave as though it were absolutely cold, flowing like a perfect liquid…”

There’s more, a wonderful bit more that conveys the pure essence of science. But I don’t want to spoil it. Please read and share this book. The final word is awe.

The book is available for purchase through major online retailers and in stores.

We sat down with Brookhaven theoretical physicist Raju Venugopalan for a conversation about “color glass condensate” and the structure of visible matter in the universe.

Q. We’ve heard a lot recently about a “new form of matter” possibly seen at the Large Hadron Collider (LHC) in Europe — a state of saturated gluons called “color glass condensate.” Brookhaven Lab, and you in particular, have a long history with this idea. Can you tell me a bit about that history?

A. The idea for the color glass condensate arose to help us understand heavy ion collisions at our own collider here at Brookhaven, the Relativistic Heavy Ion Collider (RHIC)—even before RHIC turned on in 2000, and long before the LHC was built. These machines are designed to look at the most fundamental constituents of matter and the forces through which they interact—the same kinds of studies that a century ago led to huge advances in our understanding of electrons and magnetism. Only now instead of studying the behavior of the electrons that surround atomic nuclei, we are probing the subatomic particles that make up the nuclei themselves, and studying how they interact via nature’s strongest force to “give shape” to the universe today.

We do that by colliding nuclei at very high energies to recreate the conditions of the early universe so we can study these particles and their interactions under the most extreme conditions. But when you collide two nuclei and produce matter at RHIC, and also at the LHC, you have to think about the matter that makes up the nuclei you are colliding. What is the structure of nuclei before they collide?

We all know the nuclei are made of protons and neutrons, and those are each made of quarks and gluons. There were hints in data from the HERA collider in Germany and other experiments that the number of gluons increases dramatically as you accelerate particles to high energy. Nuclear physics theorists predicted that the ions accelerated to near the speed of light at RHIC (and later at LHC) would reach an upper limit of gluon concentration—a state of gluon saturation we call color glass condensate.* The collision of these super-dense gluon force fields is what produces the matter at RHIC, so learning more about this state would help us understand how the matter is created in the collisions. The theory we developed to describe the color glass condensate also allowed us to make calculations and predictions we could test with experiments. (more…)

The art of data mining is about searching for the extraordinary within a vast ocean of regularity. This can be a painful process in any field, but especially in particle physics, where the amount of data can be enormous, and ‘extraordinary’ means a new understanding about the fundamental underpinnings of our universe. Now, a tool first conceived in 2005 to manage data from the world’s largest particle accelerator may soon push the boundaries of other disciplines. When repurposed, it could bring the immense power of data mining to a variety of fields, effectively cracking open the possibility for more discoveries to be pulled up from ever-increasing mountains of scientific data.

Advanced data management tools offer scientists a way to cut through the noise by analyzing information across a vast network. The result is a searchable pool that software can sift through and use for a specific purpose. One such hunt was for the Higgs boson, the last remaining elementary particle of the Standard Model that, in theory, endows other particles with mass.

With the help of a system called PanDA, or Production and Distributed Analysis, researchers at CERN’s Large Hadron Collider (LHC) in Geneva, Switzerland discovered such a particle by slamming protons together at relativistic speeds hundreds of millions of times per second. The data produced from those trillions of collisions—roughly 13 million gigabytes worth of raw information—was processed by the PanDA system across a worldwide network and made available to thousands of scientists around the globe. From there, they were able to pinpoint an unknown boson containing a mass between 125–127 GeV, a characteristic consistent with the long-sought Higgs.

The sheer amount of data arises from the fact that each particle collision carries unique signatures that compete for attention with the millions of other collisions happening nanoseconds later. These must be recorded, processed, and analyzed as distinct events in a steady stream of information. (more…)

RHIC, the Relativistic Heavy Ion Collider at Brookhaven Lab, found it first: a “perfect” liquid of strongly interacting quarks and gluons – a quark-gluon plasma (QGP) – produced by slamming heavy ions together at close to the speed of light. The fact that the QGP produced in these particle smashups was a liquid and not the expected gas, and that it flowed like a nearly frictionless fluid, took the physics world by surprise. These findings, now confirmed by heavy-ion experiments at the Large Hadron Collider (LHC) in Europe, have raised compelling new questions about the nature of matter and the strong force that holds the visible universe together.

Similarly, searches for the source of “missing” proton spin at RHIC have opened a deeper mystery: So far, it’s nowhere to be found.

To probe these and other puzzles, nuclear physicists would like to build a new machine: an electron-ion collider (EIC) designed to shine a very bright “light” on both protons and heavy ions to reveal their inner secrets. (more…)

Born in the hearts of stars and nuclear reactors, almost undetectable, nearly as fast as light, able to pass unhindered through everything from planets to people, and confirmed shapeshifters. That role call describes what makes the particles known as neutrinos both exciting and perpetually challenging for physicists on the hunt.

A series of brilliant experiments designed and executed since the 1950s have managed to detect these slippery subatomic wonders, revealing much about their origins, travels, and presence as one of the most abundant particles in the cosmos.

Earlier this week, an international collaboration led by China and the United States at the Daya Bay Reactor Neutrino Experiment in the south of China pinpointed the action behind one of the neutrino’s signature magic tricks: its ability to seemingly vanish entirely. The disappearing act is the product of neutrino oscillations, and the Daya Bay team calculated the final unknown transformation type. The 5-sigma discovery not only helps demystify the neutrino, but it will also guide future experiments in exposing more fundamental mysteries – such as how we exist.

Sensitive photomultiplier tubes line the Daya Bay detector walls, designed to amplify and record the faint flashes that signify an antineutrino interaction. (Courtesy of Roy Kaltschmidt, Lawrence Berkeley National Laboratory)

“It’s surprising and exciting that this result came so quickly and precisely,” said Brookhaven Lab’s Steve Kettell, who is Chief Scientist for the U.S. at Daya Bay. “It has been very gratifying to be able to work with such an outstanding international collaboration at the world’s most sensitive reactor neutrino experiment.” (more…)

Working with an international team, three physicists from Brookhaven Lab have helped to demonstrate the feasibility of a new kind of particle accelerator that may be used in future physics research, medical applications, and power-generating reactors. The team reported the first successful acceleration of particles in a small-scale model of the accelerator in a paper published in Nature Physics.

The device, named EMMA and constructed at the Daresbury Laboratory in the UK, is the first non-scaling fixed field alternating gradient accelerator, or non-scaling FFAG, ever built. It combines features of several other accelerator types to achieve rapid acceleration of subatomic particles while keeping the scale — and therefore, the cost — of the accelerator relatively low. (more…)

Today’s public seminar at CERN, where the ATLAS and CMS collaborations presented the preliminary results of their searches for the Standard Model (SM) Higgs boson with the full dataset collected during 2011, is a landmark for high-energy physics!

The Higgs boson is a still-hypothetical particle postulated in the mid-1960s to complete what is considered the SM of particle interactions. Its role within the SM is to provide other particles with mass. Specifically, the mass of elementary particles is the result of their interaction with the Higgs field. The Higgs boson’s properties are defined in the SM, apart from its mass, which is a free parameter of the theory. (more…)