News / Highlights / Colloquium

A general statistical analysis (classical statistics) is a common experimental procedure to determine the uncertainty of photon statistics in measuring a line shift and width. Given the importance of taking into account the background as well as the measured signal in any photon measurement, the paper describes both the perfect spectrometer measurements with a zero and nonzero background as well as the case of an imperfect spectrometer.

Distribution of the ?t = tLVD - t*OPERA for corrected events. All the events of each year are grouped into one single point with the exception of 2008, which is subdivided into three periods.

The halls of the INFN Gran Sasso Laboratory (LNGS) were built in the 1980s based on the design of A. Zichichi and oriented towards CERN for experiments on neutrino beams. In 2006, the CERN Neutrinos to Gran Sasso (CNGS) beam was ready and the search could start for tau-neutrino appearances in the muon-neutrino beam produced at CERN. The OPERA detector was designed and built for this purpose.

What is the general relativistic version of the Navier-Stokes-Fourier dissipative hydrodynamics? Surprisingly, no satisfactory answer to this question is known today. Eckart's early solution [Eckart, Phys. Rev. 58, 919 (1940)], is considered outdated on many grounds: the instability of its equilibrium states, ill-posed initial-value formulation, inconsistency with linear irreversible thermodynamics, etc. Although alternative theories have been proposed recently, none appears to have won the consensus.

An interesting feature of black holes is the existence of quasi-normal modes, arising because the system has a peak in the wave potential (scalar, electromagnetic, or gravitational waves). The quasi-normal mode is excited when a disturbance is put in the field near but outside the black hole, (like a wave packet roughly in a circular orbit near the peak). The excitation then propagates outward and inward and decays.

At present there are no known elementary scalar fields. A possible candidate is the as yet undiscovered Higgs particle; however it could well be that this elusive particle is instead composite. This possibility is exhaustively examined in this article, which is both tutorial and extensive review, classifying the diverse technicolor models as extensions of the Standard Model of particle physics.

To date the most successful efforts to solve the irregular boundary Helmholtz equation with Neumann boundary conditions have been computational, but even this general method has its drawbacks. Panda et al. provide a new analytic approach which solves the irregular boundary problem via a perturbative series. As the authors show by working out several nontrivial examples, the benefits of this approach include a precise understanding of the behavior of the solution as the amplitude of the boundary distortion is increased, as well as the control over the analytic precision in the terms computed and its corresponding analytic error estimates. more

Oncological hadron therapy was first proposed 65 years ago by Robert R. Wilson, and it took more than 40 years to build the first dedicated facility, in Loma Linda in the nineties. The growth of new facilities since then has been exponential, and thousands of patients are now treated every year. Close collaboration between research institutes, clinical centers and industry is the basis and the future of this field. This EPJ Plus focus issue spotlights the status of hadron therapy in Europe, where different centers are already in operation, some are just now ready to start patient treatments, and new ones are being planned. more

While the hunt is on for the Higgs at the LHC, model building continues to explore also other scenarios. Here, an ultraviolet complete electroweak model is presented that assumes running coupling constants described by energy-dependent entire functions. Contrary to the conventional formulation the action contains no physical scalar fields and no Higgs particle, even if the foreseen masses for particles are compatible with known experimental values.

The need to store, distribute and analyze the 15 million gigabytes of data annually generated by the Large Hadron
Collider (LHC) at CERN has led to a revolutionary development of innovative software tools. Under CERN coordination,
leading IT teams have tested and validated cutting-edge software technologies aimed to operate distributed computing
and data storage infrastructures based on a worldwide network of hundreds of computing centers on an unprecedented
scale.

Thank you. We are indebted to the reviewers, Editor for their time and pinpoint comments and suggestions. I am also thankful for the Editorial office. The speed of the manuscript processing and the updates are too good.