Friday, 27 February 2015

Wherein I list some (mostly) recent happenings, ramble a bit, and provide links, in an order roughly determined by importance and relevance to particle physics.

Doing the rounds lately has been a new mechanism for leptogenesis: (post-inflationary) Higgs relaxation leptogenesis. APS has a synopsis and Scientific American has a story (never let difficult physics stop a title containing both "Higgs" and "antimatter"). Kusenko, Pearce and Yang achieve the observed matter-antimatter asymmetry using this mechanism in only the minimal Type I see-saw model. The Type I see-saw is a very nice model for neutrino masses, and it can explain leptogenesis via the Fukugita-Yanagida mechanism (1986), but it is not possible to do both without ceding naturalness. For me, this provokes an interesting question: is there any way of doing natural leptogenesis in the Type I see-saw without adding new particles? The only ways I am aware of are resonant leptogenesis and neutrino oscillations. This could be another (and would be a nice selling point for the mechanism) if it can work with $M_N\lesssim 10^7$ GeV... but this requires further thought and investigation; in the examples they give (Figure 2) the right-handed neutrinos are much heavier than this requirement.

Tom Kibble's recollections on the history of electroweak symmetry breaking are online. I don't think there is anything in there that is particularly new, but it is a nice short document written at an accessible level. Here is an excerpt on the three-year gap between the Higgs mechanism and the electroweak theory:

The 1964 papers [on avoiding the Goldstone theorem] from the three groups attracted very little attention at the time. Talks on the subject were often greeted with scepticism. By the end of that year, the mechanism was known, and Glashow’s (and Salam and Ward’s) SU(2)×U(1) model was known. But, surprisingly perhaps, it still took three more years for anyone to put the two together. This may have been in part at least because many of us were still thinking primarily of a gauge theory of strong interactions, not weak... A unified gauge theory of weak and electromagnetic interactions of leptons was first proposed by Weinberg later that year. Essentially the same model was presented independently by Salam in lectures he gave at Imperial College in the autumn of 1967 — he called it the electroweak theory... Weinberg and Salam both speculated that their theory was renormalizable, but they could not prove it... Renormalizability was finally proved by a young student, Gerard ’t Hooft, in 1971, a real tour de force using methods developed by his supervisor, Martinus Veltman.

I'm seeing a bunch of articles linking mass extinctions to disk dark matter again this week. The origin appears to be a press release from NYU on a paper by a Professor of Biology who appears to be an expert in extinction events. You can read it in full for free here. The paper, which is in a monthly notices journal, seems to me to be largely a review, perhaps with the new suggestion of linking the work of Abbas-Abbas on clumpy dark matter capture causing large-scale volcanism with the Randall-Reece disk dark matter scenario to explain the ~35 Myr period in mass-extinction events. For that reason it is disappointing to not see any reference to those authors in the press release or the related articles circling around. Maybe as a consolation, given the buzz around those articles, it is clear that Randall's new book should sell by title alone!

A new arXiv preprint shows that in elliptical galaxies, the central stellar velocity dispersion (a surrogate for the central black hole mass) is more tightly correlated to the total gravitating mass of the galaxy rather than the stellar mass, connecting central black holes of elliptical galaxies with their dark matter halo.

A separate preprint has observed nearly-spherical streams of ionized gas winds coming from an active central black hole. Bad Astronomy has a nice article about it. The winds transfer energy into the galaxy well beyond the gravitational influence of the black hole, affecting the galaxy's evolution.

I learned this week that it is actually fairly easy to extract data from a vector plot. This is a common task for the phenomenologist so I will explain two examples below for reference, since I haven't seen it written anywhere in detail; skip this point if you're not interested.

Usually, if I need to scrape a plot (often exclusion curves) I just use WebPlotDigitizer, however there are some plots with many points or indiscernible lines which are not amenable to that method. For example, the BaBar limit on dark photons is a nightmare:

But it is a vector plot, so in principle it should be possible to extract the information on how to draw the curve. Here is one way...

First save the page of the .pdf as an .svg file (I used inkscape). When you open the .svg file in a text editor you will find some intuitive syntax describing how to draw the plot. We want to pick out the BaBar line, so I looked for an occurrence of stroke:#000000 (a black line) associated with a list of numbers. If you do this yourself you will find those numbers in the following context inside a "path" element:

d="m 146.558,372.935 0.639,-122.326 1.668,4.96 1.625,1.169 ...

According to the syntax, in general a path element contains a d="(path data)" attribute, where the ‘d’ attribute contains the moveto, line, curve (both cubic and quadratic Béziers), arc and closepath instructions. In the above case we have a lower case m which indicates relative moveto's which will draw the line of interest. To get the points then we just have to transcribe the list into our favourite format and write a for-do loop (or similar) to obtain absolute positions.

In Mathematica, this can be done by replacing spaces with "},{" and adding opening and closing brackets to obtain the list:

All the points you want are the coordinates after the m. So you just need to get them somehow. There are no doubt multiple approaches to this. In bash you could grep or awk or sed your way there. For this example I used the command

$ sed -n '/fill:#00ffff/{n;p;}' BaBar1502.svg > cyanpts.dat

to print to file all the lines after a line with a match for fill:#00ffff. If you paste that document into a spreadsheet program you can then pick off the numbers of interest into columns.

I missed this talk by Dan Hooper from a few weeks ago. Some talk on the central excess is at 28-46 mins. There is also a fun anecdote about astrophysicist Troy Porter. [1 hour]

Katie Mack talking about Mars plumes, the star that passed through the Oort Cloud 70,000 years ago, the Europa Clipper mission, and Ceres on breakfast radio. Begins at 2:13. [9 minutes]

SmarterEveryday is starting a space series with a personal touch. In the first video astronaut Don Pettit shows how he turns off the streetlight at the end of his driveway with a laser so that he can better view the night sky. [4 minutes]

Mary Somerville, Alan Turing, and Stephen Hawking autographs in The Royal Society Charter Book at Objectivity. [5 minutes]

Interstellar's rapidly spinning black hole was slowed down for aesthetic reasons during rendering, to reduce the resulting brightness asymmetry and blueshift.

As we know, the arXiv surpassed $10^6$ papers last month and increased its article identifier sequence number to 5-digits to cope with the fact that they are nearing $10^4$ monthly submissions. Hot on arXiv's tail, now viXra has surpassed $10^4$ submissions! The real winner though is the snarxiv, with well over $\infty$ submissions...

The bright spot on Ceres is actually two bright spots! (Ceres is opening its other eye...)

This week we won't end on space images. Instead I reproduce a poem by Thomas Otto called "LHC" (I enjoyed the final stanza):

In Hobart we heard an update on geological backgrounds. In short: neutron flux okay, gamma flux okay, radon flux high but manageable with surface air / adsorption on activated carbon / shielding. All good news.

A detector in the southern hemisphere has the capacity to settle the case for the DAMA/LIBRA annual modulation signal. If the DAMA/LIBRA signal is real and originates from Earth's traverse through the dark matter halo, then the signal should not change for a similar experiment in the southern hemisphere. If the signal is spurious, due to some local Gran Sasso background or an unaccounted seasonal effect, then the signal will disappear or swap phase. A diurnal modulation signal might even be observed if dark matter is captured within the Earth. Any result would be an interesting result, and a new generation twin experiment would be particularly interesting.

Related is a Wall Street Journal article about direct detection at Gran Sasso, specifically the Darkside-50 experiment. It includes a two minute video.

ICARUS has updated and revised their comparison to the MiniBooNE $\nu_\mu$-$\nu_e$ oscillation anomaly, still suggesting an "unexplained nature or an otherwise instrumental effect for the MiniBooNE low energy event excess." The paper is worth reading if only for the maligning undertones... ICARUS already presented their results in July 2013 which appeared to rule out MiniBooNE at >99% CL:

MiniBooNE posted a critical reply which "explains and corrects the mistaken analysis published by the ICARUS collaboration." The confusion seems to be in the translation between the reconstructed energy $E^{QE}_\nu$ and $E_{True}$. Let me quote an excerpt from the new ICARUS paper to give you a feel for their thoughts on the issue:

It appears that the reconstructed energy is affected by a huge non-Gaussian smearing compared with the true neutrino energy, as clearly stated in [the MiniBooNE reply] (see Figure 2), in contrast with the much better 11% resolution on $\nu_e$ event energy quoted in a previous paper. This difference between $E_{True}$ and $E^{QE}_\nu$, for which MiniBooNE gave a quite elliptical explanation, is the major cause of the problem in using the L/E (or E/L) to compare data with expectations...

They make two further remarks. First they argue that the MiniBooNE reply implies that MiniBooNE's own results were represented in a misleading way in the original paper, by being directly compared to LSND data. Second they point out that the MiniBooNE quoted errors are inconsistent from paper to paper, and they even have a plot comparing the errors to make the point.

Anyway, upon updating the comparison, the MiniBooNE anomaly appears excluded at 90% CL but no longer at 99% CL.

The LHC PR machine has really started to gear up. Checking the In the News section at Interactions.org I count 26 news articles in the last week. It's the usual sell: the big bang, Higgs, SUSY, dark matter, and baryogenesis.

Old news, but I just learned that since the end of September last year you can search for papers in inspire by just copy-pasting a reference from a paper and using: find rawref "..." . That will save us a bit of time.

Natalie Wolchover at Quanta magazine always produces well-written and well-balanced articles (even Luboš agrees), and has delivered another one, this time on string theory; the byline asks, "Researchers are demonstrating that, in certain contexts, string theory is the only consistent theory of quantum gravity. Might this make it true?"

It appears that Lisa Randall will be releasing a book "Dark Matter and the Dinosaurs" at the end of October this year. If you're wondering about the connection, Randall has an arXiv paper which links comet impacts to periodic transits of the Oort cloud through the galactic disk, hypothesised to align with a dark matter disk (if some component of dark matter is dissipative). Actually, The Economist picked up this story yesterday, though there's no mention of Randall.

The Pale Blue Dot photograph taken by Voyager 1 is 25 years old. Read the famous reflections by Sagan here. Voyager 1 entered interstellar space in mid-2012; it's still sending back data, but by 2030 will be unable to power any instrument. Click here to see where it is now.

Lastly, images from space...

Mysterious plumes are erupting on Mars (gif here) and as Dawn edges closer, the nature of the white spots on Ceres are still unknown: "We expected to be surprised; we did not expect to be this puzzled"; it will be in orbit on March 6. Are we living at the beginning of a sci-fi novel?

As for the rest 2015 Planck release, the base cosmological parameters have hardly changed (see e.g. Table 1 of this paper). ΛCDM of course continues to explain everything extremely well. The $\Sigma m_\nu$ upper limit is down to 0.17 eV (Planck TT, TE, EE+lowP+BAO) at best, almost at a level to rule out an inverted hierarchy ($\Sigma m_\nu\gtrsim 0.1$ eV). $N_{eff}$ is still at 3.04 ± 0.18 (Planck TT, TE, EE+lowP+BAO), consistent with the standard value 3.046. There is no longer a hint of an excess there. As for inflation, Jester weighed in, and Planck present their results in this paper:

Looks like the canonical Starobinsky $R^2$ inflation (1980) is taking the lead, but time will tell... experiments will need to probe at the level r~0.001 to test it.

BaBar released a new paper, "Search for Long-Lived Particles in e+e− Collisions". No significant signal is observed. It seems worth mentioning that they see a high local significance of events in the dimuon mode at $m_{\mu\mu}\approx 212$ MeV, which interestingly is very close to the HyperCP anomaly, but they say it is consistent with background from photon conversions. Anyway, from the point of view of a phenomenologist I especially like the effort BaBar has put into presenting their results in a model independent way. For example:

1. They provide upper limits on $\sigma(e^+e^-\to LX)\mathcal{B}(L\to f)\epsilon(f)$, where $L$ is the long-lived particle and $f$ is the final state, which are completely agnostic of the production mechanism. 2. They provide limits on $\mathcal{B}(B\to LX_s)\mathcal{B}(L\to f)$ where $X_s$ is "strange stuff", which reduces significantly some theoretical uncertainties for exotic $B$ decays and increases the possible signal yield. 3. They look at six different two-body final states.4. On top of this, they will provide the full efficiency as a function of $m$, $c\tau$ and $p_T$ in supplementary material, so that their results can be reinterpreted.

Very well done from BaBar. Their measurement has implications for a very simple extension of the Standard Model with a real singlet scalar mixing with the Higgs (a Higgs portal). I have quickly reinterpreted their Fig. 3 bounds (1 cm < $c\tau_L$ < 100 cm) on the $\mu^+\mu^-$, $\pi^+\pi^-$ and $K^+K^-$ final states, assuming the limit lines are $\mathcal{B}(B\to LX_s)\mathcal{B}(L\to f)\lesssim 10^{-6.5}$...

The above plot shows exclusions (within solid lines) for the model as a function of light scalar mass and mixing (see a previous paper of ours). The incremental shadings represent different lifetime regions and the shaded region between 0.28 GeV and 4 GeV masses indicates a very uncertain region for branching predictions; between we choose the most recent calculation (still >20 years old!) below 1.4 GeV and a perturbative calculation above. The approximate exclusion from BaBar is shown as the purple dashed line. Indeed they are exploring previously unexplored parameter space of interest for an inflationary model of Bezrukov/Gorbunov (between black dashed lines). They are limited by difficult backgrounds between 0.37 GeV and 0.86 GeV masses. The purple dotted line indicates the region they would have excluded if they could limit $\mu^+\mu^-$ and $\pi^+\pi^-$ final states to the same level in that region. I am interested to get my hands on the supplementary material when it is released.

A new paper published in Nature Physics (not on the arXiv [edit: now it is]) this week is doing the rounds; it infers that there must be dark matter in the inner region (within the solar circle) of our own galaxy. It's behind a paywall, but there is a press release here. The main plot is below.

The authors have compiled an exhaustive list of rotation curve measurements (red), as well as gathered a set of models for the baryonic contribution from which they form their "baryonic bracket" (grey). The lower panel shows that all baryonic-only models are already ruled out at 5-sigma by the time we arrive at our galactic radii. Hence there must be a dark matter component in the inner Milky Way. They do not appeal to any dark matter density profile, so in that sense it is a model-independent result (but of course it is not independent of the baryonic models!). Nevertheless you can see that a typical Navarro-Frenk-White profile models the residuals extremely well. This is nice to know.

It is interesting that you can already infer dark matter just with measurements of the rotation curve within the solar circle, and I suppose the result may also be useful to constrain dark matter distributions of interest to direct and indirect detection. I do not think it is "the first observational proof of the presence of dark matter in the innermost part of the Milky Way" as the press release claims (the paper does not claim this). It is obvious that you need some dark matter component in the Milky Way to explain the rotation curve measurements at large galactic radii, and nobody thinks that all that dark matter is accumulated beyond 8 kpc! Those measurements are already observational evidence for dark matter in the inner Milky Way. Regardless, Hooper et al. already saw dark matter at the galactic centre...

IceCube have released their results on the flavour ratio of astrophysical neutrinos above 35 TeV. They release a very nice plot which fits the observed flavour ratio at Earth:

Averaging neutrino oscillations over astronomical distances would give a value in the blue triangle for any flavour ratio at the source. The blue circle $\approx(1:1:1)_{Earth}$ marks the expected value for pion decays as the dominant source. Any measurement inconsistent with the (very thin) blue triangle would be a signal of new neutrino physics, such as neutrino decay, sterile neutrinos, or CPT violation (see for example here and here). Tommasso discusses it a little more here. [Edit: there is a short article at the IceCube web site also.]

If you'd like to read some more on the "Firewall Phenomenology with Astrophysical Neutrinos" paper that appeared on hep-ph last week, Bee at Backreaction has written a nice summary. The paper shows that IceCube's PeV neutrinos could be explained by a suitable emission spectra from black hole firewalls.

Physicist Val Fitch, whose discovery (along with James Cronin) of CP violation in a 1964 experiment won him the 1980 Nobel Prize, has died at the age of 91. Read about his life and contributions here.

Steven Weinberg has written a history book, released a few weeks ago. According to the About, "To Explain the World is a sweeping, ambitious account of how difficult it was to discover the goals and methods of modern science, and the impact of this discovery on human knowledge and development." For some reason it even garnered praise from Ian McEwan.

IPMU has recently published a 26 page transcript [pdf] of a conversation with Edward Witten after he was awarded a Kyoto Prize last year for "Outstanding Contributions to the Development of Mathematical Sciences through the Exploration of Superstring Theory". Peter Woit highlights some excerpts and talks about it on his blog.

A moderately sized meteor (~0.5m) created a fireball (bolide) above New Zealand on Wednesday and was captured on dashcam. You can read about bolides here, with graphs! If you want to see what a bigger one can do, try the Chelyabinsk event from two years ago, perhaps the biggest meteor to enter the atmosphere since Tunguska (1908).

This blog post has a gif showing the history of planet detection in 1 minute, and the exoplanet gold rush starting at the year ~2000, with some discussion. Here's where we were by the end of last year:

SpaceX launched the Deep Space Climate Observatory (DSCOVR) on Wednesday. You can watch the launch and read a little about it at space.com. DSCOVR will sit at the L1 Lagrange point between the Earth and the Sun, observe the climate, and serve as an early alert "buoy" for geomagnetic storms. The satellite is the resurrection of a previous project championed by Al Gore; he wrote, "DSCOVR has embarked on its mission to further our understanding of Earth and enable citizens and scientists alike to better understand the reality of the climate crisis and envision its solutions. DSCOVR will also give us a wonderful opportunity to see the beauty and fragility of our planet and, in doing so, remind us of the duty to protect our only home."

Friday, 6 February 2015

Wherein I list some (mostly) recent happenings, ramble a bit, and provide links, in an order roughly determined by importance and relevance to particle physics.

Obvious big news item of the week is the release of the Joint Analysis of BICEP2/Keck Array and Planck Data (also on arXiv now), which derives an upper limit on the tensor-to-scalar ratio r<0.12 at 95% C.L., perfectly consistent with r=0. So, no evidence for primordial gravitational waves (yet). This is somewhat different from the original BICEP2 result r=0.20+0.07-0.05 with r=0 disfavoured at 7.0σ (lest we forget the YouTube reveal). That paper (from March last year) can be found on the arXiv. It is less than a year old with already almost 1000 citations! (And nature has a rundown on that). The important caveat can be found at the end of the abstract, emphasised after peer review and acceptance by Physical Review Letters: "Accounting for the contribution of foreground dust will shift this value downward by an amount which will be better constrained with upcoming data sets." Well, now we know that amount...

Of course, this basic conclusion has been known for a while. Rumours began to circulate by May that the effect of polarised emission from the galactic dust foreground was problematic, that it was estimated (and misinterpreted) from preliminary figure in a slide shown at a conference. That month a couple of papers appeared on the arXiv arguing the point. Nevertheless the BICEP2 paper was accepted in June with the added caveat I mentioned above.In September Planck released their study of the polarised dust emission, showing that the effect of dust was likely of the same order of magnitude as the effect measured by BICEP2. There are some very good blog entries on this, see Sean Carroll, Katie Mack, In The Dark, Blank On The Map, Excursionset, Resonaances, etc. The plot below was enough to convince mostly everyone that the dust could account for all of the signal; it shows the "amount of B-mode polarisation" versus multipole moment, with blue the expected dust component and black the best fit theoretical prediction from gravitational waves claimed by BICEP2.

The book was almost shut, but Planck reminded us that this was an extrapolation from a high frequency region to the lower frequency which BICEP2 observed. We were told to be patient physicists until the joint analysis was complete. The release was pushed back and back, but now here we are... primordial gravitational waves at r=0.2 are dead. So it goes.

The upshot is that we do have gravitational lensing modes at 7.0σ! But wait, that number is familiar... Also, the following image was released which is just stunning and certainly worth a stare. The colour scale is for dust emission, and the texture is the orientation of the Galactic magnetic field. Outlined region is the BICEP2 patch.

So what's next? Resonaances has a blog about that. Any non-zero measurement of r in the future will still be big news, and there are many experiments which will soon be sensitive to r~0.01. Certainly we could still see a primordial gravitational wave signal in the coming few years. Once again, we must sit and be patient physicists...

Today was the 2015 release of Planck full mission data products and scientific papers. The press release is here. The result they are spinning is that Planck measures the beginning of reionisation at 560 million years after the big bang, significantly later than the WMAP measurement of 420 million years. This is more consistent with observations from Hubble of the earliest galaxies (300-400 million years); now there's enough time for these structures alone to inject the energy needed to end the dark ages. I'm sure we will hear more about all their results in the coming week(s).

They also released the full map in hi-res of the polarised emission from Milky Way dust, reminiscent of Van Gogh:

I missed this last week but ATLAS has released evidence for the Higgs-boson Yukawa coupling to tau leptons (actually there were quite a few releases, which is just the wrapping up of the remaining Run-I analyses). They measure a signal strength μ=1.43+0.43−0.37, and an excess of events over the expected background from other Standard Model processes with an observed (expected) significance of 4.5 (3.4) standard deviations. So they got somewhat lucky. Here's the plot:

CoEPP has been involved in some of this analysis and I have seen a few talks on it in the past. I am always amazed, when I see the histograms before the BDT (and even after the BDT in each channel) that they are able to dig out this signal at all. Just look at this histogram of an important BDT input variable from one of the better channels, τlep+τhad:

Doesn't look too bad, until you see that the signal histogram is presented 50x larger than it really is, just so you can see it. Obviously the experimentalists have plenty of tricks up their sleeves which are especially powerful when you know exactly what you're looking for. It's a remarkable analysis. And to be honest, if we required a local p-value of 5σ to "discover" the Higgs, when we didn't know its mass, then h→ττ with a significance of 4.5σ, when we know exactly where it should be... that's discovery in my book.

DZERO submitted the more detailed documentation on their top mass measurement originally published as a letter back in May. Their measurement is 174.98±0.76 GeV. They say in the abstract, "This constitutes the most precise single measurement of the top-quark mass," but that is no longer true. As far as I am aware that honour goes to CMS, with a measurement of 172.38±0.10(stat)±0.65(syst) GeV. (Evidently the LHC has a lot of top statistics!) Read more about that at Tommaso's blog post from four months ago. The Tevatron continues to pull the world average measurement up.

It is cool to see these razor variables in action! They're a very nice method for isolating new physics signals with pair-produced particles each decaying to visible+invisible. Other variables (which may be more familiar) are MT2, MCT and MCT⊥. (ATLAS used MT2 for their top squark search). The difficulty for such searches is how to deal with the fact that, in an event, we cannot know the momentum vectors of the invisible particles individually, but only reconstruct the total missing transverse momentum vector. So the general idea has been to construct kinematic variables which are an approximation of the mass scale of the underlying event, and which have a kinematic end-point (a maximum possible value) defined by the masses of the particles involved. The endpoint is exact at the parton truth-level but ends up being smeared by showering and detector effects. Above some e.g. MT2_{cut} there are expected to be very few SM events; usually MT2_{cut} ≈ m_W or m_t. Signal events will accumulate beyond MT2_{cut} since new physics is expected to have larger masses. Thus these variables are a very nice way to eliminate SM background in these kind of searches, applicable for R-parity conserving SUSY models with neutralino dark matter candidates, and leptoquarks.

But they are not perfect. A problem with MCT for example is that the kinematic endpoint actually depends on the centre-of-mass of the pair-produced particle system. (This was eliminated with MCT⊥). Also, a lot of events end up piled at ~0 for both MCT and MT2. The razor approach avoids this problem by boosting from lab-frame to the particle-pair centre-of-mass frame with a "best guess" boost, and constructing a kinematic variable there. In general it performs at least as good as MCT⊥ and MT2 (see below).

And they now have a successor: super-razor variables. Super-razor variables are designed to increase sensitivity in searches for specific decay topologies. To construct these variables one iteratively boosts from reference frame to reference frame with "best guesses", and at each stage you get some information about the masses (and mass splittings) involved. For example, in dislepton production where each slepton decays to a lepton and neutralino, there are three interesting frames to boost to. One advantage of this approach is that, along the way, you reconstruct some extra information such as decay angles which you can then use to help discriminate your signal. Click and look at the figure below to see its power...

I am not aware of any ATLAS/CMS search which has employed the super-razor variables yet, but I look forward to seeing it in Run-II.

My supervisors and I have a letter paper out on the arXiv today. It is a short analysis of naturalness in the three-flavour Type I see-saw model, with the following take-home message: standard hierarchical thermal leptogenesis is unnatural, and there's no way out in the minimal model. I will write a short post about it next week some time.

According to a new arXiv preprint, there is no indirect dark matter signal from the Large Magellanic Cloud. The following figure tells the tale:

The investigators expected to begin to probe the areas (see the brazil lines) of parameter space interesting for the galactic centre excess (four marked areas), but they aren't quite there yet. Matthew Buckley (one of the investigators) has some tweets about it (reading upwards, beginning Feb 5).

[Edit 16/02: I read this paper in a little more detail last week. It is the first indirect dark matter search in the LMC, certainly worth doing as it is potentially the second brightest (after the galactic centre) annihilation source in our sky. However unlike dwarf galaxies there is a lot of baryonic matter to contend with; the authors use a data-driven method to model this. They end up seeing a broad excess which is consistent within systematic limitations of the background model (they stress this point, so it is not taken as further evidence for the Hooperon!). It should be noted that the above plot (their Fig. 22) is conservative in terms of the statistical analysis and choice of LMC centre, but it is optimistic in the halo profile. Assuming an NFW or Isothermal (cored) profile weakens these limits by an order of magnitude (see their Figs. 17 and 18). The analysis appears to be difficult yet worth attempting, unfortunately it cannot compete with the limit from dwarf galaxies, especially after Fermi Pass 8.]

Could the missing satellite problem be solved with just dark energy? This paper on the arXiv suggests the possibility, and new scientist ran a story. This is surprising, since I would have thought that this is already taken into account in simulations? What am I missing?

James D. "BJ" Bjorken was one of the two winners of the Wolf Prize in Physics this year. It is known as somewhat of a predictor for the Nobel. (Brout, Englert and Higgs were recipients in 2004). You can read about it at the official site, and also at Tomasso's blog. The former writes, "in retrospective, Bjorken's scaling not only led to the discovery of quarks, but also pointed the direction toward the mathematical framework governing all fundamental interactions."

On Wednesday, Roman prosecutors closed the case on the disappearance Ettore Majorana. Majorana, who disappeared at sea in 1938 at 32 years of age, is now believed to have been alive and well, living in Valencia, Venezuela, between 1955-59.

I only just read that as of January 1, Physical Review journals and Physical Review Letters will allow article titles in the reference list. Huge.

Jester noted that the Symposium on Lepton Photon Interactions resembles a vomiting dragon. I'll let you to make up your own mind...

The Crayfis app for detecting ultra-high energy cosmic rays using a network of smartphones is already in beta testing. Read more about it from Kyle Cranmer in a blog post, or see the original paper from October last year.

The exciting news: there is a request for $30mil to begin planning a mission to Europa! Bad Astronomy claims that the request has a decent shot, too, since it has a champion in Congress. Looks like this has been in the works for a while, JPL released somewhat of a promotional video for such a mission back in November.

NASA has successfully launched the SMAP (Soil Moisture Active Passive) satellite observatory, which will gather three years of data on global soil moisture levels via a very cool-looking 6m rotating reflector. You can read about it at space.com or at NASA, and watch the launch here. It is the last of five Earth-observing space missions to be launched in the past year by NASA (including: Orbiting Carbon Observatory-2, Global Precipitation Measurement Core Observatory, ISS-RapidScat, and Cloud-Aerosol Transport mission).

The US (Republican-led) Senate on 21 July passed an amendment to a bill 98-1 which stated: "It is the sense of the Senate that climate change is real and not a hoax." However they rejected, 50-49 with a requirement for 60, the stronger amendment: "It is the sense of Congress that 1) climate change is real, and 2) human activity significantly contributes to climate change." Senator Inhofe, who once claimed that global warming was "the greatest hoax ever perpetrated on the American people" claimed: "The hoax is that there are some people who are so arrogant to think that they are so powerful they can change climate. Man can't change climate." I'll leave that alone...

"Computational Linguistics Reveals How Wikipedia Articles Are Biased Against Women", as an article here or on the arXiv.

Last but certainly not least, here is some old news that I only recently discovered... an arXiv paper on predicting the length of winter in the world of Westeros. From the abstract: Thus, by speculating that the planet under scrutiny is orbiting a pair of stars, we utilize the power of numerical three-body dynamics to predict that, unfortunately, it is not possible to predict either the length, or the severity of any coming winter. We conclude that, alas, the Maesters were right -- one can only throw their hands in the air in frustration and, defeated by non-analytic solutions, mumble "Coming winter? May be long and nasty (~850 days, T<268K) or may be short and sweet (~600 days, T~273K). Who knows..."

About Me

Jackson Clarke, PhD candidate in phenomenological particle physics at CoEPP, University of Melbourne. Collider phenomenology, neutrino masses, and some naturalness. Science enthusiast, among many other things. Blogging accordingly.

Views are my own. Content very definitely skewed by my own leanings and by papers getting attention. So it goes.