Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

eldavojohn writes "Both the EU and US are using a strategy to merge what used to be two separate searches: the search for exoplanets that may harbor life and the search for dark energy. In an effort to develop 'robust, low-risk missions that maximize the scientific return,' the article analyzes how, without any changes, a space-based dark energy telescope could also check for microlensing events indicating an exoplanet."

From my layman's POV, it seems like we have telescopes all over the spectrum, from X-rays to long radio waves, constantly gathering enormous amounts of data which could easily be mined for dark energy detection, SETI, and just about anything else conceivable. So while I think it's very cool that two such different applications can share data and techniques, I'd like to know what the reasons are that this doesn't just happen all the time. Is it a reluctance to share data, differences in the type of data needed, or something else entirely?

I'd also like to point out that while exoplanets are obviously relevant to people seeking extraterrestrial life, they are scientifically interesting in other ways too. They can suggest things about the history or future of our solar system, and about star systems in general. Additionally, the more we learn about them, the easier it becomes to find additional ones, and the easier it is to create and test hypotheses about them.

From the title, you'd expect that these telescopes were listening for signals that

Even though exoplanets could provide useful knowledge, the search for dark matter is just as fruitless as trying to prove that aether exists (http://en.wikipedia.org/wiki/Luminiferous_aether#End_of_aether.3F). It's another theory that we should be developing, not conforming what we see into a theory we know to be flawed.

Well, I've got a couple of comments on that. Firstly, the article isn't talking about dark matter which is an entirely separate issue to dark energy -- effectively, dark matter is gravitationally attractive and appears on scales ranging from galactic to cosmological, while dark energy is gravitationally repulsive and appears on cosmological scales. Secondly, the search for dark matter cannot (yet) be compared to the search for the aether and any such comment, without being qualified and backed-up with reasonable arguments, is uneducated. The evidence for "dark matter" (meaning an apparent weakness of gravity on galactic and supergalactic scales) is extremely strong -- the problem is there. The next question is simply whether something is wrong with the theory of gravity, which is certainly possible, or whether there is actually missing matter of some form out there.

Taking the second view first, which is more or less the current view, there are plenty of ways that we can get "dark matter". The simplest is simply matter that is difficult to see -- but there are problems with this and it is unlikely that it makes up a significant amount of the problem. Then you get more interesting models. The immediate particle candidate is a neutrino. Neutrinos exist, of this we can be sure, and they appear to have a small but non-vanishing mass -- of this we can be very confident. (This immediately tells us, by the way, that the standard model of particle physics in its simplest form is flawed since it predicts neutrinos of vanishing mass.) Since neutrinos interact only very weakly with matter, they are an immediate dark matter candidate. The problem is that if you make all teh dark matter in the universe the result of massive neutrinos you wash out cosmic structure -- it just doesn't fit observations. So it can't be (entirely) neutrinos. The most common candidate at hte minute would be neutralino, which is the supersymmetric partner of a neutrino. If you believe supersymmetry (which personally I don't, quite, but plenty of people do and there are very good reasons to believe that there's something in it) then you believe in supersymmetric partners; and you also must believe in a *lightest* supersymmetric partner. This particle will be stable, since it can't easily decay into non-supersymmetric particles. In most models, this particle is the neutralino. The hope is that its mass may be such that we can detect it at the LHC. If we do, much of the dark matter problem will be immediately solved -- there would be neutralinos in enough numbers to fit the observations.

The first view, that of modifying gravity itself, is an old one -- and a current one too. It is entirely wrong of you to suggest that people aren't "developing" the theory, since they are and have been ever since Einstein proposed general relativity in the first place. The main ways of modifying relativity come from adding an extra (scalar) degree of freedom into the theory; this can either be done by literally adding in a scalar degree of freedom (which ultimately makes Newton's "constant" time-dependent) or by modifying the "action", the function that generates the equations of motion, such that the Einstein action linearly dependent on the Ricci scalar becomes instead an arbitrary function of the Ricci scalar.

You can struggle to get dark *matter* out of such theories, but if you go one step further and also add in additional vector degrees of freedom, then you have dark matter along with dark energy. (And a really ugly theory.) The other advantage is that you can tune the theory such that in the non-relativistic limit it matches the predictions of "MOND" (MOdified Newtonian Dynamics), which is a purely phenomenological "theory" that aims to predict galactic rotation curves without recourse to dark matter. Effectively, in MOND there is a minimum acceleration below which the nature of gravity changes. With this simple idea, you get startlingly good agreements with many observations (and truly rubbish ones with others, it must be said). The benefit of the

The Michaelson-Morley experiment was an attempt to find out more about the luminiferous aether - specifically, how we were moving relative to it. Are you arguing that it was fruitless? (For the history-challenged, it was one of the driving forces behind relativity. One of the important features of special relativity was that it explained Michaelson-Morley in a way that was consistent with observed facts, if not with the way scientists thought about the world.)

From my layman's POV, it seems like we have telescopes all over the spectrum, from X-rays to long radio waves, constantly gathering enormous amounts of data which could easily be mined for dark energy detection, SETI, and just about anything else conceivable. So while I think it's very cool that two such different applications can share data and techniques, I'd like to know what the reasons are that this doesn't just happen all the time. Is it a reluctance to share data, differences in the type of data needed, or something else entirely?

I have a huge number of programs on my hard drive for everything from web browsing and word processing to Java IDE's and hard disk defragmenting. Why can't I get by with just one program which can do everything?

I suspect you over estimate the amount of data being collected and how unsuitable some data may be for any other purpose.

Ignoring the obvious, that space is a huge space, you can hardly expect optical telescope sky scans used to detect, say, Kuiper belt objects in visible light to be suitable for detecting dark matter. Quasar signals won't be useful to detect the slight wobble induced by a planet in a star's motion.

Everywhere you look on this planet there are cameras and cell phones and radios, seismograph

I'd like to know what the reasons are that this doesn't just happen all the time. Is it a reluctance to share data, differences in the type of data needed, or something else entirely?

Actually, it's primarily lack of funding to build archives and, therefore, lack of access to the data. I think most astronomers have no problem sharing data, as long as they're properly credited and the data is used for something other than the original use (i.e. detecting exoplanets instead of dark energy). There are, of course, differences in the optimum observing strategy and obviously you can't figure everything out from the same instrument/observation, but additional observations are always good and there are usually ways to use the data.

What most people probably don't know is that the majority of data from ground telescopes (except for a few roboticized telescopes) is kept only by the observer. Observations often yield something "weird", in which case the "standard" procedure is to ask colleagues if they know what the "weird" thing is. If no one has any clue, the data is often put aside for later analysis, and typically forgotten about. Everyone is guilty of it, but it is entirely possible that what is one person's trash is another's gold mine.

Amusingly enough, at the last AAS (American Astronomical Society) meeting, there was a grad student discussion session with some of the higher ups in AAS. One of the things we grad students were very much in favor of was an observation archive with exclusivity for the PI for 12-18 months (this is the standard for NASA space missions). The reason we were given that this would never happen is funding.

Ah, that makes sense, in a sad way. In the bioinformatics world, we're used to the existence of large open archives, but of course this is because the funding agencies have specifically made grants for the purpose. Perhaps NASA or the NSF could be persuaded to think about such a thing, at some point.

Well, I suspect this is a rather North American perspective. All data taken with the largest optical/IR observatory in the world, the European Southern Observatory's Very Large Telescope in Chile, are archived and are freely available to researchers after a (typically) 12 month proprietary period. This is true for observations made on site by the astronomers who proposed them (so-called "visitor mode") and for observations made by ESO staff on behalf of the successful proposer, saving them the need to trav

I think you answer your own question: the enormous amounts of data involved - that, as well as the fact that they are using different and probably incompatible ways of storing the data. And that is just the technical side of the problem; someone has to translate the data into actual knowledge first, so we know what to combine; and we still don't have computers that can do that.

This has always been the problem in scientific research; an enormous amount of knowledge is being generated, and nobody can keep up

One issue is how deeply you have to look into space to get a large number of supernovae in your field of vision. Generally, the integration time to image a supernova is extraordinarily large. The radio telescopes famous for SETI experiments are non-directional, therefore they cannot find supernova. You have to look at a specific region of space and look HARD to find them. That is why various space missions are proposed to find supernova in a sky survey, then look at them hard once they're found.

It's a pity that they cannot smuggle SETI into the pack. Anyway, if I were a SETI researcher, I'd save some of my radiotelescope time to look into those exoplanets deemed as suitable for life by research such as this.

If there is a planet with intelligent life that has their own SETI, they would not know Earth had intelligent life unless they were within 120 light years away, becaue noneof the radio signals would have reached them. SETI is incredibly limited.

It's a pity that they cannot smuggle SETI into the pack. Anyway, if I was a SETI researcher I'd save some of my radiotelescope time to look into the region of space occupied by exoplanets deemed suitable for life by research such as this.

It's a pity that they cannot smuggle SETI into the pack. Anyway, if I was a SETI researcher I'd save some of my radiotelescope time to look into the region of space occupied by exoplanets deemed suitable for life by borg such as this.

I thought you were making some witty reference to the search for dark matter and the SETI experiments together. That would have been deserving of your current "funny" mod on both posts, mistake or not.

The fact that there you posted the same thing twice and that these two sets of astronomers are doing the same thing is ironic and an unintended witticism. That was probably why you're modded funny rather than redundant.

Interestingly (to me anyway) I am currently reading Asimovs "Of Time and Space and Other Things" from 1968, where he posits that if the night sky were not dark, life would probably not have evolved. To be honest it wasn't his idea (and he doesn't claim it is). In 1826 a German scientist called Heinrich Wilhelm Matthias Olbers (b 1758), who also discovered the asteroids Pallas and Vesta, started an investigation which became known as Olbers Paradox.

To briefly summarise the idea, if you take the estimated number of stars in the galaxy then add up all the light which they are emitting, there should be no dark night on earth, as the cumulative effect of 100s of million of stars would ensure a blinding sky, not to mention an amazing amount of heat. When you add to that the light from other galaxies, the situation becomes even more untenable for life. But trying to solve this paradox led Olbers and later Hubble to discover that stars and galaxies are not uniformly spread throughout the universe, and then to discover further that the universe is expanding and due to red shift, we never receive the energy from those most distant from us. Hence the dark skies.

I did say it was a brief summary ! Interesting read, in a Connections type of way. And also shows how long these topics have been studied before the truth was known. Olbers didn't know what he was looking for, he just thought that the situation needed some thought. When he started, the existence of other galaxies wasn't known, let alone the size of the universe. Most of his work was theoretical as the technology to see the problem first hand didn't exist. Logic abounds in science !

I guess the general population of the world doesn't really take much time to consider the Swiss and Norwegians. (or Belarusian. and should Iceland really count as part of Europe, how far away from the continent do you have to be)

What kind of question is that? Don't you know that America was that close to get a vice-president that wouldn't be able to find the US on a map of Nort America? I think you need to adjust your expectations a bit.

The National Geographic-Roper Public Affairs 2006 Geographic Literacy Study paints a dismal picture of the geographic knowledge of the most recent graduates of the U.S. education system

Thirty-three percent of respondents couldn't pinpoint Louisiana on a map.Fewer than three in 10 think it important to know the locations of countries in the news and just 14 percent believe speaking another language is a necessary skill.Two-thirds didn't know that the earthquake that killed 70,000 people in October 2005 occur

As already pointed out, the European project (Euclid) in question is being studied by the European Space Agency, not the EU: these are simply not synonymous.

Euclid is one of six missions currently under study for two so-called M-class mission slots within the first round of ESA's Cosmic Vision programme. The first two-year long study phase is over now and the results will be made public at a meeting in Paris on December 1, prior to ESA's scientific advisory working groups and committees coming up with a