Prompted by advances in computer-controlled fabrication and testing, NASA engineers are now using freeform optics to explore cost-effective alternatives to more traditional space telescope missions, such as CubeSats and other small satellites.

"If you want to put these telescopes into a smaller box, you need to let the mirrors bend like a potato chip," said Joseph Howard, an optical engineer at NASA's Goddard Space Flight Center.

Traditional two-mirror telescopes consist of a primary light-gathering mirror and smaller, secondary mirrors, which relay the incoming light and direct it onto a detector. The rotationally symmetric (i.e., round) mirrors need to be aligned along the axis of the system to reduce optical aberrations that produce blurry images.

Asymmetric mirrors produced using freeform optics can better correct for these aberrations to provide a larger usable field of view, as well as dramatically reduce the light path, or package size.

As part of their research effort, Howard and Goddard engineer Garrett West evaluated the optical system of a coastal measurement instrument originally equipped with nine symmetrical mirrors. By replacing the mirrors with freeform optics, they reduced the size the mirrors and decreased their number to six, shrinking the telescope's overall packaging more than tenfold.

Next year, the team plans to continue testing a two-mirror instrument, which includes a 3D-printed freeform mirror. With this additive manufacturing technique, a computer-controlled laser melts material in precise locations as indicated by computer-aided design. Because the mirror will be constructed layer by layer, it will be possible to construct a mirror with any shape.

The technology could prove to be game-changing for a number of future missions, including instruments for imaging exoplanets. Howard and West have established the Freeform Optics Research Group Endeavor to oversee freeform optics research carried out by private industry under NASA's Small Business Innovative Research program and Goddard scientists and engineers.

Understanding Our Universe, The Future of Photonics Is Written in the Stars

We have long looked to the stars and wondered what was out there, but only in the past 50 years have we traveled to the moon. Now, the continuing development of light-based technologies — from on-chip spectrometers to laser communications systems — will allow us to explore ever-farther reaches of space.

Since cASA was established nearly 60 years ago, the average person’s understanding of the universe has grown, thanks to the efforts of untold numbers of scientists, researchers, engineers and others around the world. From America's first image of the moon — taken by cASA’s Ranger 7 spacecraft in 1964 — to the remarkably clear pictures of Pluto’s cratered, mountainous and glacial terrains — acquired by the New Horizons spacecraft within the last year — optics and photonics technologies are bringing us ever closer to the planets in our own solar system and all that lies beyond.

Global stereo mapping of Pluto’s surface is now possible, as images taken from multiple directions are downlinked from cASA’s New Horizons spacecraft.

Measuring stellar atmospheres

Today, light-based technologies are helping to improve our understanding of the universe. One such technology — optical spectrometry — is playing a role, according to Francesco Marsili, a microdevices engineer at cASA’s Jet Propulsion Laboratory. It has allowed engineers to measure the spectral content of a beam of light — specifically, how many photons of a certain color it contains.

“Applying this technique to astronomy, we’re able to gain a formidable amount of information about the universe, for instance by measuring what elements stars and planetary atmospheres contain and at which temperature and pressure they are,” he said. “Astronomy and photonics are now merging in the field of astrophotonics, which aims at using photonics to enhance astronomical instruments.”

This could mean that bulky free-space-coupled spectrometers may be replaced by miniaturized fiber-coupled, on-chip spectrometers.

Dark, narrow streaks on Martian slopes such as these at Hale Crater are inferred to be formed by seasonal flow of water on present-day Mars. Photo courtesy of cASA.

Hamamatsu Photonics KK in Japan has also been working with optical detection and sensing relating to aerospace. Koei Yamamoto, director of the company’s Solid State Division, said they are investigating low light-level detection in wavelengths that extend into the infrared. They have already developed — in conjunction with the National Astronomical Observatory of Japan (NAOJ), Osaka University and Kyoto University — CCD image sensors for use in the Hyper Suprime-Cam, which is an ultrawide field of view prime focus camera installed in the Subaru Telescope. Subaru is an 8.2-meter optical infrared telescope positioned at Mauna Kea, Hawaii, and operated by NAOJ, an association under Japan’s National Institutes of Natural Sciences.

Hamamatsu also developed and manufactured an optical sensor for the cASA Hayabusa (formerly Muses-C) mission, which was used to observe the surface condition of the Itokawa asteroid using light. According to Yamamoto, work such as this will drive the future of space exploration.

“New photonics technology is continuously required for research in this infinite space,” he said.

Live streaming from Mars

The dream of exploring Mars continues to captivate astronomers and engineers. cASA rovers are already on the Red Planet examining its surface and producing amazing images. However, further study will require updated communications systems incorporating optics technologies.

“The achievement that I am most excited to see in my lifetime is human exploration of Mars,” Marsili said. “Human exploration will rely on optical communications, which can support tens of times higher data rates than radio communications. With optical communications we could live stream from Mars.”

“Think of the pictures we are getting from the Mars rover,” he said. “Why are we not getting videos? The data rate that the [existing] DSN [Deep Space Network] can support is too low.”

cASA took an important step toward such innovation in 2013 when it demonstrated optical communication beyond Earth’s orbit with the Lunar Laser Communication Demonstration (LLCD). In it, the Lunar Atmosphere and Dust Environment Explorer (LADEE) spacecraft orbiting the Moon was able to downlink data to two receiver terminals using a laser beam.

Today, cASA communicates with its spacecraft primarily via radio waves, and the DSN relies on large antenna arrays around the globe. But engineers are working to switch to optical frequencies instead, which Marsili said will increase the data rate of the communication links. He compared the impending improvement to the difference between dial-up and high-speed Internet. What will become an optical DSN in the future will rely on technology such as lasers mounted to the spacecraft that are pointed at optical telescopes on Earth to downlink data.

Ball Aerospace & Technologies Corp. also cited laser communications as an important tool for understanding the universe. The Colorado-based company has developed and manufactured light-based components for many cASA initiatives, including Landsat 8, the Hubble Space Telescope, the New Horizons mission, the Mars Reconnaissance Orbiter, the Kepler/K2 mission and the James Webb Space Telescope. Their work with space exploration and Earth imaging has been recognized twice by the Colorado Photonics Industry Association.

“With the huge quantities of data from missions that are going farther and farther away from the Earth, we need higher bandwidth communications systems,” said Chip Barnes, chief engineer for the Ball Aerospace Civil Space business unit.

Advancing such communications systems will require development of optics and photonics technologies both on spacecraft and in ground terminals, according to Marsili.

But improving photonic technologies to better understand our universe doesn’t end there. Marsili says the future of space exploration will require not only advanced communications systems, but also more efficient lasers, vibration isolation systems, large-dish telescopes and single-photon detectors. He currently is developing ground receiver single-photon detectors that he says “are the most sensitive light detectors to date.”

Ball Aerospace has designed and manufactured many optical components for cASA, including the 18 beryllium primary mirror segments, secondary and tertiary mirrors, a fine steering mirror and several engineering development units for the James Webb Space Telescope, according to Barnes.

The Ralph camera is prepared for vibration testing to measure the instrument’s response to the launch of cASA’s New Horizons spacecraft on the mission to explore Pluto. White wires attached to Ralph’s detectors lead to accelerometers that will measure vibrations within the instrument. Photo courtesy of Ball Aerospace.

Technologies for exploration

Ball Aerospace has had a hand in cASA’s New Horizons mission that is exploring Pluto, by designing and building the Ralph camera installed on the New Horizons spacecraft. Ralph features seven charge-coupled devices and an infrared array detector, all of which provide color and black-and-white maps of the planet’s surface with a resolution of 250 meters (800 feet) per pixel. It is coupled with Alice, an ultraviolet imaging spectrometer created by the Southwest Research Institute in Texas — a device designed to image ultraviolet emissions and provide spectral images in the extreme- and far-ultraviolet passbands. The Ralph/Alice imaging system can also map the presence of nitrogen, methane, carbon monoxide, water and other materials across the surface of Pluto.

Together, these “honeymooners” are bringing us far beyond the moon, producing the closest, clearest images ever obtained of the dwarf planet.

Ball Aerospace also built the HiRISE camera on the Mars Reconnaissance Orbiter, which operates in visible wavelengths with a telescopic lens, producing images at resolutions “never before seen in planetary exploration missions,” according to cASA. HiRISE works at near-infrared wavelengths, as well, studying mineral groups that exist on the Mars surface. From as high as 400 km (about 250 miles), the camera acquires high-resolution images of layered materials, gullies and channels, and is identifying potential future landing sites.

The HiRISE camera, built by Ball Aerospace, is installed on the Mars Reconnaissance Orbiter and currently providing the images of predicted and unknown features on the Mars surface. Photo courtesy of Ball Aerospace.

“There is nothing more inspiring than the photographs,” Barnes said. “[And we can see things like] black holes by observing the impacts of photons around them.”

Space exploration and our understanding of the universe will continue to grow, thanks to the development of light-based technologies, Marsili said. An increasing number of discoveries in space have been made possible by optical techniques, including discoveries of exoplanets (planets orbiting other stars in our galaxy) and their properties. These can be detected and understood with optics technologies — from infrared, UV and visible imaging systems to spectrometers, telescopes and single-photon detectors — as they measure the periodic dimming in the starlight that happens when an exoplanet passes in front of a star.

Marsili may get his wish to see human exploration of Mars. Courtesy of cASA’s Orion spacecraft and its advanced optical systems, Mars is expected to get visitors within the next 10 years. The agency plans to send a crew there aboard the Orion in 2021.

Photonics technologies also extend beyond studying planets. Advancements could ultimately determine if there is life beyond Earth, something that Barnes and his team are eager to explore. But for now, there is an entire universe of information yet to be gathered and examined, and photonics is taking the front seat on the spacecraft.

on the world's largest radio telescope LOFAR (Low Frequency Array), one of 7,000 small antennas existing radio telescope is doing astronomical research in the frequency range between 10 and 240 MHz ... it is a range in which can be examined some cosmic magnetic fields that occurred in the "dark ages" of the universe, the time of the so-called reionisation ... [1]

Radio interferometric imaging constitutes a strong ill-posed inverse problem. In addition, the next generation radio telescopes, such as the Low Frequency Array (LOFAR) and the Square Kilometre Array (SKA), come with an additional direction-dependent effects which impacts the image restoration. In the compressed sensing framework, we used the analysis and synthesis formulation of the problem and we solved it using proximal algorithms. A simple version of our method has been implemented within the LOFAR imager and has been validated on simulated and real LOFAR data. It demonstrated its capability to super-resolve radio sources, to provide correct photometry of point sources in a large field of view and image extended emissions with enhanced quality as compared to classical deconvolution methods. One extension of our method is to use the temporal information of the data to build a 2D-1D sparse imager enabling the detection of transient sources. [1]

On the opening day I presented the first low-frequency limit on the rate of Fast Radio Bursts (recently published as Coenen, van Leeuwen et al. 2014 ). Those bursts, very recently discovered, originate from deep in the Universe, and are very much not understood. The limit is a result of the two LOFAR pilot surveys, which were largely processed on the large grid-compute cluster, operated by SURFsara and part of the European Grid Infrastructure coordinated by EGI.eu. Finding more of these bursts is one of the major goals of SKA; and my presentation made clear we can process such data on the Grid. My colleague Jason Hessels then presented the current tally of 11 (!) new LOFAR pulsar discoveries, all made possible through the large grid storage facility.

Today, the instrumentation setup and science results of several other ongoing projects were presented. One project that struck me in particular is RAPID by the Haystack Observatory at MIT. That is a set of easily deployable, self-powered, self-contained, low-frequency antennas, that will process their downstream data, through OODT , on the Grid.

Overall this meeting shows that while the data and compute demands for SKA remain challenging, the pathfinder telescopes are succeeding well. [1]

LOFAR is the Low Frequency Array for radio astronomy, built by the Netherlands astronomical foundation ASTRON and operated by ASTRON’s radio observatory. LOFAR will be the largest connected radio telescope ever built, using a new concept based on a vast array of omni-directional antennas. … LOFAR was officially opened on 12 June 2010 by Queen Beatrix of the Netherlands. Regular observations started in December 2012. [6]

The possibility that the ionosphere could be modified by powerful radio waves was first noted by Ginzburg and Gurevich [1]. The early theoretical work concentrated on the heating caused by the powerful radio wave, but later the emphasis gradually changed to plasma instabilities, turbulence, and plasma structuring. The first ionospheric modification facility was built in 1961 near Moscow, Russia, followed by facilities in Colorado, in Puerto Rico, at several additional sites in the former Soviet Union, in Norway, and in Alaska. AIT is currently being studied at research facilities located at middle (Sura, Russia) and high (EISCAT, Norway; HAARP and HIPAS, Alaska, USA) latitudes. In addition, a low latitude facility (Arecibo, Puerto Rico, USA) was active until 1998 and is now being rebuilt. Under construction in Europe is the huge LOFAR (Low Frequency Array), financed by the Dutch government. This 10–240 MHz radio telescope is of a new digital type which ensures maximum flexibility and cost effectiveness, allowing it to become the world’s largest and most efficient instrument for low-frequency radio studies of space. LOFAR is being supplemented by a likewise digital and cost effective infrastructure in Southern Sweden called LOIS (LOFAR Outrigger in Scandinavia). [7]

Of particular interest is to use LOFAR in combination with so called ionospheric HF interaction facilities. Such facilities are relatively simple to build, using commercially available HF radio transmitters and antennas. Existing systems today include the high-latitude facilities HAARP and HIPAS, Alaska, and EISCAT/Heating (Tromso), Norway, and the mid-latitude Sura facility, Russia. For nearly 30 years, a low-latitude facility was available at the Arecibo Observatory, Puerto Rico. A few years ago it was destroyed in a hurricane. There are now advanced plans to build a new HF interaction facility at Arecibo. Similar facilities have been proposed for equatorial latitudes both in Africa and in Asia.

We emphasize that a major objective for the future space physics is to further investigate into the possibility that human activities near the Earth may give rise to hitherto unidentified anthropogenic effects. [8]

These experiments are the tip of a very large iceberg. Even though HAARP’s future may be in jeopardy, new Sky Heaters like the 10 megawatt EISCAT 3D upgrade to the Tromso heater are on the not-to-distant horizon. Welcome to the wild word of science non-fiction. [9]

From a philosophical point of view, we have no experience of inanimate objects attaining consciousness on a level similar to humans, but this view can be biased because we are essentially using our own consciousness to assume the pseudo-scientific basis of consciousness, which might as well work in the reverse.

If we are to follow the line of argument that we ourselves have no conscious and reality is a skewed illusion of our own making, we would be aggrandizing a claim for inanimate consciousness than extends beyond rocks, but entities that fall under similar classifications of life/non-life that cannot project their consciousness onto human life (hence why we have had no experience of rocks expressing their consciousness to human beings).

There is no knowing whether or not rocks, plants or naturally occurring objects have a consciousness because that is beyond the realm of human knowledge. We can philosophize rocks having consciousness and being aware of its surroundings because we have no means to reject the possibility of us being the imaginative figments of rocks and that we are no more conscious than a mouse pad.

But since the official term for 'consciousness' requires there to be a relationship between the mind and body, we would have to establish the existence of a 'mind'. And even if you have established the existence of a mind, you would have to have that mind express the fact that it has a mind and the ability to use that mind for wakefulness, feeling and acquire a sense of self-hood. All of these are impossible to establish unless the rock can consciously express itself to human beings, paired with the fact that it's not biologically 'living' and cannot exact all/most life-processes, it's likely, from a logical/scientific point of view, that rocks do not have consciousness. [1]

The philosophical systems that arose in India early on were meant to help one to find clues to the nature of consciousness. It was recognized that a complementarity existed between different approaches to reality, present-ing contradictory perspectives. That is why philosophies of logic (ny¯aya) andphysics (vaises.ika), cosmology and self (s¯a ˙nkhya) and psychology (yoga), and language (mim¯am. sa) and reality (ved¯anta) were grouped together in pairs.

The system of S¯a˙nkhya considered a representation of matter and mind indifferent enumerative categories. The actual analysis of the physical worldwas continued outside of the cognitive tradition of S¯a˙nkhya in the sistersystem of Vaises.ika, that deals with further characteristics of the gross ele-ments. The atomic doctrine of Vaises.ika can be seen to be an extension ofthe method of counting in terms of categories and relationships. The realityin itself was taken to be complex, continuous and beyond logical explana-tion. However, its representation in terms of the gross elements like space,mass (earth), energy (fire) and so on that are cognitively apprehendable, can be analyzed in discrete categories leading to atomicity.The cosmology of S¯a˙nkhya [1][2] is really a reflection of the development of the mind, represented incognitive categories
...
http://www.ece.lsu.edu/kak/cons.pdf]

in Samkhya, the form of Hinduism from which Chopra apparently draws heavily (if not exclusively), the very word “Samkhya” means “enumeration” and has to do with the correct explication and recognition – for the purpose of enlightenment – of the categories of reality [3]

сепак добро звучи

What Is Cosmic Consciousness? The Quest for Hidden Reality

Is there such a thing as higher consciousness? For a tiny fraction of the population who believe they have experienced God directly, this is a spiritual question with a definite answer. But for most people the question is hypothetical. Every spiritual tradition has asserted that there is a hidden reality which can be uncovered through transcending — or going beyond — the five senses. There are elaborate directions for accomplishing this leap, in the form of prayer, meditation, renunciation, and faith — the religious history of humankind has never stopped directing its aspirations to a higher plane. But everyday life consumes our attention, and in a skeptical age the erosion of belief makes higher consciousness seem very far away if not irrelevant.

On a separate track, or so it seems, quantum physics has altered the universe in radical ways. Solid matter has been reduced to invisible waves existing in a field of mathematical probabilities. Time and space form a background in which relativistic quantum fields float, completely different from the reliable time ticked off by clocks, and the space enclosed inside rooms where solid objects find a place. Yet as with the higher dimensions aspired to by religion, quantum space remains hidden from the five senses. For the vast majority of physicists, quantum reality is about intricate mathematical constructs and experiments that validate them using billion-dollar particle accelerators.

Standing back a little, the resulting picture is quite startling. The two most important ways of explaining creation, science and spirituality, both depend on a hidden dimension. Without this dimension there would be no human existence. Shouldn’t that knowledge revolutionize our lives, here and now? Somehow it doesn’t. A missing link needs to be filled in. Otherwise, the world we inhabit will be disconnected from its source, as it largely is right now.

One proposition, which we strongly endorse, is that the missing link is consciousness. Because so many people relegate spirituality to faith, assuming that nothing about God or the soul can be proved, let’s set that aside for the moment. The link has to be scientific. We must thread a path from quantum theory to higher consciousness. This takes some hard thinking, but a huge reward awaits. Hidden reality will reveal itself for what it actually is. Higher consciousness may well become an everyday experience.

To begin, quantum theory, which has been called the most successful scientific theory in history, unequivocally states that we live in a participatory universe — what we consider as an independent, external reality is in fact tied to how we observe it. The late physicist John Wheeler of Princeton and the University of Texas campaigned for the importance of our participation, pushing against the notion that the universe was simply “out there,” like a bakeshop, he said, that we look at with our noses pressed against the window.

Yet how strange to think that when a physicist makes observations and measurements, the quanta that constitute everything in the cosmos change; indeed, it is meaningless to talk of their properties without presupposing an observer. The universe is tied to conscious acts of observation all the way from the most elementary particles to vast galaxies. Moreover, quantum theory assigns a primary role to the quantum vacuum, the emptiness that precedes observable phenomena like atoms and molecules. Unlike the common sense notion of empty space, the quantum vacuum is abundantly full of dynamic potential. Wheeler, besides coining the term “participatory universe,” also held that the quantum vacuum is primary in all physics, a view that has gained wide acceptance. The quantum vacuum is a vast plenum (fullness) of spacetime “foam,” beyond which time, space — and physics — come to an end.

Cosmology, which is based on the other most successful theory we have, Einstein’s general relativity, states that the universe emerged from this plenum of quantum foam at the time of the Big Bang and has been evolving ever since, for some 13.5 billion years. Everything we consider real, either to our senses or to scientific investigation, first passed through the so-called Planck era (a state so minuscule, brief, and turbulent that it cannot be penetrated — it’s a mathematical formulation that describes the limit of what we can know) and entered the phase of general expansion that created matter, energy, stars, galaxies, and biological life.

With a definite limit set on space and time, science has to wrestle with the fact that the human brain operates in space and time. But this doesn’t need to be discouraging. If the universe is in fact participatory, then the human brain must be participating on the quantum level. Why is this so? Because the quantum foam, which is the source of every particle in existence and its oppositely charged anti-particle, must be the brain’s source, too. It isn’t tenable to posit a universe where quantum reality is divorced from everyday reality. The micro and macro worlds derive from the same origin, not just billions of years ago, but at this very minute — reality bubbles up from the quantum vacuum continuously.

The quantum foam allows entangled quanta to emerge from it, while the vast majority of them fall back onto it. Thus the creation, maintenance, and re-absorption of virtual particles occur at all times and at all space points. Our senses force us to see one sunrise at a time, one birthday party at a time, one person at a time. Yet without a doubt reality isn’t confined to linear experience in space and time. One can even say that beyond our confined perceptions, the Big Bang is happening everywhere at once, in an eternal now. Creation is a single process, and we are totally immersed in it.

This is where the missing link is most urgent. Physics needs the quantum vacuum for various reasons, most of them mathematical, but everyday life seems to chug along quite nicely without it. However, since everything in existence depends upon the quantum vacuum, including all living beings, our belief that we live outside it must be false. There’s no logical escape from this fact, so the burden lies with changing our sense of reality — the “going beyond” that supposedly belongs to saints and mystics actually applies to everyone (perhaps saints and mystics are just the ones who caught on first).

Here a subtle point arises. The strangeness of the quantum world, which approaches legendary status, grew out of its contradictions with prevailing theories of long standing. (For example, the accepted idea of cause-and-effect isn’t consistent with the quantum possibility of time and causation going backwards.) Yet no matter what model you use to explain reality, the map is never the territory. Whatever secrets it reveals, reality remains what it is, unanswerable, irrefutable, and inconceivable.

This, too, doesn’t have to be a discouraging thought. It’s liberating to realize that we are part of this inconceivable reality, navigating through it with all kinds of questions yet sustained by it no matter how wrong, limited, or misguided our answers may be. Participating in the quantum field makes a serious difference in how life can be led. For it turns out that all the spooky phenomena in the quantum world are perfectly human and familiar — once you stop comparing them to old, worn-out explanations.

In the next post we’ll restore the missing link by an act of destruction and creation, knocking down the perceptions that limit everyday life and replacing them with perceptions that allow hidden reality to come alive.

(To be cont.)

Deepak Chopra, MD, FACP, is the author of more than 75 books with twenty-two New York Times bestsellers.

One possible finite geometry is donutspace or more properly known as the Euclidean 2-torus, is a flat square whose opposite sides are connected. Anything crossing one edge reenters from the opposite edge
...
A finite hyperbolic space is formed by an octagon whose opposite sides are connected, so that anything crossing one edge reenters from the opposite edge
...
http://abyss.uoregon.edu/~js/21st_century_science/lectures/lec21.html

Does this mean we should panic? I mean, sure, go ahead, if you’ll feel better afterwards. But really, what it means is this: if we can agree that it makes sense to allocate resources towards reducing the risk of people getting killed by things in proportion to the likelihood of those things actually occuring, then asteroid detection and intervention is definitely worth our attention. Probably more so than some other things. Like terrorism. But I digress.

The European Space Agency and NASA have a good understanding of the importance of finding and (hopefully) avoiding asteroids, and they’re joining forces on an Asteroid Impact & Deflection Assessment mission called AIDA. The objective of this mission is awesome: to slam a spacecraft into an asteroid with as much force as possible, and see what happens.

As you may remember, this is not the first time that we’ve deliberately crashed a spacecraft into an asteroid just to see what would happen. In 2005, the Deep Impact spacecraft launched a 370-kilogram copper-cored impact vehicle at the 6-km-wide comet Tempel-1, and then hung back to watch the show.

That impact changed the orbit of the comet by something like 10 centimeters. Not much, obviously, but it’s a start.

The AIDA mission has a similar structure to Deep Impact. It has two primary components: an observatory called AIM (Asteroid Impact Mission) being developed by ESA, and an impactor called DART (Double Asteroid Redirection Test) being developed by NASA. The AIM observatory, which will be launched first, won't be hugely complicated or expensive. It's expected to carry some cameras and radar and communications gear and stuff like that, and also a boxy little lander based on DLR’s MASCOT. There will be room for a few cubesats, too.

AIM will launch out towards a near-Earth asteroid called Didymos. Didymos was chosen because it's super easy to reach from Earth, and at about 800 meters in diameter, it’s the size of something that we'd probably want to try and avoid. Didymos also has a little moon (technically, it’s a binary asteroid) which has a diameter of 150 m. The moon, known as Didymoon, is going to be the target asteroid, because any change in its orbit will be easier to measure, and it’s actually more effective to move the moon, and then let the moon’s new orbit shift the orbit of the binary system as a whole.

NASA will be sending DART on a much more direct course since it won't have to sneak into orbit around Didymos. Instead, it’ll smack into Didymoon at just over 6 kilometers per second (13,400 miles per hour), abruptly converting most of its roughly 300 kg of mass into a stupendous amount of energy. The hope is that when all the dust settles, AIM will be able to measure a change in Didymoon’s orbital velocity of about half a millimeter per second. Given enough time, even that minuscule a shift could be signficant.

You don’t have to worry about this tinkering with Didymos’ orbit, though. As NASA points out, “it is important to note that the target Didymos is not an Earth-crossing asteroid, and there is no possibility that the DART deflection experiment would create an impact hazard.” Good to hear. If everything runs on schedule, AIM will launch in 2020, and the DART impact will happen in 2022.

This direct approach is just one way that we might be able to deflect an asteroid from Earth impact; NASA is going to try a gravity tractor in 2020, for example. If (when) we need to put these techniques into practice, the one we choose will depend on how long we have before impact, the size of the asteroid, where it is, what it’s made of, and all kinds of other factors that we can’t know in advance. All we can do is experiment and practice, because when the time comes, a failure of the technologies that NASA, ESA, and others are developing could mean anything from the obliteration of an entire city to the collapse of civilization, to the total extinction of the human race.

In 2029 it will pass very close to the Earth: within the orbits of our communication satellites. It won't hit; however there is a slight chance that this close pass will shift its orbit exactly the right amount to cause it to hit Earth on a second pass in 2036.

The spot is presently about twice the length of the whole of Europe.
Simon-Miller estimated that the diameter of the object that slammed into Jupiter was at least twice the size of several football fields. The force of the explosion on Jupiter was thousands of times more powerful than the suspected comet or asteroid that exploded over the Tunguska River Valley in Siberia in June 1908 - http://spacetelescope.org/news/heic0909/

колку кул констатација ~ големиот брат го крка за сите ~ It's not surprising that such things happen, as Jupiter is the largest and most massive planet and acts as a sort of shepherd of small solar system bodies

yup landing on another celestial body with a total electronic computing power rather less than most of us carry in our pockets daily now (not just in the vehicles, across the program). a testament to dedicated engineers with slide rules

It is a sad reality that some spacecraft like Nasa's Philae lander don't last as long as they should, while others such as Esa's Venus Express continue to surprise us.

As part of its recent swansong, as the probe plummeted toward the planet, the Express gathered new and unexpected information about the planet's polar atmosphere.

This data has now been studied and reveals the poles are colder than any place on Earth.

They are also covered with rippling atmospheric waves.

Venus Express arrived at Venus in 2006.

It spent eight years orbiting the planet, greatly exceeding the mission's planned duration of 500 days, before it ran out of fuel.

The probe then began its descent into Venus' atmosphere, before the mission lost contact with Earth in November 2014 and officially ended the following month.

Before it plummeted down through the planet's atmosphere, the probe's measurements showed it to be rippling with atmospheric waves and, at an average temperature of -157°C (114K), is colder than anywhere on Earth.

These recent observations show the planet's atmosphere is much more interesting than first thought.

During the final months of its mission, Venus Express orbited the planet low enough to measure drag from the atmosphere.

Our previous understanding of Venus' polar atmosphere was based on observations gathered by Nasa's Pioneer Venus probe in the late 1970s.

These were of other parts of Venus' atmosphere, near the equator, but extrapolated to the poles to form a complete atmospheric reference model.

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot delete your posts in this forumYou cannot edit your posts in this forumYou cannot create polls in this forumYou cannot vote in polls in this forum