These transient features are so named because they last about 20 milliseconds.

Share this story

The pulsating blue jet from the top of the northern cloud. Frame 1 is the first of the time sequence. It serves as a reference frame to illustrate the structure of the cloud. Frames two through eight show the pulsating blue jet.

Geophysical Research Letters

Mogensen was told to take photos of a cloud turret like this one photographed from the International Space Station.

NASA

Transient Luminous Events discovered in the Earth’s atmosphere since first observations in 1989.

ESA/IAA-CSIC

A red sprite captured on camera from the International Space Station on August 10, 2015, over southern Mexico.

NASA

The red color of the sprites is caused by nitrogen interacting with electricity in the upper atmosphere.

ESA/Jason Ahrns

Scientists don't know much about the mysterious, powerful electric discharges that sometimes occur in the upper levels of the atmosphere in conjunction with thunderstorms. The first photograph of the phenomenon—which can occur as high as about 90km above the surface of the Earth and are known variously as sprites, pixies, elves, or jets—was only taken from Earth in 1989.

Fortunately for scientists interested in these storms, the International Space Station offers an excellent vantage point at an altitude of about 400km. So Danish researchers devised a "Thor experiment"—named after the hammer-wielding Norse god—to study the phenomenon. As part of the experiment, an astronaut on board the station would image thunderstorms under certain conditions, and these observations would be correlated with data collected by satellites and ground-based radar and lightning detection systems.

Further Reading

It may sound easy to catch a few quick snaps of electrical storms, but given the station's movement at 28,000km/hour and ephemeral nature of these events, it's actually quite difficult. Sprites and other features got their other-worldy names precisely because they are so short-lived, lasting on the order of 20 milliseconds.

When Danish astronaut Andreas Mogensen spent 10 days on the station in September 2015 as part of an ESA-Roscosmos contract that designated him a visiting crew member, one of his primary tasks was to complete the Thor experiment. Perched in the station's cupola, with a Nikon D4 set at 6400 ISO and recording 24 frames per second, Mogensen readied himself to capture images at locations where forecasters predicted thunderstorm activity would occur below.

Mogensen succeeded. As the station passed over the Bay of Bengal, between India and Myanmar, Mogensen shot a 160-second video of the cloud turret of a storm that had 245 blue flashes. Meanwhile Doppler radar at a station in Machlipatnam, India, and elsewhere tracked the storm from the ground. Scientists recently reported their results in Geophysical Research Letters.

The study observed a majority of discharges between 18km and 40km, or as high as the upper stratosphere, and provided a new perspective on the electrical activity that occurs at the top of tropical thunderstorms, the authors say.

Notably, the discharges appear to play a significant role in the exchange of gases between the troposphere and the stratosphere. Previously scientists had understood that the red color of sprites was caused by nitrogen interacting with electricity in the upper atmosphere, but with the new data they were able to study the discharges and found they release a comparably large amount of various oxides of nitrogen into the upper atmosphere. "They underscore that thunderstorm discharges directly perturb the chemistry of the stratosphere with possible implications for the Earth's radiation balance," the authors write.

A Professor of mine worked out that when you have a phase change like steam condensing into water or a metal solidifying from the melt, that you get a peltier like effect at the interface of the two phases. He thought it might have something to do with the formation of lighting since the two phases can induce currents that at least in metals could affect the growth of crystals from the melt. So when you have water condensing in the atmosphere the peltier effect in addition to wind shear could lead to charge separation and the build up of voltage.

Why does the ISS not have a camera system pointed at earth to capture these photos all the time without an astronaut having to point a camera manually? Like the current live stream, but better. 4k, perferrably with multiple cameras pointed at different areas of the surface and controllable by mission control, with lens and aperture adjustment capabilities that would show surface light when on the dark side? That seems like it would not only be awesome, but super useful too. Or would something like that be just wayyyyy to expensive to do?

Capt. Picard: I understand what you've done here, Q. But I think the lesson could have been learned without the loss of 18 members of my crew.

Q: If you can't take a little bloody nose, maybe you ought to go back home and crawl under your bed. It's not safe out here. It's wondrous, with treasures to satiate desires both subtle and gross. But it's not for the timid.

That was exactly what came to my mind too. I was disappointed when I found out part of the appearance is because it was the top of a cloud...

What is it?

In the image? It's a wormhole, as depicted in Star Trek Deep Space 9 (sure, nothing like what a wormhole should actually look like, but the scifi geek in me won't let a little detail like that get in the way of wishing it was ).

Picked up a Nova show recently titled "At the edge of space". Pretty good, if you can see online or from your library, has a lot of good photos and information. I learned quite a bit about them from the show.

Wow, thanks Eric for bringing these strange thunderstorm events to my attention! Interesting stuff. And thanks, Ars, for sending me onto yet another multi-hour research binge on the net while I have work to do. Yeah, thank you very much ;-).

A 8.3 megapixel image of the whole earth isn't very useful. It's pixels the size of islands. And with a shutter time in the order of the length of a sprite, chances are you'll miss most of the light anyway.

Why does the ISS not have a camera system pointed at earth to capture these photos all the time without an astronaut having to point a camera manually? Like the current live stream, but better. 4k, perferrably with multiple cameras pointed at different areas of the surface and controllable by mission control, with lens and aperture adjustment capabilities that would show surface light when on the dark side? That seems like it would not only be awesome, but super useful too. Or would something like that be just wayyyyy to expensive to do?

Take a look at those photos again, and ponder that they're taken from about 300 km away. The angle of view is very small; you would need hundreds or thousands of cameras to cover the whole earth at such resolution in real time, and then you'd have to figure out a way to handle the petabytes of data falling out of the sky that would generate.

There are plenty of satellites - including the ISS - that do broad angle imaging more or less continuously. But for small features like these, the only practical way to record them is as was done here - careful planning and highly targeted imaging.

Would probably also result in a massive torrent of data that would have to be transmitted and analyzed, too. The analysis time and computation isn't free, neither is the bandwidth!

Hence picking thunderstorms and having both the ISS as well as surface tracking data focused at the same time, way easier to deal with and probably much more awesome given the available resources!

TLDR: Totally setting aside whether it's a research priority, I think one could plausibly design a distributed system to handle the torrent of information of an ultrafast camera setup. Let me make clear, this is a back-of-the-envelope "what would it take" analysis, not a "we can definitely do this" one.

The fact that this single storm lead to several hundred observations of the phenomena leads me to think there might be a feasible compromise for some manner of (semi-)automated ultrafast imaging. Suppose one can detect and discriminate characteristic transients (with standard imaging or perhaps spectroscopy) with fair probability at a particular latency. The goal is to turn on a fast camera (or other devices) when there is a high probability of detecting events, point it in the right direction, and carefully ignore almost everything it records in close to real time.

Big particle physics experiments work similar to this now, with real-time filtering throwing away the vast majority of events within moments and only saving the ones which look more interesting. It's still a flood of data, but one distributed terrestrial systems can be built to handle. There is a similarity to real-time supernova detection like SNEWS as well. The constraints in space will be different, but maybe the idea could be adapted.

I will now perform extensive hand waves.

A solution along these lines would probably need quite different space and terrestrial systems. The space part would have very low latency access to the raw data, but limited storage and a strict power envelope that limits the quality of event detection and analysis. I'd imagine it has two modes: a really low power wide-angle one that runs continuously (say 60-120 fps, enough to see the flashes) and occasionally sends frames to Earth that might be interesting but are computationally intensive to analyze well. The second would be a high-power low-angle one that is only activated when events are very likely to occur in the near future, and uses a much higher frame rate (thousands of fps?), but preserves station bandwidth (albeit using more than low-power mode) by sending just enough information to Earth to identify the probable beginning and end of events. The local storage would be high bandwidth but limited capacity. It needs a large enough buffer to hold all recent frames until Earth says what to keep and throw away, as well as events that were sufficiently interesting. As these are only ~20 ms long, even at a high framerate this is probably tractable. When things calm down one can make decisions about what events to send to Earth directly, and what ones will be sent back on a drive.

On Earth the power and computational limits are comparatively negligible, so the goal is to limit the total latency (travel plus computation) subject to reasonable bandwidth constraints with the station. The intensity of analysis is much higher than possible on the station, and this is especially important when the station is taking high-frame rate data, since only a tiny fraction of the total data can be sent.

Now some thoughts about the time scales and the constraints it places on the processing times. The video in the article was 160 s and averaged about 1.5 events/s. So the time scale of observing a storm from a high angle is on the order of minutes. The total latency to go into high-speed mode should be low enough to, with high probability, observe at least a few events. With events of order 1 per second a latency up to order minutes will still allow at least seconds of imaging on a good fraction of storms. If 0.1 per second is more typical, a latency of order 10s of seconds would be ample, and so on. The communication latency to ISS is probably of order 1 second, so if the event rate > 0.1/s the processing time is the major factor.

Once in high-speed mode, the major constraint is the available storage buffer on the ISS and how quickly it can be cleared. That will obviously depend on what data is being taken, so I looked at some cameras near the upper end of possible. The Phantom VEO 410 has up to 72 GB of memory and can take 720p at 5800 fps for about 12 s. (For reference, the Phantom v2012 shoots 720p at 25000fps, about 10 s with 288 GB of RAM.) For a 20ms event that's 116 frames of sciency goodness. If we want to record for several minutes without some real-time way to throw frames out we're talking several terabytes of RAM, compared to which the space required to store interesting events is negligible. Now I don't know what a truly space-hardened high speed camera with 10 TB of memory costs, and I don't want to know. (Well, that's a lie, I do want to know.) However, one can prioritize the core functions and accept a higher rate of faults in memory that will never be used for anything but the video, most of which you plan to toss anyway. In any case, I'm trying to adroitly avoid research priority and budget considerations beyond "not totally crazy."

Obviously if we can do all classifying on the station that would be super, but let's proceed under the assumption that it takes more power than is practical. Due to communication latency Earth-bound decisions about what frames to keep (pre-filtered by the station to cut down on bandwidth) take no less than order seconds, and to be safe let's just say with local filtering and terrestrial processing 10s is the bare minimum. For this framerate and resolution a 100 GB buffer would probably be about right. For 1000 fps about 20 GB. While still considerable, these numbers are achievable with real devices, not just hypothetical ones.

The big question then is what kind of processing can be done, and can it be done fast enough. I imagine station filtering has to be pretty simple, running a very fast image classifier or even just looking for sudden changes in intensity. If I remember right LHCb does the bulk of the work in fixed-latency dedicated hardware, so I don't want to underestimate the difficulty. Still, back on Earth one has the freedom to go full parallel to distinguish something like lightning from the events we're actually interested in. While I have experience in scientific modeling, computer vision is largely outside my expertise. That said, this isn't a self-driving car: the tolerances for "good enough" classification are pretty loose once you point the camera at a storm generating a fair number of events, and the latency requirements are considerably less. If having this information at better than guy-with-a-DSLR framerates were a scientific priority, I'm inclined to think it could be done.

I want to know more about how these amazing photos were obtained. Did the astronaut simply hold the camera in his hands, stand at a window and zoom in on the likeliest storm? Or did he have some sort of tripod / mount, with the camera connected to a computer, then manually locate and zoom in and let the computer take 1000's of shots? Considering the ISS moves pretty fast, was the mount & zoom motorised and remote controlled from earth but still just shooting through a window? (So here the astronaut's role would be to set it up when conditions were good, and take it down to make more space at other times?)

In other words he got paid to sit in the cupola and sight see- soak in the 3d-ness of the Earth and clouds beneath him.

As someone who has hiked to the South Rim in Big Bend National Park at 3 am just to take pictures and watch the landscape (think of Mars but with clouds and cacti) change as the sun rises and moves (very 3d-ey) I am extremely jelly.

Once in high-speed mode, the major constraint is the available storage buffer on the ISS and how quickly it can be cleared. That will obviously depend on what data is being taken, so I looked at some cameras near the upper end of possible. The Phantom VEO 410 has up to 72 GB of memory and can take 720p at 5800 fps for about 12 s. (For reference, the Phantom v2012 shoots 720p at 25000fps, about 10 s with 288 GB of RAM.) For a 20ms event that's 116 frames of sciency goodness. If we want to record for several minutes without some real-time way to throw frames out we're talking several terabytes of RAM, compared to which the space required to store interesting events is negligible.

While I don't have specific experience with the VEO, most of Phantom's camera line-up has the capability to buffer continuously, only saving frames when the record button is hit. The fun part is that you can set it save frames from BEFORE the record button is hit. I'll bet it would be super simple to detect the flash and trigger the high speed camera to record a few thousand frames before and after. There are so many motion and light-activated camera triggers out there that I'll bet you could get something working with completely off-the-shelf parts.

Why does the ISS not have a camera system pointed at earth to capture these photos all the time without an astronaut having to point a camera manually? Like the current live stream, but better. 4k, perferrably with multiple cameras pointed at different areas of the surface and controllable by mission control, with lens and aperture adjustment capabilities that would show surface light when on the dark side? That seems like it would not only be awesome, but super useful too. Or would something like that be just wayyyyy to expensive to do?

There is a live stream http://www.ustream.tv/channel/iss-hdev-payload It's just not aimed at anything in particular. It's probably a lot easier to send up a Nikon and tell an astronaut where to point it than to make a space-hardened remote-controled surveillance system.

There was a great magazine in the 90's called "Earth", I still remember the article that told all about blue jets, red sprites, and green elves. The thought of all sorts of otherworldly things like that above thunderstorms was the coolest thing, made such on impression on me I never forgot it.

Why does the ISS not have a camera system pointed at earth to capture these photos all the time without an astronaut having to point a camera manually? Like the current live stream, but better. 4k, perferrably with multiple cameras pointed at different areas of the surface and controllable by mission control, with lens and aperture adjustment capabilities that would show surface light when on the dark side? That seems like it would not only be awesome, but super useful too. Or would something like that be just wayyyyy to expensive to do?

I think the reasons are:1) the cameras used are specially designed to handle the conditions of space, as such the two cameras used on that video site cost $17 million,2) they have major bandwidth limitation on the ISS, so more videos, let alone 4K would be a major hurdle3) the current live stream site is the first TEST of being able to stream live video to Earth & measure camera life, after they prove it works, they might do exactly what you propose