CosmosVR

Inspiration

When you think big, you might think of our Solar System, or maybe even our galaxy - but our cosmos is far, far more vast than that. Our home galaxy, the Milky Way, hosts hundreds of billions of stars, enough gas to make billions more, and five times more mass in something mysterious, called Dark Matter. At least trillions - that is millions of millions - of galaxies fill up the visible Universe. Together, they form the structure you see now, called the Cosmic Web.

Some galaxies, like our Milky Way, live in groups with a few dozen partners. The Milky Way is on a collision course with our nearest neighbour, Andromeda. Don't worry, the collision isn't due for another 4 billion years! But when it happens, it will be spectacular. Groups live in long chains called cosmic filaments, which eventually feed the largest groups, called galaxy clusters.

What it does

Look around you. Navigate your way through filaments, groups, and individual galaxies. When you find an interesting object, give it a little attention. If you look long enough, you'll get to zoom in and see it in much more detail, and learn more about that type of object. What do you see? How do the stars, gas and dark matter live in relation to each other?

The Data

Every point you see in this experience is a real data point from the Illustris cosmological simulation. This simulation started with a Universe merely 51 million years old and runs it all the way to the present day, at age 13.7 billion years. Astronomers know that in the early days, the Universe was largely very smooth, just one big hot soup of protons, electrons, light particles, and dark matter. If one part of this soup is a little denser, it attracts other parts to it by gravity and grows even denser over time; regions that were less dense in the beginning grow emptier and emptier. As gas gets denser, some of it forms clumps that eventually form stars, and the Universe starts to light up. Finally, by the present day - which we show - the large scale structure looks like a giant web, called the Cosmic Web. The zoom-in at the end takes you to high-resolution data on a single galaxy.

This simulation thus shows our best understanding to date of how the Universe evolves on the largest scales. Urmila published her first academic paper on measuring dark matter distribution in large groups of galaxies (galaxy clusters) by looking at the light from the stars in the cluster galaxies, using this very simulation! Dozens of papers have used this simulation to understand what exactly are the predictions of our current model of dark matter and dark energy, and how these predictions compare to observations. Now, anyone can experience this.

User Experience

We tried to design our experience around the user. We created personas that we wanted to target the experience towards.

Then for each user, we mapped out there "User Journey" through our application. We tried to imagine what would be important at each stage for each user.

Our conclusion from this study is that we wanted to primarily target an audience like "Bob" - someone who doesn't have a lot of technical knowledge already but is interested and wants to have an exciting experience! We also considered expanding this in the future to target researchers like Alex as a secondary audience who might use this for further data analysis that isn't possible in traditional mediums.

Design

Based on our user story conclusion we tried to design an experience that would fit our primary audience. Part of this was designing a fun and "cool" heads-up-display that would make the user feel like they're in an astronaut's helmet. Part of this design choice was also to counteract the possibility of a nauseating experience caused by flying in space. We also wanted to implement a system on changing helmets to see different types of information such as gas and stars. These designs were first sketched on paper, which is shown below.

Next, the heads-up-display design was implemented using Inkscape and imported to Unity.

Implementation

We downloaded snapshots from the openly accessible Illustris simulation (www.illustris-project.org/data). Like most astrophysical simulations, this was in HDF5 format. We used Python to select the information we wanted to visualize - density of dark matter and gas, the colors of star particles, and positions for everything - and convert it into .xyz files for Meshlab (http://www.meshlab.net/). Meshlab converted these into PLY files, and we found an open-source library to load these into Unity. This gave us our first view of the Universe in 3D!

We then used the Vive SDK to allow users to fly through the simulation using the controllers.

Challenges we ran into

Loading the particle data into Unity was a huge challenge! We first considered reading in JSON files and making a ParticleSystem, but couldn't figure out how to do this. The Meshlab workaround really saved the day.

We seem to be having a Unity sound driver issue, so that we can't hear any sounds playing from Unity. We have been working with a mentor to debug this, but unfortunately ran out of time. We would like to expand the interactions to support grab-and-move style locomotion, more advanced selection of objects (galaxy groups, clusters, etc.) using both hand and eye controllers, and an in-depth HUD. For this project we developed a simple "fly-around" movement system using the Vive hand controller triggers.

Accomplishments that we're proud of

We thought a lot about the user experience and pedagogical content. We are really proud that we could show an accurate data-set without compromising performance.

We are also really proud that our team (all strangers before this weekend) came together and utilised our diverse skillsets to create a unique and engaging experience! We come from a variety of backgrounds - Astronomy, Biology, UI Research, Design, and Software.

What we learned

Beste, Diego and Paul learned about extragalactic astronomy (including the term extragalactic astronomy), the large scale structure of the Universe, and how astronomers simulate and observe the various components of the Universe. Urmila learned to work with HDF5 files instead of relying on other people's highly specialized wrappers. We all got a really deep dive into Unity and the Vive SDK, including how to load in and render large datasets, how to move around the space and various ways to trigger actions. We quickly gelled as team members and worked like a well-oiled machine!

What's next for CosmoVR

We want to add a lot more sound and information to the app to make it a powerful educational tool for consumers at various levels, ranging from curious school kids to scientific researchers. We also want to support more zoom levels and the ability to dynamically change scale without loading new scenes. Since a typical dataset in this space has different time slices we'd also like to visualise how these particles move and change over the course of billions of years. We'd also like to continue implementing our interaction points.

One future work will be to try and port this experience to a standalone headset and prove out the feasibility on a reduced performance hardware platform (like the Oculus Go or GearVR). We believe even a simplified port to a standalone headset could increase adoption in museums and schools who don't want to spend excess resources to drive an HTC Vive.

Built With

Try it out

Submitted to

Created by

I am an undergraduate student at the University of Michigan who is studying Computer Engineering. I worked on the Unity development and the overall design. While working on the project, I learned a lot more about Unity and developing for VR.

I'm a computational biologist with deep experience processing and visualizing large biologically related datasets. I was able to aid in identifying the best way to process and flip the data into the right format in order to load into Unity and begin manipulating it. I definitely intend to extend and leverage what we have created to understand the data that I manipulate daily.

I'm a graduate student in computational cosmology, so I run and analyse simulations like this one for a living. I love communicating science to a general audience, and write for Yale Scientific, Science in the News and AstroBites.

I'm a software developer in the self-driving automotive space, I was really happy to work on a completely different type of project than my day to day work! In this project I wrote most of the C# scripting for Unity. I also led the User Story development to drive product definition. I learned a ton about rendering in Unity and hope to take some of that knowledge back to my work to render point-cloud data for Radar and Lidar sensors!