We’re going to have a conversation this morning about the whole idea of flows in computing, and who needs it and why. This started with – Larry and I have this no-longer-secret favorite thing that we do, which is the Smarr-Anderson Road Trip. I have this little silver convertible, and we get into it and we go places, every year. And all we do is just share ideas for about 48 hours. It’s the most fun thing I think I do all year long, other than FiRe itself.

And this year … You can picture the little silver car, and there’s Larry, there’s Mark, and we’re driving along, it’s very beautiful, it’s on an island up in the Northwest, and almost from the very beginning – I think we were in the parking lot, still, getting onto this ferry – and we start talking about flows, and it just builds.

Smarr: Mm-hmm.

Anderson: And Larry’s talking about stories about the increased amount of data in the world, and we’re talking about streams of data that are flowing, and what’s going to happen, and who’s going to do what about it, and it just immediately comes clear that we don’t have – today – we don’t have the proper ways of dealing with all that.

So I think what we’re going to try and do this morning is essentially re-create that conversation, and I’m going to ask Larry to give us some examples of the problem and what we have to do to meet it. So, would you like to tell a few stories about flows of data?

Smarr: Well, as you all know, I’m the principal investigator for an NSF grant called the Pacific Research Platform. And what that is, is realizing that with this exponential increase in big data and big data flows, the ordinary commodity internet isn’t gonna cut it. And so we really need, like the interstate highway system was needed, as well as Route 66. So we’ve been working on this for about 15 years, using optical fibers, which have the ability to carry – you know, a trillion bits per second, actually, in the glass. And to be able to establish between scientific instruments, investigators, supercomputers, over the whole West Coast – and we take a broad view of the West Coast, that goes to Australia and to Chicago, and actually to the Netherlands [laughter]. It’s like that New Yorker [cartoon] view of the world.

Essentially, we picked scientific teams – and I’m going to give you an example of two of them – that typify where we’re going over the next few years. And this is hooking up at about 10 to 100 gigabits a second, which is about 1,000 times the speed through the normal internet.

As a person who’s spent 25 years doing generalistic astrophysics and observation and astronomy, that’s close to my heart. So, one of them is the Large Synoptic Survey Telescope, the LSST, which the NSF is funding. It’s a giant telescope being built in Chile, up in the mountains, the very top [of the] mountains, and it is looking at the sky. Every night, it reads out about one-and-a-half billion pixels a second, and it has two 40-gigabit-per-second – for those of you who have home cable modems at, say, 10 megabits? this is 40,000 megabits a second – to the National Center for Supercomputing Applications in Illinois, that I founded 30 years ago.

It follows most of the observable universe. It in particular follows 30 billion objects in the sky and looks at their changes. So it essentially turns the universe into a video stream. And then it’s computing anything that moves, gets brighter, dimmer, changes color … How many of those do you think it’s going to see a night? Because it has to send an alert out to all the telescopes on Earth and in space. One to 10 million. And, by the way, those alerts go out from the telescope through the fiber-optic supercomputer one minute after the observation in Chile.

So that’s a level of flow that is literally turning the whole universe into a movie. That’s never been done before.