Saturday, August 31, 2013

In the last few days, I've built a browser based peer-to-peer message passing system. It was easier than I thought.

I've also been building control panels, now that all the tech is working. I solved the texture curvature problem with a little trig math and storing polar texture coordinates instead of linear spatial. They look nicer.

Incidentally, I tweaked a few things, and the entire interface is now running in-browser (chrome on desktop) at 60fps, which means that technically I am redrawing the consoles (and entire 3D environment) at a faster framerate than ST:TNG was actually broadcast at. Welcome to the future.

Thanks to PeerJS, I now have a functioning peer-to-peer network between the clients using WebRTC. So far it's just carrying 'chat' messages, but it's a very short step to fully encrypted video streams.

Well, I say 'peer-to-peer network', but currently it's the worst kind - a "complete N-way mesh" where every client connects to every other client and sends everything to everyone. To say this is an area of active research is an understatement, but you have to start somewhere.

Given my fairly low needs, that will probably be good for up to a hundred clients, which technically exceeds the number of concurrent PeerJS connections I can have on their free developer account, but the software is open-source, so I can scale up to my own server when needed.

I'm still using Pusher here and there, (just another ODN Conduit type, from inside the code) but it's clear that solution has a different niche. It's more a 'fallback' for highly constrained mobile devices that can't do all the latest WebRTC networking. (Or even WebSockets) Pusher sometimes takes a second or two to get messages across the network, but it's reliable.

When you have two systems with different strengths and weaknesses, the right answer is often to choose both. In fact, if more peer-to-peer style IM services become available, I'll do my best to fold them in too. Heterogeny is good for networks.

Once again, I'm impressed by the quality of work the W3C demonstrated in building the spec, and the talented people at Mozilla and Google who implemented the hard parts while the ink was still barely dry. You might not realize it yet, but your browser has been granted all kinds of new superpowers.

My only gripe is with the quality of the tutorials; too many of the 'official' demos/tutorials are simply incomprehensible. They look more like test cases. Confusing (repetitive and overly long) variable names, poor documentation, no comments, and one-line functions that call other one-line functions instead of inlining for readability.

That's probably why no-one knows about this stuff - the awful tutorials. A similar situation exists for WebGL. I don't blame the W3C or the devs for this: The W3C specs. aren't really meant to be normal-human readable. They have to use a very precise language where terms like "must", "should" and "may" determine what your IF statements need to do. (Although I'd like to find the person who thinks "candidate" is a noun, and give them a piece of my vocabulary.) And the devs are too busy implementing the code and asking for clarifications as needed.

The poor tutorials are really our fault. Mine too, for not writing better expositions of the tech. 'Newcomers' have to document their journey, and discover the common pitfalls and misconceptions that the developers are not longer capable of making, or realizing that we might make. That hasn't happened yet, so being an early adopter means confusion and headaches. WebGL at least has the HeNe tutorials originally written for pure OpenGL and 'ported' across. The best part was their 'lesson structure', not the code. There'a difference between documenting a standard and teaching a class.

While I don't have the time to write a comprehensive tutorial, I can at least say that WebRTC seems to be worth it. Browsers can't go fully peer to peer yet - some 'signalling' by a server is required to set up the connection between consenting browsers, which specifically prevents the general 'try random peers until we get a connection' algorithms - which could potentially turn your computer into a botnet slave if it has a single insecure listener running in a web page. It's a balance, really.

But even this mediated peer-to-peer ability is going to change the web in ways I can't entirely contemplate. The platform is now ready. Let's see what we build upon it.

Sunday, August 25, 2013

It's been going so well I've barely had time to update. Here's a more recent screenshot:

So, there's a lot to explain in that image, and it's only the second floor. I've moved the old signal processing consoles down into the 'basement', and the upper floor is left for the planetarium/system map.

The earth and moon should be obvious. Both datasets are from NASA. Earth is the 2013 bathyspheric "blue marble" image, and the moon is albedo corrected. The Sun is textured with the 2012 STEREO image (the first simultaneous full-sun image map) And far off to the left is a pair of Plank Microwave Anisotropy datasets: Background radiation in red, and matter density in blue.

Oh yeah, and there's a skybox too, but it's fake. Looks nice though. There'll be an option to turn it off.

The spheres are not actually spheres - in model space they're cubes. I suppose technically they are 'voxels'. Each sphere is raytraced within the cube with correct perspective, but without creating z-buffer geometry, so "hologram" might also be a good word. The spheres look geometrically perfect, but won't intersect correctly.

Why go to the trouble, compared with simply creating a triangular tesselation mesh for the sphere? Well, each "sphere" voxel is eight vertexes, with no normal or surface buffers. I can have thousand of spheres, millions, so long as they are stacked in non-overlapping minecraft rows.

I'm essentially ready to tackle the next major algorithm, the one which should elevate astromech to a whole new level: Ray Bundle Adjustment.

What the application? Given data coming from multiple viewpoints (such as a ring of solar observatory satellites like SOHO, SDO, or STEREO-A and B) one should be able to reconstruct the 3D structure of the volume.

Near-real-time 3D maps of the Sun and it's near volume (flares, mass ejections) sounds like a neat thing. I expect to have something working within the month.

Another load off my mind is that I found an excellent solution for one of my other problems: sensible rendering of cluster data containing thousands or millions of points. This is some beautiful work, based on the brilliant idea of leaving a step out. Expect to see something very similar in Astromech, soon as I get to it.

Lastly, I'm investigating going nearly fully peer-to-peer for the networking layer using WebRTC. In terms of spec and capabilities for moving video and data streams around, I would be an idiot not too. I'm putting together a small signalling server, and thinking about adaptive mesh "gossip" networks.

Google gave an excellent talk on what WebRTC is all about which I recommend you see, given how much impact that particular spec is going to have on the peer-to-peer internet. Without this technology, I would need some heavy central servers to videoconference a dozen telescopes. Now it can go point-to-point.

That changes many things. Assuming it all works. And it's already in your browser, most likely. There are still Chrome/Moz compatibility issues, but only because the spec is still moving.

So, progress is good. Problems are falling one after another, and the roadmap to release stretches ahead.

Saturday, August 10, 2013

So I thought, "keplerian elements are pretty straightforward once you get past the jargon", but there's different jargon for comets as asteroids, so that's still a work in progress. And I assumed that plotting the position of the moon would be a similar exercise.

Oh boy. Utterly not.

Let's just say that the best and easiest way is probably to ping NASA and just ask their big ephemeris computer, called HORIZON.

I did it the masochistic way, which is to port the ELP/MPP002 code out of GAL (the 'General Astrodynamics Library') and into Javascript. The amazing thing is that it seems to have worked.

See how all the lines are off from each other? Like it's a couple of rounds of a strange attractor? Well, that's actually correct. The "orbit of the moon" is chaotic. Not a neat hoop. Practically everything in the solar system perturbs it. The size of the ocean (and hence tide) it's currently over affects it. It's orbit shrinks when the Earth is farther from the sun, and expands again when we're closest.

And since Earth is the Moon's dancing partner, anything it does, we do to, but considerably less so. Plotting Earth's true position (compared to the stable "earth-moon barycenter" which is what most programs approximate down to) means adding a suitably scaled opposing vector.

The equation which computes this orbit is literally the largest I have ever dealt with. It is a polynomial with upwards of 20,000 terms. The code which runs is barely 50 lines long, but the 'coefficients' file it accesses is 10Mb of data.

This 'Lunar Equation' is not a physical computation of any kind. The coefficients represent a 'best fit' curve to the JPL ephemeris computations. (which are physically based) The JPL supercomputer crunched the numbers to find the position over time (ignore that part) and then the ELP computer "compressed" that path down to a best-fit curve. The code then "uncompresses" the coordinates for a given point in time.

Even that 10Mb of data is not the true 'best fit' curve: but, they discovered that adding any more terms actually degrades the computation through rounding errors more than the accuracy gained. That's how big it is.

Once again, I'm surprised at the numerical stability I'm getting out of Javascript's native floats. It's agreeing with the test cases down to the meter, over hundreds of years. Again, it would be good to re-implement the native math with a BigDecimal class, but the performance hit will be intense. Applying those 20,000 trig coefficients means I can only compute about 250 points-per-second on a beefy modern machine. That year's worth of orbit took about four seconds to calculate.

So, I can compute moon positions client-side in the browser now. At the cost of downloading 10Mb of javascript code and intense processing. That's a tradeoff that only makes sense in some circumstances. A pre-computed table of a year's moon positions, three times a day, would only take 3,000 numbers, and would be instant-lookup.

Then again, the GAL table contains _two_ lunar models. It might be possible to split those out. 5Mb (on demand) starts being a very manageable quantity. And the code can surely be optimized.

Friday, August 9, 2013

I spent yesterday implementing a digital orrey in my WebGL environment. (New furniture is nice) Just the major planetary barycenters, so far. All the numbers and equations come from this JPL document, and frankly I think I spent more time typing and rechecking all the numbers than actually coding the algorithm.

(Yes, I tried copying and pasting the table at the end, in various PDF readers, and invariably got a screen full of Zapf Dingbats.)

Then there was a good half-hour of staring at random messy lines, while I figured out that all the equations assumes angles in degrees rather than radians. Then I noticed that I had the whole solar system upside-down.

You know why Orbital Mechanics has a reputation for being hard? Because it's deliberately made so. Even professional astronomers can't remember the terms or actually do the math. They just use the on-line calculators like HORIZON, and have to look up what "argument of perihelion" means, just like everyone else.

There's a philosophical question hidden in there: when a Jargon becomes so dense and archaic that modern experts in the field don't understand it, what use does it have?

The other reason is the complete lack of a sensible reference frame. Orbital mechanics specified in "Heliocentric" co-ordinates don't actually have the Sun at the origin. (There are side-tables of where the sun is actually located) "Equatorial" co-ordinates are based on the Earth's position, (literally extending the plane of the equator out into space) which sounds easy, until you hear the word "precession", let alone "nutation", or "three body problem".

Earth is probably the worst place to define as a reference point, because it's locked in a three-way chaotic dance with the moon and sun. Yet pretty much all intra-solar-system co-ordinates are based on the "longitude of vernal eqinox" and the "ecliptic plane" from Earth. Both of which are constantly shifting.

OK, it makes the math simple and convenient for Earth-based astronomy. If you're doing it with pencil and paper. Which no-one does anymore. You can tell because all the math is presented as trigonometry, not vector/matrix forms. What's the difference? Quadrants. Inverse trig functions lead to quadrant ambiguity. Which a human must sanity-check along the way with their pencil.

Now I'm on to the task of plotting comets and asteroids, which is fun because their orbits are specified in a different format. Yup, different classes of objects get different co-ordinate formats. Why? No good reason.

Perhaps my biggest surprise in all of this is that Javascript seems quite capable of handing the precision required to crunch the numbers. I was all set to use the 'big.js" arbitrary precision library, (and still should implement that in parallel as a numerical check) but I'm not seeing any numeric instability yet. Granted, if the precision of the keplerian elements goes up another few decimal places then Javascript clearly won't have the mantissa to encode it, but for the moment it matches or exceeds the precision of the data.

Since I like to post code, here's my table of keplerian orbital elements. I've double-checked it, but a triple-check probably wouldn't go astray before you plan a space mission. And they'll slightly improve the numbers in a year or so, so they're always provisional.