Wearing my hat as a NASA/JPL Solar System Ambassador, I gave a presentation on space exploration and robots to Jefferson Elementary School in Carlsbad, Ca. I also visited and listened to a presentation from their “Space Junkies” team from First Robotics.

I came early to do an aerial mapping of the school, with an enthusiastic group of kids and parents, including their First Robotics team. Clearing the area to keep a safe distance from my DJI Phantom 3 PRO, I brought up Map Pilot on an iPad. I simply touched the 4 corners of the area I wanted to map, and the program produced a flight plan, calculating the time, number of images, and battery requirements for the run. Here is the flight plan it generated:

Map Pilot flight plan to map the Jefferson elementary school in Carlsbad, CA.

Since the school was 4 miles from the Carlsbad Palomar Airport, I followed the FAA rules for Small Unmanned Aircraft Systems and called the Airport’s control tower, reassuring them that I would be flying less than 200 feet above the ground. I explained the many safety and privacy considerations about drones, then lifted off about 20 feet. I pressed the Map Pilot Upload button, then pressed the Start button and my Phantom climbed to altitude, then zipped off to follow the flight plan exactly. It took 79 photos, then returned to land automatically within a foot of its takeoff place. Quite an amazing sight.

Then I uploaded the images to Maps Made Easy, which stitched the 79 images together to make a single view:

This is a composite of 79 images captured by the Phantom 3 flying at 180 feet above the school.

Maps Made Easy also produced a Digital Elevation Model (DEM) of the school. Because Map Pilot programmed the appropriate overlap of image, the software was able to see each point of the school from multiple angles. It then calculated the elevation of each point in the image, and added colors to make them visible with about a 4″ resolution.

This is a Normalized Difference Vegetation Index view of the school, which can be used to study the growth of plants at the school. It was made with only visual spectrum (instead of near infrared), but it shows the basic principle.

When I host star parties, one of the most enjoyable experiences is to introduce people to Saturn for the first time. “Saturn Virgins” they are called. People oooh! and ahhh! and walk around to the front of the telescope to see if I’m fooling them. The can’t believe that the solar system appears with such 3 dimensional depth and reality.

This is inevitably followed up with “let’s look at the other planets.” Jupiter is pretty cool, and occasionally shows moon shadows moving across it. Mars and Venus can be very bright, but Neptune and Uranus are just small dots, barely discernable as disks instead of single dots (as stars look).

But Pluto is a different story. Besides its demotion from planet to minor planet (a topic which generates immense debate, but which I’m firmly an agnostic), it is really far away. It is only visible with light reflected from the sun. The light from sun diminishes according to the inverse square law. If a planet is 10 times as far from the sun as another, then it gets 1/100th the light. But that is just the light falling on Pluto. That light has to reflect and come back to Earth, which is another inverse square law relationship, which makes it an “inverse power of 4″ law. Moving a planet twice as far away makes it 1/16th as bright. Pluto is very far away, as far as solar system metrics go. It takes light about 4.5 hrs to go from Pluto to earth.

I have a scale model of the solar system in my back yard. I shrunk the sun to the size of a golf ball. To scale, Earth is then 12 feet away. Pluto is 330 feet away. This is seriously Far Away with Not Much In Between. And the light we see from Pluto is Magnitude -14 – requiring a serious telescope to see. Pluto is about 1 million times dimmer than Saturn.

Just before the New Horizons encounter with Pluto, I took some time lapse images of Pluto moving across the sky. It was impossible for me to see the spacecraft, and even detecting Pluto was a challenge. I set used my backyard observatory, the Cosmos Research Center, to photograph the sky around Pluto. This is what I saw:

This image is about 1 degree wide, about as wide as your index finger held at arm’s length. For those of you who can’t see Pluto yet, here is a close up, showing a zoom area around Pluto:

And for those of you who are still missing Pluto, here is a closeup showing the motion of Pluto over 4.5 hours – the same time that it takes for New Horizons to send information back to Earth. Pluto’s motion is shown as a sequence of dots, making a thin line across the middle of the frame. This shows were Pluto was when New Horizons sends a message (on the left), and where it is when we receive it (on the right).

And here is an animated image, showing the motion of Pluto over 4.5 hours. Look in the center for the dot moving across the image.

I have a modest telescope set up on homemade concrete pier in my back yard, and use an Orion Atlas EQ-G mount controlling a Williams Optics 110mm F7 APO Refractor telescope. I use an SBIG SFT 8300C camera, with a Starlight Express Lodestar guider. The mount is controlled by EQMOD and Maxim DL, and I do my astronomic analysis using Pinpoint.

With all the attention given VA Scheduling nowadays, I thought I’d relate some of my experiences with a commercial, off-the-shelf (COTS) scheduling program that I use as a Kaiser Permanente member. Kaiser has a multi-billion dollar agreement with EPIC systems, who provide their MyChart patient portal system.

Bottom Line: Even COTS software does not necessarily lead to happy patients. The Kaiser scheduling system is seriously messed up. Figuring out when to call the central 800 number, the department, or submit an online request is quite a frustrating challenge. And trying to book an online appointment for optometry, for example, is nearly impossible.

Here is what it looks like to do an online optometry appointment request.

First you have to choose a facility:

So far, so good. I chose the Carlsbad offices. The fun begins on the next screen, where I have to choose specific providers. I can’t ask for “any” provider, nor can I just ask for the next available appointment. I have to choose 1-3 providers specifically, and I have to choose a specific date range:

After about a dozen attempts to figure out providers and dates, I keep getting the same message:

No other feedback, just try again. When I go back, I don’t see the providers or dates I tried, just blank fields to try again. There are no hints, no “next available appointment” slots, no Help tab, customer support access, or ability to browse. Just a blank screen to try again.

So, I had to call their 800 number to wade through the key presses to select my language, listen to a “this call may be recorded” message (multiple times), then select my way though the appointment menu. Depending on the particular clinic, I might get routed to another 800 number to call. Some appointments I can make directly with my provider, others through the department, others through the central service.

Of course, the problems with the VA scheduling system are far more serious than what I describe here. But this also shows that, even with billions of dollars of software expenses and Commercial Off The Shelf products, things can be far from perfect (or even usable).

I got these photos of some of the original membership cards issued to Clayton Curtis, MD, who is currently with the VA’s Health Informatics’ Knowledge Based Systems as well as the VHA-Indian Health Service Interagency Liaison for Health IT Sharing.

Long story to be told here, but the bottom line is that the VA and the Indian Health Service have been collaborating for 30 years now, while DoD worked really hard to make its systems incompatible. I was an informal consultant to the IHS while I was at the VA, and found them to be a very dedicated, but underfunded, agency. So, the cooperation made a lot of sense.

When I went to work on the DoD version of the software (called Composite Health Care System), things were completely different. They stripped out the communication capabilities I wanted to use to coordinate systems, and made other changes that would make the DoD version incompatible with the VA.

I wish this could be written off as ancient history, but I don’t think so. DoD is continuing to do its thing with an $11B “rip and replace” waterfall effort, while VA seeks to take an evolutionary approach.

There has to be a better way. For starters, the folks in Washington should recognize the power of informal organizations active in their formal organization charts.

The most remarkable aspect of the 2007 Wildfire evacuations of San Diego was how orderly everything was. Drivers seemed to be more polite with each other than normal. Talk to someone who has been through a wildfire evacuation and they will likely have some stories to tell about community cooperation, a sense of pulling together. For example, in the May, 2014 fires, the Helen Woodward Animal shelter put out a request for horse trailers to move their horses to safety, and were immediately met with a flood of volunteers with horse trailers. The San Diego Union Tribune (June 7: Cocos Fire Jam to be evaluated) quoted a San Elijo resident about his experiences trying to evacuate during the recent Cocos fire.

Longtime resident Dustin Smith said he packed up his pets and headed off about 4:15 p.m., but couldn’t leave his gated Promontory Ridge community. In front of him was a line of vehicles backed up even before the gate…. He said he gave up, tried again an hour later but found the same situation. Tried again shortly after 6 p.m. and finally found roads clear enough to leave.

Being blocked from leaving your home for 2 hours under any circumstances is a really bad thing, but it is particularly terrifying when there is a wildfire raging nearby, and you don’t know where it will go. My daughter was evacuated earlier in the day from her office at the corner of El Camino Real and Palomar Airport road. Traffic on the road was so bad that it took her 30 minutes just to get out of her parking lot. She call 911 to see if they could get some traffic control police to help, but the dispatcher just said, “sorry, all of our officers are busy with other aspects of the fire.” Google Maps traffic reporting was very helpful, and gave citizens a great way to see what was happen and adapt to the traffic flow dynamically. For example, my wife and I were babysitting our grandchildren May 14th, and I was driving to our normal rendezvous with my son-in-law for a 4:00 handoff in San Marcos. I was planning on driving over San Elijo road to Twin Oaks, when I got a call from him, saying he saw a fire starting near Twin Oaks (that would become the Cocos fire). We both knew that this could block the road, and could cause havoc with the traffic flow between us. So, we both turned around and went home, watching the fire expand, but also noticing that the traffic on Del Dios highway was clear on Google Maps. My daughter came by around 8 that night, and we had an uneventful handoff. Unfortunately, the past few decades have seen an upsurge in NIMBY activists who fight roads in their area, creating a patchwork of unconnected roads with long cul-de-sacs. Perhaps the risk of traffic gridlock might reverse some of these attitudes, or at least give fire safety folks a stronger position from which to demand better ingress and egress for fire safety.

I live in an area where wildfires are part of nature. I am also an “early evacuator” – happy to get out of the way of any potential fire hazards if one is coming my way. I know that there are others who want to stay back and defend their homes, typically with a garden hose.

Here’s a photo sequence of the recent fires that proves my point. It shows an ember gaining a foothold on a hillside, which engulfs the whole hillside and creates a 100′ tall fire tornado within 15 minutes.

Jeff Anderson, Elfin Forest Recreational Reserve park ranger, took these remarkable images during the recent “Cocos Fire” in North San Diego County. The fire started quickly in the late afternoon of May 14. The next morning (May 15), it seemed to be fairly tame until about noon. Then it flared up with a vengeance. My home was about 1000 feet downwind of the evacuation zone, and we could smell the smoke passing over us, so we were intensely focused on what was happening.

Jeff was on the ridge of the Elfin Forest reserve looking north towards Harmony Grove when he snapped this photo of an ember burning at 12:25:24pm on May 15:

Just two minutes later, at 12:27:30pm, the fire from the original ember fire spread considerably, and another ember jumped up the hill:

Four and a Half minutes later, at 12:32:07, the fire engulfed the whole side of the hill. At this point, the flames were probably burning 1200 – 1600 degrees F.

Eight minutes, later, at 12:40:05, the fire had generated a “fire tornado” about 100 feet high, with winds 50-80 mph. The temperature at the base of the tornado was probably about 2000 degrees F, about one fifth the temperature of the surface of the sun, and the fire was generating its own wind:

I would ask those who would try to defend their house by playing Rambo with a garden hose, how long they think they would last in the midst of that inferno. It’s not a matter of your skill or machismo, it’s simply a recognition of the overwhelming power of nature.

We also need to recognize that wildfires are a natural part of the ecosystem. We even have a flower, the Fire Poppy, that germinates after wildfires. Here is picture I took of a fire poppy at Lake Poway, six months after the area had been burned in the 2007 Witch Creek Fire:

Note that, even in 1986, the Committee on Veterans’ Affairs was savvy to, and advocating the use of metadata (then called the “data dictionary – a roadmap to the database.” It understood its use in VistA (then called DHCP), its role in portability (then with the Indian Health Service), and hopes to use it for the Department of Defense’s Composite Health Care System.

Today, metadata is a household word, given the NSA’s use of it. But it reflects an entirely different perspective on how we view complex systems.

Imagine a complex system, represented by millions of dots, with even more connectors between the dots. We can think of the dots as representing the “data” in the system, and the connectors (links) representing the “metadata” in the system.

This perspective generates an overwhelming number of dots and links, well beyond any human capacity to understand.

One way to approach this complexity I’ll call the “Dots-first” approach. This approach tries to categorize the dots, pigeonholing them into a predefined hierarchy of terms: “A place for every dot, and every dot in its place.” This goes back to Aristotle, and the law of the excluded middle. Something is either A or Not A, but not both. We just keep applying this “law” progressively until we get a tidy Aristotelian hierarchy of categories. Libraries filed their books this way, according to the Dewey Decimal system. If you wanted to find a book, you could look in a card catalog for title, author, and subject, then just go to the shelves to find the book. The links between the dots are largely ignored. For example, it would be impossible to maintain the card catalog by all the subjects referenced in all the books, or all of the references to other books and papers. Order is maintained by ignoring links that don’t fit the cataloging/indexing system.

An alternative approach I’ll call the “Links-first” approach. This approach focuses on the links, not the dots. It revels in lots of links, and manages them at a meta-data level, maintaining the context of the information. It can work with the Dots-first categorization schemes, but it doesn’t need them. This is the approach taken by Google. It scans the web, indexing information, growing the context of the dot with every new link established.

If a book had a Dewey Decimal System number assigned to it, Google would pick it up as just another piece of metadata. Users could search for the book using it, but why would they? Why revert to the “every dot in its place and a place for every dot” scheme when you can use the much richer contextual search that Google provides.

Sonny Montgomery – in 1986 – was advocating the “Links-first” approach that we pioneered in VistA. This approach came up again in the metadata discussions of the PCAST report.

Bureaucracies typically favor to focus on the dots. If a Dewey Decimal System isn’t working well enough, the solution is to add more digits of precision to it, more librarians to catalog the books, and larger staffs, standards committees, and regulation to insure that the dots all stay in their assigned pigeonholes.

This is what is happening with ICD10 today. After the October 2014 roll out, we will now have the ability to differentiate “W59.21 Bitten by turtle” and “W59.22 Struck by turtle” as two distinct dots in the medical information universe. Unfortunately, we are lacking dots to name tortoises, armadillos, or possums. Struck By Orca (both the name of the book as well as an ICD10 code) provides some artistic insight into the new coding system.

The continued expectation that we can understand medicine from a “Dots-first” approach is a travesty in today’s world of interconnection, rapidly growing knowledge and life-science discoveries, and the world of personalization. People use Google, not card-catalogs, to find their information, and do so in a much richer, quicker, and informative way than anything before in human history.

The “Dots-first” thinkers will panic at the emergence of a “links-first” metadata approach. How can we have establish order if we don’t have experts reviewing the books, applying international standards, and librarians carefully typing and filing the catalogs?

One of the criticisms in the early days of VistA that it’s metadata-driven model would lead to “Helter Skelter” development, and that only centralization could make things orderly. (Helter-Skelter was the name of the Charles Mansion murder movie at the time, so the term carried a lot of linguistic baggage with it.) They could see only the Dots-first framework, and the ensuing failures of the centralized, waterfall development of $100m+ megaprojects has continually proven that their approach doesn’t work. Yet, they continue to blame their failures on the decentralized, metadata-driven core of the system.

There are technologies that address this, such as the Semantic Web or Linked Data initiatives. But I’m afraid that there is so much money to be made “improving” the medical Dewey Decimal Systems and patching up all the holes in the Dots-first kludges that it seems to be a tremendous uphill battle.