]]>Quantum computing is still in its infancy, even though the idea of a quantum computer was developed some thirty years ago. But there are a whole load of pioneering organizations (like Google) that are exploring how this potentially revolutionary technology could help them solve complex problems that modern-day computers just aren’t capable of doing at any useful speed.

One such organization is NASA, whose use of D-Wave Systems quantum computing machines is helping it research better and safer methods of space travel, air traffic controls and missions involving sending robots to far-off places, explained Davide Venturelli, a science operations manager at NASA Ames Research Center, Universities Space Research Association. I’ll be speaking with Venturelli on stage at Structure Data 2015 from March 18-19 in New York City and we’ll be sure to cover how NASA envisions the future of quantum computing.

The basic idea of quantum computing is that quantum bits, or qubits — which can exist in more than two states and be represented as both a 0 and 1 simultaneously — can be used to greatly boost computing power compared to even today’s most powerful super computers. This contrasts with the modern-day binary computing model, in which the many transistors contained in silicon chips can be either switched on or off and can thus only exist in two states, expressed as a 0 or 1.

With the development of D-Wave Systems machines that have quantum computing capabilities (although researchers argue they are not true quantum computers along the lines of the ones dreamed up on pen and paper in the early 1980s), scientists and engineers can now attempt to solve much more complex tasks without having to perform the type of experiments used to generate quantum phenomena, explained Venturelli. However, these machines are just the tip of the quantum iceberg, and Venturelli still pays attention to ground-breaking research that may lead to better quantum devices.

NASA hopes to use the machines to solve optimization problems, which in its most basic terms means finding the best solution out of many solutions. One such example of an optimization problem NASA has focussed on deals with air-traffic management in which scientists try to “optimize the routes” of planes in order to “make sure the landing and taking off of airplanes in terminals are as efficient as possible,” said Venturelli. If the scientists are able to route air traffic in the best possible way, there’s a good chance they can reduce the dangers of congested skies.

Davide Venturelli

NASA also wants to use quantum computing to help with automated planning and scheduling, a subset of artificial intelligence that NASA uses to plan out robotic missions to other planets. NASA typically plans out these type of endeavors ten years in advance, said Venturelli.

The goal is to plan out the mission of the robots far in advance because realtime communication with the robots just isn’t feasible given how far away other planets are from the Earth. Using quantum optimization, NASA scientists will have new tools to basically forecast what may occur during the mission and what would be the best possible plan of attack for the robots to do their work.

“We have some missions where we imagine sending multiple robots to planets and these robots will need to coordinate and will need to do operations like landing and such without realtime communication,” said Venturelli.

Scientists need to “maximize the lifetime of the batteries” used by the robots as they perform tasks on the planets that may include drilling or using infrared thermometers to record temperatures, so careful planning of how the robots do their tasks is needed in order to ensure that no time is wasted. This all involves a lot of variables that normal computers just aren’t up-to-speed to process and could be a fit for quantum computing.

“[The robot] has to figure out what is the best schedule and figure out if he can recharge and when to go in a region where it is dark and a region where there is water,” said Venturelli. “We need to preplan the mission.”

]]>SpaceX is on its way to delivering NASA’s DISCOVR satellite to orbit after a succesfull rocket launch today, a feat that will make it the first private company to travel beyond the inner ring of Earth’s orbit. The satellite will travel 1 million miles to a location between the Earth and the Sun, where it will spot solar flares up to an hour before they hit Earth and take daily images of the planet.

]]>Among U.S. government agencies, the adoption of cloud computing hasn’t been moving full steam ahead, to say the least. Even though 2011 saw the Obama administration unveil the cloud-first initiative that called for government agencies to update their old legacy IT systems to the cloud, it hasn’t been the case that these agencies have made great strides in modernizing their infrastructure.

In fact, a September 2014 U.S. Government Accountability Office report on federal agencies and cloud computing explained that while several agencies boosted the amount of IT budget cash they spend on cloud services since 2012 (the GAO studied seven agencies in 2012 and followed up on them in 2014), “the overall increase was just 1 percent.” The report stated that the agencies’ small increase in cloud spending compared to their overall budget was due to the fact that they had “legacy investments in operations and maintenance” and were not going to move those over to the cloud unless they were slated to be either replaced or upgraded.

But there’s at least a few diamonds in the rough. The CIA recently found a home for its cloud on Amazon Web Services. And, in 2012, NASA contracted out with cloud service broker InfoZen for a five-year project worth $40 million to migrate and maintain NASA’s web infrastructure — including including NASA.gov — to the Amazon cloud.

This particular initiative, known as the NASA Web Enterprise Services Technology (WestPrime) contract, was singled out in July 2013 as a successful cloud-migration project in an otherwise scathing NASA Office of Inspector General audit report on NASA’s progress in moving to cloud technology.

Moving to the cloud

In August, InfoZen detailed the specifics of its project and claimed it took 22 weeks to migrate 110 NASA websites and applications to the cloud. As a result of the project’s success, the Office of Inspector General recommended that NASA departments use the WestPrime contract or a smilier contract in order to meet policy requirements and move to the cloud.

The WestPrime contract primarily deals with NASA’s web applications and doesn’t take into account high-performance computing endeavors like rocket-ship launches, explained Jonathan Davila, the InfoZen cloud architect and DevOps lead who helped with the migration. However, don’t let that lead you to believe that migrating NASA’s web services was a simple endeavor.

Just moving NASA’s “flagship portal” of nasa.gov, which contains roughly 150 applications and around 200,000 pages of content, took about 13 weeks to move, said Roopangi Kadakia, a Web Services Executive at NASA. And not only did NASA.gov and its related applications have to get moved, they also had to be upgraded from old technology.

NASA was previously using an out-of-support propriety content management system and used InfoZen to help move that over to a “cloudy Drupal open-source system,” she said, which helped modernize the website so it could withstand periods of heavy traffic.

“NASA.gov has been one of the top visited places in the world from a visitor perspective,” said Kadakia. When a big event like the landing of the Mars Rover occurs, NASA can experience traffic that “would match or go above CNN or other large highly traffic sites,” she said.

NASA’s Rover Curiosity lands on Mars

NASA has three cable channels that the agency runs continually on its site, so it wasn’t just looking for a cloud infrastructure that’s tailored to handle only worst-case scenarios; it needed something that can keep up with the media-rich content NASA consistently streams, she said.

The contract vehicle takes in account that the cost of paying for cloud services can fluctuate based on needs and performance (a site might get a spike in traffic on one day and then have it drop the next day). Kadakia estimates that NASA could end up spending around $700,000 to $1 million for AWS for the year; the agency can put in $1.5 million into the account that can cover any unforeseen costs, and any money not spent can be saved.

“I think of it like my service card,” she said. “I can put 50 bucks in it. I may not use it all and I won’t lose that money.”

Updating the old

NASA also had to sift through old applications on its system that were “probably not updated from a tech perspective for seven-to-ten years,” said Kadakia. Some of the older applications’ underlying architecture and security risks weren’t properly documented, so NASA had to do an audit of these applications to “mitigate all critical vulnerabilities,” some of which its users didn’t even know about.

“They didn’t know all of the functionalities of the app,” said Kadakia. “Do we assume it works [well]? That the algorithms are working well? That was a costly part of the migration.”

After moving those apps, NASA had to define a change-management process for its applications so that each time something got altered or updated, there was documentation to help keep track of the changes.

To help with the nitty gritty details of transferring those applications to AWS and setting up new servers, NASA used the Ansible configuration-management tool, said Davila. When InfoZen came, the apps were stored in a co-located data center where they weren’t being managed well, he explained, and many server operating systems weren’t being updated, leaving them vulnerable to security threats.

Without the configuration-management tool, Davila said that it would “probably take us a few days to patch every server in the environment” using shell scripts. Now, the team can “can patch all Linux servers in, like, 15 minutes.”

NASA currently has a streamlined devops environment in which spinning up new servers is faster than before, he explained. Whereas it used to take NASA roughly one-to-two hours to load up an application stack, it now takes around ten minutes.

What about the rest of the government?

Kadakia claimed that moving to the cloud has saved NASA money, especially as the agency cleaned out its system and took a hard look at how old applications were originally set up.

The agency is also looking at optimizing its applications to fit in with the more modern approach of coupled-together application development, she explained. This could include updating or developing applications that share the same data sets, which would have previously been a burden, if not impossible, to do.

A historical photo of the quad, showing Hangar One in the back before its shell was removed. Photo courtesy of NASA.

Larry Sweet, NASA’s CIO, has taken notice of the cloud-migration project’s success and sent a memo to the entire NASA organization urging other NASA properties to consider the WestPrime contract first if they want to move to the cloud, Kadakia said.

While it’s clear that NASA’s web services have benefited from being upgraded and moved to the cloud, it still remains hazy how other government agencies will follow suit.

David Linthicum, a senior vice president at Cloud Technology Partners and Gigaom analyst, said he believes there isn’t a sense of urgency for these agencies to covert to cloud infrastructure.

“The problem is that there has to be a political will,” said Linthicum. “I just don’t think it exists.”

Much like President Obama appointed an Ebola czar during the Ebola outbreak this fall, there should be a cloud czar who is responsible for overseeing the rejiggering of agency IT systems, he said.

“A lot of [government] IT leaders don’t really like the cloud right now,” said Linthicum. “They don’t believe it will move them in the right direction.”

Part of the problem stems from the contractors that the government is used to working with. These organizations like Lockheed Martin and Northrop Grumman “don’t have cloud talent” and are not particularly suited to guiding agencies looking to move to the cloud.

Still, as NASA’s web services and big sites are now part of the cloud, perhaps other agencies will begin taking notice.

]]>When the producers of The Amazing Race decided to make a Canadian version of the reality TV show, they were delighted to discover that they could use unmanned aircraft to film teams of contestants scrambling from place to place.

In the U.S. that quite literally would not fly. Federal Aviation Administration rules forbid any commercial use of such drones, which can weigh under five pounds and offer new and useful opportunities for photography.

The result is that the Canadian approach, which is based on a simple permit system, is allowing hundreds of businesses to integrate the technology in a range of industries, while U.S. companies are grounded awaiting regulation.

A ticket to fly

In the United States, everyone from media organizations to photographers to search-and-rescue crews are in a legal dogfight with the Federal Aviation Administration over a controversial policy that says only hobbyists can use drones.

Photo by Funky Frog Stock/Shutterstock

While the courts sort this out, the FAA continues to crack down on any business that flies them.

It works differently to the north. In Canada, anyone that wants to use a drone for commercial purposes simply asks the country’s aviation regulator for a permit to do so and, in most cases, receives one in 10 to 20 business days.

According to Maryse Durette, a spokesperson with Transport Canada, the government has granted nearly 1,500 “Special Flight Operations Certificates” in the last three years, including 945 in the year 2013 alone.

In order to receive one, an applicant must explain in detail how they intend to use the drone, and outline what precautions will be taken to fly it safely. After reviewing the application, and consulting with the business, the federal agency then issues a permit with a distinct set of stipulations tailored to that particular business.

The upshot is that there are now hundreds of companies and institutions in Canada that have made drones part of their day-to-day operations.

Drones on set and on the farm

Canada’s decision to set up a green-light system for commercial drone use came in 2010 as the government came to recognize an emerging aviation-based economy. Since then, companies large and small are gaining expertise in drone-related technology.

A startup called Resson Aerospace, for instance, is transforming drone footage into customized images and analytics for large agriculture companies:

CEO Peter Goggin said that the image represents an early version of its proprietary software, and that it is developing more sophisticated versions as it gains more experience working with drones and clients.

Farming is not the only industry that is supporting drone-based businesses, according to Transport Canada. The agency declined to provide a specific breakdown, but did name TV production, law enforcement and real estate photography as common examples of commercial drones usages.

All of this activity is not only intriguing from a photography and technology standpoint. It may also be giving Canada a first-mover advantage when it comes to the emerging drone economy. As the National Post reported last month, one Montreal-based start-up believes “Canadian startups have an advantage over their U.S. counterparts, because in that country drone use is illegal for commercial uses.”

Waiting for U.S. air support

The apparent success of the nascent Canadian drone industry raises the question of whether a similar permit-style system might work in the U.S.

“Drone operators in the United States would be happy to see a permit system that provides responsible companies with that kind of streamlined and efficient system for granting permits in safe work scenarios such as agriculture and the energy sector,” said Brendan Schulman, an aviation lawyer who is currently representing a number of the groups suing the FAA.

For now, however, the FAA is is still working on long overdue rules for integrating unmanned aircraft into U.S. airspace. In the meantime, the agency is insisting its “guidelines,” which have been rejected by one administrative law judge, continue to apply.

The good news is that the FAA may finally getting its act in gear. In response to an email question about permits, a spokesperson did not refer to a Canadian-style system, but did suggest the burden on businesses would soon be lifted:

“We expect to publish a proposed rule on small unmanned aircraft before the end of this year. We can’t discuss specifics because the language isn’t finalized, but the rule will make a start on allowing more routine UAS operations.”

In the meantime, initiatives outside the FAA could also help break the regulatory log-jam. This month, for instance, it emerged that NASA and a San Francisco-based drone start-up, Airware, are working on a new air-traffic control system for unmanned aircraft.

]]>Just a bit of information has trickled out about five-year-old startup Algae Systems over the years. It’s been working on growing algae offshore in big plastic bags using waste water, which was originally a concept out of NASA. It later picked up assets and patents from the defunct algae startup GreenFuel.

The company also turned some heads early on just because it’s home to an interesting group of characters. CEO Matt Atwood is a young chemist, entrepreneur and avid Burner, and Vice President John Perry Barlow, is the John Perry Barlow, the Grateful Dead lyricist and the co-founder of the Electronic Frontier Foundation. They raised some angel funding from billionaire Edgar Bronfman Jr.

But this week, the stealthy startup finally came out of quiet mode to talk about a pilot plant it just completed in Mobile Bay in Daphne, Alabama. The 25-person team started building it last summer, working with the local utility Daphne Utilities, and just finished it in June of this year.

Part of Algae Systems’ plant in Daphne, Alabama. Image courtesy of Algae Systems.

The plant takes disinfected waste water from the city of Daphne, combines it with CO2, and fills up large plastic bags offshore in nearby Mobile Bay. The waves naturally mix the substances around, the sun shines on the bags, and after about four days, algae grows.

The algae is then harvested, and the left over water in the bags is cleaned. The result is that Algae Systems can sell both the harvested algae to make diesel and jet fuel, and at the same time the waste water is cleaned and can be reused. “It’s impossible to have a biofuels company and be profitable and stable only off of fuels because it’s such a low value commodity,” Atwood told me.

The plant in Mobile Bay is at demo scale right now, and it is treating 40,000 gallons of waste water per acre per day. In terms of fuel production that’s about 3,000 gallons per acre per year with their current productivity numbers. Atwood expects that to increase.

At the heart of the plant is Algae System’s “hydrothermal liquefaction” tech, which at 550 degrees Fahrenheit, turns algae and the sewage into a liquid that’s like a crude oil. Adding other substances they can make the different types of fuels.

When I ask why the company chose the city of Daphne and Mobile Bay, Atwood explains that for the first plant, it needed the “right partner.” Daphne Utilities is small but progressive and forward thinking. Atwood and the team also managed to recruit Daphne Utilities General Manager Rob McElroy to join their group now that the pilot plant is up and running in the Bay. Barlow was responsible for siting the plant.

The pilot plant is the first step for Algae Systems. The next step is to build a commercial plant and raise funding. As many of our readers know, this is a point in a startup’s life commonly called “the valley of death,” because many startups languish in between initially starting to prove their tech and building it at commercial scale.

Algae Systems is looking to raise a Series B round, and a commercial plant could cost between $80 million to $100 million. In addition to angel investor Edgar Bronfman Jr, Japanese conglomerate IHI invested $15 million into the startup.

Funding for cleantech and biofuel companies in Silicon Valley has been tight for several years now. Strategic investors will probably be a lot more open to this type of investment.

Algae Systems’ process. Image courtesy of Algae Systems.

Algae Systems has other helpful partners. The Department of Energy delivered a $4 million grant to a partnership led by SRI International to work with Algae Systems tech. Algae Systems also has a long time relationship with another startup called Global Thermostat, which bills itself as a carbon negative capture company. Atwood said that Algae Systems plans to work with Global Thermostat to integrate its tech with theirs and build a plant.

Ways to turn algae into fuel have long been under development by large companies and startups alike. Solazyme is one of the startups that managed to break through, and scale up their tech. Craig Venter’s Synthetic Genomics ended up moving away from and downgrading its original algae fuel research. Startup GreenFuel is now living a second life through Algae Systems’ deployment.

It’s a hard business to be in, to be sure. But given Algae Systems multiple revenue streams, and multi-purpose plants, it could have an advantage in specific environments that need low cost clean water and biofuels.

After a short 24-hour delay, NASA launched a rocket from Vandenberg Air Force Base in California Wednesday morning, which contains a satellite and a spectrometer that will monitor carbon emissions from on high. The tool measures the colors of sunlight that bounce off of the earth — the intensity of the colors indicates how much carbon dioxide there as the light passes through the atmosphere. Check out this article for more on how cutting-edge tech in the skies is seeking answers to the world’s changing climate.

]]>Scientists and governments around the world are leaning on the latest tech advances when it comes to gathering information on high — from drones to satellite and rocket systems to big data tools — to fill in the gaps in knowledge around the planet’s changing climate. The trend is being pushed not just by cheaper and better technology — thanks to Moore’s Law — but is also being driven by the desire of organizations, countries, and scientists to provide data that can support policy decisions.

Cheaper and more accurate systems are enabling both more advanced remote monitoring of human activity that could be having an effect on the globe’s rising temperatures, and are also keeping close tabs on the world’s overall atmospheric changes. More sophisticated analytics are teasing out trends, drawing new conclusions and helping identify ways to lower greenhouse emissions.

On Tuesday, NASA plans to launch a satellite (called the Orbiting Carbon Observatory-2) from Vandenberg Air Force Base in California that will help monitor the carbon emissions released into the atmosphere around Earth. The satellite plans to pass over the North and South Poles at a height of 438 miles, observe the same locations every 16 days and take a million measurements a day. The satellite will be carrying a tool called a spectrometer that can measure the colors of sunlight that bounce off of the earth — the intensity of the colors indicates how much carbon dioxide there as the light passes through the atmosphere.

The information — which will provide much more data than land-based carbon emissions units — could help scientists learn more about how carbon emissions ebb and flow with the seasons (for a good explanation of this phenomenon watch this Cosmos episode) and also help provide answers to the complex way that plants absorb carbon emissions around the world. Will the oceans and plants continue to absorb carbon emissions at the same rate going forward? And how do droughts and floods affect how this works?

The satellite is being launched now after two failed attempts, where rockets previously crashed. The satellite and 300-pound instrument are being launched on another bigger rocket with a longer history of successful launches. Japan also has a satellite that tracks carbon emissions, which was launched in 2009. NASA has 17 other satellites in orbit that looks at Earth data.

The Obama administration has been increasingly interested in implementing policies that can reduce carbon emissions. Recently the EPA announced that it plans to reduce power plant emissions by 30 percent by 2030.

The Orbiting Carbon Observatory-2 will cost $467.5 million this time around. But expect satellite and rocket technology to get even cheaper in the coming years. SpaceX is hard at work testing out reusable rockets. The idea is that the boost stage of a rocket — the part of the rocket that returns to Earth after launch instead of traveling into space — can land gently and could be recovered and reused for future launches. The boost stage of the rocket can be 70 percent of the cost of a rocket launch.

Startups and tech companies in Silicon Valley are focused on reducing the cost of satellites and satellite data. San Francisco-based Planet Labs, founded in 2010 by NASA alums, started launching their constellation of tiny satellites earlier this year. Their “Dove” satellites are meant to be low-cost, quickly deployable, and able to taking pictures of Earth that are down to the three to five meters. One of their major applications could be monitoring deforestation, a key contributor to climate change.

Images and environmental data taken from lower heights are also being used to monitor carbon emissions and answer questions about climate change. That would be data from aerial drones. Drone technology is quickly advancing with large Internet companies and startups alike investing heavily in innovations and standards.

Drones, image courtesy of Pond5/boscorelli.

The Chinese government has been using drones to monitor and discover illegal pollution and emissions from some of Chinese industrial giants. The Chinese government is trying to reduce air pollution considerably in certain regions and is using the drones to back up those policies.

More sophisticated big data tools are enabling the smart use of these monitoring technologies. Every hour NASA’s various missions compile hundreds of terabytes of information. NASA is now turning to analytics to make better predictions like estimating weather patterns and predicting rates of melting ice caps.

Digital eyes of course aren’t always positioned high and pointed down. IBM Research has developed “sky cameras,” which are high-resolution fish-eye lens cameras that can be pinned onto poles or the rooftops of buildings and are pointed upwards. The cameras continuously stream visual data about the atmosphere and cloud density and can feed data to solar systems to help predict how well solar systems will operate under various weather conditions.

]]>NASA launched a new contest Tuesday for imagining and then building new uses for the space agency’s trove of earth sciences data. The challenge — actually two of them, broken down into the imagination and building stages — kicks off on July 1 and runs through Nov. 15, and utilizes the NASA Open Earth Exchange platform. The exchange’s datasets and informational material, as well as the computing resources for the challenge, are hosted on Amazon Web Services.

This is not the first time NASA has opened its data to the public as part of a challenge. It had previously partnered with TopCoder, a platform for crowdsourced challenges, on a number of competitions. The OpenNEX challenge is in partnership with Innocentive.

Although agencies like NASA, other research institutions and universities still maintain some of the smartest people and best tools in their respective fields, there’s a growing acceptance of what’s possible when the public can access certain types of data and the computing resources to analyze it. Competitions on the Kaggle platform, for example, are routinely won by teams or individuals with little subject-matter expertise but lots of general experience in building predictive models.

Likewise, GDELT creator Kalev Leetaru recently partnered with Google to open that massive socio-political database, as well as tools for analyzing it, to the public. He told me at the time that he’s excited to see what happens when data scientists in all fields can experiment on the data with minimal effort of their part. “You’ve got all this pent-up [analytic] expertise out there,” he said. “Go run these big queries. Tell us what’s possible.”

Cloud computing is the thing that makes these types of competitions possible. Researchers in fields such as genomics hope the cloud can transform their spaces, too, and lead to new scientific breakthroughs. The power comes from giving scientists access to data and computing resources in a centralized location instead of having to send huge datasets over the network and process them using locally using — in the best-case scenario — high-performance computers.