Latest posts

By Brian Robinson

Reports are emerging about recommendations that the Commission on Cybersecurity for the 44th Presidency will make to the Obama administration that will require government and contractor employees involved in cybersecurity be formally certified.

The commission is a group overseen by the Center for Strategic and International Studies, and published its first report on securing cyberspace in December 2008. It’s expected to issue its follow-up report later this month or in early July.

That’s likely to spur a debate about just what a cybersecurity credential is, versus that for regular IT security. There are many people involved in the cybersecurity know that don’t have any kind of formal qualifications, but are nevertheless leaders in the nascent field.

It will also stir up debate about the needs of organizations such as the Homeland Security Department , which has said it aims to hire around 1,000 cybersecurity professionals. If it wants that size of a cybersecurity workforce, it may have to become involved in defining just what a cybersecurity professional is. If new Senate legislation becomes law, DHS could become the lead on this issue for the federal government.

And where will the military’s new Cyber Command stand on this? It has its own view on what cybersecurity means, which include offensive as well as defensive capabilities.

Maybe it’s time for another acronym to be thrown into the mix. We already have the CIO, CTO and CSO. Perhaps there should also be a CCSO. No?

If you Washington types ever thought about what it would be like to actually work for Twitter, rather than just experience it through your thumbs, now is the time to find out. The company is looking for a “government liaison.”

This will be the company’s first DC employee — in fact, the first outside of Twitter’s small San Francisco office — and “the closest point of contact with a variety of important people and organizations looking to get the most out of Twitter on both strategic and highly tactical levels,” according to the job announcement.

The reverse side of that is that the person appointed will also be expected to feed Twitter execs with ideas of how to spread the microblogging services to politicos. It could be an important base for the company’s future, given the current mania for all-things Twitter.

Again, according to the announcement, the successful candidate will “help set the culture and approach of a fledgling public policy department and be an important part of our very small company.”

Take the “very small” bit with a pinch of salt. Yes, Twitter only employs about 200 people, but that doesn’t relate to anything in the world of the Internet. More importantly, take a look at the growth of the company as reported by GigaOM: 65 million tweets a day for a total of 2 billion so far.

Suffice to say, the eventual Twitter liaison will be a very sought-after contact in DC.

The administration is trying to take cybersecurity to the next level with an R&D program aimed at producing what it sees as game-changing technologies that will “significantly enhance the trustworthiness of cyberspace.”

It will kick off the new program at an event May 19 in Berkeley, Calif., where people from agencies that make up the Federal Networking and Information technology Research and Development (NITRD) program will explain the program’s goals. It will include a webcast.

Basically, it will split the research into three areas: tailored trustworthy spaces, which are “sub-spaces” in cyberspace that support different security policies and services for specific kinds of interactions; something called moving target, which will increase the cost of any asymmetric attack, presumably to make attackers think twice before they act; and cyber economic incentives, which will look at the economic principles needed to encourage good practices.

The NITRD is one of the older continuing R&D efforts in government, going all the way back to the 1991 High Performance Computing Act, and it’s had a good success rate. Cybersecurity was a focus for it well before it became the hot issue it now is.

The NITRD isn’t proposing this three-step program as a be-all for cybersecurity, but it does expect it to be a precursor for different ways of thinking about the problems of cybersecurity, and a way of “provoking” novel solutions.

I miss the days of the old Office of Technology Assessment (OTA), which regularly put out reports about science and technology issues that were well researched, informative and managed at the same time to irk many people on Capitol Hill and elsewhere whose political and ideological tilts they upset. Fun times!

So, of course, Congress had to get rid of it. Certainly the usual suspects, such as cost, redundancy and so on, were rounded up to defend the decision. But there was no doubt that the OTA-as-irritant was the biggest factor.

Given the generally lousy track record that Congress has had since those days in debating and deciding on science and technology issues, I think the case has been made many times over for the return of the OTA, or something similar. And indeed, many have tried to argue for that, so far without success.

Here's one more, which tries to take into account the Obama Administration's push for open government. The Woodrow Wilson International Center for Scholars is suggesting in a new report that the U.S. takes up the European experience with Participatory Technology Assessment (pTA).

The new-agey moniker notwithstanding, the center says that there are now 18 European technology agencies that are flourishing using pTA, which takes account of the views of lay people as well as those of experts. Also, it points out, the use of pTA has already been proven in the U.S. by various university groups and other non-profit organizations.

This kind of thing was impossible during the age of the OTA because the means just didn't exist to collect these lay views, at least not easily. With social media and the Internet, those barriers have disappeared.

The report's author, Dr. Richard Sclove, is pushing for the creation of a nationwide network that will incorporate pTA. Count me in.

Social networking isn’t rocket science, but the eggheads at NASA are showing how it can be used to help them pursue the most bleeding edge applications.

The space agency in May launches a new research network named NASA Earth Exchange (NEX), which combines high-end supercomputing resources with Earth system modeling and the decades of NASA’s remote sensing data to provide a new way of analyzing the planet’s climate and land use patterns.

According to a story at Supercomputing Online, NASA scientists think they can use the new network to slash the time needed to gather and analyze the massive, global-scale datasets they use -- from the months it takes now to just hours.

The key is the online collaboration will incorporate social networking, enabling NASA scientist and science teams scattered around the world to easily share datasets, algorithms, complex codes and research results.

They haven’t been able to do that before because it required physically transferring huge amounts of data to each other. With NEX, all of the data and codes will actually reside on the social networking platform so there will be no need for all of that laborious stuff.

Using NEX, the scientists will apparently be able to build custom project environments using virtualization technology that will automatically capture the entire analysis process. Those environments will also be reusable by other scientists who can add their own data to the remote sensing data, throwing open all kinds of new avenues for research into such things as urbanization, deforestation and biodiversity.

Separately, but as another example of how it is using the new Web technologies, NASA said it would start using semantic search to enable its employees to search through the more than 50 years of information it has collected on its space program.

Perhaps not surprisingly, Chris Kemp, the thought leader behind this and other IT efforts at NASA Ames, such as the Nebula cloud computing platform, has reportedly been bumped up to the new position of chief technology officer for IT at NASA headquarters.

It’s not usual for me, or anyone else that I know for that matter, to recommend a government document as a great piece of reading, but I’d stray from that norm and point to the U.S. Army Roadmap for UAS 2010-2035.

I’m not saying it’s great literature, and there’s sections of it that would serve well as a sleep aid, but for overall entertainment for the techie minded among you I think it makes the grade.

Turn first to a section that describes the far-term of the Army’s roadmap for its unmanned aircraft systems, which describes what the Army sees happening in the 2026-2035 timeframe. It talks there, among other things, about using bug-like nano UAVs to survey buildings before soldiers enter them, and use clouds of the critters operating as interlinked smart warfighting array of reconfigurable modules (SWARM) to do the same for area reconnaissance.

There’s lots of similar acronymic stuff in the document (this is the Army, after all), but most of it has a similar heft, particular that for both the far-term and mid-term (2016-2025) chapters.

Taken individually, none of the technologies will come as a surprise as they’ve been talked of before. But the roadmap provides a context for how they will work together, and it promotes a grand vision that’s pretty sweeping. Twenty years from now, battlefields and the skies over American itself could be filled with these flying robots.

There are provisos, of course, as there always is. The military overall wants as many unmanned aircraft as it can get it hands on, but the Government Accountability Office recently warned that the military’s ability to handle the number it already has is stretched. And then there’s a small matter about making sense of all of the data these things produce.