This is a personal blog updated regularly by Dr. Daniel Reed, Vice President for Research and Economic Development at the University of Iowa.
These musing on the current and future state of technology, scientific research and innovation are my own and don’t necessarily represent the University of Iowa's positions, strategies or opinions.

Microsoft

April 25, 2012

Today, in Cambridge, UK, we celebrated the ten months of experiences and successes from the Cambridge white spaces trial, which was organized by a consortium of companies, including Microsoft. The event brought together telecom regulators from around the world, hardware, software and content providers and interested parties to discuss the potential and the reality of dynamic spectrum management. The "white spaces" – the unused channels in the TV bands at each location – can be used for wireless communication as long as there is no interference with the primary (television broadcast) usage.

This white spaces spectrum is particularly valuable due to its signal propagation characteristics. Sometimes called "Super Wi-Fi," it can provide broad wireless coverage with relatively few access points, for rural areas that are often digitally disenfranchised, for machine-to-machine communication that can enable smart grids, smart cities and intelligent transportation systems, and for mobile data traffic offload in urban areas.

White spaces use has been approved by the U.S. Federal Communications Commission (FCC), and approval by the U.K. regulator (Ofcom) is near. Other regional and national bodies are also moving forward. In this spirit, today, we also announced formation of a new consortium to launch a new trial in Singapore. Details on all of this can be found on the Microsoft on the Issues blog, where I wrote about Broadband White Spaces – Ready to Go Global. For more details and news coverage, type the words, "white spaces" and "Cambridge" into your favorite search engine.

My complete written testimony will ultimately appear on the Subcommittee web site here, and the hearing charter, the subcommittee press release, and the hearing video capture most of context. However, there are a few points from my testimony that seem appropriate to repeat here. These concern research empowerment via the cloud, support for basic research and education and broadband expansion.

Accelerating Scientific Discovery for Research via the Cloud

Throughout the history of science, data has been scarce and precious. Indeed, the modern scientific method is defined by a careful cycle of hypothesis and experiment, which gathers experimental data to test the hypothesis. In a few short years, scientists and engineers have gone from scarcity to an incredible richness, necessitating a significant change in how they manage and extract insight from all this data. In a parallel shift, many of our scientific, engineering and societal questions increasingly lie at the intersections of traditional disciplines.

Increasing data volumes and the complexity of collaboration on interdisciplinary problems are challenging our historical approaches to discovery and innovation via computing. Most researchers and research institutions are ill-prepared for the large-scale computing infrastructure management challenges posed by large data sets and complex models. The cloud and associated applications and tools offer a possible solution to this challenge by letting scientists be scientists.

The U.S. government can accelerate this transition by encouraging the purchase of cloud services, in addition to the acquisition of local IT infrastructure, and by supporting new tools that facilitate distributed collaboration and simplify access to multidisciplinary scientific data. As I have noted before, Microsoft is acting on this belief, working in partnership with the National Science Foundation.

Fostering Continued Support for Computing Research and Education

Today's cloud technology is derived from basic computing research conducted over the past four decades. To ensure that the U.S. continues to remain at the forefront of cloud technology, continued investment in basic research is critical. There are deep and open questions in areas as diverse as the future of silicon scaling and system-on-a-chip design, energy-efficient systems, primary and secondary storage, data mining and analytics, wired and wireless networks, system resilience and reliability, privacy and security, and user interfaces and accessibility, to name just a few. Insights and innovations from this research will spawn new companies, create jobs and reshape our future.

In addition to continued research investment, it is critical to support the pipeline that produces researchers, and others who will able to invent new uses of the cloud and information technology. The U.S. Bureau of Labor Statistics estimates that the computing sector will have 1.5 million job openings over the next 10 years, yet the number of graduates receiving Bachelors, Masters or Ph.D. computer science degrees is far short of that. In addition, we must strengthen the quality of and access to computing education at all levels. Consistent with these concerns about the IT workforce and computing education, Microsoft is a founding member of the Computing in the Core coalition, which supports computer science education, particularly at the K-12 level.

Broadband: The Need for Speed

The web and cloud services depend on broadband communications. Without them, service and information sharing are impossible. Concomitantly, ensuring reliable wired and wireless connectivity, with adequate bandwidth and latency, is critical to ensuring successful adoption of the cloud and realization of its benefits. The phenomenal growth of digital data, the rise of streaming media services, and the explosive growth of Internet-connected devices are all straining our nation's broadband infrastructure.

It is critical that we continue to design and deploy new backbone networks that support higher data rates, develop and deploy new protocols and infrastructure for the next generation of wireless networks and define the global standards that will shape the future of the globe-encircling cloud. We must also remember that digital access to information and services is increasingly the enabler of economic competitiveness, of lifelong education in a rapidly changing world, and of government efficiency and service delivery.

21st Century Innovation

This is an exciting time, when the future becomes the present. Computing can be a great equalizer, providing access to the world's knowledge base to individuals, anywhere, anytime; empowering entrepreneurs and companies large and small to sell their products and ideas globally; and enabling scientists and engineers to discover and innovate in ways that will define the future.

June 27, 2011

I have been spending quite a bit of time on approaches to address the spectrum crisis – the seemingly insatiable demand for more wireless communication as the number of Internet-connected devices continues to grow. We are working with a consortium in the UK to host a major white spaces trial. Details can be found on the Microsoft blog, along with a video explaining the challenges and opportunities.

June 21, 2011

I just spent an all too brief two days in Hamburg, attending the International Supercomputing Conference (ISC). It was a chance to catch up with some old friends, discuss the ever changing nature of computing technology and reflect on the Weltanschauung that is the HPC community. In addition to the buzz around the latest Top500 list and the new Japanese machine atop the list, there were many debates about the intersection of HPC and cloud computing.

My friend and former collaborator in North Carolina, Wolfgang Gentzsch, organized a session at ISC on HPC and the Cloud, where I was delighted to participate. In my talk, I made four points, several of which were echoed by Josh Simons from VMWare in a great talk that followed mine.

First, HPC has experienced several punctuated equlibria, as we transitioned from one parallel computing model to another. From instruction level parallelism in the mainframe era through vector systems, MPPs, SMPs, clusters and clusters plus accelerators, each transition has brought technical and cultural challenges. The technical challenges have been well documented; the cultural challenges are less often discussed, for they affected the technology providers (vendors), the technology operators (HPC centers and facilities) and the technology users (scientists, engineers and other researchers.) Clouds bring another set of technical and cultural challenges.

When any new technology appears, there is a great temptation to see it through the lens of the old, either in nomenclature or behavior (see "horseless carriage"), or attempt to converge the new into a variant of the old (see assimilation by the Borg). Many corporate acquisitions fail for just this reason, failing to recognize one is acquiring something precisely because it is not like the acquirer. Successful adoption and exploitation occur when all parties are willing to work together to help create the ecosystem of capabilities needed. I believe we are moving down the path of successful adoption with cloud-based HPC.

Second, I talked about the need to continue to democratize access to HPC, targeted the "excluded middle," those potential and actual technical computing users who need more computing than available on desktops but who find traditional HPC too difficult to use. They are the majority of scientists and engineers. Client plus cloud acceleration is one possible solution to this problem, and could empower a new set of HPC users. See HPC and the Excluded Middle.

Third, I reminded the audience that there is another burgeoning challenge and opportunity – big data. We have big problems with big data, and the scientific and engineering communities are drowning in the data produced by a new generation of sensors and scientific instruments. We need simple, easy to use tools that allow researchers to mine and extract insight from that data. In that spirit, I discussed our continuing work on Excel DataScope, which allows one to use the familiar Excel interface while applying rich analytics algorithms in the cloud. See The Zeros Matter: Bigger Is Different.

Finally, I highlighted the worldwide progress of Microsoft's research cloud partnerships. We now have over 75 active projects, launched in partnership with research agencies in the U.S., the EU, Japan, Australia and Taiwan. (See Cloud Seeding: Stimulating Discovery and Innovation.) The partnerships are an opportunity for us to work together to understand how clouds can best support technical computing and discovery, driving change in the cloud infrastructure space and service providers and also educating researchers and HPC providers about this technical and cultural transition.

May 25, 2011

All organizations and cultures develop their own idiosyncratic acronyms, abbreviations, lingo and metaphors. (Yes, there are probably a few similes and alliterations as well.) One need look no further than the social networks and Twitter to find a veriable cornucopia of such phrases and acronyms (e.g., LOL or OMG). The hacker community (the original one) also gifted us with the gems RTFM and UTSL.

I was once offered my first full time job in the software industry because I knew what JCL was needed to apply a PTF to BTAM. That's IBM speak for applying a software patch to telecommunications software. In this sense, Microsoft is not unique, for its worldwide employees have developed their own lexicon of metaphors and abbreviations, as well as appropriating others were useful. As a relative newcomer to the corporate world after twenty-five years in academia, several of these abbreviations, words and phrases immediately caught my ear.

Licking the Cookies

Not to be confused with the joy of double stuff Oreos or even licking the spoon, licking the cookies refers to claiming action or skill in an area but in a way that prevents anything from happening. It is intended to conjure an image of a cookie plate, where one licks the cookies then returns them to the plate, rendering them inedible by all. As the phrase suggests, you have not eaten the cookie (i.e., taken action) but you have prevented others from doing so either.

Cookie licking is endemic to any large organization or culture where multiple teams compete for credit and accomplishment. We have all seen this behavior in our organizations and occasionally been guilty of it ourselves. This is one of the more pernicious forms of non-collaborative, non-team play.

After understanding the phrase, I realized that subtle and not so subtle forms of cookie licking occur in academia as well. I have seen it in debates over academic directions, discussions of research priorities and in paper and grant proposal reviewing. Of course, academia has its own set of arcane phrases.

Eating Your Own Dog Food (aka Dogfooding)

Dogfooding is a hoary classic, well used and abused across the computing milieu. It refers to using early (alpha or beta) versions of a system you and others have developed before it is released, often as an informal tester. It's not haute cuisine or even nouveau cuisine; it's dog food, edible and nutritious, but neither elegant nor tasty.

Learnings

Finally, there are learnings, as in "Our learnings on this suggest optimization is our first priority." The first time I heard this linguistic oddity, I was nonplussed and puzzled, confused and uncertain, perhaps even mortified, but I was not chagrined. After hearing it several times, I finally asked if the speaker really meant insights, understanding, observation, or even experience. The answer, of course, was yes. Nobody I have yet asked, however, knows the origin of the usage, though I am sure the exegesis would be fascinating.

As part of this agreement with the NSF, Microsoft is making available access to Windows Azure, Microsoft's cloud computing platform. In addition, a support team of Microsoft researchers and developers is working with grant recipients to equip them with a set of common tools, applications and data collections that can be shared with the broad academic community, and also providing expertise in research, science and cloud computing. Broader details on the program and its research tools and software for Windows Azure can be found here.

Diverse Projects

Increasingly, the important scientific questions lie at the intersections of traditional disciplines, and insights from multidisciplinary collaborations drive innovation, economic development and our response to complex problems and natural disasters. This is one of many reasons I am so pleased with the technical diversity among the list of NSF-Microsoft award recipients.

All of the award recipients were selected via NSF's rigorous peer review process, which emphasized the scientific merit of the proposed work. Projects range from exploring scientific applications of cloud computing as well as extensions of cloud computing with new tools and techniques.

Worldwide Reach

The Microsoft/NSF partnership is but one part of a broader international program that the eXtreme Computing Group (XCG) and Microsoft Research are building with the scientific research agencies worldwide. We believe cloud computing, coupled with powerful client tools, can transform the nature of research. This worldwide program currently targets collaborations with several national and international research partnerships, including

Power via Simplicity

This Microsoft/government partnership is about so much more than access to cloud computing resources. The deep partnership between academic and Microsoft researchers, creation and release of easy-to-use client tools and an exploration of the new world of massive data are all key elements of our shared journey to a new model of data rich analysis enabled by powerful client tools and cloud services.

Any successful technology ultimately becomes invisible, enabling and empowering without requiring the user to focus on the idiosyncrasies of the technology itself. Technical computing can and should be an invisible intellectual amplifier, as easy to use as any other successful consumer technology. As I wrote last year when we first launched this program, it is really about transforming how we conduct research by broadening researcher capabilities, fostering collaborative research communities, and accelerating scientific discovery by shifting the focus from infrastructure to empowerment.

March 14, 2011

We all remember some variant of the fairy tale of Goldilocks and the Three Bears. It is a tale of the search for a good meal and a nap, though it also involves breaking and entering, culinary theft and accidental vandalism. If, perchance, you feel a worrisome gap in your erudition and erstwhile encyclopedic knowledge of this element of the Western cultural lexicon, fear not, for I will summarize the key elements of the tale. And please bear with me, this does relate to science! (Pun intended.)

Porridge and Bears

The young girl, Goldilocks, visits the home of the bears and samples the porridge of Papa, Mama and Baby Bear, pronouncing each, in turn, too hot, too cold and just right. Sated with porridge, she then seeks a place to sit, finding the chairs of Papa and Mama Bear too large. As she settles comfortably into Baby Bear's chair, it collapses. She then explores each of the beds before falling asleep in Baby Bear's bed, where she is found by the returning bear family. On discovery, she flees.

This tale, of course, raises worrisome questions. Do bears really like porridge? Could they sit in chairs and would they sleep in beds? More tellingly, are they underwater on their cottage mortage? Though all worthy of whimsical rumination, let us defer literal scrutiny of the fairy tale's purported ursine facts and the anthromophrized bears and focus on the porridge bowls and their metaphorical implications for science.

Science: Matching Need and Availability

Like porridge bowls, science comes in many sizes, from research conducted by a single investigator (e.g., a theoretical computer scientist studying computational complexity) to experimental projects that involve thousands of people (e.g., the ATLAS and CMS detectors at CERN). More to the point, there is no "right size" for science, other than matching the resources to the nature of the problem and the approach, albeit with the usual assessment of investment versus potential payoff. Nor is there, a priori, differential value between large and small science. Groundbreaking discoveries with broad and transformative effects have occurred across the entire spectrum of project sizes.

Scientific culture does vary widely with discipline and scale, however. Just as the ethos and metrics of biology differ markedly from those in physics, so too do the approaches and politics of small and large science. Large-scale experimental science typically involves long-term planning, geo-political negotiations, and infrastructure construction that may span multiple years before data can be captured and analyzed.

Big Infrastructure for Small Science

Historically, these two worlds – small-scale science (both theoretical and experimental) and large-scale science – have largely depended on separate and distinct infrastructure. That historical separation is now disappearing, mediated by the tsunami of data now being produced by new generations of scientific instruments and computational models. (See The Zeros Matter: Bigger Is Differentand Language Shapes Behavior: Our Poor Cousin, Data.)

Historically, the researcher who had unique experimental infrastructure also maintained a competitive advantage, for he or she could conduct experiments and capture unique data. With the rise of large-scale, shared instrumentation in a host of disciplines, most notably in biology, astronomy and physics, and open access to the resulting research data, advantage instead accrues to the researcher who can ask and answer more interesting questions. This is a profound shift in scientific culture with deep implications.

It is now incumbent upon us to rethink how we facilitate discovery and innovation in this brave new world of large data, for practitioners of both small and large science. Simply put, we must reconsider how we fund, construct, manage and operate scientific data repositories.

As I have learned at Microsoft, there are lessons that can be drawn from the construction of cloud data centers, web search engines and tools for analyzing ill-structured data that could both accelerate and simplify scientific discovery. These tools and practices are not only applicable to individual scientific domains, they are also especially relevant to the problems that lie at the intersection of multiple disciplines, where scientific cultural divides and divergent terminologies often inhibit collaboration and exploration. We will also need new models of public/private partnership to realize this vision, something we are pursuing with Microsoft's worldwide engagements on client plus cloud infrastructure.

Remember the porridge, the bears and Goldilocks. We can match the needs of all scientists, working together. Whether you like it hot or cold, large or small, it can be "just right" for everyone.

February 28, 2011

I have posted a few thoughts on the coming Internet of Things and the implications for cognitive communication over on the Microsoft on the Issues blog. As I have written in other pieces, I believe we have to rethink how we manage spectrum allocation and adopt much more nimble and flexible policies that leverage the emerging capabilities of cognitive radio. (See Spectrum Future Shock.)

Everywhere, anytime communication is a notable result of recent computing advances, but it's dependent upon available bandwidth, and that bandwidth is finite. Spectrum is, in many ways, like a natural resource that has to be managed judiciously, especially if we are to continue to advance the digital economy and leverage technology to drive innovation—to create new services, new business models, new ways of communicating and living for the betterment of society.

The AMERICA Competes Act outlines target funding levels and priorities for the research and education programs at several federal agencies, including the National Science Foundation and the Department of Energy. As Elizabeth notes, "Robust federal support for breakthrough research conducted throughout the U.S. is critical to fueling the ecosystems of government, industry and universities, allowing the U.S. to make discoveries and turn them into products that improve our nation's ability to compete globally."

After initial crafting, the bill faced a long and arduous process that involved many negotiations and compromises. Hence, final passage of AMERICA Competes is a holiday event worthy of celebration.