2011

Computer science is often divided into two camps, systems and theory, but of course the reality is more complicated and more interesting than that. One example is the area of "experimental algorithmics," also termed "empirical algorithmics." This fascinating discipline marries algorithm analysis, which is often done with mathematical proofs, with experimentation with real programs running on real machines.

Untrustworthy information is an increasing threat to decision making in information environments. Rick Hayes-Roth has been studying how to detect and filter away untrustworthy information and base decisions on well-grounded claims that can improve outcomes. In last week's installment of this two-part interview, we focused on the problem and the principles that help ameliorate it. In this installment, we focus on the means to implement the principles in our information environments.

Untrustworthy information is an increasing threat to decision making in information environments. Rick Hayes-Roth has been studying how to detect and filter away untrustworthy information and base decisions on well-grounded claims that can improve outcomes. We interviewed him to find out more about this problem and get advice for our readers. Although there are many subtleties in the shades of truth and the intentions of speakers and listeners, Hayes-Roth finds the essential core of what you can do to ward off untrustworthy information.

Richard John is a professor at the Graduate School of Journalism, Columbia University, and a historian of communications networks in the United States. His most recent book, Network Nation, won the inaugural Ralph Gomory prize from the Business History Conference and the AEJMC prize for the best book in the history of journalism and mass communications.

Punched cards were already obsolete when I began my studies at the Technical University of Munich in 1971. Instead, we had the luxury of an interactive, line-oriented editor for typing our programs. Doug Engelbart had already invented the mouse, but the device was not yet available. With line editors, users had to identify lines by numbers and type in awkward substitution commands just to add missing semicolons. Though cumbersome by today's standards, it was obvious that line-oriented editors were far better than punched cards. Not long after, screen oriented editors such as Vi and Emacs appeared. Again, these editors were obvious improvements and everybody quickly made the switch. No detailed usability studies were needed. "Try it and you'll like it" was enough. (Brian Reid at CMU likened screen editors to handing out free cocaine in the schoolyard.) Switching from Assembler to Fortran, Algol, or Pascal also was a no-brainer. But in the late '70s, the acceptance of new technologies for building software seemed to slow down, even though more people were building software tools. Debates raged over whether Pascal was superior to C, without a clear winner. Object-oriented programming, invented back in the '60s with Simula, took decades to be widely adopted. Functional programming is languishing to this day. The debate about whether agile methods are better than plan-driven methods has not led to a consensus. Literally hundreds of software development technologies and programming languages have been invented, written about, and demoed over the years, only to be forgotten. What went wrong?

Bob Metcalfe thinks we are in a bubble, an innovation bubble, seeing that the word "innovation" is on everybody's lips. To help ensure that this bubble does not burst, he has embarked on a new career path as Professor of Innovation and Murchison Fellow of Free Enterprise at the University of Texas at Austin. This is his fifth career, building on his work as an engineer-scientist leading the invention of Ethernet in the 1970s, entrepreneur-executive and founder of 3Com in the 1980s, publisher-pundit and CEO of InfoWorld in the 1990s, and venture capitalist in the 2000s. As General Partner with Polaris Venture Partners, he has invested primarily in cleantech and currently serves on the boards of five companies: Ember, Sun Catalyx, 1366 Technologies, Infinite Power, and SiOnyx.

Ubiquity is dedicated to the future of computing and the people who are creating it. What exactly does this mean for readers, for contributors, and for editors soliciting and reviewing contributions? We decided to ask the editor in chief, Peter Denning, how he approaches the future, and how his philosophy is reflected in the design and execution of the Ubiquity mission. He had a surprisingly rich set of answers to our questions. We believe his answers may be helpful for all our readers with their own approaches to their own futures.

Melanie Mitchell, a Professor of Computer Science at Portland State University and an External Professor at the Santa Fe Institute, has written a compelling and engaging book entitled Complexity: A Guided Tour, published just last year by Oxford University Press. This book was named by Amazon.com as one of the 10 best science books of 2009. Her research interests include artificial intelligence, machine learning, biologically inspired computing, cognitive science, and complex systems.

Joseph F. Traub is the Edwin Howard Armstrong Professor of Computer Science at Columbia University and External Professor, Santa Fe Institute. In this wide-ranging interview, he discusses his early research, organizations and other entities he has created, and offers his view on several open-ended topics on the future of computing. --Editor

This interview is the second of two parts of an interview of Professor Erol Gelenbe by Professor Cristian Calude. It appeared in print in the October 2010 issue of the Bulletin of the European Association for Computer Science and is printed here with permission. --Editor

Mark Guzdial is a Professor in the School of Interactive Computing at Georgia Institute of Technology (Georgia Tech). His research focuses on the intersection of computing and education, from the role of computing in facilitating education to how we educate about computing. In this interview with him, he discusses how we teach computing and to whom, especially his contention that a contextualized approach is a powerful tool to teach everyone about computing. --Editor