A collection of observations, news and resources on the changing nature of innovation, technology, leadership, and other subjects.

September 24, 2012

MIT professor Richard Larson recently wrote an excellent opinion column - STEM is for Everyone. In the column, he succinctly made the case for widespread STEM (Science, Technology, Engineering and Math) literacy, and explained why it is as important to our 21st century information economy as basic reading-writing literacy has been to the industrial economy of the past two centuries. According to Larson, STEM literacy is a way of thinking and doing:

“A person has STEM literacy if she can understand the world around her in a logical way guided by the principals of scientific thought. A STEM-literate person can think for herself. She asks critical questions. She can form hypotheses and seek data to confirm or deny them. She sees the beauty and complexity in nature and seeks to understand. She sees the modern world that mankind has created and hopes to use her STEM-related skills and knowledge to improve it.”

Larson is Professor of Engineering Systems and Civil and Environmental Engineering at MIT. He is a pioneer in applying STEM capabilities to a wide variety of problems in services industries, from urban systems to online learning. He is the founder and director of LINC - Learning International Network Consortium, a collaboration involving educators from around the world who share best practices on how to best organize higher education distance learning projects in developing countries. Among LINC’s key projects is Blossoms - Blended Learning Open Source Science or Math Studies, which is developing a large free repository of video modules created by gifted volunteers from around the world to assist school teachers with math and science courses.

September 17, 2012

At the end of August, IBM announced the latest member of its mainframe family, the zEnterprise EC12. As pointed out in the announcement, the new system is a result of more than $1B in R&D investments over the past four years, resulting in major improvements in performance, security, availability and other key enterprise features. But perhaps, what is most impressive about this announcement, is the longevity of the IBM mainframe, which is now in its 48th year. Few computer families having major announcements in 2012 could trace their vintage to the 1980s, let alone the 1960s. There is something pretty unique about the mainframe being not only alive but well after all these years.

I have had a long association with mainframes. I was a second year college student at the University of Chicago when the System 360 was first announced in April of 1964. At the time, I was also working part time at the university’s computation center which used IBM computers. I still remember attending a presentation on the announcement given by a visiting IBM technical executive. Later on as a physics graduate student, I used high-end S/360 models for my thesis research. In IBM, I was closely associated with mainframes through major portions of my 37 year career, in particular, the period from 1977-1992 when I was involved in a number of R&D initiatives on the future of the mainframe.

To a large extent, the mainframe’s longevity is a result of two major architectural innovations first introduced with S/360. The first was the notion of a family of computers, from low to high performance, all based on the same instruction set which allowed customers to upgrade to larger systems as well as to future system models without having to rewrite their applications. The second was OS/360, a common operating system that supported the various members of the S/360 family except for the smaller ones which ran DOS/360, a subset with more limited capabilities. Today’s z/Architecture and z/OS are direct descendants of the original S/360 and OS/360.

Through the 1960s, 1970s and 1980s, mainframes were built using highly sophisticated and expensive technologies. It was important to get the maximum efficiency and price-performance out of each machine, so the hardware architecture and operating system were carefully designed to achieve high processor utilizations and fast response times to high volume transactions. Over the years, the IBM mainframes have continued to carefully design the hardware and operating systems software together to optimize performance and industrial strength, e.g., security, systems management, availability and other ilities.

September 10, 2012

The corporate research lab has radically changed over the past several decades. Corporate labs reached their peak in the 1960s and 1970s. As technologies and markets changed, most such labs declined in importance over the next twenty years. But, they are being reincarnated in our 21st century information economy as market-facing innovation labs, with significantly different and broader scopes than those of the original industrial economy research labs.

Most of these labs were established by large, successful industrial companies in the years after World War II. Their job was to push the frontiers of knowledge by conducting both basic and applied research, which would hopefully, over time, lead to new commercial technologies and products.

I joined the computer science department at IBM’s Thomas J Watson Research Center when I finished my university studies in 1970, and worked there for fifteen years. In those days, R&D were two very distinct activities, the R conducted by scientists in research labs, and the D conducted by engineers in product development labs. Marketing, was another distinct activity in a product’s life cycle. These various handoffs resulted in large time gaps between research, product development, and marketing, but since both technology and markets advanced at a relatively slow pace, there was little pressure to reduce the transition times across the gaps.

A very good 2007 Economist article, The Rise and Fall of Corporate R&D - Out of the Dusty Labs observed that this strong separation between R&D&M was common across all industries. John Seely Brown, former director of Xerox PARC, succinctly captured the prevailing attitude of research labs in those days with this quote in The Economist article: “When I started out running PARC, I thought 99% of the work was creating the innovation, and then throwing it over the transom for dumb marketers to figure out how to market it.”

September 03, 2012

I recently read an intriguing article in the NY Times: In the Ordinary, Silicon Valley is Finding the Next Big Thing, by columnist and Ohio State professor Steven Davidoff. The article focused on the recently announced deal between Starbucks and Square, the San Francisco-based mobile payments startup. Under the deal, customers will be able to pay for their purchases at any of Starbuck’s 7,000 US coffee houses using Square’s mobile phone app by simply telling their name to the cashier, who will just verify the customer’s picture appearing next to their name in the iPad-based digital cash register.

While the Square technology is very cool indeed, Davidoff’s key point is the ordinary nature of the innovation in question.

“The deal not only has the potential to change the way people pay for coffee and everything else, it also shows how small innovation applied to everyday tasks may be the next new thing for venture capital. Call it the rise of the ordinary innovators. . . And while it may be revolutionary, Square is really just an example of what Silicon Valley is increasingly about: process and networking [i.e. VC ecosystems] at its finest. . .

“Square shows how these venture capital networks can be used to transform the ordinary. Its idea was not particularly new, but Square designed a simple but usable device, patented some of the underlying technology, leveraged other technology like the iPad and added some marketing. With the Starbucks partnership, Square is also showing the opportunity for old-line companies to capitalize on their large customer bases.”