Design

Programming With Reason: Where Are We Going?

By Eric J. Bruno, September 03, 2010

The solution is a paradigm shift in how we develop software

Computer Technology of 1984

Let's take a look at the state of technology in 1984. For starters, Richard Stallman began work on the GNU project and his vision of open software, and the Macintosh is introduced by Apple Computer; both in January of that year. Other popular home computers of the time included the Commodore 64, the IBM PC and PCjr (which was released in 1984), and the Apple II.

The home video game market seemed to be on the decline, as popularity of the Atari 2600 was waning, and the Nintendo NES was destined to be launched the following year. However, Atari would go on to launch a series of home computers that, along with the Commodore Amiga, would create a cult following. And with the Nintendo launch the following year, the home video game market was re-energized. For more information on that subject, I recommend a book titled Racing the Beam, by MIT Press. I just finished reading it and found it fascinating and inspiring as a programmer.

In terms of big iron, by 1984, Digital Equipment Corporation's (DEC) VAX systems were in full swing, and had supplanted all of DEC's work on its PDP systems. Although there were VAX systems that qualified as mainframes, and competed with IBM's mainframes that were still popular at the time, the VAX helped usher in the age of the minicomputer. This also includes Sun Microsystems, who around this time was taking the industry by storm with its Sun workstations running Unix. Workstations and minicomputers running different flavors of Unix would eventually supplant DEC's VMS OS, but in 1984 the computer hardware and software landscape was quite varied -- perhaps the most varied in the history of computer technology.

The Big '90s

Yes, I know the real phrase is the big '80s, but for me, in terms of computer technology, it was the 1990s that were big. This is the time I began my career as a programmer, and I had access to seemingly unlimited computing resources. I began my career at Reuters on Long Island, developing foreign exchange trading system software. The development labs I had access to were filled with PCs of all types, Digital workstations and mini computers, and even exotic computers such as NeXT Cubes and Silicon Graphics workstations.

The trading system I worked on had a large DEC VAX mainframe at its heart that performed all of the central processing and matching – we referred to it simply as The Host. One day I asked how much memory was in the Host, and was truly shocked when I was told it was 512MB. The size, the cost, the potential -- all truly shocking. Soon after, I was allowed to go into the data center to see the monster in person, and was in awe at the power of this computer in such a pristine environment. The sheer size, the clustered mainframe next to it for redundancy, the networking equipment, and the supporting computers around the cluster made for a very impressive, and inspiring site for a young programmer. Truly a "hulking giant" experience, for those of you who have read Steven Levy's book, Hackers: Heroes of the Computer Revolution.

Once again, going back to the iPhone 4, I now hold more processing power with an equal amount of memory in my pocket; all running on a battery. The Host cluster at the time had huge generators installed next door to keep it running in case of even a momentary power outage. iPhone may not have quite the bandwidth the Digital mainframe had, but it's got it beat in every other category.

The code for the system overall was written in a combination of Ada and C, with parts moving to C++ by the mid-1990s. This is where things slow down. Whereas computer technology has changed a lot since then, programming languages have changed much more slowly.

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task.
However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

This month's Dr. Dobb's Journal

This month,
Dr. Dobb's Journal is devoted to mobile programming. We introduce you to Apple's new Swift programming language, discuss the perils of being the third-most-popular mobile platform, revisit SQLite on Android
, and much more!