With the growing popularity of MakerBot, the reduced transaction costs of interfacing with sensors and digital circuits that the Arduino allows, and the emergence of modular prototyping platforms like Liquidware's Beagleboard-based gadget packs and Bug Labs, it feels like there's been a fairly dramatic increase in single-programmer productivity.

I re-read one of my favorite books of all time, "The Mythical Man Month" by Frederick Brooks. Normally I don't get too excited about dense, heavy, cerebral books that don't have any practical advice unless they teach a new programming language (or algorithm). But in this case, I make an exception, because it's just a decent book that questions the process of engineering.

Well, lo and behold, I decided to amazon around for comparable books, and I discovered that the author has written a new book, which sounded even more cerebral and pie-in-the-sky, "The Design of Design". As an aside, I wonder how much more "meta" you can possibly go. How about: "The Process of Thinking about Writing about the Design of Design" ? I mean, even the Greek philosophers had a limit. (Actually, then you'd need to write a book about a "Formalized Grammar and Metaphysics to Document the Process of Thinking about Writing about the Design of Design.")

At some point, for the sake of humanity, I just hope someone remembers how to actually do something tangible! But I digress. Turns out, I enjoyed the book.

It got me thinking. The Matrix movie got me thinking, as did the movie Inception (actually the 13th Floor did too but fewer people saw that one). So the fact that a book held such high company in my mind as the Matrix, Inception, and 13th Floor is high praise coming from me. Much of the book was focused on the process of designing design processes.

How many designers are too many, how should they work together, how do you organize to solve problems? I had a thought while reading the book: design exists because planning for engineering is an important and valuable step in communication and preparing to optimize problems. This is largely because engineering takes time.

Engineering, or building, or solving problems takes time.

But what happens if you break into a new plateau of productivity? What if that time is reduced significantly?

I've hit a personal plateau and breakthrough in productivity twice. Once in software, once in hardware. The software one happened a few years ago, and the hardware one happened about 2 weeks ago.

Software Productivity

My personal software productivity came when I moved from C programming to purely Perl. I realized that I could write an algorithm faster in Perl, and make a functional program faster, than I could in C. Because of that, I could iterate on the program faster, add new features, in less time. The next bump came when I moved from Perl to R. I could do almost everything I could in C in R, except that R also let me access tons of higher level math building blocks. I became really fast at writing code in R... although it took the code longer to execute, I focused on optimizing the algorithm or the way I wrote a function, as opposed to spending lots of time debugging Perl data structures, or C memory leaks.

The common thread was that R allowed a higher level of functional modularity, but still exposed the lower level functions and data types for me to use when I needed them.

Hardware Productivity

Recently, I've felt similarly faster at building hardware than I used to be. I used to have to write PIC chips into protoboards manually with wires that I stripped and cut myself. Then I got excited about Basic STAMP boards because they let me focus more on the code, and on a few simple digital IO pins. Then the Arduino completely changed the way I thought about accessing sensors, switches, and digital interfaces in general - it significantly lowered the "hardware access barrier" if such a thing exists. In practical words, it let me sit down, and hack the E-Ink screen on the Esquire magazine cover in a matter of hours, rather than days. Now, I'm hacking away on Arduino gadget shields and BeagleBoard Gadget Packs... and the time it takes to go from "I have an idea for a gadget hardware that does XYZ" to actually having one in front of me is measured in minutes.

I think I've found the critical pattern... just like in software, the biggest productivity improvement came as the hardware allows a higher level of functional modularity. I'm interfacing with sensors now, as opposed to I2C buses, so I'm able to build a hardware device even faster using a new sensor, for instance. But the important part is that as the hardware gets higher and higher level, it still gives me access to the basic bit-banging serial and data IO ports and buses, just in case.

Open vs. Closed Design Philosophy

That's the biggest difference in my design philosophy and that of Apple. While Apple *hides* digital IO and obscures interfaces, everything I've ever built *opens* the digital raw interface, and keeps that exposed and really easy to access from Perl, R, and C, even as the modules higher level.

The result is much faster hardware and software development.

As this continues, the time it takes to prototype decreases.

At some point, the time it takes me to prototype a device might reach the time it actually takes for me to just build it anyway.

And maybe that's some new concept or field of "extreme" or "agile" instant hardware prototyping.

...and naturally, this is accelerated by the existence of Open Source Hardware...

Why? Because Open Source Hardware is about lowering design barriers, exposing underlying schematics functional blocks, and the result is that prototyping with Open Source Hardware - in my experience - is orders of magnitude faster than traditional dev kits, and proprietary hardware.

...

Wow. Where did that come from? I suppose this is an example of the kind of high level, head-in-the-clouds type thoughts you walk away with after reading "The Design of Design". I feel like the reading rainbow guy: I recommend that book for any design engineer. I think the kind of design discipline I learned will make me a better hardware hacker - or at least a more efficient one (I wonder if anyone's ever done a study on hacker efficiency?) And the honest-to-goodness truth is that I'm not paid to endorse it. I'm not getting any kick backs (ha), nothing. I simply enjoyed the book, and it made me realize something about my own design process I hadn't thought about before...

5 comments:

What you are really talking about here is raising the level of abstraction. That's precisely what books such as, "The Design of Design" aim to do. Perhaps you should actually read it before you form an opinion about it...

@Jim - no no! my main point is precisely this: this is an *excellent* book. It served the goal of making me realize the science *behind* design, as opposed to simply blindly following a process I didn't even know I was following.

Experience is valuable, and I have a great respect for Brooks and his writing style (it is a compliment that he can write succinctly about such high-level concepts!). His first book really changed the way I coded. This book changed the way I *think* about circuit and PCB design, which is closer to what I enjoy, designing and hacking hardware...

I actually think Brook's age has a lot to do with it - in a good way. The world has changed dramatically in the time since he started managing design, and in the opening chapters he specifically points out how technology was *supposed* to change the way problems were solved and design was made. This type of perspective is impossible to get unless you truly have a multi-decade view.

Take for instance video-conferencing ("Telecollaboration" as Brooks calls it, I loved the picture on page 88). 5 years ago, it was going to change the way meetings happened. Conference rooms all over the world installed it. And yet today, many people I've talked to regard it as a pain. It takes longer to set up than is really useful. I think we need more of this forward-looking meta-perspective about technologies we create. Where are they going? What are they really good for? Do they solve a collaboration problem they were intended to address?

I wonder: is twitter going to help teams of hackers and engineers build better systems? Probably not. Because, according to Brooks, it's in some ways extremely disruptive to a cognitive design trajectory. It's a decent navigation tool, it can add some random input, but it remains to be seen whether broadcast communications can really solve engineering problems.

Sorry Matt. I guess I just misinterpreted your post because of the phrase "pie-in-the-sky" and your reference to the age of the author.

Some people (including experienced programmers) just don't understand the true value of abstraction. If it wasn't for the abstractionists (such as Brooks), we'd still be knapping flints and prodding animals with pointy sticks!

In my view, the future of hardware (and software) development is the move to higher and higher levels of abstraction. This empowers hardware hackers (if that's the right term) such as you and to a lesser extent me, by hiding complexity and allowing us to achieve much with relatively little input.

I've personally been involved with Model Driven Architecture in which you generate code directly from quite abstract UML models. Think "compiled language", where UML is the "language", the "compiler" is the code generator, and the machien code is Java, Python, C++ etc.

Several of my clients generate 100% of their code from UML models. And this isn't code you could easily write manually - it is instrumented, documented and comes complete with test and deployment suites.

If you are interested, you might like to check out my book "Enterprise Patterns and MDA" for some ideas about abstraction, code generation, components (patterns) etc. Although it's not stated explicitly anywhere in the book, some of the ideas are inspired by a consideration of hardware. Here's a link: