9 Things Engineers Need to Know About Embedded Development

1. It's all about the software.
For mechanical and electrical engineers, whose education and experience revolve around hardware, it's hard to imagine that software could matter more than anything else. But in the embedded world, it does. Whether the project involves avionics, automobiles, phones, computers, thermostats, automation systems, or myriad other products, the brunt of the engineering will emanate from the software effort.

Ganssle estimates that 70 to 80 percent of electronic engineering time is devoted to software. Many products incorporate hundreds of thousands of lines of software code, and some use more than a million.

To figure out how that translates into engineering work hours, consider this: The average programmer writes about 200 lines of code per month. At that rate, a staff of 50 would need 100 months -- more than eight years -- to write a million lines of code.

Low-cost embedded applications that are less code-intensive can be served by microcontrollerssuch as Texas Instruments' MSP430 line.(Source: Texas Instruments)

"When it comes to embedded systems, software is the big engineering effort, period," said Bill Graham, product line manager for tools and lifecycle solutions at Wind River Systems, a maker of embedded operating systems. "Most experienced companies have a hardware group, which is a handful of people. And then they have a software group, which is a tremendous number of people." However, because newcomers to the embedded world almost universally underestimate the software contribution, more than a third of embedded projects are late or never delivered.

"We still hear about managers who say, 'Just change the code,' and then expect it to get done in an afternoon," Ganssle said. "But months go by, and they're still struggling with it."

2. It costs more than you think.
Cost is one factor that trips up even the most experienced embedded developers. After years of watching design teams underestimate costs, Ganssle cites this rule of thumb, which puts it in perspective: Each line of code costs between $20 and $40 to create. As a result, development of a product with a million lines of software code could cost $20 million to $40 million.

"Even people who are in the field tell me they don't believe it," Ganssle said. "But there's been way too much data collected and far too many studies done on this to argue it." Development teams who ignore such data do so at their own peril. And because so many of today's top engineers came to age in an era when software wasn't fully understood, the risks of underestimating software costs are great.

"Software is a relatively new field -- only about 50 or 60 years old -- and we're really still figuring it out," he said. "But the truth is that it's probably the most expensive thing in the history of engineering."

I third ttemple and RogerD's comments. In my 30 years of embedded hardware/software development the most successful efforts in terms of product quality and timely delivery were the result of small teams of very talented individuals that worked well together. That got the product off on the right foot with a great architecture, kept it on track with quality code, and got it tested, debugged and delivered by the team supporting and helping one another. Barring management interference, the quality of the code and the pace of development tends to be equal to the least-capable member of the team. One person is fastest, but can get stuck occasionally. That's where a few more team members help. More than 3-5, though, and the project can rapidly enter the death-by-committee/meeting realm. Larger projects can only succeed if a great architect can break it down into manageable subprojects that integrate well, or so many engineers are thrown at it that it gets done by sheer overwhelming numbers.

"The average programmer writes about 200 lines of code per month. At that rate, a staff of 50 would need 100 months -- more than eight years -- to write a million lines of code."

Knowing the author is simplifying for the sake of brevity, it may not be obvious to some that total lines of 'good' project code per time unit is never linear in the number of people devoted to the task. There is definitely a point of diminishing returns, and a point at which adding more people does the project a disservice by making the overall task unwieldy, if not outright unmanageable. Microsoft used to blame IBM for ruining OS2 because non-technical managers relied on the 'masses of asses' principle as a means of (erroneously) getting it done faster, then ran roughshod over the coders when the simple arithmetic did not prove axiomatic. The people who were a party to the overall 'vision' at the outset can become disconnected from what is actually emerging, as new people are added at the back end to expedite certain tasks or address new requirements. Moreover, the newcomers may have a completely different view of what the goal posts look like. If you start out with a few people who all know 'C' well then, for example, marketing decides the thing needs Android, bringing in Java experts who've never seen a pointer in their life may cause the team to split into two camps, and they may end up competing more often than working together.

Regarding operating systems, I agree they should be avoided wherever prudent and possible. For some projects however, there's no getting away from it. For example, if you're targeting a high end MPU like a Cortex A8 or above, you NEED an OS else you'll get bogged down in the minutiae of writing drivers etc. The first rule of thumb is you should abandon all rules of thumb. The second might be if you're using a MCU like a MSP430 or a Cortex-M0 to M3, you can probably get away without an OS. As stated in the article, concurrency beyond all but the simplest of requirements dictates you need an OS to manage access to inter-process, shared objects.

Thanks to the author for making a software guy feel important for an afternoon. It's time to go home so my teenager can ruthlessly burst that delicate bubble.

Great article Charles. I am one of those mechanical types and have very limited experience with software, embedded or otherwise. This field fascinates me and I am certainly appreciative of your article highlighting the difficulties with the technology. Your comments about the time-consuming efforts and costs to produce the code were revelations. Revelations. My experience in programming is with C++, Pascal and Visual Basic, which are basically "learning-types" of software. If I may, I write an educational blog published through WordPress; i.e. www.cielotech/wordpress.com. Would you mind giving me permission to reference your article in an upcoming blog discussing embedded systems? I think my readers would also be fascinated by the subject. Many thanks, Bob J.

ChasChas: It can't really be boiled down any further than to say that writing and de-bugging code is a very slow, tedious, complex process and many products have hundreds of thousands, or even over a million, lines of code. As RogerD accurately points out here, the numbers cited here refer more to larger projects. Still, the stories I've heard seem to indicate that many, many teams don't have a full appreciation for the scale of the software portion, and that misunderstanding (or lack of understanding) gets them into trouble. As for your reference to eccentric behavior by programmers, we'll need some deeper insight from some of our readers on that one.

I couldn't have said it better. I completely agree that the smallest possible software team will usually minimize development time.

I have seen a one man "team" design, debug, and program a very high performance FPGA/DSP/Microcontroller based motion control and data acquisition system in about a year (hardware and software). I doubt if a whole team could have done it in five years. A government funded team would probably never have finished it. I'm not saying that anybody could have done what he did, but he was the right man for the job, and adding more people to the mix would have only slowed him down.

Unless a software project can be very distinctly divided and conquered, the fewer programmers the better.

One additional thought: when a project takes any COTS module and places it onto a product host PCB, any Agency Approvals (FCC, UL, CE, etc.) are streamlined because the COTS module already "grand-fathers" the host product's approval process. On the contrary, embedding the solution eliminates that short-cut and you must face the full scrutiny of any Agency. Plan on adding at least another 8-12 weeks before approvals are granted.

To point #1 (all about SW) truer words were never spoken. A recent contract assignment involved placement of a Standardized (COTS) transceiver on a motherboard. One staff meeting discussion entertained the topic of eliminating the COTS transceiver in favor of a direct chipset embedded solution. Easier for the EE's; easier for the ME's. But the SW guys hit the ceiling, citing months of recoding development. All the points of your article are great checkpoints for whole teams and especially program managers to post on their walls for continuous awareness.

Earlier this year paralyzed IndyCar drive Sam Schmidt did the seemingly impossible -- opening the qualifying rounds at Indy by driving a modified Corvette C7 Stingray around the Indianapolis Motor Speedway.

Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.