Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

An anonymous reader writes "In a similar vein to the previous discussion about the New York professors taking Java to task for damaging Computer Science education, Mike Anderson of the PTR group wonders why it's so hard to find good embedded developers these days. 'As for today's CS programs, it seems that long gone are the computer architecture classes, writing code in assembly language (or even C at this point) and engineering software economics. In fact, a large number of CS majors apparently believe that everything can be implemented in a virtual machine and that both memory and [CPU] cycles are infinite.'"

I'm in my late 30s and have been doing embedded development since college. Back then we built finite state machines in PALs and complete embedded systems using microsequencers and UV eraseable ROMs. We both built (hand wired) and programmed the systems from scratch. Doing that gives you a very good appreciation for what's going on. That kind of stuff just isn't done much anymore in college. From what I can see, there aren't many good embedded programmers more than a few years younger than me. It looks like good embedded programmers (whatever their age) will be in demand and well paid for many years to come.:)

I wonder is this a problem in USA only or is the trend spreading to other countries?
here in Sweden it does not seem as there should be any bigger shortage of embedded developers than other types.
Still embedded development are a lot more fun than doing some ordinary desktop app, or even worse webapps

UK too.We were the last year through in my Uni (one of the UK top 10 for CS, 10 years ago) that started by learning C. Java replaced it as the starter language.

I think that's a bit of a mistake.

They were already telling us then not to worry too much about memory or speed, unless you took the specialist courses in optimisation. From where I'm sitting now, as an open systems engineer working on high throughput memory-resident databases, that seems like criminal negligence. I should imagine an embedded enginee

Not too bad to start with Java, but it's not really close to the hardware at all.

A starting language shall not be too unforgiving, and there is a small advantage with Java, and it is that it's very C-like. At least as long as you don't use a lot of built-in classes. If a student is limited to the "java.lang" package and isn't allowed to use anything else from the Java library it can actually provide a good learning base. But I assume that many teachers aren't into this... The really great disadvantage wit

It is a problem here in Sweden too. Many of the colleges and uni's spend too much time teaching high-level stuff, and just a bare minimum on low-level. So what we get out here is people who need a ton of libraries to get anything done, who believe that Getting It Right The First Time is a waste, it's better to patch early and patch often(Rather anathema to embedded work, don't you think?) and who believe that if something doesn't quite work with the hardware available, you can just get MORE hardware....And

Really? Cause I seem to remember creating FSMs using PALs just last year in my undergraduate CS program (in a required 300 level class). We also had a class where we put together a microprocessor, put it on an FPGA, and wrote low level code to run on it.

A number of my friends took the hardware capstone where you did even more of this type of work. It still goes on, perhaps you just aren't looking in the right place? It seems the CE and EE folks take more embedded classes than CS students, so that is where you should look. CS is a huge field, and not everyone is going to follow the story's author's preference of study.

I have to agree. I'm a computer engineer at the University of Illinois. It's the computer engineers that are doing the lower level development. I've had extensive systems programming (in assembly and c) and every CE here makes both an OS and a 16-bit processor. CS students don't see that stuff unless they elect to take it.

I haven't seen a speck of java (or even c++) in my ECE (Electrical & computer engineering) courses, but that's all I've used in my CS courses. Furthermore, I have a lot of friends in CS that know very little about what the actual computers are doing on a bitwise level. (I had to help one of them work on bit masking last weekend.)

Santa Clara University requires all CS majors to take the introductory course to Embedded Systems as well as the Operating Systems course offered by the Computer Engineering department (I was a CS major and graduated a couple years ago). We also had to take an introductory course into Logic Design from the Electronic Engineering department as well. The philosophy behind this, as explained by my faculty adviser was that Computer Scientists should at least have some background in lower-leveled programming. Co

The trouble is, a great deal of CS graduates feel that Computer Science is the fastest way not to being an engineer or scientist, but to becoming a programmer. A disturbing number of my fellow students avoid classes perceived to be lots of work, which is exactly what this guy is asking for. Debugging embedded hardware is not easy or simple, and requires massive amounts of attention to detail to get anything to work at all.

Anderson's question might have been equivalent to "where are all the graphics programmers?" or "where are all the operating systems programmers?" but for one thing: this article presupposes a shortage to convince readers they need embedded skills, because PTR offers training in Linux embedded system programming [theptrgroup.com]. Frankly, the more important skill in embedded systems isn't pipeline stalls, but the Chinese language; most of the work has gone to where the embedded hardware is made: East Asia. Case in point: the only work this guy appears to get is defense contracting, where clients can't outsource / offshore the work.

I'm not willing to create things that have the sole purpose of killing other human beings. Sorry.In addition, while I'd love to continue advancing my CS skills it has become obvious to me that the only way to make real money in our society is to get a degree of a fake science like economics/business in the form of an MBA. That way you can bullshit like a pro, cheat shareholders out of millions of dollars (maybe even billions!) and provide absolutely no value to anybody but yourself.

CS courses are supposed to be about computer science. If I wanted to learn how to make a webpage I'd just get a good reference and hack at it (or take a summer class).I don't meet many CS graduates with any grasp on computer fundamentals. I was actually arguing with someone last week about why Java is the WRONG language to use for a once-off program to edit a column in a database. His response was "but Java is common and everyone knows it AND it's a database language, who knows Perl?". Shows how much he

Of course, these engineers had one thing in common. Like many schools, they took their programming and algorithms classes in the CS faculty because the schools were trying to save money. The CS schools are doing their best to churn out people who know all the buzzword technologies (Java, HTML, AJAX, etc) but have little marketable skill. CS courses seem mostly a left-over from the dot bomb era.

I've yet to meet a CS grad who properly understands the difference between TCP, UDP and IP. I haven't met one wh

This is what always bothers me about these "shortages" complaints: why do you feel the sample population you've encountered is a valid cross section? If you are a hiring manager, it could be that your HR dept. is failing you. Or it could be that your company's pay scales immediately dissuades talented programmers from applying. Can anyone honestly say they've accurately surveyed the graduating class of '0x?

Some schools might offer good CS courses but the majority of grads I deal with know nothing more than buzzword garbage, and I am not alone in my views; I regularly hear associates complain that they can't get a CS person who knows their ass from an infinite loop.

I hear you, man. I am a recent graduate myself (UIUC), so I see this from the other side. Our program is pretty good compared to most things that I've heard on the net (and various people I've met from CMU/MIT/Standford/...), but a couple of t

I'm about the same position as you are, late 30s and done about the same but I didn't do it on CE classes but in EE. At that time, at least here in Brazil, CE was very incipient most of the work was done in mainframes and the embedded software was created by electronics engineers doing microprocessor programming.Well, I still do embedded for a living, right now for Canonical and I have to agree that's damn hard find good embedded developpers.

I agree, however for a much different reason. I feel that C++, *.NET, Java, etc. is the entire reason that computers need to be consistently faster and have more cores. The reason, being Object Oriented programming. The companies putting this crappy software out (all of the big names go here, with etc. at the end) don't have a clue how to make things efficient, they simply are paid to get the programs out the door as quick as possible, regardless of the performance. Embedded Engineers have to do things

Used to be an embedded developer for devices, but then I got interested in Ruby on Rails (it was much more fun).

I think there are probably more than a few people out there that have the skills, but have been moved into other paths by market forces. They will come back if you pay them enough, but that is unlikely to happen, so they will probably stay where they are.

Not only are there probably more than a few people that already have the skills, but I also think that it isn't necessarily too difficult to get people into the mindset needed. I learned programming in C++ but without any real concern for efficiency or memory usage. On a job much later on, I had to write a browser for very limited j2me devices (total memory of 64KB). The browser downloaded, parsed a subset of xhtml, displayed (including images), navigated according to user action and even accepted text inpu

True!when comparing x86,PIC or (shudder) 8051 ASM to some more modern Python programming, I dont really want to go back anymore. Even though I maintain Python code with a curses GUI Yuck!Last thing I worked on was a GPRS/GPS combo on an ARM device, which was quite fun.

Why such a long prelude? Well, I just cant find serious employment on the field of embedded devices in Eastern Europe. Most guys who need a developer are of the type with "A good Idea" and hiring a dumbass to do the work; you just cant feed a

I've done the coursework, and been trained in embedded programming. I'd be a junior at it, and would need experience but I could be one pretty easily.

But like the parent said, are you going to pay me for it? I get a good salary working in high-level languages that let me create things that *do* so much more, unless I was overwhelmingly interested in embedded systems what motivation do I have to do it?

I'm one of them. I did embedded for many years, I loved it but 90% of the work was defense systems and it was so cyclical. And most of the projects were poorly managed over budget, software always got screwed for hardware. I got my MBA, left for Technology consulting and never went back. I get offers all the time for a 25-30% pay cut to go back. Seems to me that programmers who know C and Assembler and can write tight code, know OS internals, and can interface to hardware are almost as extinct as IBM mainfr

Correct! There are many people who know how but when you compare it to the projects where Ruby (or Java, or..) is used what's the point. Headache and less pay. I'm not on level where microcode people (next level, more headache) are but have done my share low level coding (at the time when systems programmers were real) for anything from controllers and channel drivers to multi-processor memory managers and schedulers. Once you get it, you can code to any hardware. But who would like to do that with what is

The new graduates are uncomfortable with: "Klingon multitasking systems do not support "time-sharing". When a Klingon program wants to run, it challenges the scheduler in hand-to-hand combat and owns the machine." They have to use java, schedulers, vm protection, etc.

On a more serious note, to do real embedded programming you need to know data representation in and out because you tend to manipulate your data directly, no band-aids allowed. Until the embedded systems will support band-aids as used in todays college it will be a profession for the myopic geeks with grey pony-tails or the ones who are way on their way to well developed pattern boldness.

That's what I rejoined school for.My dad is an EE, and does embedded work for vehicle systems.My granddad was an EE, and did the early work on embedded missile guidance systems. He was a ham operator also.

I'm also a ham operator, program in C fluently, and have a knack for getting electronics to do what it wasn't intended for.

I'm going for my EE at IUPUI. I'm also learning Japanese, and also working on my commercial FCC license.

My plan: there's a large amount of Japanese factories located here (Columbus, IN

I wish I could jump on something like that too. As it is, I have a CS instead of EE, and no real embedded experience. I did want to pursue embedded systems early on, but as my first job was a.Net related job, I'm pingeon-holed into.Net jobs.

As it is, I'm looking at doing an English-teaching-in-Japan stint. I wholeheartedly agree with your sentiment.

I wish I could jump on something like that too. As it is, I have a CS instead of EE, and no real embedded experience. I did want to pursue embedded systems early on, but as my first job was a.Net related job, I'm pingeon-holed into.Net jobs.

This is a hard lesson to learn in technical/engineering jobs: be extremely careful what job you first take, because it will go a long way in determining what you do for the rest of your career. Employers always pigeon-hole people based on their previous experience. So if your last job was a.Net programming job, your next one probably will be too, because it'll be extremely difficult to convince anyone that you'll make a great Linux kernel programmer, for instance.

Maybe as a CS. I'm a MAE (Mechanical and Aerospace Engineer) and I've gone from a simulationist (6DOF modeling and simulation) to a guy who writes seeker models and guidance models to working for NASA as an aerothermodynamist (heat transfer on rocket re-entry). In under three years time. The key is to keep on learning. Get that masters, and don't stop till your business cards say "Ph.D." on them. And learn outside the classroom too:) It's good for you and good for your career.

Employers always pigeon-hole people based on their previous experience.

Not so sure about that, I'd written loads of software for PCs - interactive graphics on an 8086 including some assembler - in my sandwich year (internship) and when I got my first proper job did they put me in their PC dev team? No, they put me in the mainframe division, doing C080|_.

>> In fact, a large number of CS majors apparently believe that everything can be>> implemented in a virtual machine and that both memory and [CPU] cycles are infinite.>> Good for them; Alan Turing believed it too.And now it is a near reality. Memory and CPU cycles are abundant and do not play much role in slowing down today's programs. Incompetent programmers who don't know how to program IO, or how to make their code small, slow down today's programs. As for virtual machines, there is no

And why do the incompetent programmers not have a clue how to do IO right, or use efficient, compact code? It doesn't matter might have gotten As in their algorithms classes; They've never had the point driven home because they've never had to write programs for machines that run less than several billion instructions per second from a store of gigabytes of memory.

I'm neither surprised by this nor do I necessarily find it something to get up in arms about.

There's not really time in a 4 year degree (well, along with all the other crap that goes into it) to teach someone the kinds of things you need to know to be a good business application developer and to teach someone the kinds of things you need to know to be a good embedded applications developer.

A good embedded developer needs experience with languages that run "close to the metal" of the machine, needs to know how to manage memory, needs to know how the machine architecture works, needs to know how to perform optimizations within that world, etc.

A good business applications developer needs experience with languages that abstract or hide a lot of the above details in order to let them focus on business logic, needs to know a decent amount about databases, needs to know about software architecture and design patterns, needs to know about networking, generally needs to know something about UI design, etc.

Yes, there's some overlap.

Speaking as someone with a college education emphasizing the former and a career emphasizing the latter, I'm not convinced this is a terrible thing. There are a lot more business style applications that need writing in the world than embedded applications. That specialization and the need for it I don't see going away any time soon, but it's the exception rather than the rule, and I'm not convinced there's something holier about understanding the guts of the machine than in understanding how to design a complex system for extensibility, maintainability, high availability, or whatever best suits the project.

Actually, there is time - it's just that most students decide to skip some of the technical classes for an "easy A" class and there are limits to the number of courses that a program can require students to take.

Since students are supposed to expand their horizons, most schools have a limit on how many credits can be required by a specific program. This means programs require their "core" classes and then require students to take 3 out of 5, (or however many, in some sort of pattern), of the other courses

Sounds like your undergrad experience was pretty different from mine (UIUC). I didn't really have any flexibility to pick 'easy A' classes per se -- I mean, yes, I did have to take some humanities / social sciences / etc. classes that were generally easier than my math / physics / engineering / CS classes, but I also needed X specified amount of each of those to graduate. All my technical classes had to fall within a fairly narrow set of choices and roughly zero of the ones I took were applicable to what

That is to say, after picking a set of classes to take over 4 years that would satisfy all the requirements, there were few if any additional classes necessary to make up the hours required for graduation, thus, not much latitude to pick classes for interest or grade padding outside of that list.

The reason for Java/C#'s popularity isn't because of the VM, it's because of the huge accompanying frameworks that allow for rapid development which is in most cases much more important than efficient cpu/memory usage these days. Build one of these frameworks for C/C++ and you will find it much easier to compete with newer langs.

And if you Linux/Unix platform as it was originally meant for, most of your library work is already done.You build C programs to handle real data crunching, and bash scripts for data-passing. One has all of this running in console, and have a console hooked to a X program which simply passes GUI data. One can change the GUI without touching the engine, and likewise.

Unlike Windows, where one doesn't know exactly what the system is doing, hence we require trials and testing. In Linux, we know what does what,

C# and Java are inherently easier to code in. This is not due the the framework. It is due to things like the more difficult memory management in C/C++, the more difficult-to-use container classes and the arcane nature of C++ template syntax.I'm no Java bigot. Quite the opposite. I've been doing C/C++ for twenty years and have only a year and a half of C# and Java combined. My experience with C# was that I could code four times faster and ended with a program that used four times the memory and ran at

Unfortunately, it also boils down to time to market. As you stated, you code four times faster, and with a decent framework and unit tests, you can get working code out the door four times faster (presumably). To my clients at least, that's worth way more than saving some clock cycles. No one really cares about that stuff unless you're working within some serious hardware limitations. They just want the stuff out there and working and earning. It's all about the bottom line.

Except that we are talking about embedded here, the kind of environment where the closest thing to a framework is called the BSP (board support package, basically a set of drivers), which is specific to each variant of chip/board you might encounter. Anyway, most of the times, the requirements are also so specific/tight you end up rewriting your most critical parts over and over anyway.Embedded is not the kind of environment where you seek rapid development, it's more the old school mentality of spending we

You must be daft. Apparently you are to anxious to plug your own agenda rather than actually listening. Qtopia [trolltech.com] has been around longer than you obviously realize. I ran it on my 200 mhz ARM Zaurus PDA... fast/complete as you could want.

Your embedded system is a 200MHz ARM that can be recharged on the wall every few days as necessary. Mine is a 100MHz Zf86 that depends exclusively on solar power, has no VGA card at all, has only a 115200 baud serial port and 11 megabit wireless for communication, and would use up a significant chunk of its 32MB RAM just loading the client-side X11 libraries. Hell,/usr/lib/libQtGui.so.4.2.1 (stripped) by itsef is 7 megabytes. Turbo Vision OTOH (which BTW I am NOT a developer for, I just use it sometimes

I'm an old school UNIX hacker, I worked at Bell Labs in the '70's. These days I hack software that controls robots used for rehab of stroke patients. It's not exactly embedded, since the code runs on an Ubuntu/Xenomai COTS PC, but it's similar in nature to embedded hacking.

The relevant question is, how many embedded-system hackers are needed? If only.1% of job opportunities are for embedded-system hackers, then there really isn't much incentive for people to learn to hack embedded systems. If embedded hacking is a lucrative field with attractive opportunities, then hackers will follow. We saw it happen with other forms of hacking, we even saw it happen with web-page hacking. If there is a need for embedded hackers, it will be filled (guided by Adam Smith's Invisible Hand).

Thats what I'm thinking. If you train people too much in niche fields, they're more likely to be attracted to these niche fields... And then you have a market saturation. There's a reason why everytime there's an article about IT jobs, you have a billion CS majors from Slashdot posting about how they can't find one (even though the market is currently starved and needing developers like never before): too many people trained in things that simply don't have a market for. Its nice (and even a must) to know t

Well, while it is a niche market, there are many industries that use some embedded developers. I've personnaly worked in that field for mobile phones, medical imagery equipments and police equipment, and I never had any difficulties finding a job. I also know people doing embedded dev for cars, commercial and military planes, trains, highway toll booths, Visa card readers or home appliance.So I would say that there are a lot of niche markets, but it usually requires a different mindset than the regular CS j

Heh, I could probably keep a dozen of them busy right now!Wouldn't be Slashdot if I actually RTFA, but I do know from my own experience that part of the reason students aren't learning embedded concepts is because there aren't a lot of instructors who are willing to go through the pain required to teach them.

To demonstrate an embedded concept, oftentimes you want to build a working model. And as soon as you bring hardware into the room, there's the chance it could fail and you'd look bad. Or you'd get a q

Yo. (Hopefully) future embedded software engineer right here. Currently majoring in Computer Engineering with a CompSci minor.
I actually just enrolled in an embedded applications class this semester. Seems like a popular class, too - offered fairly often, and good enrollment each time from what I hear.
Even though the majority of the CS classes at my university are Java-based, there're still required architecture classes that use assembly and C, along with several EE courses. Just because many colleges ar

That's right. EE are the teachers of embedded discipline. We have to know algorithms from CS, pure mathematics (enough for a minor at least), physics of light/energy/magnetism (as like math, enough for a minor), circuit design, OS design in theory and practicality, and a bunch of other things.

All together, that's what an EE is. If it's not on fire, we can program it;D

I taught myself to code on an Amstrad CPC 6128 (128k of ram in two 64k banks, z80a processor - so not exactly "embedded".. but optimization was important), but since then it's been VB, Java, and Delphi.

I'd love to get my teeth into something embedded. But what? Can anyone recommend a fun device to play with?

'As for today's CS programs, it seems that long gone are the computer architecture classes, writing code in assembly language (or even C at this point) and engineering software economics

And IMO, they still teach these TOO MUCH. From my personal survey (probably meaningless, but I need a point of reference), they still teach that quite a heck of a lot in schools. The demand for people in these fields is significant, but still quite low. It is a LOT easier to find someone who knows how to c

The desktop machine I learned to program on was an Acorn RiscPC. Later in life, I helped write the embedded firmware in the Rio Karma portable MP3 player. Rio Karma had two 90MHz ARM CPUs, 16Mbytes of RAM, and 20Gbytes of disk. That would have been one kick-ass RiscPC. If anything, programming and optimisation techniques learned on late-80s and early-90s desktops are too "embedded" for a lot of today's embedded programming.

And to those advocating hiring EE graduates for embedded programming: unless your device isn't far north of a toaster on the embeddedness scale, please also hire people with broader or higher-level software expertise for system design and architecture. It's not the same skill as squeezing one more instruction out of a loop, you find people with one skill and not the other, and any medium- or large-scale software project needs both skills (whether in the same person, or through good collaboration).

People go on about the spectrum of computing, software/assembler/logic/gates/transistors; what some don't realise is that there's a spectrum even within software. Some really good, tight, expert coders just can't see the bigger picture. I've seen medium-scale software architected by the toaster contingent: it wasn't pretty.

I more or less drifted into embedded systems work from a desktop/server background. I frequently point out to people that there's a lot of confusion about what an "embedded" system is. Some people use the term to refer to 8-bit mircocontrollers that have a whopping 4K of memory and no persistent storage. Others use the term to refer to 32-bit devices with 2-4 MB of memory and 256 MB of flash for storage. There's yet another

Almost everyone is focusing on the small memory and storage of embedded but leaving out the other key part. An embedded device has no rescue disk. On the production device, you turn it on and it either gets at least to a useful state for diagnostics and firmware reflash or it's a paperweight. When developing, the dev environment may include JTAG, serial, various debugging interfaces, perhaps a dev board with video (sometimes), but in other cases you have a debugging port (more or less serial) and a romulato

There is not a huge business need for this type of programming knowledge so guess what - it's not taught as much in school.
There are still plenty of people interested in this stuff. We regularly see articles about developers from all over the world working on the Linux kernel. Doesn't that require knowledge of computer architecture, assembly and C programming skills? The OSDev scene is full of people like this (see my homepage). I'm sure there are lots of embedded hobbyists that have regular codding jobs

I certainly would have liked to do embedded programming. I really enjoyed my OS and Embedded Systems classes in college. As a result, when I was in the market for a new job, I was specifically looking for embedded systems work.The problem was that any job pretty much wanted several years of experience, and greatly preferred a EE or Comp E degree. I had a CS degree and no formal experience in embedded. Any experience I had was through my classes, as my personal projects tended to focus in other areas.

Forget that A/C fool down below. My advice: learn as much as you can as a side hobby. Have some code to take in to show them. Know the toolkits and environments.Show them that not only are you willing to learn, but that you have already taking care of the "Hello World" stage and are moving past the trivial. Come equipped with real questions about your side project if you are stuck on something.

Then, creatively craft a resume around these skills (but again don;t lie). be honest. Tell them you are will

After 17 years of writing application code - I just got sick and tired of sockets, threads, and shitty toolkits like MFC, GTK, and Qt.Not that these constructs will ever truely disappear from my life, but christ that got old.

So I took the plunge, dusted off my (very dated) assember and architecture texts, revisted a couple of device drivers I wrote in the course of getting my masters, then convinced an employer to hire me at my current salary level to write device drivers and learn FPGA programming (Verilog

As someone who finished whis BEng in Computer Engineering this year I have to ask, where are the embedded jobs?

I hunted around looking for any assembly/vhdl/c job I could find and only found one which didn't require 2 years expearence, they didn't get any message back to me until 2 months after I'd applied. I applied to roughly 35 "entry level" jobs dealing with embedded devices. None were interested because I lacked two years expearence. In the time it took the only one to get back to me I'd been contacted by a C++/Java company had a phone interview went up and been personnally interviewed and been offered the job.

The real kicker is the embedded jobs paid less, didn't mention any benifits and wanted more work out of me. I think the real issue is most companies aren't willing to invest in graduates unless its for business positions. It is difficult to get 2 years expearence when everyone demands 2 years for their entry level jobs. Which leads to a shortage since no one gets offered a embedded job so Universitys don't bother going into too much detail since 99% of their students will never use it.

You have to realise how recruiting goes... Aside for the college "Job fairs", jobs that can be offered directly to students won't be posted. The jobs that get posted are the ones that cannot be offered directly to a student. Simple as that.

So there probably were a lot of companies "willing to invest in students". They just contacted the students directly looking at their resumes, hired them up, and you never heard about it... Happens all the time.

I currently have an embedded Linux job. Note that I have over 25 years of programming experience in general, though not so much in embedded systems specifically. I'd gotten rather badly pigeon-holed as a Windows programmer in the last decade because my last several jobs were Windows programming--very frustrating, because I detest programming in Windows and very much wanted a Linux/Unix/etc programming job. Unfortunately, when your 20 year-old jobs say System V Unix and your 5 year-old jobs on the resume say

Thats actually fairly common. I mean, let say I want to hire someone for Ruby on Rails or ASP.NET... they SERIOUSLY don't teach that in any college in the mainstream...so if you want someone with experience with it, what are your options?Only ones i can think of is:

A) The person was an XYZ programmer, and got hired as a RoR or ASP.NET dev because of extensive general development experience, or experience in the field...so they got experience in those technologies from scratch in the workplace.... AND then l

I think a large part of the problem is the job market. Most job postings go by the "X years of Y" rigamaroll with very few people looking to hire what they should be looking for: talented new blood. I am one of many many computer engineers I know who opted to go into programming rather than suck down some totally shit entry level job for lack of on the job experience, even though I've been building microcontroller systems for years.I've heard outside the US the embedded market is a lot less corporate and

The guy who wrote the article is either an idiot or is out right lying.The reason so many people stay away from the kind of degrees he wants them to get is because there are very few jobs in those fields and the jobs that do exist are poorly paid when you look at them on an hourly basis. Not to mention that the smart young people he wants to con into being his cheap labor are smart enough to look around and see the thousands of unemployed and unemployable 40 something engineers. They are smart enough to "ju

What happened to the embedded developers? The industry got rid of them...

So if it doesn't have any now, then it really can't look elsewhere to blame anyone.

I started out as a R&D engineer working with video game technology, but essentially, it was all embedded work... I lived and breathed machine code and logic - to me software and hardware were one in the same, a symphony of technology with blurred distinction between the two. I remember sitting down with six spare GAL16v8s and a couple of low-power walkie talkies and a spare afternoon and built myself a radio modem for fun. That was the sort of work I used to do.

But there weren't many people like me - Assembly programmers were hard to find, even back in the 80s and most engineers fresh out of university just didn't know how to write real-time code in assembly language properly - didn't know how to write fault-tolerant code or build a spinlock as the starting point for your application. Didn't understand the necessity of understanding how many cycles an instruction took or how to watch for errors by measuring the duty cycle of the interrupt pin with a logic probe.

So the people who employed hardware designers (back then, if you knew machine code, you usually had a hand in the design of the system as well) found that it was difficult to get replacement engineers. As a result, they couldn't employ similar salary replacements and as old engineers got tired of being mistreated and poorly paid, they simply left and went off to do something different.

Industry responded to the lack of engineering by eliminating the need for the machine code engineers - they moved away from the embedded design with assembly to embedded design with C or even to outsourcing the product they needed, and the hardware got designed by dedicated hardware engineers.

Once again, any real skill in the area was lost as employers wouldn't pay for experience and the best engineers realised they would never be paid what they were worth, so left to do something else.

Then the industry got around this constraint by using really powerful embedded devices - basically a complete PC ready to run whatever PC programmers could write for it.

That's where we're at now. The skills left the industry because the industry wouldn't pay what they were worth... If you can make more money at another job (in my case at the time, selling PCs and Journalism) then why would you keep on developing hardware for a company that doesn't want to pay what you're worth?

I'm seeing the same thing now in Network Analysis... The world is full of network technicians (and I include many people who consider themselves engineers in that description, but don't really know how to actually measure things or understand the technology they work on) but has very few network engineers.

The solution for me? I got smart and moved to management.

I'm a lousy manager ( really, I suck at it ) but I try hard and for once, my contribution is recognised by the company I work for financially... And I have a family to look after.

Would I even go back to engineering or even embedded engineering?

I would love to go back, I really would. I can sit in front of circuits all day and build something and I enjoy every second of it, but I can't afford to do company critical work that won't feed my family or pay my bills.

So unless the industry is prepared to pay for skilled people, and by pay I mean pay them more than they would get being a manager or an accountant or even a journalist, then they will leave.

The other embedded engineers I know all did the same thing... One works on an offshore oilrig, another as a miner, one went to a call centre. One even opened a grocery business. These are all smart people and although they all miss working with electronics and embedded designs, they have families to feed too.

If you want to do embedded systems development, it's cheap. If you know what you're doing. You can get an Atmel ATMega128 board with a little LCD display and a few pushbuttons for about $50, a JTAG programmer cable for about $20, and a complete development environment with simulator, debugger, and C compiler for free. Even C++ works; GCC supports the thing.

Unfortunately, you can't get all this stuff in one box with a nice little "Embedded Development for Dummies" book. There are environments well suppo

Right now I'm looking for entry level positions doing embedded systems work. This is what I want to do. The problem is that every job position I've looked at wants at least 2 years of professional experience. If you look up the article author's company, his most basic level positions say "The candidate should have a minimum of 2-5 years experience of embedded software development" and their FAQ page says "We are pretty fussy about who we hire. That is to say that if you are a true real-time, or embedded,

If you want someone to program the low-level guts, look for a EE with some programming experience, or a Computer Engineer. CS major programs just aren't designed for that kind of work.

My company understood this ten years ago. I'm a EE major with a computer "concentration", which is typical of who we were hiring back then for programming work; we hired very few pure CS majors at the time.

When I started, we made home-grown OSes (or lived without them!) and had to really try for efficiency. Now, with modern processing power, we run Linux or an off-the-shelf RTOS, and program applications in much the same way as a PC programmer. Aside from interfacing with the peripheral hardware, we don't do much that's uniquely "embedded" anymore, and efficiency only really matters in a few specific areas. So now we can hire CS majors to do most kinds of work.

What truly time/space crunched design we do have mostly occurs in the specialty FPGAs, which are still largely the domain of EEs.

... large number of CS majors apparently believe that everything can be implemented in a virtual machine and that both memory and [CPU] cycles are infinite.

Not an embedded system, per se (more like a "headless system"), but I was involved in a project where certain aspects of the design (aspects that I had no input into) more or less assumed infinite memory and CPU.

But when implemented, the design SUCKED because it ran like molasses on a cold day. Everything that could be done to increase its performance was tried, and in the end its performance still sucked. The whole thing had to be re-designed, and rebuilt from scratch. The original designers (C.S. degrees, of course) howled about "engineering hacks" the whole time.

The final design was uglier on paper, but it ran several orders of magnitude faster than the original.

Yet CS is the department in which programming is taught. Hence the tendency for one to equate the two (CS and application programming).Not long ago "computer science" didn't even exist: it was all taught in math departments. Once it became quite clear that most developer types can't handle vector calculus that changed.

Yet I wonder - what is computer science without the ability to actually program the computer? Discrete math, algorithms, and some theory. Interesting, but (not quite) useless.

Maybe it used to be that way. For all I know, it might still be that way in a few places. It's pretty silly for a school to expect that of their students, though, as that's not what the industry needs. You should pick math as a minor if that's something you really love. There are always plenty of jobs for people in various math-centric areas like DSP, and if that's your thing, more power to you, but don't

This is a troll, right? A subtle reversal of the old "tell people to go get vocational training if they want Java monkeys" line?

It should be self-evident that it takes more knowledge, skill, and ingenuity to write a correct program in a low-level language, and have it run sufficiently fast on underpowered hardware, than it is to write a correct program in a high-level language on leading-edge hardware.

On the flip side unless you fully understand the hardware how can you write good optimised C code? In C you might use a long (assume long is 32 bits for now!) to hold a counter. But the assembler programmer will know to stay clear of longs because he's programming a 16 bit CPU, or maybe longs take a performance hit.

It seemed to me that a lot of the "computer engineering" degrees were mostly for IT people. Admittedly this come from back in the 90s when there were only a few departments with this name. Has this changed?I felt like my Computer Science degree (which was the only game in town back when I was in school) gave me a pretty good background for the working world. Admittedly I also took some electives that focused on digital electronics (including building a computer from chips and then programming it) and we

You're confusing computer engineering with software engineering. Computer engineering is a hybrid software/hardware degree- depending on your electives its like minoring in EE and majoring in CS, or majoring in EE with an emphasis on digital circuits and minoring in CS. Quite frequently with tougher versions of some courses- the asm course taught to CS students at UIUC was a joke taught on simulators. By the end of the ECE asm class, we were programming video games on x86 without using external librari

It's an actual no BS Engineering degree, they study the Engineering core curriculum for their first two years. Your comment about majoring in EE with digital emphasis and minoring in CS is pretty close. Majoring in CS and minoring in EE would be a cakewalk in comparison.

The GP is just plain wrong regarding CompE being an IT track. IT is where the bottom half of the CS students wind up along with a lot of two year tech school grads.

Sorry to be a cynic, but learning these things in classes does not make you an embedded developer. It is the experience in a variety of fields that makes an embedded programmer. For what it is worth, experience in the variety makes you a programmer for nearly all fields.Let me give you a simple example; programming an interrupt service routine is probably one of the most difficult things to do the Right Way(TM). When things get smaller, it gets harder. That is where the experience kicks in and you "see" how

It can be hard to get real experience when all the jobs out there want several years of experience in the field. Of course you can do personal projects, but its coin toss whether or not the company will consider that actual experience. Most HR departments I've dealt would consider 1 year of embedded experience, plus 3 years of embedded experience through personal projects as 1 years of "real" experience.