I appreciate the repetition for effect - it works to drive home critical points. I would tho like to see a little consolidation and maybe introduce a half dozen slides (interspersed over three days if needed) on particulars of Development Boards and RTOS's to also solidify some key points ... altogether - a worthwhile class from a knowledgeable instructor.

If Development boards are indicators, am thinking some vendors may want to hire me to architect, design and implement boards for them ... cheaper than an applications engineer running around from client to client ...

Embedded linux -- have actually configured Slack 7 though 10 for embedded as I said the other day.

Will look at it again shortly with a high end ARM -- just want to finish current projects. Maybe Raspberry PI -- just about bought in -- too much on the go -- and then there is the $200 for a real working system for my purpose.

@KentJ: if you're coming from hardware, you stand a better chance at being great at it. You know all about Xc, etc. so just understanding how to reach registers are a breeze for you. Unlike us programmers...

@Uart I can't architect a firmware solution without flow diagrams. Need to think it through at a higher level before I start coding. Maybe its just me. BTW, I've done medical product too. Life safety product, and then your everyday thermostats.

@gaun. I can see that as being useful. My problem has always been that each new project differes too much from the last. I might pull up code I wrote 5 years ago, and revamp it, but never had the chance to work on consistent enough projects to reuse code to the library level.

I've been turned down (or not even interviewed) for not having a degree many a time. I've always looked at it as I wouldn't want to work for a company who sees the value of its people as a piece of paper first, and accomplishments second.

@guan Not really. Seems every new project is a new project. I sometimes will re-use algorithms I've written previously (always with an eye for improving them) but never taken the time to structure them in a library.

Thanks for the lectures Bill. I'm off to class. Teaching a fledging group of students how to take over their world with code.

Guess no one wanted to touch that question of my 2 year program. I know... their still is that stigma in the HR departments. Serious issues are that most of my most bright students don't want to or can't afford a 4 year school. All that talent is going un-used. I employ a few of them on some embedded projects that are on government contracts, but I'm not ready to hire them all

What a shame. embedded programming is one of the few sciences where you can be self-taught and still be a professional.

Thanks for the lectures Bill. I'm off to class. Teaching a fledging group of students how to take over their world with code.

Guess no one wanted to touch that question of my 2 year program. I know... their still is that stigma in the HR departments. Serious issues are that most of my most bright students don't want to or can't afford a 4 year school. All that talent is going un-used. I employ a few of them on some embedded projects that are on government contracts, but I'm not ready to hire them all.

@Bill I do work for a family resource company -- am the Science Officer -- they claim I cast spells to make machinery/scientific equipment work... maybe Luminary would have been a good home if they are that strange... rotfl

@DaveWR LOL! Aint that the truth. I've recently returned to a company I previously worked at for 13 years. All the products I designed previously are still in the market with no issues. SO what do I do now? Fix all the products that were released after I left! Sheesh! It not only takes an understanding of doing code, or design, it takes a fundemental understanding of the market needs to implement a proper user interface, or to avoid the pit falls of "when things go wrong"

@Bill For the record, most people think of me as a EE. I just cover all aspects of it. Thats what happens after a few (ha ha) years of doing it. I design the hardware, system architecture, firmware architecture, and do coding depending upon the specific project requirements.

For learning Ethernet and USB, try Microchip 32 bit 500 series to 700 series microcontrollers. They have free stacks for both that you can download and it makes it easier to learn. Also the microcontrollers have a built in MAC address so you don't have to buy one.

@BillGioino: That's encouraging. Ok. Have a serious one for everyone. I'm in the process of designing a 2 year degree specific to embedded programming and design. No, it's not a 4 year degree... I know. What it does is cut out most of the B.S. that students have to go through to get to the meat. Sad side note: Most CS students that I see in our state don't touch a word of code until their 3rd year! So, the idea is to cut out that first 2 years and go straight for the gusto. Question is: How attractive do you think that would be to employers?

if you are trying to teach the level of the controller down to the hardware interface level, the assembly is a must. Its the best way to show the data pathways, and how the internals of the part truly operate. If you are approaching the microcontroler as a "black box", then it is not required. In those cases, assembly would only be needed when requiring some special optimization. At least that has been my experience so far

Very true, especially if you want to optimze your code. Even more important when you are running out of memory - write your own assembly optimized language routines

Another question I have is: What does the employment field look like for students who are capable programmers and decent embedded system designer/programmers?

Very good. Build a few projects thayt you can use to show your expeirence. I suggest learning USB and Ethernet, as well as WiFi. Build some projects with these technologies and put them on your resume.

@javawantabe if you are trying to teach the level of the controller down to the hardware interface level, the assembly is a must. Its the best way to show the data pathways, and how the internals of the part truly operate. If you are approaching the microcontroler as a "black box", then it is not required. In those cases, assembly would only be needed when requiring some special optimization. At least that has been my experience so far

We look at each new product development from the angle of can it capitalize on previous work, does it need new capability. THen always rescan the market to see what new parts are out there that might best fit the need. That's why I've use Microchip, TI, Rensas, Samsung to name a few.

I am a professor of Computer Science but trying to blend microcontrollers into the mix. We have had great success with C but wondering how much Assembly is really needed? Is it important for my students to know?

@gaun, to tell you the truth, I don't remember now. We didn't buy the processor directly, and partnered with the manufacturer of the radio module in creating the application layer interface so we could embed our code into the same chip. THe actual processor was one of the TI affiliate chips specifically used in the celluar market. ThHis was 4-58 years ago now.

@UART and Bill, I'd agree. WHen TI first introduced the MSP430, I came across a couple of "non documented features" that I had to implement external hardware solutions for. Later TI introduced internal Brown out circutry fixing the issue.

@Bill: Actually yes, the past program we made had some troubles. The main application breaks some times and does not support as many task as we expected... That's why i don't want to use the same micro and operative system

@gaun We ended up purchasing a radio module, so I'm sure there were mutiple dies contained within, but yes, there was only one processor in the module that contained the rtos that ran the radio code, with our application code sitting on top.

In lower voltages; yes, due to line noise. back then we weren't dealing with speeds that we're dealing with today, ntm, registers too. Of course it's almost a catch22 but if you weigh it all out I'd say yes.

First pass of embedding a cellular radio in a product had seprate processor for radio and application. Later we embedded our application code into the radio core (major cost reduction). In this case they were both TI parts.

I've working in this company about 2 years. I've done to many different tasks, so I have not much expirience on developing new products. Now we're development a new mother board for all the products we sell... We´re using a S3C2440 microcontroller and WINCE 6.0 as operative system.

I think we can use a better resources (micro and OS) but I don't know what kind micro and OS can I suggest.... Any ideas of where can I start looking at to see if there are better resources we can use? I mean what kind of micro and operative system could be a better from the ones we're using now?

Interesting. If you are using WinCE, you definitely need a 32-bit, which you are using with the S3C2440. Make sure you are maxed out in your memory. Are tou having performance problems with that board? is the UI lagging? You would probably be better served with a main 32-bit and a 16-bit to do the user interface.

@UART, I agree. THe only exception I've ran into is when you have a process intensive sub task that is best delegated off to a slave processor to handle, so your main processor can focus on the top level applicaton. Don't run into it as much now days, but I can still see it happening.

The soft core microcontrollers for FPGAs could be a whole week, certainly more than one 30-minute presentation. Include soft and hard core processors in FPGAs. Advantages of each. Comparison of those to microcontrollers, and best applications for each.

@UART, I basically agree with you, but the question was which single one. If I were to teach with the idea of covering as much ground in the least time, I'd start with 16 bit. There isn't a lot of architectual differences between 16 and 8, while there is between them and 32.

On 32 bit "successful" architectures: IBM System/360 and its 32-bit successors (loosely the base of the intel 32 bit family), the Intel IA-32 32-bit version of the x86 architecture, the DEC VAX, the Motorola 68k, and the 32-bit versions of the ARM, SPARC, MIPS, PowerPC and PA-RISC architectures. 32-bit instruction set architectures used for embedded computing include the 68k and ColdFire, x86, ARM, MIPS, PowerPC, and Infineon TriCore architectures. Seems like more than six.

I'm a little confused. I've working in this company about 2 years. I've done to many different tasks, so I have not much expirience on developing new products. Now we're development a new mother board for all the products we sell... We´re using a S3C2440 microcontroller and WINCE 6.0 as operative system.

I think we can use a better resources (micro and OS) but I don't know what kind micro and OS can I suggest.... Any ideas of where can I start looking at to see if there are better resources we can use? I mean what kind of micro and operative system could be a better from the ones we're using now?

Any comment would be helpful =)...

P.D. Sorry if my english is not so good. My native lenguage is Spanish

Intel might beg to differ with you on the x86 atom in embedded designs.

The Atom is relatively new. It is still cutting it's tetth in the industry, so it may be too early to call it a success or not. Especially when Intel has a history of cutting loose architectures that aren't successful quickly.

I started off on 8-bit, and found it lovely, then moved to 16 bit. This helped me understand the direction the architecture was going. I would probably recommend both: 8&16 together for more depth of understanding IMHO.

For those checking out ARM for the first time, be aware that there are multiple ARM cores, each targeted for different applications. Even the ARM Cortex core has Cortex-A, Cortex-R, and Cortex-M variations. (note the "A" "R" "M" marketing slant).

Also, the definition of Embedded can cause discussions that would drain more than one keg. Large communications infrastructure and military systems are considered embedded, and you won't find most of the devices we are discussing.

Microcontrollers or EmbSys, I didn't know it was a difference. Microcontrollers represent EmbeddedSystems, so I thought. Therefore I suppose Embedded Systems would be the core curriculum, and from there you learn Microcontrollers, right?

Well, Embedded Systems is basically anything that is NOT user-programmable. So a PC, which is very user-programmable, is not an embedded system. Microcontrollers are used in embedded systems.

@cghaba: Microcontrollers or EmbSys, I didn't know it was a difference. Microcontrollers represent EmbeddedSystems, so I thought. Therefore I suppose Embedded Systems would be the core curriculum, and from there you learn Microcontrollers, right?

I think the "Internet of Things" putting things like light swiitches, thermostats, smoke alarms, etc. will push a number of traditionally 8 bit implementations to 16 or 32 bits because of the complicated communication stacks like Zig-Bee.

Gaun - I couldn't locate the reference you mentioned on DSP for engineers and scientists, but I did find a tutorial on the Analog Devices website called "Mixed Signal and DSP Design Techniques." Thanks

From yesterday, the PIC24 does not have built in Ethernet but works very well with the ENC624J600 PHY also from Microchip. The ENC624J600 has a built in buffer for TX and RX and works with SPI or parallel interface. Microchip has examples of using the PIC24 in their Ethernet Stack. And FreeRTOS has support for the PIC24 as well.

Was taking a microcontroller class this week and wasn't able to see your other presentations. I did get to see today's in its entirity. VERY much looking forward to seeing the other ones! Stuff like this is worth paying for. Thanks for the freebie ;)

Dev boards should be used to determine software library support for peripherals, like complex communications peripherals (USB, Ethernet, graphic display). Better to check things out with multiple dev boards (in same vendor's family) than wait until your PCB is fabricated.

Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.