Car Hacking: NXP Pushes Flexible Security

MADISON, Wis. — Talking about the vulnerabilities of the electronics in automobiles is a risky business.

On one hand, as long as the auto industry hasn't yet experienced any real-life disasters as a result of car hacking, why even bring it up? Such talk starts to sound like "fear-mongering."

On the other hand, automotive security is a relatively new issue even for many people working in the industry -- chip suppliers, module developers, and, of course, the car makers. While reliability has been always high on vendors' minds, car security has hung below the radar for long time. For decades, cars weren't as much interconnected with the external world as they are now.

But over the last few years the mindset of the automotive industry has changed.

For Dirk Besenbruch, engineer, group leader of Systems & Applications, Automotive, at NXP Semiconductors, a turning point (triggering his work on NXP's automotive security solutions) came when he read a 2011 paper, written by researchers at the University of Washington and the University of California at San Diego, commonly know, among experts, as the "Savage" paper. Stefan Savage of UC San Diego was an author of the paper, which detailed experimental analyses of automotive attack surfaces.

To be clear, the automotive industry didn't entirely dismiss the issue of risks interconnected cars might face. Nor did they stand still.

Several automotive companies, including BMW and Audi, have gotten together to develop a spec called SHE (Secure Hardware Extension). SHE offers protocols for secure communication among different modules inside the car, explained Richard Soja, a distinguished member of Freescale Semiconductor's technical staff. Soja is responsible for the company's 32-bit automotive SoC architecture.

More than a few players also worked on the development of EVITA (E-Safety Vehicle Intrusion Protected Applications, somehow), an EU-sponsored project, to create "a set of guidelines to allow manufacturers to satisfy security features," Soja told us. The EVITA project was completed at the end of 2011.

It might be a while, though, before an appreciable number of cars with newly minted security features hit the road, especially considering the lengthy (about five years) development cycle of a car.

Still, automotive security is a boon for semiconductor companies. It affords an opportunity to demonstrate security expertise, pitch the idea to add secure elements to cars, or even convince carmakers to replace current MCUs with completely new secure SoCs.

NXP NXP early on realized that automotive security could benefit from the company's experience and expertise in developing a "secure element" -- successfully deployed in millions of smartcards. NXP's Besenbruch says his company's approach to automotive security is to leverage that "field-proven" smartcard knowledge, and offer "separated secure elements."

NXP's approach creates a clear contrast to the strategy some competitors -- to wit, Infineon -- are pursuing. Infineon is redesigning the entire MCU to create embedded secure modules. While an embedded secure module might be a good solution for high-end cars, "changing micro [in its entirety] means getting locked into a certain type of MCU," argues Besenbruch.

NXP, in contrast, hopes to sell the flexibility of its separated secure element approach. Considering lifecycles and reliability demanded in the automotive industry, NXP believes its flexible approach can give auto companies more options to get started with protections against certain attacks sooner.

I came across an interesting company recently that may offer a solution - albeit a bit extreme - for securing that automotive IC supply chain, Junko, particular in the context of the MCU.

I was talking with Olek Cymbalski, owner of OPC Technologies (www.opct.com). He described their service which secures the supply chain by taking in the MCUs to be used in a particular design, programming them here in the US using the required code, removing the ability to recode (securing them) then shipping them to the production line -- anywhere in the world. This, according to Cymbalski, takes the programming details out of the engineers hands, while at the same time ensuring the ICs aren't tampered with along the way from the MCU manufacturer to the production line.

There aren't many ways someone could connect to your car... actually, none. The only way to hack your car is if you do it yourself or have someone do it for you since you would have to physically modify it. It looks like the real issue here is that the car companies don't want you to be able to make modifications to your car. Reminds me of when I was waiting for some friends after work at a brew pub in Austin in 1999. It was crowded and some women offered to share their table with me. One of them was explaining that she quit her job and started a company to fix the year 2000 issue with cars. She was convinced that cars would stop working on 1/1/2000. She said it was the microcontrollers. I was a design manager for microcontrollers in the automotive division at the time. I told her that there was only one microcontoller in the car that knew what time it was and it didn't know if it was am or pm much less what year it was. She wouldn't listen to me so I moved to another table. I wonder how her company did.

I've read the Savage report you mentioned with a lot of interest as it's my job to design such electronics. I designed one of the first MP3 player for car radio in Europe (OEM and aftersale). I can tell you the type of attack (MP3 buffer overvlow) mentioned in the report is just impossible in that case as the MP3 decoder was hardware. I guess it's possible to do a buffer overflow with a software MP3 decoder but I seriously doubt that it could be used to hack the car itself (maybe the car radio alone, even that would be quite time consuming for poor impact). Was there a real demo of what they could do on an unmodified car with this type of attack ?

What makes me think it's impossible to hack a car from the car radio is: The only network beetween the car radio and the rest of the car controllers is the CAN bus, often through gateways (the body network is physically independent of the engine network). CAN reliability is based on hardware message filtering, this way a controller cannot be overflown by a CAN bus. It's part of the validation process of all good designed controllers to check that it cannot crash because of a CAN bus overflow, not be cause of the fear of hacker's attack but more because of the fact that a controller could go crazy on the bus and overflow it (This kind of bug already happened if real life).

Today I design engine and body controllers for different car manufacturers, we do have security schemes in the bootloaders since about 10 years or more. It's mostly based on encrypted keys to allow calibration changes (it's easy to do a BO attack with a calibration change) and updated software download. There are also CRC checks and stuff like that (not talking about key(less) authentification). I know some people could go around these, mostly because of the weakest link: the garages. We need to have the possibility to update the software for the most important controllers of the car, it's a requirement of the car makers. These updates are done in the car repair stations of the brand and these will always remain the weakest link.

The solution to make the controller chip non reprogrammable (mentioned by Patrick) is not applicable in that case. For the controllers that don't need reprogrammability, we just use OTP (One Time Programmable) microcontrollers wich are cheaper thant Flash µC and physically impossible to recode.

Junko, good for you and EETimes to surface these issues. I have to say, it is so scary to read the comments by some (I am assuming by the fact they are at the EETimes web site) knowledgeable and educated engineers on this article and your other one:

The ones I refer to are those that are in total denial that cars being developed today are hackable and/or make the arguments that that if the car is hackable, why go to the trouble, just run into it or cut the brake line....

Are none of these engineers aware or following the massive outcry about the security holes in our existing infrustructures? Have they not followed STUXNET ?

Very sad and scary! Shows how much education or quick retirement needs to be done NOW.

But here's the thing. I have been told that there are instances that users try to modify their own cars (or in the case of car sharing, shared cars) to change mileage, Some people also change engine parameters (say, manupilate it from 100 horse power to 120 horse power engine).

Such manipulation on engine parameterscan be done by software, according to my source. And such actions could directly affect reliability of a car, for example.

Yes, that makes sense. The automotive companies have always worried about their vehicles being modified. They have made it harder for us to work on our own cars. Making a modification should void the warranty, but they worry about the liability if something happens due to the modification.