According to the Wikipedia page around 32,000 people die each year in auto accidents in the United States. That's about 90 per day, give or take. To date, that we know of, one of those people have died in a Tesla using its "autopilot" mode.

Naturally, people are furiously debating self-driving cars and automation. The National Highway Transportation Safety Administration (NHTSA) is investigating the crash.

The information we have right now suggests that the driver was pushing the envelope, doing videos with the autopilot engaged – and maybe just watching a DVD at the time of the accident. (The "autopilot" is, as I understand it, not meant to be 100% self-driving, and the driver is supposed to remain engaged at all times.) The accident seemed to be a perfect storm where the car's software may have interpreted the tractor trailer as an overhead sign, and the driver of the tractor trailer is mostly at fault. The autopilot simply failed to correct for the situation.

Everything about this situation was predictable. We know that the software for these cars will be imperfect. It will improve drastically over time, but it will always be imperfect because humans are imperfect and the real world is damn hard to predict anyway.