Founded in 1993 by brothers Tom and David Gardner, The Motley Fool helps millions of people attain financial freedom through our website, podcasts, books, newspaper column, radio show, and premium investing services.

Reevaluating some of Tesla’s choices after the first death linked to their semi-autonomous driving system.

This past May, a 40 year-old man was killed in a car accident while his Tesla's (NASDAQ:TSLA) autopilot system was engaged.

In this industrials segment of Industry Focus, Sean O'Reilly, Taylor Muckerman, and Motley Fool senior auto specialist John Rosevear explain what happened, why it happened, and how regulatory bodies like the National Highway Traffic Safety Administration are responding to the accident. Also, they take a look at some of the unsettling reasons that Tesla is the only company on the market with a system as advanced as their autopilot software, while other auto makers in the space are still in the research phase.

A transcript follows the video.

This podcast was recorded on Jul. 14, 2016.

Sean O'Reilly: Just to rehash, a Tesla owner named Joshua Brown died in a collision on May 7th with an 18-wheeler tractor trailer down in Florida. But I wanted actually take a step back. Can you take us to school and tell us, what the heck is autopilot?

John Rosevear: OK. Autopilot is essentially an enhanced cruise control. It's a set of several driver assistance systems that together work like an enhance cruise control. It uses cameras, radar, ultrasonic sensors, and some fancy Tesla software. In Tesla's words, what it does is automatically steer down the highway, change lanes, and adjust speed in response to traffic. It's intended to, again these are Tesla's words, help the car avoid hazards and reduce the driver's workload. So, it's not really a self-driving system. It's kind of one step past the latest and greatest adaptive cruise control and lane keeping systems you might get on, really, any mainstream luxury car, and even some more mass market models. You can load up a Ford Fusion with most of this stuff, for instance.

O'Reilly: There's an idea right there. Did you read that Dan Sparks piece last year? He drove, what, 60 or 90 miles -- do you remember, Taylor? -- in his Tesla in Colorado?

Rosevear: I did. He's far from the only one doing this kind of thing. That's kind of the issue. Tesla will tell you officially, you're not supposed to take your hands off the wheel. But Tesla owners are early adopters--

O'Reilly: Just check YouTube. (laughs)

Taylor Muckerman: Yeah, I was going to say ...

Rosevear: I was just going to say! They have access to webcam! (laughs) It's all out there, it really is.

O'Reilly: Can you run down, really quickly, the basic details of the accident?

Rosevear: OK. This gentlemen, Mr. Joshua Brown, was 40 years old in Florida, owned a Tesla Model S. On May 7th, he was cruising up some speed on the road in Florida, and a tractor trailer made a left turn in front of him. It appears that his car never slowed down and hit the trailer broadside with very bad results. Essentially, the top of the car was sheared off, I've seen pictures of it, poor Mr. Brown. Our condolences to his family, etc. That was a really nasty one.

Tesla was not able, they say, to access the car's computer remotely because of the damage. They eventually sent an investigator to Florida to look at it, took all of the data from the car that they could download, took it home to headquarters, processed it, and discovered in late May that autopilot was engaged. They had reported it to the National Highway Traffic Safety Administration some time before. They gave them that information. And then, toward the end of June, the Federal agency said they were opening an investigation into this. And that's when Tesla disclosed it.

O'Reilly: Got it. This actually occurred before that big stock raise that Tesla did at the end of May. We're not going to talk too much about that. Obviously, that's up in the air. We could spend days going to sleep talking about SEC disclosure rules. So, what's going on with the investigations? Obviously this is, of note, the first major death that I've heard of for a driverless or semi-driverless system.

Rosevear: It is the first major death, it is the first fatal accident in which a system that does this much is implicated. Certainly the first one for Tesla's autopilot system, which is really, like I said, a bit of a step ahead of everybody else, mostly because it will steer for you, it will change lanes automatically, which of the other systems will not yet do. The investigation is in fairly early stages. The National Highway Traffic Safety Administration, which is part of the Department of Transportation, it's the main safety regulator when it comes to cars in the U.S., they sent Tesla a letter earlier this week or at the end of last week demanding a whole ton of information. Basically, they want to know details for every car they sold in the U.S. with autopilot enabled, when have the various safety systems been employed, when and how many times have they had to brake automatically, when and how many times has it had to remind the owner to put the hands on the wheel, what percentage of the mile that the car has driven under autopilot, etc. They want it all in a spreadsheet, they want it all soon.

So, Tesla's going to have to do that. And I think this is the first time the federal government has tried to really grapple with a system that's actually shipping. They've been talking about setting rules and guidelines going forward with self-driving systems as we move toward that, but Tesla shipped this thing, and it's out there. There's something like 70,000 cars out there with this system operational, that have activated autopilot. And now the feds are coming in and saying: "We need to take a look at this and maybe start to set some rules and give some guidance and maybe put some limits on what Tesla's done."

O'Reilly: Taylor, what did you think when you first heard about the story, and the stock offering, and all that?

Muckerman: My first thought was that, it's already been scaled back to some degree, because last year, like we mentioned, YouTube videos of people climbing into the back seat of their Tesla--

O'Reilly: Oh my God!

Muckerman: --leaving the driver's seat completely empty on the highway.

O'Reilly: Is that a ghost riding the whip thing?

Muckerman: Pretty much. Elon Musk actually came out in the public sphere and voiced his opinion of, humans shouldn't necessarily be trusted just yet with fully autonomous cars. I believe they scaled back on what they had made available to folks in terms of the vehicle capabilities. I'm still a big believer that autonomous driving is going to take place in the next one or two decades. I hope it does. But this is obviously proof that we're just not quite ready for it to be fully autonomous. And maybe there just needs to be a little bit more training or instruction involved in how to use it. But then again, Elon Musk isn't riding shotgun with every single person with a Tesla. There has to be some level of trust involved with the purchaser and their decision making. Unfortunately, humans are humans, and we all operate a little bit differently when it comes to this.

O'Reilly: John, you know a bunch of auto executives and players in the industry. What has been your feel for the rest of the auto industry's opinion about what happened, and the advent of driverless cars, and what's it going to take to prove the viability of actually having a driverless automobile?

Rosevear: First of all, if you go to a big auto maker and you talk to the people who are doing their self-driving research and development and so forth--

O'Reilly: And they all are.

Muckerman: If they aren't, they won't be a car company for a long.

Rosevear: Yeah. That's mostly true. So, if you go there, and you get them away from a camera and you start asking them, they say, "Tesla released this thing too early." I mean, it's a beta system. The transitions between the different parts of the system aren't seamless. It's a beta system, it has holes in it, as you would expect. There's a concept here with cars that have partial self-driving capability about hand off. The hand off problem, they call it, meaning that when the car goes, "Bing! Driver, you need to take over!" It takes several seconds for somebody who's maybe been reading a book or whatever to refocus on the task at hand.

Muckerman: Just like texting and driving. That's the main crux of that whole issue, is that you think you can look up fast enough, but you still have to catch your bearings.

O'Reilly: Right, and those few seconds could mean the difference between life and death.

Rosevear: Right. But the thing is, and the thinking is, they have to build this into the system to anticipate this so that the car gives you enough time to switch your attention back to the road. This is something consumer reports talked about. They came out this morning and said, "We think Tesla needs to dial this down." They had a bunch of concerns, which we can talk about. But, that's the issue. We aren't going to get to full autonomy for years. It's going to take billions of miles of testing. It's really going to take cars that talk to each other, super detailed maps of roads, all of that to get fully autonomous cars. But we will see, and Tesla's autopilot is maybe the first rough step in that direction -- there will be many more over the next few years -- partial autonomy systems, systems that will automate highway driving.