For Tesla enthusiasts, 2017 may be remembered as the year when semi-autonomous vehicles faced heavy backlash after a fatal accident involving the electric car.

While the car brand was exonerated in that accident—the driver relied too heavily on the autonomous features and ignored repeated warnings—a string of recent accidents has brought Tesla back into the spotlight.

First of all, a Tesla Model S rear-ended a parked fire truck at a reported speed of 65 mpg (no injuries were reported. The autopilot feature was engaged, and as a result, the National Board of Transportation Security (NBTS) will be launching an investigation.

Secondly, highway patrolmen in San Francisco had to intervene when a driver fell asleep behind the wheel of his Tesla while it was on autopilot; the driver’s car had stopped in the middle of a 5-lane highway.

Tesla has repeatedly warned drivers that the autopilot features should only be used while the driver is alert and prepared to intervene. But for many, the temptation to check out altogether is too great.

The Stages of Autonomous Driving

Most of the Tesla vehicles involved in accidents are in what experts refer to as “stage 3” of autonomous driving. Other models by Mercedes-Benz and BMW are in the 2nd stage, and the eventual goal—totally autonomous cars—will be called “stage 4.”

Unfortunately, the current stage (stage three) is more dangerous than “stage 2” and the eventual “stage 4.” That’s because “stage 3” cars are autonomous enough to convince drivers that they will be fine if they don’t pay attention (and they often are!). Eventually the driver’s luck will run out though, and he or she will be in an accident by continually ignoring the road.

Google’s Waymo is Avoiding Stage 3 Altogether

Waymo, Google’s self-driving car affiliate, has decided to bypass “stage 3” altogether. It seems like a smart move for a company that is obviously under no pressure to turn a profit in the near future (Google certainly has the funds to prop Waymo up). Rather than deal with potential accidents related to distracted drivers, Waymo will simply wait until their cars are completely and safely 100% autonomous.

The Insurance Institute for Highway Safety (IIHS) has conducted tests on each autonomous vehicle on the market, and they have discovered various glitches. A Mercedes, for instance had a tendency to veer off on exit lanes rather than stay straight, claimed IIHS founder Ian Reagan.

Tesla vehicles, on the other hand, had difficulty when reaching the crest of a hill. All of these glitches will be ironed out over time, and it’s likely that they have been fixed already. But a distracted driver could be injured or killed by remaining overly reliant on Tesla’s autonomous technology.

Liability Still Falls on the Driver

Unfortunately for these drivers, litigation probably isn’t on the table. They operated outside of Tesla’s guidelines, and as such, they can’t hold the car maker responsible for injuries or damage. Drivers involved in an accident due to a malfunction or manufacturer error should of course contact a car accident lawyer. In short, though, drivers still have to remain alert and in control of the vehicle while it is driving “autonomously.”

As for the drivers in the recent string of Tesla accidents, they have each claimed that responsibility falls on Tesla’s autonomous driving features for the accidents. The truth is the drivers are at fault though, and they will need a criminal defense lawyer to escape a harsh ruling in court.