Published on:

California Driver Charged With Vehicular Manslaughter for Using Autopilot

tesla-accident-attorneys-CaliforniaAutonomous driving technology has been touted by manufacturers as a potential solution to the problems of serious motor vehicle collisions. As several manufacturers continue to test systems on the nation’s roadways, the likelihood of autonomous vehicles becoming available in the market has increased. However, this type of technology cannot prevent all accidents, and drivers who misuse semi-autonomous technology in their vehicles can cause serious accidents in which people can be injured or killed. A pending criminal case in California demonstrates the potential liability of motorists who are driving semi-autonomous vehicles and cause serious injury or fatality collisions. It also raises potential issues involving product liability.

Fatal California Tesla Autopilot Collision Results in Criminal Charges

In 2019, Kevin George Aziz Riad, a 27-year-old man, was driving his Tesla on Autopilot mode. Tesla Autopilot is a driver-assist technology that is considered to be level 2 automation technology on the Society for Automotive Engineers (SAE) six levels of autonomous technology. While Tesla Autopilot is not fully autonomous, it includes features that include automatic lane changes, traffic-aware cruise control, lane centering, self-parking, semi-autonomous navigation, and the ability for a driver to summon the vehicle from a parking spot or garage. However, when a driver has Tesla Autopilot engaged, he or she must constantly supervise the vehicle and be prepared to take over at any time. Tesla recommends drivers keep at least one hand on the wheel while they are using Tesla Autopilot.

Riad reportedly had the Autopilot turned on in his Tesla Model S on Dec. 29, 2019, when the car exited a freeway at a high rate of speed, ran a red light in Gardena, and crashed into a Honda Civic that was traveling through the intersection. The crash killed two people who were inside of the Honda, including Maria Guadalupe Nieves-Lopez and Gilberto Alcazar Lopez. Both were pronounced dead at the accident scene. Both Riad and his female passenger suffered non-life-threatening injuries and were hospitalized following the wreck.

Following a nearly two-year investigation, prosecutors filed two felony counts of vehicular manslaughter against Riad in Oct. 2021. Under Cal. Pen. Code § 192(c), vehicular manslaughter may be charged as a misdemeanor or a felony, depending on whether the driver’s negligence amounted to gross negligence. In Riad’s case, prosecutors allege that he was grossly negligent in his misuse of the Autopilot features in his vehicle in the moments leading up to the collision. The felony counts each carry up to a maximum of six years in state prison if Riad is convicted.

Riad entered a plea of not guilty and is currently free on bail while the case is pending against him. This is the first time a driver has been charged with a felony in the U.S. for a fatal collision caused by misusing a widely-available partially automated driving system.

Other Incidents Involving Automated Driving Technology

Riad’s 2019 crash is not the first incident in which a motorist has been criminally charged for misusing automated driving technology. However, he is the first person who has faced felony charges for misusing a widely available driver technology. In 2020, an Uber driver was charged with negligent homicide in Arizona when the Uber vehicle he was testing with a fully autonomous system crashed into a pedestrian and killed her.
The misuse of Autopilot, which can control steering, speed, and braking, has occurred on numerous occasions and is the subject of investigations by two federal agencies. The filing of charges in the California crash could serve as notice to drivers who use systems like Autopilot that they cannot rely on them to control vehicles. While fully autonomous Uber vehicles are not widely available, approximately 765,000 Tesla vehicles in the U.S. are equipped with Autopilot.

Multiple prior incidents involving Tesla Autopilot have been reported. In 2018, a Tesla crashed into a firetruck. No injuries were reported in that collision, but authorities said that the driver had disengaged from driving at the time of the accident. In May 2020, another man in California was arrested on a freeway when police officers saw him sitting in his Tesla’s backseat while the vehicle moved along the highway. In that case, no one was sitting behind the steering wheel.

The National Highway Traffic Safety Administration (NHTSA) has investigated 26 crashes involving Tesla Autopilot since 2016, including collisions with tractor-trailers and highway barriers. These collisions have resulted in at least 11 fatalities. In Riad’s crash, the NHTSA confirmed that Riad had the Autopilot system turned on at the time of the collision.

The National Transportation Safety Board (NTSB) and the NHTSA have been investigating widespread abuse of Tesla Autopilot. The agencies refer to driver inattention and overconfidence as automation complacency and blame it for numerous collisions.

Potential Product Liability Issues

In addition to potential criminal liability for drivers who fail to maintain control of their vehicles while using Autopilot, Tesla might also face civil liability through lawsuits filed by the victims’ families.

The surviving family members of Nieves-Lopez and Lopez have filed lawsuits against Tesla and Riad. Their lawsuits allege that Tesla’s vehicles are defective and can suddenly accelerate and that the cars lack good automatic emergency brake systems. They also allege that Riad was negligent in his failure to take over from the Autopilot to avoid the crash. The cases are set for trial in 2023.

According to the lawsuits, Riad’s vehicle suddenly accelerated to an uncontrollable and excessive speed. They also argue that Riad had multiple traffic violations on his record and was not able to handle the vehicle.

With the history of Tesla Autopilot crashes, Tesla has updated its software in an attempt to make it more difficult for drivers to misuse the technology. It has also worked to improve the Tesla Autopilot’s ability to identify emergency vehicles and avoid collisions with them.

Despite the numerous Tesla Autopilot crashes, Tesla is currently testing a full self-driving system in hundreds of vehicles on U.S. roads. This newer technology is being tested by Tesla owners across the U.S. The company has stated that drivers must be prepared to react at any time when they are using Autopilot or the full self-driving system in their Tesla vehicles.

In April 2020, the family of a pedestrian who was killed by a sleeping motorist’s Tesla Model X while the vehicle was using Tesla Autopilot filed a lawsuit against Tesla in San Jose, California. In that case, the plaintiffs alleged that the driving system contained a patent defect in its design. In that case, the Tesla suddenly accelerated after a vehicle in front of it changed lanes, striking a van and pedestrians on the side of the road following a different accident.

The plaintiffs alleged that the Tesla Autopilot did not alert the driver since his hands were still on the wheel. It also failed to detect the pedestrians and van or engage the emergency braking system. Since both the motorist and the victim were Japanese citizens, and the motorist purchased the Tesla in Japan, the case was later dismissed by the court following a forum non-conveniens motion filed by Tesla, which argued that the proper forum for the case should be Japan instead of California. However, it illustrates some of the potential product liability issues that Tesla could face in other lawsuits filed against it in crashes involving its semi-autonomous driving systems.

Design Defects and Strict Liability

Several different types of product defects can result in liability claims against the parties involved in bringing a product to the market. Design defects are defects that occur when a product is being designed. In Tesla’s case, if its software is found to contain a design defect, it could be strictly liable to pay damages to any parties that are injured as a result.

Manufacturing defects are defects that occur during the manufacturing process and only involve some, but not all, of the products. The third type of defect involves a failure to warn. Product liability lawsuits against Tesla might include both claims of design defects and a failure to warn if the company knew about the defect in its system but failed to warn consumers about the risks.

In previous lawsuits against Tesla, the company has tended to blame the drivers instead of admitting any responsibility. This will likely be the type of defense any manufacturer of autonomous vehicles will use in similar lawsuits in the future. However, if plaintiffs can show that there is a defect in the design, Tesla and other similar carmakers might be strictly liable and be forced to pay damages to people who are injured and the families of those who are killed because of the defect.

Talk to the Steven M. Sweat, Personal Injury Lawyers, APC

If you have been injured in a crash with a Tesla on Autopilot, you should speak to an attorney at the Steven M. Sweat, Personal Injury Lawyers, APC. We can investigate what happened and explain your legal options. Call us today for a free consultation at 866.966.5240.

Contact Information