fbpx
tesla

IS TESLA AUTOPILOT SAFE?

Tesla Autopilot?

When Elon Musk began suggesting that he and Tesla would create the first self-driving electric vehicles, you weren’t alone if you felt a rush of excitement. People around the world were thrilled at the prospect. Self-driving vehicles could solve myriad problems. From elderly drivers seeking increased independence to parents picking the kids up from school while they were in a meeting or responsibly drinking and not driving, the promise of fully autonomous vehicles fascinated people of all ages. So what went wrong?

Tesla’s Autopilot, Advanced Autopilot, and “Full Self-Driving” Software

One of the biggest issues with Tesla’s self-driving car is that it just isn’t autonomous yet. However, some features seem to imply that it is.

Tesla released its driver assistance autopilot feature in 2016 and its “Full Self-Driving” (FSD) feature in November 2022. Tesla’s website describes Autopilot as an “advanced driver assistance system.” The company created this feature to increase convenience and safety while simultaneously reducing the driver’s workload.

Today, Autopilot is a standard feature in all new Teslas. Advanced Autopilot and FSD are upgrades that Tesla owners can purchase. Tesla’s website clearly states that it designed all three Tesla car options to be used by a driver who has their hands on the wheel, is fully attentive, and is prepared to take full control at a moment’s notice.

Tesla’s driver assistance options come with the following features:

Coming Soon Beta Testing Autopilot Enhanced Autopilot Full Self-Driving Capability
Traffic-aware cruise control X X X
Autosteer X X X
Auto lane change X X
Navigate on autopilot X X X
Autopark X X
Summon X X
Smart summon X X
Traffic and stop sign control X X
Autosteer on city streets X (X)

Problems With Tesla’s Autopilot Driving

There have been numerous reports of problems with Tesla’s autopilot and FSD driving. These issues range from the autopilot system causing crashes to reports of phantom braking. Some people believe the issue with Tesla’s autopilot driving is an inherently unsafe product. Others believe the primary cause is related to Tesla’s misleading marketing.

Tesla’s Autopilot and FSD Crashes

In 2021, the U.S. National Highway Traffic Safety Administration (NHTSA) ordered specific car manufacturers to report any crashes involving certain advanced driver assistance systems (ADAS), such as Tesla’s Full Self-Driving vehicles. The order involved vehicles with SAE Level 2, which is defined as “driver assistance.”

SAE defines six levels of driving automation:

Level Description Examples Explanation
Level 0 No driving automation A car with fixed cruise control or crash warnings fits into this category. Full manual control.
Level 1 Driver assistance Most modern cars fall into this category. Level 1 must have one or more driver assistance features, such as adaptive cruise control or lane-keeping technology. The vehicle has a single automated system, such as cruise control.
Level 2 Partial driving automation Level 2 vehicles have at least two ADAS that can control steering, braking, or acceleration. Tesla Full Self-Driving vehicles are an example. An advanced driver assistance system with steering and acceleration. The human driver must monitor all tasks continuously.
Level 3 Conditional driving automation Level 3 vehicles can take full control of portions of a journey that meet certain conditions.Mercedes-Benz DRIVE PILOT is the first example of a Level 3 vehicle. The vehicle can handle most driving tasks, but a human must be ready to override control.
Level 4 High driving automation Vehicles in this category will be able to complete a full journey without driver intervention. No Level 4 vehicles are available to consumers. The vehicle handles all driving tasks under certain circumstances, but human override remains an option.
Level 5 Full driving automation Level 5 vehicles are fully capable of operating in all circumstances. These cars will have no options for human control. This means no steering wheel or pedals. Level 5 vehicles do not yet exist. Zero human attention or interaction is needed.

You may be surprised to learn that Tesla’s Full Self-Driving vehicle is only a Level 2 out of six levels. A year after ordering the report of ADAS accidents, NHTSA found that 70% of these automatic driving accidents involved Tesla vehicles. The top four reporting Level 2 entities and their incidents were:

  • Tesla: 273 incidents
  • Honda: 90 incidents
  • Subaru: 10 incidents
  • Ford: 5 incidents

Recent examples of recent FSD and autopilot malfunction accidents include the following:

  • In FSD summon mode, a Tesla Model Y drives into a jet (2022): A Telsa drove into a Cirrus Vision jet, kept going, and spun the aircraft around.
  • Tesla in FSD mode nearly runs over cyclist, requiring driver intervention (2022): Two people were filming a vehicle touting Tesla’s safety when their Model 3 shifted to the right, directly at a nearby bicyclist.
  • Tesla Model 3 wrecks into an overturned truck while on Autopilot (2020): A driver in Taiwan was using Autopilot when his car failed to notice an overturned truck blocking two lanes of the freeway. The driver noticed at the last moment and hit the brakes, overriding autopilot, but not in time to prevent an accident.
  • Tesla Model 3 drives into a tractor-trailer (2019): The Tesla crashed into the tractor-trailer, shaving off the roof and killing the driver. The car stopped about 0.3 miles after the automated driving accident.

Tesla’s Phantom Braking Issues

Another issue related to Tesla’s autopilot and FSD accidents includes phantom braking. Phantom braking relates to the vehicle’s partially automated driving system. Sometimes Tesla vehicles slow or suddenly stop for no known reason. These phantom braking incidents can be frightening, dangerous, and even fatal.

In 2022, more than 750 Tesla owners lodged complaints regarding phantom braking to the U.S. Department of Transportation’s NHTSA.

One example of Tesla’s phantom braking issue occurred when a Tesla FSD vehicle caused an eight-car crash on the Bay Bridge:

FSD phantom braking causes an eight-car crash on Bay Bridge: A Tesla Model S suddenly switched into the left lane, hit the brakes, and suddenly attempted to stop in the middle of a tunnel on San Francisco’s Bay Bridge. The vehicles behind it couldn’t stop in time, causing an eight-car pileup. Nine people were injured.

False Advertising and Misleading Claims?

One of the biggest issues with Tesla’s cars may be the company’s words. Tesla is selling a semiautonomous vehicle under the label “Full Self-Driving” vehicle. Tesla uses words and phrases such as “automatic,” “autopilot,” and “full self-driving” to describe its products.

Some believe that Tesla’s exaggerated claims may be responsible for accidents related to its self-driving technology. If people believe that Tesla’s Full Self-Driving vehicle is fully self-driving, they may be less likely to give the road the attention it deserves. For example, take the North Carolina doctor who admitted to watching a movie when he crashed his Tesla Model S into a police car.

Are Tesla’s branding practices aspirational or false advertising?

California’s Department of Motor Vehicles (DMV) would beg the latter. In July 2022, the California DMV filed a lawsuit against Tesla. The state agency claims the company made misleading claims about its technology. The agency has lodged two complaints against the “Full Self-Driving” vehicle manufacturer. These complaints could impact Tesla’s ability to sell its cars in California.

California is not alone in its apparent exasperation with the company’s claims. In January 2023, the South Korea Fair Trade Commission (FTC) levied a $2.2 million fine against Tesla for exaggerating the benefits of its vehicles, including Tesla’s statements regarding the following:

  • The distance a Tesla car can travel on a single charge
  • The cost-effectiveness of a Tesla vehicle versus a gasoline vehicle
  • The performance of Tesla’s Superchargers

In 2020, a Munich court ruled that Tesla’s claims of a self-driving vehicle are misleading. The court stated that Tesla made exaggerated claims on its German website that misled people to believe Tesla vehicles can drive themselves.

In 2022, the consumer protection group vzbv filed another misleading advertising lawsuit against Tesla. Germany’s largest consumer protection group alleged that Tesla’s statements regarding carbon-dioxide emissions were misleading.

Unrelated to false marketing, vzbv also claimed Tesla’s sentry mode violates data protection laws by recording its surroundings.

Some Tesla drivers have filed a class-action lawsuit against the company, claiming that Tesla and Musk misrepresented the vehicle’s self-driving and autopilot abilities.

Other Issues with Tesla Cars

Other issues with Tesla vehicles include their propensity to catch fire. More than 180 Tesla car fires cases were recorded between 2013 and February 2023. These car fires are related to charging station malfunction, thermal runaway, and external factors. Of these nearly 200 incidents, 53 fatalities have been associated with Tesla car fires.

Overall, 356 fatalities have been linked to autopilot Tesla accidents.

Tesla Recalls

Tesla has issued numerous recalls for its vehicles. These recalls cover Tesla vehicles manufactured between 2016 and 2023.

Examples of Tesla recalls include:

  • The 2016 to 2023 Model S
  • The 2016 to 2023 Model X
  • The 2017 to 2023 Model 3
  • The 2020 to 2023 Model Y

Legal Incidents With Tesla

Aside from South Korea and California lodging lawsuits and complaints against Tesla, many individuals have filed claims. Currently, there is an ongoing class-action federal lawsuit against the company for misleading Tesla customers.

Class Action Regarding Misleading Remarks

The lawsuit claims that the Tesla company misled the public regarding its vehicles’ self-driving capabilities. Specifically, the abilities of Tesla’s following features:

  • Autopilot
  • Enhanced Autopilot
  • Full Self-Driving Capabilities

An example of misleading remarks includes the text introduction to a Tesla video from 2016. It reads,

“THE PERSON IN THE DRIVER’S SEAT IS ONLY THERE FOR LEGAL REASONS.

“HE IS NOT DOING ANYTHING.

“THE CAR IS DRIVING ITSELF.”

Class Action Regarding Tesla’s Phantom Braking Issues

Another class-action lawsuit threatens Tesla. This suit attacks the company’s phantom-braking issues that have been associated with Autopilot for several years.

A California Tesla owner has filed a class-action lawsuit in the federal court of northern California. His lawsuit seeks compensatory and punitive damages related to the cost of vehicle repairs, his car’s diminished value, and a refund for the upgrade to the Autopilot feature.

Tesla Shareholder Lawsuit

Yet another recent lawsuit is hurled at Tesla and Elon Musk. This time, the lawsuit was filed by shareholders who accused the company and its CEO of exaggerating the safety and effectiveness of Tesla’s Autopilot and Full Self-Driving features.

The State of Litigation with Tesla

There is ongoing litigation between Tesla and Tesla vehicle owners. If you have been subject to injury or damages resulting from owning a Tesla, you may be eligible to join a class action lawsuit or file a unique lawsuit in your local court.

Injuries to Tesla vehicle owners may include physical, property, or mental and emotional damages. These damages could also be connected to monetary loss related to Tesla recalls, repairs, or lost wages.

Suppose you have been in a Tesla accident or are a Tesla owner in Texas or Louisiana and believe you have been harmed by Tesla or your vehicle. In that case, you may be eligible to file a legal claim to recover damages and monetary compensation.

Contact the attorneys at Morris & Dewett Injury Lawyers to discuss your claim in a free consultation.

Morris & Dewett provides this information to the public for general education and interest. The firm does not represent clients in every topic discussed in legal & injury news. The information is curated and produced based on trends in law, governance, and society to present relevant issues to the general public. Every effort is made to provide accurate information. Do not make any decision solely based on the information provided, please seek relevant counsel for each topic area. Consult an attorney before making any legal decision, consult a doctor before making any medical decision, and consult a financial advisor before making any fiscal decision. If you have any legal needs that we can assist you with, please do not hesitate to contact us.

Morris & Dewett Will Answer Your Questions and Help You Recover