Regulation In 2015, a spokesman for the US
National Highway Traffic Safety Administration (NHTSA) said that "any autonomous vehicle would need to meet applicable federal motor vehicle safety standards" and the NHTSA "will have the appropriate policies and regulations in place to ensure the safety of this type of vehicles". On February 1, 2021,
Robert Sumwalt, chair of the NTSB, wrote a letter to NHTSA regarding that agency's "Framework for Automated Driving System Safety", which had been published for comment in December 2020. In the letter, Sumwalt recommended that NHTSA include user monitoring as part of the safety framework and reiterated that "Tesla's lack of appropriate safeguards and NHTSA's inaction" to act on the NTSB's recommendation "that NHTSA develop a method to verify that manufacturers of vehicles equipped with Level 2 incorporate system safeguards that limit the use of automated vehicle control systems to the conditions for which they were designed" was a contributing cause to a fatal crash of a vehicle in Delray Beach, Florida in 2019. Reporting is limited to crashes where the ADAS or ADS was engaged within 30 seconds prior to the crash that involve an injury that requires hospitalization, a fatality, a vehicle being towed from the scene, an air bag deployment, or involving a "vulnerable road user" (e.g., pedestrian or bicyclist); these crashes are required to be reported to NHTSA within one calendar day, and an updated report is required within 10 calendar days. On August 16, 2021, after reports of 17 injuries and one death in car crashes involving emergency vehicles, the US auto safety regulators opened a formal safety probe (PE 21-020) into Tesla's driver assistance system Autopilot. Initial data from SGO 2021-01 were released in June 2022; 12 manufacturers reported 392 crashes involving ADAS (Level 2) between July 2021 and May 15, 2022. Of those 392 crashes, 273 were Tesla vehicles, out of approximately 830,000 Tesla vehicles equipped with ADAS. Honda had the next highest total, with 90 crashes reported out of approximately 6 million Honda vehicles equipped with ADAS. The NHTSA said Tesla's numbers may appear high because it has real-time crash reports, whereas other automakers do not, so their crash reports may be delivered more slowly or not reported at all. Collectively, five people were killed and six more were seriously hurt in the 392 ADAS crashes that were reported. SGO 2021-01 also applied to manufacturers of vehicles equipped with ADS (Levels 3 through 5); 25 ADS manufacturers reported 130 crashes in total from the initial data release in June 2022, led by Waymo (62),
Transdev Alternative Services (34), and Cruise LLC (23). In most cases, these crashes involved the ADS vehicle being struck from the rear; only one serious injury was reported, and 108 of the 130 crashes resulted in no injury. The suit was settled in 2018; owners who in 2016 and 2017 paid to equip their cars with the updated Autopilot software were compensated between $20 and $280 for the delay in implementing Autopilot 2.0. In 2020, a German court ruled in a lawsuit brought in 2019 by that Tesla had violated advertising regulations with its marketing of Autopilot. Upon appeal, that decision was reversed in 2021 by a higher court under the condition that Tesla clarify the capabilities of Autopilot on its website. In July 2022, a German court awarded a plaintiff most of the that she had paid for a Model X, based in part on a technical report that demonstrated Autopilot did not reliably recognize obstacles and would unnecessarily activate its brakes, which could cause a "massive hazard" in cities; Tesla's lawyers argued unsuccessfully that Autopilot was not designed for city traffic. In September 2022, a class action lawsuit was filed in the
U.S. District Court (Northern California) alleging that "for years, Tesla has deceptively and misleading marketed its ADAS technology as autonomous driving technology under various names, including 'Autopilot,' 'Enhanced Autopilot,' and 'Full Self-Driving Capability, adding that Tesla represented "that it was perpetually on the cusp of perfecting that technology and finally fulfilling its promise of producing a fully self-driving car", while "Tesla knew for years its statements regarding its ADAS technology were deceptive and misleading, but the company made them anyway." Tesla filed a motion in November 2022 to dismiss the case, defending the company's actions as "mere failure to realize a long-term, aspirational goal [of a fully self-driving car] [and] not fraud", basing the motion on the private arbitration clause in the purchasing contract signed by each buyer. A second class action lawsuit was filed in the same court by Tesla shareholders in late February 2023. The complaint alleges the defendants "had significantly overstated the efficacy, viability, and safety of [Tesla's] Autopilot and FSD technologies" and those same systems "created a serious risk of accident and injury", which "subjected Tesla to an increased risk of regulatory and governmental scrutiny and enforcement action", linking multiple specific accidents to documented decreases in share prices. The suit was dismissed without prejudice in September 2024, as the judge ruled that Musk's claims were "corporate puffery". Tesla's lawyers argued that puffery covered the statements "[A]utopilot is 'superhuman and we want to get as close to perfection as possible; as the judge wrote in the motion granting the dismissal, "these vague statements of corporate optimism are not objectively verifiable". However, Tesla's lawyers also argued that other statements including "safety is 'paramount and "Tesla cars are 'absurdly safe also were puffery, which the judge rejected, as those were objectively verifiable. In April 2023, Tesla was found not liable in a lawsuit filed in 2020 by a driver who sued for damages after she claimed the Autopilot system guided her Tesla Model S into a curb, resulting in an airbag deployment and facial injuries. Jurors explained in post-trial interviews that "Autopilot never confessed to be self pilot. It's not a self-driving car ... [Tesla] were adamant about a driver needing to always be aware." Additional lawsuits have been filed by the estates of two drivers killed in 2019 while using Autopilot, one in California and one in Florida. In the California case, which had not previously been reported, Tesla has argued the driver had consumed alcohol and it is not clear that Autopilot was engaged; the plaintiff's lawyers alleged that a known defect in the Autopilot system had caused the vehicle to veer off a highway at and strike a palm tree. Tesla prevailed in that case, with the jury voting 9–3 in October 2023 that there was no manufacturing defect. For the Florida case, the judge rejected Tesla's motion to dismiss, concluding that he could not "imagine how some ordinary consumers would not have some belief that the Tesla vehicles were capable of driving themselves hands free", citing "reasonable evidence" demonstrating that Tesla had "engaged in a marketing strategy that painted the products as autonomous" and that Musk's statements "had a significant effect on the belief about the capabilities of the products". In the first federal case involving Tesla's Autopilot ever to go to trial, on August 1, 2025, Tesla was found partially responsible for a 2019 crash which killed one pedestrian and seriously injured another. As a result, Tesla was fined US$243 million in compensatory and punitive damages. Tesla has appealed the verdict. The lawsuit also accused Tesla of hiding data that the car produced in the moments leading up to the crash.
False or misleading advertising The
Center for Auto Safety and
Consumer Watchdog wrote to the
Federal Trade Commission (FTC) in 2018, asking them to open an investigation into the marketing of Autopilot. The letter stated "the marketing and advertising practices of Tesla, combined with Elon Musk's public statements, have made it reasonable for Tesla owners to believe, and act on that belief, that a Tesla with Autopilot is an autonomous vehicle capable of 'self-driving. The groups renewed their appeal to the FTC and added the California DMV in 2019, noting that "Tesla continues to be the only automaker to describe its Level 2 vehicles as 'self-driving' and the name of its driver assistance suite of features, Autopilot, connotes full autonomy." U.S. Senators
Ed Markey (D-MA) and
Richard Blumenthal (D-CT) echoed these concerns to the FTC in 2021. A 2019 IIHS study showed that the name "Autopilot" causes more drivers to misperceive behaviors such as texting or taking a nap to be safe, versus similar level 2 driver-assistance systems from other car companies. In 2020, UK safety experts called Tesla's Autopilot "especially misleading". While
Euro NCAP's testing of Autopilot on a 2020 Model 3 noted the system excelled the level of vehicle assistance provided, the association noted the misleading nature of the system's name and a risk of over-reliance on the system. In 2020, usability engineer Dixon published a paper which called Tesla's descriptions of Autopilot and FSD capabilities exaggerated. In 2021, following more than a dozen Autopilot crashes (some fatal), the
U.S. Department of Justice (DOJ) started a criminal investigation to determine if Tesla misled consumers, investors, and regulators about Autopilot. Tesla confirmed the DOJ had requested Autopilot and FSD-related documents in its
10-K filing for 2022. The
Securities and Exchange Commission also opened an independent civil probe into statements made by Tesla and its executives about Autopilot. In July 2022, the California DMV filed two complaints with the state Office of Administrative Hearings that alleged Tesla "made or disseminated statements that are untrue or misleading, and not based on facts" relating to both "Autopilot and Full Self-Driving technologies". In August 2022, Tesla requested a hearing to present its defense. In September 2022, California governor Gavin Newsom signed state bill SB 1398, which took effect January 1, 2023 and prohibits any manufacturer or dealer of cars with partial driver automation features from using misleading language to advertise their vehicles as autonomous, such as by naming the system "Full Self-Driving".
Deceptive promotion of Full Self-Driving In October 2016, at the same time as the release of HW2, Tesla released a video entitled "Full Self-Driving Hardware on All Teslas" that claimed to demonstrate Full Self-Driving, the system designed to extend automated driving to local roads. Musk later tweeted a link to a longer version in November 2016. In the video, the driver does not touch the steering wheel or pedals throughout the video. The video also shows perspectives from the vehicle's cameras and image recognition system. At Musk's suggestion, the
title card states "The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself." It was nicknamed the "
Paint It Black" video, after the 1966 Rolling Stones song used as its soundtrack. In the interview, they stated the vehicle was following a route that had been mapped with detailed scanning cameras, a technology that is not available in Tesla production cars. Even with these augmentations in place, human drivers had to intervene to take control, the vehicle allegedly struck "a roadside barrier" on the Tesla grounds during filming, requiring repairs to the vehicle, and the car crashed into a fence when trying to automatically park. In January 2024,
Bloomberg published an expose based on internal Tesla emails revealing that Musk personally oversaw the editing and post-production of the video.
Motor Trend and
Jalopnik compared what Tesla had showcased to the deceptive video depicting a
Nikola One EV truck which was actually powered by gravity;
Jalopnik commented "[the Tesla video] may be worse, because this video was used to deceptively suggest capabilities of a system deployed into real people's hands and used on public roads." In June 2022, Ashok Elluswamy, director of Autopilot software, made a statement during a deposition taken for a civil lawsuit filed against Tesla by the family of a driver that was killed in 2018 after the Model X he was driving using Autopilot crashed into a concrete barrier in Mountain View, California. Elluswamy stated the video was not originally intended "to accurately portray what was available for customers in 2016. It was to portray what was possible to build into the system," while the final video had no such disclaimer. A
Florida circuit court judge also noted the final video as part of Tesla's marketing strategy in rejecting Tesla's motion to dismiss a lawsuit over a 2019 death, writing that "absent from this video is any indication that the video is aspirational or that this technology doesn't currently exist in the market." In August 2021, the NHTSA Office of Defects Investigation (ODI) opened a preliminary evaluation (PE) designated PE 21-020 and released a list of eleven crashes involving Tesla vehicles striking stationary emergency vehicles; in each instance, NHTSA confirmed that Autopilot or Traffic Aware Cruise Control were active during the approach to the crashes. Of the eleven crashes, seven resulted in seventeen total injuries, and one resulted in one fatality. The scope of the planned evaluation of the Autopilot system specifically addressed the systems used to monitor and enforce driver engagement. In September 2021, NHTSA added a twelfth accident in Orlando from August 2021 to the investigation list. NHTSA sent a request for information relating to PE 21-020 to Tesla's director of field quality in August 2021. The response was due by October 22, 2021. In September 2021, NHTSA sent a request for information to Tesla and other automobile manufacturers for comparative ADAS data. After Tesla deployed its Emergency Light Detection Update in September 2021, NHTSA sent a follow-up letter to Tesla in October 2021 asking for "a chronology of events, internal investigations, and studies" that led to the deployment of the update, as it potentially addressed a safety defect, which requires a formal recall. In February 2022, NHTSA ODI opened a second preliminary evaluation (PE 22-002) for "phantom braking" in 2021–2022 Tesla Model 3 and Model Y vehicles. PE 22-002 was correlated to the removal of radar hardware from those vehicles in May 2021; at the time PE 22-002 was opened, the NHTSA was not aware of any crashes or injuries resulting from the complaints. According to some complaints, while using Autopilot, "rapid deceleration can occur without warning, at random, and often repeatedly in a single drive cycle." By May 2022, NHTSA had received 758 reports of unexpected braking when Autopilot was in use and requested that Tesla respond to questions by June 20, 2022. Also in June 2022, NHTSA ODI upgraded PE 21-020 to an engineering analysis (EA) and designated it as EA 22-002, covering an estimated 830,000 Tesla vehicles sold between 2014 and 2022. Data for PE 21-020 had been supplemented by prior information requests to Tesla (April 19, 2021) and Standing General Order (SGO) 2021–01, issued June 29, 2021 and amended on August 5, 2021, The investigation was expanded to an engineering analysis after NHTSA reviewed data from 191 crashes involving the use of Autopilot or related ADAS Level 2 technologies (Traffic-Aware Cruise Control, Autosteer, Navigate on Autopilot, or Auto Lane Change). which may not have been enough time for the driver to assume full control. In addition, the data suggest that Tesla's requirement for Autopilot drivers to have their hands on the wheel at all time may not be sufficient to ensure the driver is paying attention to the driving task. NHTSA sent a second letter for EA 22-002 to Tesla in August 2022, which included requests for a description of the role of the driver-facing camera, identification of all lawsuits or arbitration resulting from Autopilot use, including complete transcripts of depositions, and "the engineering and safety explanation and evidence for design decisions regarding enforcement of driver engagement / attentiveness". Tesla submitted a response in September. A follow-up letter was submitted in July 2023, asking for current data and updates to the prior response. Starting in October 2023, NHTSA conveyed its preliminary conclusions to Tesla during several meetings, followed by Tesla conducting a voluntary recall on December 5, 2023, to provide an over-the-air software update to "incorporate additional controls and alerts ... to further encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged ... [and providing] additional checks upon engaging Autosteer and while using the feature outside controlled access highways", while not concurring with NHTSA's analysis. EA 22-002 was closed in April 2024. ODI concluded their "analysis of crash data indicates that, prior to [the December 2023 recall], Autopilot's design was not sufficient to maintain drivers' engagement", citing data showing that in 59 of 109 crashes, hazards were visible for at least five seconds prior to the collision. Also ODI added, based on vehicle telemetry, "the warnings provided by Autopilot when Autosteer was engaged did not adequately ensure that drivers maintained their attention on the driving task", showing that in approximately 80% of 135 incidents, braking and/or steering did not occur until less than one second before a collision. noting "concerns [identified] due to post-remedy crash events and results from preliminary NHTSA tests of remedied vehicles. Also, Tesla has stated that a portion of the remedy both requires the owner to opt in and allows a driver to readily reverse it. Tesla has also deployed non-remedy updates to address issues that appear related to ODI's concerns under EA22002." In March 2026, the NHTSA stated that FSD in low-visibility conditions fails to detect hazards and/or alert drivers to deteriorated camera performance until immediately before a crash.
Recalls Tesla issued an "Emergency Light Detection Update" for Autopilot in September 2021 which was intended to detect "flashing emergency vehicle lights in low light conditions and then [respond] to said detection with driver alerts and changes to the vehicle speed while Autopilot is engaged", after NHTSA had opened PE 21-020 the previous month. After the update was issued, NHTSA sent a letter to Tesla asking why the update had not been performed under the recall process, as "any manufacturer issuing an over-the-air update that mitigates a defect that poses an unreasonable risk to motor vehicle safety is required to timely file an accompanying recall notice to NHTSA." Tesla issued a recall of 11,728 vehicles in October 2021 due to a communication error that could lead to false forward-collision warnings or unexpected activations of the automatic emergency braking system. The error had been introduced by the Full Self-Driving beta software version 10.3 over-the-air firmware update, and was reversed by another over-the-air update the same month. The recalled vehicles were reverted to 10.2, then updated to 10.3.1. On February 1, after the NHTSA advised Tesla that failing to stop for a stop sign can increase the risk of a crash and threatened "immediate action" for "intentional design choices that are unsafe", Tesla recalled nearly 54,000 vehicles to disable the rolling stop behavior, removing the feature with an over-the-air software update. On February 16, 2023, Tesla issued a
recall notice for all vehicles equipped with the Full Self-Driving beta software, including 2016–23 Model S and X; 2017–23 Model 3; and 2020–23 Model Y, covering 362,758 vehicles in total. NHTSA identified four specific traffic situations in a letter sent to Tesla on January 25, and Tesla voluntarily chose to pursue a recall to address those situations, Tesla issued a wider recall on all vehicles equipped with any version of Autosteer, including 2012–2023 Model S; 2016–2023 Model X; 2017–2023 Model 3; and 2020–2023 Model Y, covering 2,031,220 vehicles in total. The NHTSA concluded that Autosteer's controls were not sufficient to prevent misuse and did not ensure that the drivers maintained "continuous and sustained responsibility for vehicle operation". == See also ==