MarketTesla Autopilot
Company Profile

Tesla Autopilot

Tesla Autopilot is an advanced driver-assistance system (ADAS) developed by Tesla, Inc. that provides partial vehicle automation, corresponding to Level 2 automation as defined by SAE International. All Tesla vehicles produced after April 2019 include Autopilot, which provides autosteer and traffic-aware cruise control. As of February 2026, customers can subscribe to an optional Level 2 package called "Full Self-Driving (Supervised)" (FSD), which adds semi-autonomous navigation on nearly all roads, self-parking, and the ability to summon the car from a parking space. In January 2026, MotorTrend said FSD was the best ADAS on the market.

History
near Lake TahoeElon Musk first discussed the Tesla Autopilot system publicly in 2013, noting that "Autopilot is a good thing to have in planes, and we should have it in cars." At the time, no autopilot system in aircraft rendered them fully autonomous. Over the ensuing decade, Autopilot went through a series of hardware and software enhancements, gradually approaching the goal of full autonomy, which, , remains unmet. Autopilot, as initially introduced in 2014, referred to automatic parking and low-speed summoning on private property, but Tesla and Mobileye dissolved their partnership that July. Enhanced Autopilot (EAP) was announced in late 2016 as an extra-cost option that used a new hardware suite developed by Tesla. The release of the key EAP feature "Navigate on Autopilot", which uses the new hardware suite to guide the vehicle on controlled-access roads, from on-ramp to off-ramp, was delayed until 2018. When EAP was announced in 2016, Tesla also offered for purchase Full Self-Driving (FSD) as an upgrade option to EAP, which extended machine-guided driving capabilities to local roads. During a January 2019 earnings call, Elon Musk reiterated "full self-driving capability is there", referring to "Navigate on Autopilot", an EAP feature limited to controlled-access highways. In September 2020, Tesla reintroduced the term Enhanced Autopilot to distinguish the existing subset of features which included high-speed highway travel and low-speed parking and summoning, from FSD, which would add medium-speed city road travel. Tesla released a "beta" version of its FSD software (which extended "Navigate on Autopilot"-like machine-controlled driving and navigation to 'local roads') in the United States in October 2020 to EAP testers. The EAP option tier was made available to all buyers by June 2022 In November 2022, the FSD beta was extended to all owners in North America who had purchased the option. In April 2024, EAP was removed from the North American design pages, although it is still available for purchase in other markets. == Hardware iterations ==
Hardware iterations
Hardware 1 and Autopilot (Mobileye) In October 2014, Tesla offered customers the ability to pre-purchase Autopilot that was not designed for self-driving. Initial versions were built in partnership with Mobileye, but Mobileye ended the partnership in July 2016 because Tesla "was pushing the envelope in terms of safety". Vehicles manufactured after September 2014 included Hardware 1 (HW1), which supported Autopilot. The first Autopilot software release came in October 2015 as part of Tesla software version 7.0. Version 7.1 removed some features to discourage risky driving. Version 8.0 processed radar signals to create a point cloud similar to lidar to help navigate in low visibility. In November 2016, Autopilot 8.0 was updated to encourage drivers to grip the steering wheel. By November 2016, Autopilot had operated for 300 million miles (500 million km). Hardware 2 In October 2016, Autopilot sensors and computing hardware transitioned to Hardware 2 (HW2) for new cars, the upgraded hardware collectively was called Autopilot 2.0 to distinguish it from the original Autopilot/HW1 vehicles. At the time it was launched, Autopilot 2.0 vehicles with HW2 actually had fewer features than HW1 vehicles; for example, HW2 vehicles were unable to be summoned in 2016. Tesla also used the term Enhanced Autopilot (EAP) to refer to planned capabilities that would be coming to HW2 vehicles - the signature EAP feature announced in December 2016 was "Navigate on Autopilot", which allows machine-controlled driving on controlled-access highways from on-ramp to off-ramp, including the abilities to change lanes without driver input, transition from one freeway to another, and exit. HW2 vehicles were updated in January and February 2017 with software version 8.0, which included Traffic-Aware Cruise Control and Autosteer (lane-centering) on divided highways and 'local roads' up to speeds of . Version 8.0 also put more emphasis on the radar system, in an attempt to try to avoid problems like the fatal 2016 Autopilot crash in Florida. Software version 8.1 for HW2 arrived in March 2017, providing HW2 cars features on par with HW1 cars, but did not include "Navigate on Autopilot". . In August 2017, Tesla announced Hardware 2.5 (HW2.5), which upgraded the on-board processor and added redundant systems. Software version 9.0 was released in October 2018 in preparation for the release of "Navigate on Autopilot" for HW2/HW2.5 vehicles with EAP, which was implemented later that month. Simultaneously, Tesla removed the option to purchase the "Full Self-Driving" upgrade. In a November 2018 test drive, The Verge reporter Andrew J. Hawkins called the beta of Navigate on Autopilot "the feature that could give Tesla an edge as it grows from niche company to global powerhouse". As initially released, Navigate on Autopilot would suggest lane changes, but could not change lanes until the suggestion had been confirmed by the driver through the turn signal stalk. Hardware 3 In March 2019, Tesla transitioned to Hardware 3 (HW3) for new cars. Completely automated lane changes without requiring driver confirmation using "Navigate on Autopilot" were added as an option in an April software update, although Consumer Reports called it "far less competent" than a human driver. To comply with the new United Nations Economic Commission for Europe regulation related to automatically commanded steering function, Tesla provided an updated Autopilot in May, limited to Europe. In September, Tesla released software version 10 to Early Access Program (EAP) testers, citing improvements in driving visualization and automatic lane changes. In 2021, Tesla began transitioning from using radar to only using Tesla Vision. In October 2022 it provided its reasoning, citing "safety." Vehicles manufactured after 2022 do not include radar or ultrasonic sensors. In April 2026, Musk stated that, contrary to previous company promises, HW3 cars do not have the capability for unsupervised full-self-driving. Musk said Tesla will offer hardware upgrades to HW3 cars to bring them up to HW4, without specifying a timeframe. The board has 16 GB of RAM and 256 GB of storage, which are two and four times the RAM and storage in HW3 respectively. Musk stated that HW4 computational capabilities are three to eight times more powerful than HW3. Tesla started shipping cars with HW4 in January 2023, starting with the refreshed Model S and Model Y; however, FSD was not available initially. It took six months before HW4-based cars ran camera-based software. Despite the increased image sensor resolution with HW4-equipped cars, HW4 initially ran the FSD software by emulating HW3, including downsizing the camera images - a result of Tesla postponing training based on the new HW4 cameras. He said it was scheduled for release in January 2026, Musk also said it would be ten times more powerful than HW4 and use up to 800 watts when processing complex environments, versus a maximum of 300 watts for HW3 and HW4. but in April 2026, Musk said that AI5 will be used for "Optimus and [Tesla's] supercomputer clusters. AI4 is enough to achieve much better than human safety for FSD." == Autopilot packages ==
Autopilot packages
Autopilot Autopilot is the basic package that comes included on the Models S, 3, X, and Y. Autopilot features adaptive cruise control (named Traffic-Aware Cruise Control or TACC) and lane centering (Autosteer). The package also includes minor features such as "green light chime" and standard safety systems such as automatic emergency braking, lane and roadway edge departure warning and correction, and blind spot indicators. The Cybertruck and standard variants of Models 3/Y do not include Autosteer. In January 2026, Tesla announced that new vehicles sold would not include Autosteer. Enhanced Autopilot (EAP) Enhanced Autopilot is a middle ground package, offering Summon, Auto Lane Change, Navigate on Autopilot, and Autopark. Vehicles that support EAP are the Models S, 3, X, and Y. North American customers are currently unable to purchase this package, although it remains active on cars for which it was purchased. Navigate on Autopilot Navigate on Autopilot allows the vehicle to maneuver on the highway from on ramp to off ramp, making lane changes and navigating interchanges as needed - similar to FSD. Upon exiting the highway, the vehicle notifies the driver and returns to the basic Autosteer. Summon Summon is separated into three categories: Dumb Summon, Smart Summon, and Actual Smart Summon. It can be activated through the Tesla mobile app or the key fob. Dumb Summon is used to move the vehicle forwards or backwards. Smart Summon, now deprecated, drives the car to either the user or a designated location, relying on the vehicle's ultrasonic sensors (USS) to navigate and avoid collisions. Actual Smart Summon (ASS) performs the same job as its predecessor, however it uses the onboard cameras instead of USS. Actual Smart Summon can only be used through the app and features a larger summon radius compared to Smart Summon. Within the user interface on the Tesla app, ASS displays the vehicle's cameras during summoning while the previous iteration does not. Autopark Autopark is capable of parking the vehicle for the driver. Users are given the option to use Autopark when the vehicle detects an empty parking space, and Autopark maneuvers the vehicles into the parking spot. At first, to activate the system, drivers were required to slowly drive past an empty spot until the system detected the space. In 2024, Tesla released a redesigned Autopark, which uses the vehicle's cameras and introduced a "tap-to-park" system. Users are shown all possible parking spots and are able to choose a specific spot. Autopark currently is only able to back into spaces. Full Self-Driving (FSD) Full Self-Driving is the top end of the packages, featuring traffic light and stop sign recognition and "Autosteer on City Streets". Visualizations displayed on the vehicle's screen are more detailed and the vehicle is able to navigate local roads, similar to Navigate on Autopilot. , FSD is available in North America for all current Tesla models, including the Cybertruck. Pricing In 2015, Autopilot was US$2,500 on a Model S. In 2016, Enhanced Autopilot (EAP) was $5,000, and FSD was an add-on for $3,000. In April 2019, basic Autopilot was included in every Tesla car, and FSD was $5,000, growing to $10,000 in October 2020 and $15,000 in September 2022. As the price of FSD increased, the fraction of buyers who purchased it steadily declined, from an estimated 37% in 2019 to 22% in 2020 to 12% in 2021. Starting in 2021, the company offered a subscription for FSD for $199 per month or $99 per month if the customer had already purchased EAP. In September 2023, the price of FSD was reduced to $12,000. In April 2024 with the removal of EAP, Tesla reduced the FSD subscription price to $99 per month for either new users or users who had already purchased EAP, and reduced the purchase price of FSD to $8,000. The reduction ran opposite Musk's earlier statements about how the price of FSD would continue to increase, and angered existing FSD users who paid the higher prices. In February 2026, Tesla ended the sale of FSD to consumers, leaving subscribing as the only option. Musk also stated that as FSD becomes more capable, the subscription price will increase. For EAP owners, the subscription price was reduced to $49 per month. == Full Self-Driving capability ==
Full Self-Driving capability
Approach Tesla's approach to achieving SAE level 5 is to train a neural network using the behavior of 6+ million Tesla drivers using chiefly visible light cameras and the coarse-grained two-dimensional maps used for navigation. Tesla has made a deliberate decision to not use lidar, which Elon Musk has called "stupid, expensive and unnecessary". This makes Tesla's approach markedly different from that of other companies like Waymo and Cruise which train their neural networks using the behavior of a small number of highly trained drivers, and are additionally relying on highly detailed (centimeter-scale) three-dimensional maps and lidar in their autonomous vehicles. According to Elon Musk, full autonomy is "really a software limitation: The hardware exists to create full autonomy, so it's really about developing advanced, narrow AI for the car to operate on." The Autopilot development focus is on "increasingly sophisticated neural nets that can operate in reasonably sized computers in the car". Tesla's software has been trained based on three billion miles driven by Tesla vehicles on public roads, . Alongside tens of millions of miles on public roads, competitors have trained their software on tens of billions of miles in computer simulations, . In terms of computing hardware, Tesla designed a self-driving computer chip that has been installed in its cars since March 2019 and also designed and built in-house a neural network training supercomputer ("Tesla Dojo"); other vehicle automation companies such as Waymo regularly use custom chipsets and neural networks as well. Predictions In December 2015, Musk predicted that "complete autonomy" would be implemented by 2018. At the end of 2016, Tesla expected to demonstrate full autonomy by the end of 2017, and in April 2017, Musk predicted that in around two years, drivers would be able to sleep in their vehicle while it drives itself. In 2018 Tesla revised the date to demonstrate full autonomy to be by the end of 2019. In 2019 and 2020, Tesla's order page for "Full Self-Driving Capability" stated: :Coming later this year: :* Recognize and respond to traffic lights and stop signs :* Automatic driving on city streets. In January 2020, Musk claimed the FSD software would be "feature complete" by the end of 2020, adding that feature complete "doesn't mean that features are working well". In August 2020, Musk stated that 200 software engineers, 100 hardware engineers and 500 "labelers" were working on Autopilot and FSD. In early 2021, Musk stated that Tesla would provide SAE level 5 autonomy by the end of 2021. In a March 2021 conference call between Tesla and the California Department of Motor Vehicles (DMV), Tesla's director of Autopilot software revealed that Musk's comments "did not reflect engineering reality." Details of the call were made public via a Freedom of Information Act request by PlainSite. Speaking via video call at a 2023 AI conference held in Shanghai, Musk admitted that his former predictions were overly optimistic, and predicted that Tesla would finally realize fully autonomous vehicles at some point "later this year". During the Q1 2024 investors meeting in early 2024, Musk announced that he would reveal a new robotaxi product in August 2024, but Tesla only received a permit to operate autonomous vehicles in Texas in August 2025. Full Self-Driving (beta) In October 2020, Tesla first released a beta version of its FSD software to early access program testers, a small group of users in the United States. In January 2021, the number of employees and customers testing the beta FSD software was "nearly 1,000" expanding in May 2021 to several thousand employees and customers. In October 2021, Tesla began the wide release of the FSD Beta to approximately 1,000 more drivers in the US, and the beta became accessible to Tesla drivers who achieved a 100 / 100 on a proprietary safety scoring system. By November 2021 there were about 11,700 FSD beta testers and about 150,000 vehicles using Tesla's safety score system, which then grew to 60,000 users participating in FSD beta by January 2022, and 100,000 users by April 2022. In February 2023, 362,758 vehicles equipped with the FSD Beta were recalled by the U.S. National Highway Traffic Safety Administration (NHTSA), and addition of new participants was halted by the company. In March 2023, FSD Beta v11.3.1, which also merged Autopilot code with FSD, was released as a fix for the issues. In July 2023, NHTSA asked Tesla to clarify which changes had been made, and when they were implemented. The NHTSA later reported 60 crashes and one fatality involving the use of FSD beta during the period August 2022 to August 2023. In August 2023, Musk livestreamed a 45-minute demo of the upcoming version 12 of FSD, which he claimed used machine learning and not any human-written code. There was one intervention: the vehicle misinterpreted a green left-turn arrow as allowing forward traffic and nearly ran the red light before Musk intervened. Full Self Driving (Supervised) In April 2024, FSD version 12.3.3 officially replaced the word "beta" with "supervised" in its naming and Tesla announced that users had driven over 1 billion miles on FSD Beta. Subsequently, Tesla announced a free one-month trial of FSD and Musk mandated demonstrating FSD to all prospective buyers in the US. The wide release of version 12.4.3 introduced the vision-based monitoring system for cars that have interior cameras, removing the need for torque-based attention monitoring. In early September 2024 the wide-release of FSD version 12.5.3 introduced Actual Smart Summon and sunglasses support for the vision-based monitoring system. Shortly after, Tesla made some changes to its FSD package, changing the name from "Full Self-Driving Capability" to "Full Self-Driving (Supervised)" along with the description. At the end of September, Tesla released FSD version 12.5.5 for the Cybertruck, the defining feature of the release being the merging of the city and highway stacks. The release of version 12.5.3 deviated from previous software releases. Previously, updates would rollout to HW3 vehicles first before rolling out to HW4 vehicles. Post-12.5.3, HW4 vehicles received updates inline with HW3 vehicles. In late October 2024, version 12.5.6.1 was rolled out to HW4 vehicles, with general improvements such as the implementation of an end-to-end highway network, improvements to earlier and more natural lane change decisions, and new speed profiles. In late November 2024, Tesla started to release different versions for HW3 versus HW4 vehicles; HW3 vehicles stayed on version 12, and HW4 vehicles received FSD version 13.2. , HW3 vehicles have been on FSD version 12.6.4 since early 2025. Version 12.6 features include speed profiles on higher-speed roads, end-to-end highway driving, and an improved controller - a feature from version 13. In April 2026, Tesla announced plans to upgrade HW3 vehicles to version 14 "lite" in late June 2026. In January 2025, Tesla said its customers had driven 3 billion miles on FSD (Supervised), and that it had increased artificial intelligence training compute capability by 400% in 2024. In October 2025, Tesla released FSD version 14.1.3 to the public. New features include adjusted speed profiles, the removal of max speed set, and new arrival options - which allows users to pick whether FSD should park curbside, in a parking lot, or in a driveway. A new profile was also added called "Mad Max", which provides higher speeds and more aggressive lane changes compared to the existing "Hurry" mode. In May 2026, Tesla said that vehicles had driven 10 billion miles with FSD (Supervised), of which 3.7 billion miles were on city streets. Full Self Driving (Robotaxi) On June 22, 2025 Tesla launched their commercial taxi service Robotaxi to a small group of invited users in Austin, Texas. Tesla said the vehicles were unmodified cars from their factory, with "Robotaxi" written on the front doors. Rides were priced at a flat rate of $4.20 within a geofenced area. While no one was in the driver's seat, a Tesla employee was still present in the front passenger seat for safety reasons. The service area in Austin has expanded four times since the initial launch and currently covers . In late January 2026, Tesla launched Robotaxi services within Austin without a Tesla employee in the car. , Tesla is operating 22 fully driverless cars in Austin, six in Houston, and two in Dallas, On August 1, 2025, Tesla launched Robotaxi in San Francisco, although an employee is present in the driver seat due to legal requirements. The service area covers the entire Bay Area. , Tesla only has a permit in California for cars that have a Tesla-employed driver. In September 2025, Tesla received regulatory approval to begin testing Robotaxi in Nevada and Arizona. In November 2025, Tesla received permits to begin operating Robotaxi in Arizona. == Regional availability ==
Regional availability
Outside of North America, autopilot capabilities differ. While Enhanced Autopilot and Full Self-Driving are offered to customers, their feature set is more limited. Most regions offer Summon, Smart Summon, and Autopark with EAP and FSD. The Tesla AI team released a roadmap noting a Q1 2025 FSD release for China and Europe. Australia and New Zealand In Australia and New Zealand, Autopilot, EAP, and FSD are available. FSD includes the Enhanced Autopilot features and Traffic Light and Stop Sign Control. As of late 2025, FSD (Supervised) is available for HW4-equipped vehicles, while Autosteer on City Streets is listed as upcoming for HW3-equipped cars. In 2025, Tesla launched FSD v13 in Australia and New Zealand. This release is notable as it is first right-hand drive market which the package is enabled. Asia China In China, Autopilot, EAP, and FSD are available. As of 2025, FSD (Supervised) is available for HW4-equipped vehicles. For map data, Baidu Maps is utilized and data collected within China is currently required to remain in country. In 2024, Tesla began testing FSD in China following preliminary approval. FSD was released to the public the next year. The Chinese version of FSD is unique as it utilizes a separate data set, specific for China. Europe In Europe, including the United Kingdom and Ireland, Autopilot, EAP and FSD are available. FSD includes the Enhanced Autopilot features and Traffic Light and Stop Sign Control are available. Some features such as auto lane change require driver confirmation. Since the end of 2022, FSD has been in internal testing. In April 2024, a Swedish Transportation Administration official received a demonstration of FSD in Germany. In September 2024, the UK’s Department for Transport raised concerns, stating “While [a driver assistance system] may help reduce collisions, it may also introduce new safety risks.”. In November 2025, Tesla announced it was working with the Netherland's Dienst Wegverkeer (RDW) to gain approval for FSD's rollout in the country. On April 10, 2026, RDW approved FSD (Supervised) for general traffic operation, making the Netherlands the first European country to do so. == Tesla Dojo ==
Tesla Dojo
Tesla Dojo is a supercomputer designed from the ground up by Tesla for computer vision video processing and recognition. It was planned to be used to train Tesla's machine learning models to improve FSD. In August 2025, Bloomberg News reported that Tesla shut down the Dojo project, although it was restarted in January 2026. Dojo was first mentioned by Musk in April 2019 and August 2020. In September 2021, a Tesla Dojo whitepaper was released. In August 2023, Tesla said that it started production use of Dojo, configured with 10,000 Nvidia chips. Dojo consists of multiple cabinets. Each cabinet holds multiple, vertically arranged training tiles. Each tile holds multiple Tesla-designed D1 processing chips with associated memory. According to Tesla's senior director of Autopilot hardware, Ganesh Venkataramanan, "Tesla places 25 of these chips onto a single 'training tile', and 120 of these tiles come together... amounting to over an exaflop [a million teraflops] of power". (, Nvidia stated that the pre-Dojo Tesla AI-training center used 720 nodes of eight Nvidia A100 Tensor Core Graphics Processing Units (GPUs), 5,760 GPUs in total, for up to 1.8 exaflops of performance.) In April 2024, Musk said Tesla was using 35,000 Nvidia H100 chips, and was on track to have invested $10 billion cumulatively by the end of the year to train the neural network model for FSD. == Driving features ==
Driving features
Tesla's Autopilot is classified as Level 2 under the SAE six levels (0 to 5) of vehicle automation. At this level, the car can act autonomously, but requires the driver to monitor the driving at all times and be prepared to take control at a moment's notice. Tesla's owner's manual states that Autopilot should not be used on city streets or on roads where traffic conditions are constantly changing; however, some FSD capabilities "Traffic and Stop Sign Control (beta" such as "Autosteer on City Streets" are advertised for urban driving. == Comparisons and evaluations ==
Comparisons and evaluations
• In 2018, Consumer Reports rated Tesla Autopilot as second best out of four (GM, Tesla, Nissan, Volvo) "partially automated driving systems". Autopilot scored highly for its capabilities and ease of use, but was worse at keeping the driver engaged than the other manufacturers' systems. • In 2018, the Insurance Institute for Highway Safety (IIHS) compared Tesla, BMW, Mercedes and Volvo "advanced driver assistance systems" and stated that the Tesla Model 3 experienced the fewest incidents of crossing over a lane line, touching a lane line, or disengaging. • In February 2020, Car and Driver compared GM's Super Cruise, comma.ai and Autopilot. They called Autopilot "one of the best", highlighting its user interface and versatility, but criticizing it for swerving abruptly. • In June 2020, Digital Trends compared GM's Super Cruise self-driving and Tesla Autopilot. The conclusion: "Super Cruise is more advanced, while Autopilot is more comprehensive." • In October 2020, the European New Car Assessment Program gave the Tesla Model 3 Autopilot a score of "moderate". • Also in October 2020, Consumer Reports evaluated 17 driver assistance systems, and concluded that Tesla Autopilot was "a distant second" behind GM's Super Cruise, although Autopilot was ranked first in the "Capabilities and Performance" and "Ease of Use" categories. • In February 2021, a MotorTrend review compared GM's Super Cruise and Autopilot and said Super Cruise was better, primarily due to safety. • In May 2021, consulting firm Guidehouse Insights ranked Tesla Full Self-Driving last in strategy and execution among 15 companies. • In January 2023, Consumer Reports rated "active driving assistance systems" and ranked Tesla Autopilot as 7th out of 12. The Full Self-Driving package was not tested. • In October 2023, Consumer Reports rated "active driving assistance systems" and ranked Tesla Autopilot as 8th out of 17. The Full Self-Driving package was not tested. • In December 2023, TechCrunch ranked Full Self-Driving version 11 last out of five systems evaluated, saying "it's pretty easy to choose a loser. Three years after its initial beta release, Tesla's supposed Full Self-Driving still doesn't live up to its name", adding "the FSD beta software [was] frequently confused on urban and rural streets" and "Tesla's driver monitoring was by far the most lax of those tested". • In March 2024, IIHS reported its first "partial automation safeguard ratings", ranking Tesla Autopilot and Full Self-Driving version 11 as "poor", along with 9 of 12 other systems. • In January 2026, MotorTrend evaluated Ford BlueCruise, GM Super Cruise, Hyundai Drive Assist, and BMW Highway Assistant, and said that Tesla Full Self-Driving was the best driver assistance system on the market, "and it isn't very close ... v14 ... is vastly improved over ... v12 ... No other ADAS on the market today can ... drive on most roads". • In May 2027, the NHTSA said that Tesla Model Ys built on or after November 12, 2025 were the first vehicles to pass the agency’s new model year (MY) 2027 benchmark for advanced driver assistance systems. == Criticism ==
Criticism
Tesla's self-driving strategy has been criticized as dangerous and obsolete, and was abandoned by other companies years ago. Most experts believe that Tesla's approach of trying to achieve autonomous vehicles by eschewing high-definition maps and lidar is not feasible. Auto analyst Brad Templeton has criticized Tesla's approach by arguing, "The no-map approach involves forgetting what was learned before and doing it all again." In a May 2021 study by Guidehouse Insights, Tesla was ranked last for both strategy and execution in the autonomous driving sector. and "experts and proponents say it adds depth and vision where camera and radar alone fall short." Similarly, in 2026, Waymo founder John Krafcik stated Tesla's camera-only approach, along with their limited resolution and wide field of view, gave the system myopia and made it worse than a human driver. An August 2021 study conducted by Missy Cummings et al. found three Tesla Model 3 cars exhibited "significant between and within vehicle variation on a number of metrics related to driver monitoring, alerting, and safe operation of the underlying autonomy... suggest[ing] that the performance of the underlying artificial intelligence and computer vision systems was extremely variable." In September 2021, legal scholars William Widen and Philip Koopman argued that Tesla's advertising of FSD as an SAE Level 2 system was misleading to "avoid regulatory oversight and permitting processes required of more highly automated vehicles". Instead, they argued FSD should be considered a SAE Level 4 technology and urged state departments of transportation in the U.S. to classify it as such, since publicly available videos show that "beta test drivers operate their vehicles as if to validate SAE Level 4 (high driving automation) features, often revealing dramatically risky situations created by use of the vehicles in this manner." Some have criticized these tests, noting issues such as the vehicle's older hardware and the fact that the video's title was misleading. The title implied the testing of the vehicle's "Full Self Driving" features, but was instead utilizing Autopilot and its older code. Protestors against Tesla's deployment of its trial robotaxi service in Austin, Texas, attended a June 2025 demonstration of the limitations of FSD (Supervised) hosted by Tesla Takedown and The Dawn Project. == Safety statistics and concerns ==
Safety statistics and concerns
Safety statistics Tesla claims that its driver-assistance features improve safety and reduce accidents caused by driver fatigue or inattention. However, collisions and fatalities involving Autopilot have attracted scrutiny from media and regulators. Industry experts and safety advocates have raised concerns about the deployment of beta software to the general public, calling the practice risky and potentially irresponsible. In April 2016, Elon Musk stated the probability of an accident was at least 50% lower when using Autopilot without citing any references. At the time, it was estimated that collectively, Teslas had been driven for 47 million miles in Autopilot mode. After the first widely publicized fatal Autopilot crash in May 2016 which occurred in Williston, Florida, Tesla acknowledged the death and published a blog in June, comparing the average fatality rate in the United States (at the time, one per 94 million miles) and worldwide (one per 60 million miles) with that of Tesla Autopilot (one per 130 million miles); Tesla stated in July that "customers using Autopilot are statistically safer than those not using it at all", "the [Autopilot] system provided a net safety benefit to society", and "the 'better-than-human' threshold had been crossed and robustly validated internally". Tesla's statistical approach was criticized as comparing two different datasets; while Autopilot is limited to highway driving, the overall death rate for the United States includes more varied driving conditions. In addition, Tesla's vehicles were larger and more expensive than most vehicles on the road, making them generally safer in a crash. Other factors that could have affected the data include weather conditions and Tesla owner demographics. Fortune criticized the sale of US$2 billion in Tesla stock, noted the sale occurred less than two weeks after "immediately" reporting the fatal early May crash to the NHTSA, but before Tesla posted its public acknowledgement of the crash in late June; the article stated that "Tesla and Musk did not disclose the very material fact that a man had died while using an auto-pilot technology that Tesla had marketed vigorously as safe and important to its customers." Musk responded to the article with a statistical argument in an email to the reporter, saying "Indeed, if anyone bothered to do the math (obviously, you did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available. Please, take 5 mins and do the bloody math before you write an article that misleads the public." Following the Williston crash, NHTSA released a preliminary report in January 2017 stating "the Tesla vehicles' crash rate dropped by almost 40 percent after Autosteer installation." NHTSA did not release the data until November 2018. A private company, Quality Control Systems, released a report in February 2019 analyzing the same data, stating the NHTSA conclusion was "not well-founded". Part of the data verification included scrutiny of the 43,781 vehicles NHTSA claimed had Autosteer installed; of those, only 5,714 had an exact odometer reading at the time that Autosteer was installed and airbag deployment data; collectively, the data for those 5,714 vehicles showed 32 airbag deployments in the traveled before installation, and 64 airbag deployments in the after. That means the crash rate, as measured by the rate of airbag deployments per million miles of travel, actually increased from to after the installation of Autosteer, an increase of 59%. using the values to demonstrate decreased accident rates while using Autopilot. In February 2020, Andrej Karpathy, Tesla's head of AI and computer vision, stated that Tesla cars have driven 3 billion miles on Autopilot, of which 1 billion have been driven using Navigate on Autopilot; Tesla cars have performed 200,000 automated lane changes; and 1.2 million Smart Summon sessions have been initiated with Tesla cars. He also stated that Tesla cars are avoiding pedestrian accidents at a rate of tens to hundreds per day. The company stopped publishing safety statistics in 2022, but resumed in January 2023. The first comparable safety statistics using Full Self-Driving were released in March 2023; Tesla stated that vehicles operating under FSD experienced a crash that deployed the airbag approximately every 3.2 million miles, compared to all crashes with airbag deployment reported to the police, which occur approximately every 0.6 million miles. Additionally, a statistical analysis first published as a preprint in 2021 and in final form in 2023 criticized the self-reported Tesla crash rate data, as it failed to account for vehicle owner demographics as well as the types of roads on which Autopilot was being operated. Fatal and nonfatal crashes After Tesla software version 7.0 was released in October 2015 and Tesla claimed Autopilot would "[relieve] drivers of the most tedious and potentially dangerous aspects of road travel", the first fatal crashes involving Autopilot occurred less than a year later, in China (January 2016) and the United States (May 2016). Tesla stated it had immediately informed the NHTSA about the US crash in May, but the NHTSA did not announce it until June 30, 2016, when it was widely reported as the first Autopilot-related fatality; the death in China was not reported in the US until that September, The first Autopilot death in the US occurred after the Tesla collided with the side of a semi-trailer truck and underrode that vehicle's trailer, shearing off the greenhouse. After the fatality, Tesla stated that Autopilot failed to recognize the white trailer against a bright sky. Musk reported that improvements to Autopilot in September 2016 would "[make] much more effective use of radar" and "very likely" would have prevented the fatal accident. Despite these improvements, the shift to a different hardware platform, and additional updates to Autopilot, another fatal crash in May 2019 occurred when a Tesla again underrode the side of a trailer. Five years after the first fatalities, in 2021, Tesla began transitioning to "Tesla Vision" by removing the radar from new Model 3 and Y vehicles; in 2023, The Washington Post reported that Musk had pushed for a camera-only approach over the objections of Tesla engineers. Many of these incidents have received varying degrees of attention from news publications. Additionally, the NHTSA cited hundreds of nonfatal accidents in official documents. In addition to failing to recognize the side of a trailer, Autopilot crashes have been blamed on driver distraction, inability to detect stationary emergency vehicles, and misuse outside the stated operational design domain of "controlled-access highways [...] with a center divider, clear lane markings, and no cross traffic". General concerns The National Transportation Safety Board (NTSB) criticized Tesla's lack of system safeguards in a fatal 2018 Autopilot crash in California, and for failing to foresee and prevent "predictable abuse" of Autopilot. Following this collective criticism amid increased regulatory scrutiny of ADAS systems, especially Tesla Autopilot, in June 2021, the NHTSA announced an order requiring automakers to report crashes involving vehicles equipped with ADAS features in the United States. In April 2024, the NHTSA released the findings of a 3-year investigation of 956 vehicle collisions in which Tesla Autopilot was thought to have been in use that found that the system had contributed to at least 467 collisions including 13 that resulted in fatalities. Driver monitoring Musk stated in October 2015 that "we're advising drivers to keep their hands on the wheel [when Autopilot is engaged]". Despite this, multiple videos were posted to YouTube at the time showing drivers using Autopilot to drive hands-free, including Musk's ex-wife Talulah Riley. As initially released, the Autopilot system uses a torque sensor to detect if the driver's hands were on the steering wheel and gives audible and visual warnings for the driver to take the wheel when no torque is detected, but several owners confirmed they could drive for several minutes hands-free before receiving a warning. At least one device designed to defeat the torque sensor was ordered by NHTSA to discontinue sales in 2018. Initially, Tesla decided against adding more advanced driver monitoring options to ensure drivers remained engaged with the driving task. In late May 2021, a new version of the software enabled driver-facing cameras inside new Model 3 and Model Y (i.e. the first cars as part of the switch to Tesla Vision) to monitor driver attentiveness while using Autopilot. Model S and Model X cars made before 2021 do not have an inside camera and therefore physically cannot offer such capabilities, although the refreshed versions are expected to have one. A review of the in-cabin camera-based monitoring system by Consumer Reports found that drivers could still use Autopilot even when looking away from the road or using their phones, and could also enable FSD beta software "with the camera covered." In 2022, Musk agreed to a proposal on Twitter that "users with more than 10,000 miles on FSD Beta should be given the option to turn off the steering wheel nag", saying the system would be updated in January 2023. In April, Musk confirmed the nag was being reduced gradually. That June, a hacker discovered that FSD Beta had an undocumented mode which disables all driver monitoring. The NHTSA wrote a letter to Tesla under the authority of EA 22–002 on July 26, noting the new mode "could lead to greater driver inattention and failure of the driver to properly supervise Autopilot". The letter was attached a Special Order requesting when the software was updated with the hidden mode, detailed steps or conditions required to unlock that mode, and the reasons why Tesla issued the updates. Tesla responded by August 25; the response was considered confidential and no public version is available. A "nag elimination" module sold as an aftermarket accessory automatically adjusts the volume from the steering wheel, which is registered as steering wheel input, allowing drivers to take their hands off the wheel. Anecdotal evidence has shown the module is effective only for Tesla vehicles sold in the United States and Canada, leading to speculation the driver monitoring software is different by region. Detecting stationary vehicles at speed Autopilot may not detect stationary vehicles; the manual states: "Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead." This has led to numerous crashes with stopped emergency vehicles. Dangerous and unexpected behavior In a 2019 Bloomberg survey, hundreds of Tesla owners reported dangerous behaviors with Autopilot, such as phantom braking, veering out of lane, or failing to stop for road hazards. Autopilot users have also reported the software crashing and turning off suddenly, collisions with off ramp barriers, radar failures, unexpected swerving, tailgating, and uneven speed changes. Ars Technica notes that the brake system tends to initiate later than some drivers expect. One driver claimed that Tesla's Autopilot failed to brake, resulting in collisions, but Tesla pointed out that the driver deactivated the cruise control of the car prior to the crash. The automatic emergency braking (AEB) system also initiates sooner than some drivers expect due to a software error, which led to a recall in 2021 for false activation of the AEB system. Ars Technica also noted that while lane changes may be semi-automatic (if Autopilot is on, and the vehicle detects slow moving cars or if it is required to stay on route, the car may automatically change lanes without any driver input), the driver must show the car that he or she is paying attention by touching the steering wheel before the car makes the change. In 2019, Consumer Reports noted that Tesla's automatic lane-change feature is "far less competent than a human driver". Data collection and privacy Most modern vehicles, including Teslas, are equipped with event data recorders which collect vehicle data to aid investigations and diagnostics. Data collected includes speed, acceleration, brake use, steering input, and driver assistance feature status. Tesla vehicles permanently record this data as "gateway log" files onto a microSD card in the Media Control Unit, at a rate of approximately 5 times per second (hertz or Hz). Gateway log files are uploaded to Tesla when the vehicle connects to a Wi-Fi network. The Autopilot computer stores images (for all vehicles) and video (for model years 2016 and later) along with driving data similar to that captured in gateway log files at a higher temporal resolution (up to 50 Hz) and uploads these to Tesla periodically. These "snapshots" are deleted locally after being uploaded. Tesla has been silent about its data retention policies. When the control inputs generated by the shadow mode Autopilot do not match those of the human driver, the vehicle may record a snapshot to assist in training the system, after which the data may be reviewed by the Autopilot team. As explained by Karpathy, Tesla can deploy additional software "detectors" triggered by specific situations identified by snapshot data, which then upload camera and other data to Tesla when similar situations are detected. These data are used to revise the existing detectors. Under Tesla's privacy policies, the company does not sell customer and vehicle data, but may share the data with government entities. Tesla Vision relies on the "Autopilot labeling team", who view short video clips recorded by vehicle cameras and label visible signs and objects, training the machine vision interpreter. Data labeling was first handled by a non-profit outsourcing company named Samasource, which initially provided 20 workers in Nairobi, Kenya. In April 2023, it was revealed that San Mateo labeling employees had shared clips internally among themselves, including recordings of privately owned areas such as garages, as well as crashes, road-rage incidents, and meme videos annotated with "amusing captions or commentary". Former Tesla employees described the San Mateo office atmosphere as "free-wheeling" and noted "people who got promoted to lead positions shared a lot of these funny [clips] and gained notoriety for being funny." In one case, the submersible Lotus Esprit prop featured in the James Bond film The Spy Who Loved Me, which had been purchased by Elon Musk in 2013 and stored in his garage, was recorded and shared by labeling team members. Because sharing these clips was apparently for entertainment and not related to Autopilot training, Carlo Plitz, a data privacy lawyer, noted "it would be difficult to find a legal justification" for doing so. After the San Mateo office was closed in June 2022, the labeling team moved to Buffalo, New York, where Tesla has a total of 675 employees, hundreds of whom are labelers. == Regulatory and legal actions ==
Regulatory and legal actions
Regulation In 2015, a spokesman for the US National Highway Traffic Safety Administration (NHTSA) said that "any autonomous vehicle would need to meet applicable federal motor vehicle safety standards" and the NHTSA "will have the appropriate policies and regulations in place to ensure the safety of this type of vehicles". On February 1, 2021, Robert Sumwalt, chair of the NTSB, wrote a letter to NHTSA regarding that agency's "Framework for Automated Driving System Safety", which had been published for comment in December 2020. In the letter, Sumwalt recommended that NHTSA include user monitoring as part of the safety framework and reiterated that "Tesla's lack of appropriate safeguards and NHTSA's inaction" to act on the NTSB's recommendation "that NHTSA develop a method to verify that manufacturers of vehicles equipped with Level 2 incorporate system safeguards that limit the use of automated vehicle control systems to the conditions for which they were designed" was a contributing cause to a fatal crash of a vehicle in Delray Beach, Florida in 2019. Reporting is limited to crashes where the ADAS or ADS was engaged within 30 seconds prior to the crash that involve an injury that requires hospitalization, a fatality, a vehicle being towed from the scene, an air bag deployment, or involving a "vulnerable road user" (e.g., pedestrian or bicyclist); these crashes are required to be reported to NHTSA within one calendar day, and an updated report is required within 10 calendar days. On August 16, 2021, after reports of 17 injuries and one death in car crashes involving emergency vehicles, the US auto safety regulators opened a formal safety probe (PE 21-020) into Tesla's driver assistance system Autopilot. Initial data from SGO 2021-01 were released in June 2022; 12 manufacturers reported 392 crashes involving ADAS (Level 2) between July 2021 and May 15, 2022. Of those 392 crashes, 273 were Tesla vehicles, out of approximately 830,000 Tesla vehicles equipped with ADAS. Honda had the next highest total, with 90 crashes reported out of approximately 6 million Honda vehicles equipped with ADAS. The NHTSA said Tesla's numbers may appear high because it has real-time crash reports, whereas other automakers do not, so their crash reports may be delivered more slowly or not reported at all. Collectively, five people were killed and six more were seriously hurt in the 392 ADAS crashes that were reported. SGO 2021-01 also applied to manufacturers of vehicles equipped with ADS (Levels 3 through 5); 25 ADS manufacturers reported 130 crashes in total from the initial data release in June 2022, led by Waymo (62), Transdev Alternative Services (34), and Cruise LLC (23). In most cases, these crashes involved the ADS vehicle being struck from the rear; only one serious injury was reported, and 108 of the 130 crashes resulted in no injury. The suit was settled in 2018; owners who in 2016 and 2017 paid to equip their cars with the updated Autopilot software were compensated between $20 and $280 for the delay in implementing Autopilot 2.0. In 2020, a German court ruled in a lawsuit brought in 2019 by that Tesla had violated advertising regulations with its marketing of Autopilot. Upon appeal, that decision was reversed in 2021 by a higher court under the condition that Tesla clarify the capabilities of Autopilot on its website. In July 2022, a German court awarded a plaintiff most of the that she had paid for a Model X, based in part on a technical report that demonstrated Autopilot did not reliably recognize obstacles and would unnecessarily activate its brakes, which could cause a "massive hazard" in cities; Tesla's lawyers argued unsuccessfully that Autopilot was not designed for city traffic. In September 2022, a class action lawsuit was filed in the U.S. District Court (Northern California) alleging that "for years, Tesla has deceptively and misleading marketed its ADAS technology as autonomous driving technology under various names, including 'Autopilot,' 'Enhanced Autopilot,' and 'Full Self-Driving Capability, adding that Tesla represented "that it was perpetually on the cusp of perfecting that technology and finally fulfilling its promise of producing a fully self-driving car", while "Tesla knew for years its statements regarding its ADAS technology were deceptive and misleading, but the company made them anyway." Tesla filed a motion in November 2022 to dismiss the case, defending the company's actions as "mere failure to realize a long-term, aspirational goal [of a fully self-driving car] [and] not fraud", basing the motion on the private arbitration clause in the purchasing contract signed by each buyer. A second class action lawsuit was filed in the same court by Tesla shareholders in late February 2023. The complaint alleges the defendants "had significantly overstated the efficacy, viability, and safety of [Tesla's] Autopilot and FSD technologies" and those same systems "created a serious risk of accident and injury", which "subjected Tesla to an increased risk of regulatory and governmental scrutiny and enforcement action", linking multiple specific accidents to documented decreases in share prices. The suit was dismissed without prejudice in September 2024, as the judge ruled that Musk's claims were "corporate puffery". Tesla's lawyers argued that puffery covered the statements "[A]utopilot is 'superhuman and we want to get as close to perfection as possible; as the judge wrote in the motion granting the dismissal, "these vague statements of corporate optimism are not objectively verifiable". However, Tesla's lawyers also argued that other statements including "safety is 'paramount and "Tesla cars are 'absurdly safe also were puffery, which the judge rejected, as those were objectively verifiable. In April 2023, Tesla was found not liable in a lawsuit filed in 2020 by a driver who sued for damages after she claimed the Autopilot system guided her Tesla Model S into a curb, resulting in an airbag deployment and facial injuries. Jurors explained in post-trial interviews that "Autopilot never confessed to be self pilot. It's not a self-driving car ... [Tesla] were adamant about a driver needing to always be aware." Additional lawsuits have been filed by the estates of two drivers killed in 2019 while using Autopilot, one in California and one in Florida. In the California case, which had not previously been reported, Tesla has argued the driver had consumed alcohol and it is not clear that Autopilot was engaged; the plaintiff's lawyers alleged that a known defect in the Autopilot system had caused the vehicle to veer off a highway at and strike a palm tree. Tesla prevailed in that case, with the jury voting 9–3 in October 2023 that there was no manufacturing defect. For the Florida case, the judge rejected Tesla's motion to dismiss, concluding that he could not "imagine how some ordinary consumers would not have some belief that the Tesla vehicles were capable of driving themselves hands free", citing "reasonable evidence" demonstrating that Tesla had "engaged in a marketing strategy that painted the products as autonomous" and that Musk's statements "had a significant effect on the belief about the capabilities of the products". In the first federal case involving Tesla's Autopilot ever to go to trial, on August 1, 2025, Tesla was found partially responsible for a 2019 crash which killed one pedestrian and seriously injured another. As a result, Tesla was fined US$243 million in compensatory and punitive damages. Tesla has appealed the verdict. The lawsuit also accused Tesla of hiding data that the car produced in the moments leading up to the crash. False or misleading advertising The Center for Auto Safety and Consumer Watchdog wrote to the Federal Trade Commission (FTC) in 2018, asking them to open an investigation into the marketing of Autopilot. The letter stated "the marketing and advertising practices of Tesla, combined with Elon Musk's public statements, have made it reasonable for Tesla owners to believe, and act on that belief, that a Tesla with Autopilot is an autonomous vehicle capable of 'self-driving. The groups renewed their appeal to the FTC and added the California DMV in 2019, noting that "Tesla continues to be the only automaker to describe its Level 2 vehicles as 'self-driving' and the name of its driver assistance suite of features, Autopilot, connotes full autonomy." U.S. Senators Ed Markey (D-MA) and Richard Blumenthal (D-CT) echoed these concerns to the FTC in 2021. A 2019 IIHS study showed that the name "Autopilot" causes more drivers to misperceive behaviors such as texting or taking a nap to be safe, versus similar level 2 driver-assistance systems from other car companies. In 2020, UK safety experts called Tesla's Autopilot "especially misleading". While Euro NCAP's testing of Autopilot on a 2020 Model 3 noted the system excelled the level of vehicle assistance provided, the association noted the misleading nature of the system's name and a risk of over-reliance on the system. In 2020, usability engineer Dixon published a paper which called Tesla's descriptions of Autopilot and FSD capabilities exaggerated. In 2021, following more than a dozen Autopilot crashes (some fatal), the U.S. Department of Justice (DOJ) started a criminal investigation to determine if Tesla misled consumers, investors, and regulators about Autopilot. Tesla confirmed the DOJ had requested Autopilot and FSD-related documents in its 10-K filing for 2022. The Securities and Exchange Commission also opened an independent civil probe into statements made by Tesla and its executives about Autopilot. In July 2022, the California DMV filed two complaints with the state Office of Administrative Hearings that alleged Tesla "made or disseminated statements that are untrue or misleading, and not based on facts" relating to both "Autopilot and Full Self-Driving technologies". In August 2022, Tesla requested a hearing to present its defense. In September 2022, California governor Gavin Newsom signed state bill SB 1398, which took effect January 1, 2023 and prohibits any manufacturer or dealer of cars with partial driver automation features from using misleading language to advertise their vehicles as autonomous, such as by naming the system "Full Self-Driving". Deceptive promotion of Full Self-Driving In October 2016, at the same time as the release of HW2, Tesla released a video entitled "Full Self-Driving Hardware on All Teslas" that claimed to demonstrate Full Self-Driving, the system designed to extend automated driving to local roads. Musk later tweeted a link to a longer version in November 2016. In the video, the driver does not touch the steering wheel or pedals throughout the video. The video also shows perspectives from the vehicle's cameras and image recognition system. At Musk's suggestion, the title card states "The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself." It was nicknamed the "Paint It Black" video, after the 1966 Rolling Stones song used as its soundtrack. In the interview, they stated the vehicle was following a route that had been mapped with detailed scanning cameras, a technology that is not available in Tesla production cars. Even with these augmentations in place, human drivers had to intervene to take control, the vehicle allegedly struck "a roadside barrier" on the Tesla grounds during filming, requiring repairs to the vehicle, and the car crashed into a fence when trying to automatically park. In January 2024, Bloomberg published an expose based on internal Tesla emails revealing that Musk personally oversaw the editing and post-production of the video. Motor Trend and Jalopnik compared what Tesla had showcased to the deceptive video depicting a Nikola One EV truck which was actually powered by gravity; Jalopnik commented "[the Tesla video] may be worse, because this video was used to deceptively suggest capabilities of a system deployed into real people's hands and used on public roads." In June 2022, Ashok Elluswamy, director of Autopilot software, made a statement during a deposition taken for a civil lawsuit filed against Tesla by the family of a driver that was killed in 2018 after the Model X he was driving using Autopilot crashed into a concrete barrier in Mountain View, California. Elluswamy stated the video was not originally intended "to accurately portray what was available for customers in 2016. It was to portray what was possible to build into the system," while the final video had no such disclaimer. A Florida circuit court judge also noted the final video as part of Tesla's marketing strategy in rejecting Tesla's motion to dismiss a lawsuit over a 2019 death, writing that "absent from this video is any indication that the video is aspirational or that this technology doesn't currently exist in the market." In August 2021, the NHTSA Office of Defects Investigation (ODI) opened a preliminary evaluation (PE) designated PE 21-020 and released a list of eleven crashes involving Tesla vehicles striking stationary emergency vehicles; in each instance, NHTSA confirmed that Autopilot or Traffic Aware Cruise Control were active during the approach to the crashes. Of the eleven crashes, seven resulted in seventeen total injuries, and one resulted in one fatality. The scope of the planned evaluation of the Autopilot system specifically addressed the systems used to monitor and enforce driver engagement. In September 2021, NHTSA added a twelfth accident in Orlando from August 2021 to the investigation list. NHTSA sent a request for information relating to PE 21-020 to Tesla's director of field quality in August 2021. The response was due by October 22, 2021. In September 2021, NHTSA sent a request for information to Tesla and other automobile manufacturers for comparative ADAS data. After Tesla deployed its Emergency Light Detection Update in September 2021, NHTSA sent a follow-up letter to Tesla in October 2021 asking for "a chronology of events, internal investigations, and studies" that led to the deployment of the update, as it potentially addressed a safety defect, which requires a formal recall. In February 2022, NHTSA ODI opened a second preliminary evaluation (PE 22-002) for "phantom braking" in 2021–2022 Tesla Model 3 and Model Y vehicles. PE 22-002 was correlated to the removal of radar hardware from those vehicles in May 2021; at the time PE 22-002 was opened, the NHTSA was not aware of any crashes or injuries resulting from the complaints. According to some complaints, while using Autopilot, "rapid deceleration can occur without warning, at random, and often repeatedly in a single drive cycle." By May 2022, NHTSA had received 758 reports of unexpected braking when Autopilot was in use and requested that Tesla respond to questions by June 20, 2022. Also in June 2022, NHTSA ODI upgraded PE 21-020 to an engineering analysis (EA) and designated it as EA 22-002, covering an estimated 830,000 Tesla vehicles sold between 2014 and 2022. Data for PE 21-020 had been supplemented by prior information requests to Tesla (April 19, 2021) and Standing General Order (SGO) 2021–01, issued June 29, 2021 and amended on August 5, 2021, The investigation was expanded to an engineering analysis after NHTSA reviewed data from 191 crashes involving the use of Autopilot or related ADAS Level 2 technologies (Traffic-Aware Cruise Control, Autosteer, Navigate on Autopilot, or Auto Lane Change). which may not have been enough time for the driver to assume full control. In addition, the data suggest that Tesla's requirement for Autopilot drivers to have their hands on the wheel at all time may not be sufficient to ensure the driver is paying attention to the driving task. NHTSA sent a second letter for EA 22-002 to Tesla in August 2022, which included requests for a description of the role of the driver-facing camera, identification of all lawsuits or arbitration resulting from Autopilot use, including complete transcripts of depositions, and "the engineering and safety explanation and evidence for design decisions regarding enforcement of driver engagement / attentiveness". Tesla submitted a response in September. A follow-up letter was submitted in July 2023, asking for current data and updates to the prior response. Starting in October 2023, NHTSA conveyed its preliminary conclusions to Tesla during several meetings, followed by Tesla conducting a voluntary recall on December 5, 2023, to provide an over-the-air software update to "incorporate additional controls and alerts ... to further encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged ... [and providing] additional checks upon engaging Autosteer and while using the feature outside controlled access highways", while not concurring with NHTSA's analysis. EA 22-002 was closed in April 2024. ODI concluded their "analysis of crash data indicates that, prior to [the December 2023 recall], Autopilot's design was not sufficient to maintain drivers' engagement", citing data showing that in 59 of 109 crashes, hazards were visible for at least five seconds prior to the collision. Also ODI added, based on vehicle telemetry, "the warnings provided by Autopilot when Autosteer was engaged did not adequately ensure that drivers maintained their attention on the driving task", showing that in approximately 80% of 135 incidents, braking and/or steering did not occur until less than one second before a collision. noting "concerns [identified] due to post-remedy crash events and results from preliminary NHTSA tests of remedied vehicles. Also, Tesla has stated that a portion of the remedy both requires the owner to opt in and allows a driver to readily reverse it. Tesla has also deployed non-remedy updates to address issues that appear related to ODI's concerns under EA22002." In March 2026, the NHTSA stated that FSD in low-visibility conditions fails to detect hazards and/or alert drivers to deteriorated camera performance until immediately before a crash. Recalls Tesla issued an "Emergency Light Detection Update" for Autopilot in September 2021 which was intended to detect "flashing emergency vehicle lights in low light conditions and then [respond] to said detection with driver alerts and changes to the vehicle speed while Autopilot is engaged", after NHTSA had opened PE 21-020 the previous month. After the update was issued, NHTSA sent a letter to Tesla asking why the update had not been performed under the recall process, as "any manufacturer issuing an over-the-air update that mitigates a defect that poses an unreasonable risk to motor vehicle safety is required to timely file an accompanying recall notice to NHTSA." Tesla issued a recall of 11,728 vehicles in October 2021 due to a communication error that could lead to false forward-collision warnings or unexpected activations of the automatic emergency braking system. The error had been introduced by the Full Self-Driving beta software version 10.3 over-the-air firmware update, and was reversed by another over-the-air update the same month. The recalled vehicles were reverted to 10.2, then updated to 10.3.1. On February 1, after the NHTSA advised Tesla that failing to stop for a stop sign can increase the risk of a crash and threatened "immediate action" for "intentional design choices that are unsafe", Tesla recalled nearly 54,000 vehicles to disable the rolling stop behavior, removing the feature with an over-the-air software update. On February 16, 2023, Tesla issued a recall notice for all vehicles equipped with the Full Self-Driving beta software, including 2016–23 Model S and X; 2017–23 Model 3; and 2020–23 Model Y, covering 362,758 vehicles in total. NHTSA identified four specific traffic situations in a letter sent to Tesla on January 25, and Tesla voluntarily chose to pursue a recall to address those situations, Tesla issued a wider recall on all vehicles equipped with any version of Autosteer, including 2012–2023 Model S; 2016–2023 Model X; 2017–2023 Model 3; and 2020–2023 Model Y, covering 2,031,220 vehicles in total. The NHTSA concluded that Autosteer's controls were not sufficient to prevent misuse and did not ensure that the drivers maintained "continuous and sustained responsibility for vehicle operation". == See also ==
tickerdossier.comtickerdossier.substack.com