A driver told authorities that his Tesla’s “full-self-driving” software unexpectedly braked and caused an eight-car pileup in the San Francisco Bay Area last month. Nine people were treated for minor injuries, including a minor who was hospitalized, according to a California Highway Patrol crash report. According to a police report released Wednesday, the driver of the 2021 Tesla Model S involved in the crash told police he was in full self-driving mode (FSD), which was malfunctioning.
CEO Elon Musk has touted Tesla’s self-driving software as a potential cash cow for the world’s largest electric car maker. But Tesla’s advanced driver assistance systems — and Musk’s claims about them — are under increasing legal, regulatory and public scrutiny.
Last month, Tesla told U.S. auto safety regulators it had received reports of two new fatalities linked to advanced driver assistance systems in its Tesla Model 3 cars, according to sources. data published Tuesday by the government.

That National Highway Traffic Safety Administration (NHTSA) in June began publishing data provided by automakers on accident reports related to driver assistance systems such as Tesla’s Autopilot system. It issued an executive order in June 2021 requiring automakers and technology companies to immediately report all accidents involving advanced driver assistance systems (ADAS) and vehicles equipped with lane-tested automated driving systems.
“NHTSA’s oversight is not limited to the specified accidents discussed in the order or the information submitted as part of its reporting obligations. NHTSA’s review and analysis will include all information and incidents related to any potential safety deficiencies. In addition, NHTSA may take other actions in the event of an individual accident, including dispatching a special accident investigation team and requiring the company to provide additional information. NHTSA may also initiate fault investigations if appropriate.”
In the case of the Thanksgiving Day crash on Interstate-80 near Treasure Island, the California Highway Patrol reviewed videos showing the Tesla vehicle changing lanes and braking to a stop. Two lanes of the road were closed for about 90 minutes as many people traveled to holiday events. Four ambulances were called to the scene.
The pileup came just hours after Tesla CEO Elon Musk announced that Tesla’s ‘full self-driving’ driver assistance software was available to anyone in North America who makes use of it. Tesla previously had limited access to drivers with high safety scores in its rating system.
“Tesla’s Full Self-Driving Beta is now available to anyone in North America who requests it,” Musk tweeted, “provided you have purchased this option. Musk’s statement comes on the heels of accusations by the California Department of Motor Vehicles, which accused Tesla for making “false or misleading” claims about the capabilities of its self-driving cars.
The beta rollout started in 2020 with a small number of customers and has since gradually expanded to be available to around 160,000 drivers by October this year. To gain access to the beta, drivers would typically have to meet a minimum safety threshold with Tesla’s built-in Safety Score feature, as well as drive 100,000 miles using the company’s advanced Autopilot driver assistance feature.
The Silicon Valley automaker is selling its $15,000 add-on software that enables its vehicles to change lanes and park autonomously. This complements its standard ‘Autopilot’ feature, which allows cars to steer, accelerate and brake in their lane without driver input.
Full-self driving is designed to follow traffic, stay in its lane and respect road signs. It requires an alert human driver ready to take full control of the car at all times. This system has pleased some drivers, but has also alarmed others with its limitations. Drivers will be warned by Tesla when installing “full self-driving” that it “could make the wrong decision at the worst time”.
The report says the Tesla Model S was traveling at about 55 mph and moved into the far left lane, but then braked hard, slowing the car to about 20 mph. This set off a chain reaction that eventually caused eight vehicles to crash, all traveling at normal highway speeds.
National Transportation Safety Board Chair Jennifer Homendy questioned Tesla’s marketing of the feature by calling it fully autonomous when it is not capable of that.
And you?
What is your opinion on the subject?
Also see:
Tesla reports two new fatalities involving driver assistance systems, out of 18 fatalities reported since July 2021, nearly all involving Tesla vehicles
Beta version of Tesla’s “Full Self-Driving” mode is now available in North America, it comes without mentioning the obligation to meet minimum safety requirements