Ralph Nader, a leading U.S. consumer advocate, called Tesla’s deployment of Full Self-Driving (FSD) on its electric cars “one of the most dangerous and irresponsible actions by a car company in decades” and called on regulators to recall Tesla’s advanced driver assistance feature.
The National Highway Traffic Safety Administration (NHTSA) must use its recall authority to require Tesla not to deploy the FSD system in its electric vehicles, Nader said.
In a statement, Nader said, “I call on federal regulators to take immediate action to prevent the growing number of injuries and deaths caused by negligent accidents resulting from this Tesla technology.”
Ralph Nader file photo
There are growing calls for regulators to make a decision on recalling Tesla’s FSD system. Critics say the FSD system is no longer fit for driver use, and the NHTSA is currently investigating 16 collisions between Tesla owners and parked emergency vehicles while using Autopilot’s advanced driver assistance system, which also resulted in 15 injuries and one death. Most of these crashes occurred after dark and the system ignored warning lights, traffic cones, arrow indicators and other site control measures. The accident investigation was recently upgraded to a second phase of “engineering analysis,” which is the final phase before the recall.
In the statement, Nader noted that Tesla recently reported that more than 100,000 owners are currently testing the FSD system on public roads.
The Autopilot advanced driver assistance system is currently standard on Tesla electric vehicles. For an additional $12,000, users can purchase the FSD system, and Tesla CEO Elon Musk has repeatedly promised that one day Tesla electric cars will be fully self-driving. But so far, FSD is still an L2 advanced driver assistance system, which means the driver must be fully involved in the operation of the vehicle while it is in motion.
In addition to investigating collisions with emergency vehicles, NHTSA has compiled a list of special incident investigations (SCI) to investigate accidents involving Tesla Autopilot and other self-driving systems.
As of July 26, NHTSA has compiled an SCI list of 48 crashes, 39 of which involved Autopilot-activated Tesla electric vehicles, in which 19 people, including drivers, passengers, pedestrians, other drivers and motorcyclists, were killed.
Last week, the California Department of Motor Vehicles (DMV) accused Tesla of misrepresenting the capabilities of its Autopilot and FSD systems, saying the company made “untrue or misleading” claims about the self-driving capabilities of its electric vehicles. The DMV’s allegations could lead to the suspension of Tesla’s license to manufacture and sell electric vehicles in California.
Tesla has faced similar complaints in the past, and in 2016, the German government asked Tesla to stop using the term Autopilot, citing concerns that it could mislead the public into believing that Tesla cars are capable of fully autonomous driving. Last year, U.S. Senators Ed Markey and Richard Blumenthal asked the Federal Trade Commission to investigate the way Tesla advertises its Autopilot and FSD systems, claiming that Tesla “exaggerates the performance of its vehicles,” which could “pose a threat to drivers and other participants on the road.
Nader called on NHTSA to take action before injuries and fatalities continue to occur.
“This malfunctioning software should not be allowed, and Tesla itself has warned that it could ‘do the wrong thing at the worst possible time’ while driving on streets where children walk to school,” he said. “Together, we need to send an urgent message to regulators concerned about fatalities and injuries that people must not be the test subjects of a large, high-profile company and its well-known CEO. No one is above the law of manslaughter.”