Recently, a report in the Washington Post once again put Tesla’s Autopilot system at the forefront. After personally experiencing Tesla’s Autopilot software update for 2 million vehicles, reporter Geoffrey Fowler published an article titled “Testing Tesla’s Autopilot update, I still don’t feel safer. , you should do the same,” the article sparked heated discussions.
It is noticed that the article pointed out the focus of the controversy at the beginning: whether Tesla’s update really solved the safety issues previously raised by the National Highway Traffic Safety Administration (NHTSA). Fowler tested his Tesla Model Y on the streets of San Francisco and found that the updated system still had worrying flaws.
One of the most serious problems is when the Autopilot system takes over driving on sections of road it is not designed to use. Fowler emphasized that he remained alert throughout the entire process, with his hands hanging on the steering wheel, but the system was still able to drive autonomously on city streets, even for more than a minute (this update was supposed to detect the driver’s hands on the steering wheel and eyes on the road) Autopilot enabled. What’s even more disturbing is that even if he blocked the in-car camera (used to monitor driver attention) with a sticker, the Autopilot feature could still be activated.
Fowler noted that the update appears to be more about appeasing regulators than actually improving safety. He believes that Tesla has not really restricted the Autopilot function and can still activate Autopilot on roads where the owner’s manual clearly prohibits its use. Before the update, he could take his hands off the steering wheel for 75 seconds on a secondary highway, which was reduced to 60 seconds after the update. . Tesla has made perfunctory efforts by increasing the font size and other inconsequential changes. This approach is not only meaningless, but may mislead drivers into thinking that the system is safer and more reliable.
In addition, his car did not slow down in front of the speed bump and ran the red light, even though the red light signal was clearly displayed on the on-board touch screen. After consulting the owner’s manual, Fowler discovered that the vehicle would only obey red lights if he purchased the FSD package. Here’s the problem, Tesla assumes drivers have a thorough understanding of what different software packages do, which may be beyond the capabilities of many and doesn’t help with safe driving.
Fowler also mentioned that he was baffled that he was still able to activate Autopilot after covering his car’s camera with a smiley face sticker. Why does the system allow Autopilot to be activated if the camera is not working properly or is intentionally blocked? Isn’t this a huge security risk?
Fowler reported his concerns to NHTSA, and the agency’s communications director, Veronica Morales, said the investigation into Tesla is ongoing and they will continue to monitor updated vehicles. performance. She did not comment directly on Fowler’s testing but highlighted the responsibility the Vehicle Safety Act places on manufacturers to develop safety fixes. Morales also said NHTSA will use multiple Teslas for testing at its Vehicle Research and Testing Center in Ohio.
Although NHTSA emphasizes that consumers should not test vehicle technology on their own, Fowler believes that all Tesla drivers who use the updated Autopilot have become testers invisibly, and NHTSA’s post-event review mechanism is inefficient and cannot detect and solve safety issues in a timely manner. question. He compared Tesla with phones, pointing out that mobile apps must undergo strict review by Apple and Google before being released and must meet transparency requirements. Why do cars enjoy looser regulations than phones?
Fowler’s doubts are not unreasonable, and the controversy surrounding Tesla’s Autopilot continues to this day. As CleanTechnica notes, the word Autopilot itself is misleading, promising features far beyond actual capabilities. In addition, safety assurance should not rely on the personal judgment of a billionaire computer genius, but should ensure that ordinary people can clearly understand the functions and flaws of Tesla cars to avoid tragedies caused by information asymmetry. Whether other traffic participants knowingly participated in this test without any consent process is also a question worth pondering.