Tesla Autopilot Issue Archives - TechGoing https://www.techgoing.com/tag/tesla-autopilot-issue/ Technology News and Reviews Thu, 27 Jul 2023 04:33:24 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.4 Tesla is under new investigation related to Autopilot safety and false advertising issues https://www.techgoing.com/tesla-is-under-new-investigation-related-to-autopilot-safety-and-false-advertising-issues/ Thu, 27 Jul 2023 04:33:22 +0000 https://www.techgoing.com/?p=116831 According to foreign media CNBC reports, Tesla is facing a new investigation, and the California Attorney General’s Office is investigating the company’s Autopilot functional safety issues and “misleading” marketing practices. It is reported that Elon Musk has claimed that Tesla vehicles will have fully driverless functions since 2016, when he said that all new Tesla […]

The post Tesla is under new investigation related to Autopilot safety and false advertising issues appeared first on TechGoing.

]]>
According to foreign media CNBC reports, Tesla is facing a new investigation, and the California Attorney General’s Office is investigating the company’s Autopilot functional safety issues and “misleading” marketing practices.

It is reported that Elon Musk has claimed that Tesla vehicles will have fully driverless functions since 2016, when he said that all new Tesla electric vehicles will be equipped with the necessary software and hardware. But in fact, until today, Tesla’s promise has not been realized, so it has been strongly dissatisfied with car owners in Europe and the United States.


▲ Picture source CNBC

And in California, Tesla vehicles are currently facing investigations from a flaw in Autopilot called “phantom braking” (Tesla vehicles suddenly brake for no reason in Autopilot mode).

The California Attorney General’s Office recently called former Tesla owner Greg Wester about his previous complaint to the Federal Trade Commission in August 2022 about Tesla’s “phantom brakes” and claims he believed the company’s Full Self-Driving (FSD) system was misleading.

“Tesla should provide customers with the option to get a full refund on the Autopilot feature if they are not satisfied with the product,” Greg Wester said in a previous interview. He said that when buying FSD, “the owner spent money, but did not get the experience Elon Musk promised.”

After inquiries, it is learned that Tesla had previously stated in its second-quarter financial documents that the company had received “information disclosure requests from regulators.” While Tesla has previously confirmed that it will disclose “documentation related to Tesla’s Autopilot and FSD functions requested by the Department of Justice,” Tesla has not previously disclosed that the California Attorney General’s Office will begin investigating the company.

The post Tesla is under new investigation related to Autopilot safety and false advertising issues appeared first on TechGoing.

]]>
Tesla sued again for allegedly misrepresenting Autopilot features https://www.techgoing.com/tesla-sued-again-for-allegedly-misrepresenting-autopilot-features/ Thu, 15 Sep 2022 04:18:08 +0000 https://www.techgoing.com/?p=26289 Elon Musk’s electric car giant Tesla became a defendant in a proposed class action lawsuit on Wednesday, local U.S. time. The lawsuit alleges that the company misrepresented its Autopilot and Full Self-Driving features and, as a result, misled the public. According to the plaintiff’s allegations in the complaint, Tesla and Elon Musk have deceptively promoted […]

The post Tesla sued again for allegedly misrepresenting Autopilot features appeared first on TechGoing.

]]>
Elon Musk’s electric car giant Tesla became a defendant in a proposed class action lawsuit on Wednesday, local U.S. time. The lawsuit alleges that the company misrepresented its Autopilot and Full Self-Driving features and, as a result, misled the public.

According to the plaintiff’s allegations in the complaint, Tesla and Elon Musk have deceptively promoted their self-driving technology as fully functional or “just around the corner” since 2016, making their vehicles unsafe, despite knowing that the technology was not yet functional or did not exist.

The plaintiff, Briggs Matsko, said Tesla’s move was designed to get people “excited” about its cars, thereby attracting investment, boosting sales, avoiding bankruptcy and driving up its stock price to become the “dominant player” in the electric car space. “.

Matzko said, “Tesla has not yet produced anything close to a fully self-driving car.”

The lawsuit, filed in federal court in San Francisco, seeks to have Tesla compensate users who have purchased or leased Tesla vehicles with Autopilot, Enhanced Autopilot and Full Autopilot since 2016, but does not specify the exact amount of damages.

Tesla has not yet responded to related requests for comment after the company disbanded its media relations department in 2020.

The lawsuit follows a July 28 lawsuit filed by the California Department of Motor Vehicles alleging that Tesla exaggerated the operation of its Advanced Driver Assistance Systems (ADAS). The lawsuit proposes remedies that could include revocation of Tesla’s California driver’s license and seeks to have the company compensate drivers.

Tesla has previously said that the Autopilot feature allows the vehicle to steer, accelerate and brake in its lane, while the fully automated driving feature allows the vehicle to obey traffic signals and change lanes.

The company also said that both technologies “require active driver supervision” and that drivers need to keep their hands on the wheel and remain “completely focused” and “cannot allow the vehicle to (fully rely on) Autopilot “.

The plaintiff in the above lawsuit, Matzko, from Rancho Murieta, California, claims he paid a $5,000 premium for enhanced Autopilot functionality when he purchased his 2018 Tesla Model X.

He also said that Tesla drivers who received the software update “actually acted like untrained test engineers” and found “countless problems” while driving, including driving into the opposite lane, running red lights, and failing to make regular turns.

Since 2016, the National Highway Traffic Safety Administration (NHTSA) has opened 38 special investigations into Tesla crashes believed to have involved “advanced driver assistance systems. In total, these crashes have reportedly resulted in 19 fatalities.

The post Tesla sued again for allegedly misrepresenting Autopilot features appeared first on TechGoing.

]]>
German regulator finds Tesla Autopilot function abnormal: ordered to improve https://www.techgoing.com/german-regulator-finds-tesla-autopilot-function-abnormal-ordered-to-improve/ Sat, 10 Sep 2022 05:52:25 +0000 https://www.techgoing.com/?p=24994 After more than six months of investigation into Tesla’s driver-assisted driving system Autopilot, Germany’s road traffic safety regulator KBA has found “anomalies” in the software. The agency has ordered Tesla to make improvements and restrict some of the assisted driving features. It’s not clear what the “anomaly” is, what changes have been made to Tesla’s […]

The post German regulator finds Tesla Autopilot function abnormal: ordered to improve appeared first on TechGoing.

]]>
After more than six months of investigation into Tesla’s driver-assisted driving system Autopilot, Germany’s road traffic safety regulator KBA has found “anomalies” in the software. The agency has ordered Tesla to make improvements and restrict some of the assisted driving features. It’s not clear what the “anomaly” is, what changes have been made to Tesla’s over-the-air (OTA) updates or how many vehicles have been affected. So far, KBA has not responded to these questions.

Information map (from Tesla official website)

The first thing the KBA investigated, however, may have been the automatic lane change feature, which does not comply with European law. The agency forced Tesla to limit the feature and require drivers to always use the turn signal.

At the same time, the KBA is also investigating Tesla’s practice of testing owners of “fully automated driving” (FSD) software based on safety scores. The agency believes that “the vehicle must be safe enough for all drivers to drive.” Currently, FSD is not available in Europe.

In an interview, he said he wanted to know what changes Tesla made before deploying these updates. “If we have not received any information, we cannot exclude the possibility that the system does not comply with the rules.”

A KBA spokesperson confirmed that the “anomaly” was partially fixed by limiting the automatic lane change feature to an update. However, “further remedies are still being tested and validated. The agency did not give specific details or provide a timeline for the fix.

If “anomalies” are a security risk, why didn’t KBA warn customers? If they are not dangerous, why is the German safety regulator asking Tesla to improve Autopilot and what must be changed? What else does Tesla need to adjust? Do these improvements only affect Teslas sold in Germany, Europe or around the world? None of these questions have been answered yet.

The fact is that Tesla’s driver-assisted driving system is not identical from region to region. This is in part because Tesla has its own improvement program that relies on data from its on-road fleet to improve the Assisted Driving feature, much of which comes from North America, specifically California.

This has led to the system performing better in some places than in others. But regulatory requirements have also prevented Tesla from deploying its Assisted Driving feature in certain markets, including Europe.

The post German regulator finds Tesla Autopilot function abnormal: ordered to improve appeared first on TechGoing.

]]>
Tesla exec: Autopilot could prevent about 40 accidents a day caused by sudden acceleration https://www.techgoing.com/tesla-exec-autopilot-could-prevent-about-40-accidents-a-day-caused-by-sudden-acceleration/ Mon, 22 Aug 2022 17:44:49 +0000 https://www.techgoing.com/?p=17875 According to foreign media reports, Tesla Autopilot software director Ashok Elluswamy said Tesla Autopilot can prevent about 40 accidents caused by sudden unintended acceleration (SUA) every day. In about 40 of those crashes, he said, human drivers mistakenly hit the gas pedal instead of the brake. But Autopilot realizes they are doing so and that […]

The post Tesla exec: Autopilot could prevent about 40 accidents a day caused by sudden acceleration appeared first on TechGoing.

]]>
According to foreign media reports, Tesla Autopilot software director Ashok Elluswamy said Tesla Autopilot can prevent about 40 accidents caused by sudden unintended acceleration (SUA) every day.

In about 40 of those crashes, he said, human drivers mistakenly hit the gas pedal instead of the brake. But Autopilot realizes they are doing so and that a collision is imminent, and automatically stops accelerating and applies the brakes to prevent a human collision.

Autopilot is a technology unique to Tesla that fundamentally functions in the same way as rival systems such as GM’s Super Cruise or Ford’s BlueCruise.

The Autopilot system has been considered the culprit in past crashes involving Tesla vehicles. In June, foreign media reported that the National Highway Traffic Safety Administration (NHTSA) had stepped up its investigation into whether Autopilot was defective and revealed that they had reviewed 191 accidents involving vehicles using Autopilot.

In reality, the system is only partially automated, such as keeping the car in its lane and at a safe distance from the vehicle in front of it. The system is only designed to assist the driver, who must be ready to intervene.

The post Tesla exec: Autopilot could prevent about 40 accidents a day caused by sudden acceleration appeared first on TechGoing.

]]>
U.S. Congressman Eyes Tesla, Asks Regulators About Its Autopilot Accidents https://www.techgoing.com/u-s-congressman-eyes-tesla-asks-regulators-about-its-autopilot-accidents/ Fri, 12 Aug 2022 01:59:57 +0000 https://www.techgoing.com/?p=15526 BEIJING, Aug. 12 – Two U.S. congressmen who oversee auto safety reportedly asked federal auto safety regulators to brief an investigation into Tesla’s use of Autopilot and advanced driver assistance systems in a crash. In a letter to the National Highway Traffic Safety Administration (NHTSA), Democratic U.S. Senator Gary Peters and Representative Jan Schakowsky expressed […]

The post U.S. Congressman Eyes Tesla, Asks Regulators About Its Autopilot Accidents appeared first on TechGoing.

]]>
BEIJING, Aug. 12 – Two U.S. congressmen who oversee auto safety reportedly asked federal auto safety regulators to brief an investigation into Tesla’s use of Autopilot and advanced driver assistance systems in a crash.

In a letter to the National Highway Traffic Safety Administration (NHTSA), Democratic U.S. Senator Gary Peters and Representative Jan Schakowsky expressed their concern that “federal investigations and recent reports have uncovered troubling safety issues at Tesla “.

The lawmakers asked, “In light of the increasing number of deaths from Tesla cars crashing into tractor trailers …… is NHTSA considering opening a deficiency investigation into this issue?”

The letter goes on to say, “Has NHTSA struck a balance between a thorough investigation and responding to urgent, emerging motor vehicle safety risks?” and whether the agency has sufficient resources and legal authority to properly investigate advanced driver assistance systems.

NHTSA was not immediately available for comment. In July, NHTSA Director Steve Cliff told the press that he hopes to complete the investigation into Tesla’s advanced driver assistance system, Autopilot, as soon as possible, “but I also want to get things right. There’s a lot of information for us to sort through.”

Tesla, which has disbanded its press office, had no immediate comment. Tesla’s website says Autopilot allows the vehicle to steer, accelerate and brake automatically, “but requires active driver oversight and does not allow the vehicle to drive itself.”

Since 2016, the NHTSA has launched 38 special investigations into accidents involving Tesla vehicles that allegedly used advanced driver assistance systems such as Autopilot. A total of 19 people have died in crashes in those Tesla-related investigations.

Last month, the NHTSA said it opened a special investigation into a crash involving a 2020 Tesla Model 3 vehicle in Utah that killed a motorcyclist.

In June, NHTSA upgraded its defect investigation to an investigation of 830,000 Tesla vehicles equipped with Autopilot systems involved in crashes into parked emergency vehicles, a necessary step for the company to issue a vehicle recall. The investigation was first launched in August 2021.

On June 15, NHTSA said Tesla has reported 273 crashes involving advanced driver assistance systems since July 2021, more than any other automaker.

The lawmakers asked whether NHTSA has determined whether Tesla has implemented safety measures to prevent advanced driver assistance systems “from being activated when the vehicle is not properly equipped to operate.

The National Transportation Safety Board and other agencies have questioned whether Tesla is doing enough to ensure that drivers are focused while using the Autopilot.

In their letter, the lawmakers wrote: “Did NHTSA’s investigation finds that allowing advanced driver assistance systems to operate in unsuitable conditions constituted a design defect?”

The post U.S. Congressman Eyes Tesla, Asks Regulators About Its Autopilot Accidents appeared first on TechGoing.

]]>
German court rules Tesla Autopilot has problems, refunds 760,000 yuan https://www.techgoing.com/german-court-rules-tesla-autopilot-has-problems-refunds-760000-yuan/ Sat, 16 Jul 2022 03:39:48 +0000 https://www.techgoing.com/?p=8521 July 16 (Xinhua) — A court in Munich, Germany has asked U.S. electric car maker Tesla to compensate a consumer who paid for a Model X electric SUV due to problems with the in-car advanced driver assistance system Autopilot on Friday, local time. The consumer reportedly paid 112,000 euros (about $762,977) for the Model X […]

The post German court rules Tesla Autopilot has problems, refunds 760,000 yuan appeared first on TechGoing.

]]>
July 16 (Xinhua) — A court in Munich, Germany has asked U.S. electric car maker Tesla to compensate a consumer who paid for a Model X electric SUV due to problems with the in-car advanced driver assistance system Autopilot on Friday, local time. The consumer reportedly paid 112,000 euros (about $762,977) for the Model X electric SUV. A technical report showed that the Model X electric SUV was unable to reliably identify obstacles such as narrow sections of construction sites and sometimes activated the brakes unnecessarily.

The court ruled that it could pose a “great danger” in the city center and cause a crash.

Tesla lawyers reportedly argued that the Autopilot system was not designed for urban traffic. A court in Munich, Germany, said it was impractical for drivers to manually turn the feature on and off in different driving situations because it would distract them.

U.S. safety regulators are investigating Tesla’s Autopilot feature. Previously, it was reported that Tesla vehicles with Autopilot active crashed into stationary emergency personnel and road maintenance vehicles, resulting in a total of 16 crashes, including seven fatalities and one injury.

Tesla said the Autopilot system enables the vehicle to brake and follow automatically in the lane, but does not allow the vehicle to drive itself.

Musk said in March this year that Tesla may launch a new “Full Autopilot” beta software in the European market later this year, depending on local regulatory approval.

At the time, he told employees at the new Berlin superfactory, “It’s very difficult to achieve fully autonomous driving in Europe.” Musk said a lot of work needs to be done to deal with tricky road conditions in Europe, where road conditions vary greatly from country to country.

The post German court rules Tesla Autopilot has problems, refunds 760,000 yuan appeared first on TechGoing.

]]>