Tesla Autopilot Archives - TechGoing https://www.techgoing.com/tag/tesla-autopilot/ Technology News and Reviews Mon, 01 Jan 2024 13:51:50 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.4 Tesla Autopilot is accused of still having safety risks after update https://www.techgoing.com/tesla-autopilot-is-accused-of-still-having-safety-risks-after-update/ Mon, 01 Jan 2024 13:51:49 +0000 https://www.techgoing.com/?p=158838 Recently, a report in the Washington Post once again put Tesla’s Autopilot system at the forefront. After personally experiencing Tesla’s Autopilot software update for 2 million vehicles, reporter Geoffrey Fowler published an article titled “Testing Tesla’s Autopilot update, I still don’t feel safer. , you should do the same,” the article sparked heated discussions. It […]

The post Tesla Autopilot is accused of still having safety risks after update appeared first on TechGoing.

]]>
Recently, a report in the Washington Post once again put Tesla’s Autopilot system at the forefront. After personally experiencing Tesla’s Autopilot software update for 2 million vehicles, reporter Geoffrey Fowler published an article titled “Testing Tesla’s Autopilot update, I still don’t feel safer. , you should do the same,” the article sparked heated discussions.

It is noticed that the article pointed out the focus of the controversy at the beginning: whether Tesla’s update really solved the safety issues previously raised by the National Highway Traffic Safety Administration (NHTSA). Fowler tested his Tesla Model Y on the streets of San Francisco and found that the updated system still had worrying flaws.

One of the most serious problems is when the Autopilot system takes over driving on sections of road it is not designed to use. Fowler emphasized that he remained alert throughout the entire process, with his hands hanging on the steering wheel, but the system was still able to drive autonomously on city streets, even for more than a minute (this update was supposed to detect the driver’s hands on the steering wheel and eyes on the road) Autopilot enabled. What’s even more disturbing is that even if he blocked the in-car camera (used to monitor driver attention) with a sticker, the Autopilot feature could still be activated.

Fowler noted that the update appears to be more about appeasing regulators than actually improving safety. He believes that Tesla has not really restricted the Autopilot function and can still activate Autopilot on roads where the owner’s manual clearly prohibits its use. Before the update, he could take his hands off the steering wheel for 75 seconds on a secondary highway, which was reduced to 60 seconds after the update. . Tesla has made perfunctory efforts by increasing the font size and other inconsequential changes. This approach is not only meaningless, but may mislead drivers into thinking that the system is safer and more reliable.

In addition, his car did not slow down in front of the speed bump and ran the red light, even though the red light signal was clearly displayed on the on-board touch screen. After consulting the owner’s manual, Fowler discovered that the vehicle would only obey red lights if he purchased the FSD package. Here’s the problem, Tesla assumes drivers have a thorough understanding of what different software packages do, which may be beyond the capabilities of many and doesn’t help with safe driving.

Fowler also mentioned that he was baffled that he was still able to activate Autopilot after covering his car’s camera with a smiley face sticker. Why does the system allow Autopilot to be activated if the camera is not working properly or is intentionally blocked? Isn’t this a huge security risk?

Fowler reported his concerns to NHTSA, and the agency’s communications director, Veronica Morales, said the investigation into Tesla is ongoing and they will continue to monitor updated vehicles. performance. She did not comment directly on Fowler’s testing but highlighted the responsibility the Vehicle Safety Act places on manufacturers to develop safety fixes. Morales also said NHTSA will use multiple Teslas for testing at its Vehicle Research and Testing Center in Ohio.

Although NHTSA emphasizes that consumers should not test vehicle technology on their own, Fowler believes that all Tesla drivers who use the updated Autopilot have become testers invisibly, and NHTSA’s post-event review mechanism is inefficient and cannot detect and solve safety issues in a timely manner. question. He compared Tesla with phones, pointing out that mobile apps must undergo strict review by Apple and Google before being released and must meet transparency requirements. Why do cars enjoy looser regulations than phones?

Fowler’s doubts are not unreasonable, and the controversy surrounding Tesla’s Autopilot continues to this day. As CleanTechnica notes, the word Autopilot itself is misleading, promising features far beyond actual capabilities. In addition, safety assurance should not rely on the personal judgment of a billionaire computer genius, but should ensure that ordinary people can clearly understand the functions and flaws of Tesla cars to avoid tragedies caused by information asymmetry. Whether other traffic participants knowingly participated in this test without any consent process is also a question worth pondering.

The post Tesla Autopilot is accused of still having safety risks after update appeared first on TechGoing.

]]>
Tesla won the first trial in the United States Autopilot car accident case https://www.techgoing.com/tesla-won-the-first-trial-in-the-united-states-autopilot-car-accident-case/ Wed, 01 Nov 2023 01:27:34 +0000 https://www.techgoing.com/?p=148467 Tesla in a trial accused of killing people, in the verdict, the jury did not think that Tesla’s software had defects. In 2019, Micah Lee’s Tesla Model 3 suddenly veered off the road on a Los Angeles freeway. The vehicle struck a palm tree and caught fire. The crash killed Lee and seriously injured two […]

The post Tesla won the first trial in the United States Autopilot car accident case appeared first on TechGoing.

]]>
Tesla in a trial accused of killing people, in the verdict, the jury did not think that Tesla’s software had defects.

In 2019, Micah Lee’s Tesla Model 3 suddenly veered off the road on a Los Angeles freeway. The vehicle struck a palm tree and caught fire.

The crash killed Lee and seriously injured two passengers, one of whom was an 8-year-old boy. The lawsuit claims that Tesla knew about the defects in Autopilot and other active safety features when it sold the Model 3 to Lee.

Tesla denies any fault and says Lee had been drinking before driving the vehicle. Tesla also said it could not confirm whether Autopilot was enabled at the time of the accident.

The jury ultimately found that Tesla was not responsible for the accident because they did not find the Autopilot system defective.

Opening statements in the case took place on September 28, and just one month later, the jury returned a verdict finding Tesla Autopilot not responsible for the accident.

This is the second lawsuit filed against Tesla Autopilot this year. In April 2023, a jury decided that Justine Hsu was liable for an accident in which a Tesla car crashed into a curb, causing her to fracture her jaw, suffer facial nerve damage, and lose a couple of teeth. Hsu claimed that Autopilot abruptly swerved the car into the curb and claimed $3 million. A California court ultimately found Tesla not liable.

THIS IS A SPONSOR PROMOTION: >>>>>>>>>>>>>

Geekwills is an online shop that connects consumers with millions of products and brands around the world with the mission to empower them to live their best lives. Geekwills is committed to offering the most affordable quality products to enable consumers and sellers to fulfill their dreams in an inclusive environment.

Geekwills

The post Tesla won the first trial in the United States Autopilot car accident case appeared first on TechGoing.

]]>
Elon Musk furious with engineers after, Tesla Autopilot nearly killed him https://www.techgoing.com/elon-musk-furious-with-engineers-after-tesla-autopilot-nearly-killed-him/ Sun, 17 Sep 2023 14:21:04 +0000 https://www.techgoing.com/?p=132743 According to Walter Isaacson’s new biography “Elon Musk”, as early as 2015, Tesla CEO Elon Musk (Elon Musk) nearly died several times due to the automatic driving assistance technology Autopilot during a test drive, causing him to become furious with the engineers. In the chapter on self-driving assistance technology in this biography, the author reveals […]

The post Elon Musk furious with engineers after, Tesla Autopilot nearly killed him appeared first on TechGoing.

]]>
According to Walter Isaacson’s new biography “Elon Musk”, as early as 2015, Tesla CEO Elon Musk (Elon Musk) nearly died several times due to the automatic driving assistance technology Autopilot during a test drive, causing him to become furious with the engineers.

In the chapter on self-driving assistance technology in this biography, the author reveals an incident that Musk personally experienced: There was a curve on Interstate 405 that Autopilot could not recognize because the lane lines on the road had faded, resulting in The car swerved and “nearly hit” oncoming traffic.

Whenever this happened, Musk would storm into Tesla’s offices angrily and rage at his engineers, whom he repeatedly asked to improve the program: “Do something to program this thing properly.”

Elon Musk insisted that Tesla’s cars only need to use optical sensors, just like humans driving mainly with their eyes. He rejects other technologies, such as LiDAR, as too expensive and unnecessary. The biography says his engineers thought lidar was the best option for improving safety, but they couldn’t convince Musk to change his mind.

Here is the sponsor promotion:

GEEKWILLS

Lenovo TWS Earphone is only $9.9

BUY IT NOW
Lenovo Thinkplus TWS Earphone

The post Elon Musk furious with engineers after, Tesla Autopilot nearly killed him appeared first on TechGoing.

]]>
Tesla driver collided with a police car https://www.techgoing.com/tesla-driver-collided-with-police-car/ Sun, 13 Aug 2023 06:24:08 +0000 https://www.techgoing.com/?p=121838 Tesla Model X in Autopilot mode was traveling at a speed of 54 miles per hour (about 86.9 kilometers). Hit a police car. Five police officers who were conducting a routine traffic check were injured in the crash, as was the driver who was pulled over by police. Reports say the driver of the Model […]

The post Tesla driver collided with a police car appeared first on TechGoing.

]]>
Tesla Model X in Autopilot mode was traveling at a speed of 54 miles per hour (about 86.9 kilometers). Hit a police car. Five police officers who were conducting a routine traffic check were injured in the crash, as was the driver who was pulled over by police.

Reports say the driver of the Model X was intoxicated at the time of the crash. Still, the five police officers who were injured filed a lawsuit against Tesla. The lawsuit accuses Tesla of failing to adequately address problems with its Autopilot driver assistance system, seeking between $1 million and $20 million in damages and permanent disability.

A Wall Street Journal investigation successfully obtained footage of the 2019 Model X, and it turns out the driver received 150 warnings from Autopilot before the crash. The warnings advised the driver to take over and lasted 34 minutes.

The 2019 Model X doesn’t have an interior camera, so the car’s Autopilot driver-monitoring system relies primarily on being able to detect torque on the steering wheel. Considering the driver received 150 warnings, it appears the driver was able to apply enough torque to keep Autopilot active.

According to reports, after receiving the 150th warning from the vehicle, the driver of the intoxicated Model X eventually followed Autopilot’s advice to take over. However, at this moment the Model X is only 2.5 seconds and 33.8 meters away from the parked police car. The Wall Street Journal noted that Autopilot attempted to stop the Model X, but the system appeared to be disengaged, expecting the driver to take over.

Tesla has always insisted that the responsibility lay with the allegedly drunk Model X driver, and to some extent, Tesla has a point, because without Autopilot vehicles, if the driver does not pay attention to the road, the driver may immediately go to sleep. Accidents occur and more people are injured. Having said that, if Autopilot gives 150 warnings, it is clear that the driver is not concentrating on the road, and Tesla Autopilot should be programmed to pull over to the side of the road to be safer.

The post Tesla driver collided with a police car appeared first on TechGoing.

]]>
Tesla’s driver monitoring function will be upgraded and the number of blinks will be counted https://www.techgoing.com/teslas-driver-monitoring-function-will-be-upgraded-and-the-number-of-blinks-will-be-counted/ Sun, 14 May 2023 14:53:15 +0000 https://www.techgoing.com/?p=96838 Tesla has stepped up monitoring of drivers to prevent distracted driving and fatigued driving, regardless of whether the vehicle uses the automatic driving function. Tesla installed a driver-facing camera in the car that can judge the driver’s attention and status through eye movement and phone use. If the driver is found to be acting unsafely, […]

The post Tesla’s driver monitoring function will be upgraded and the number of blinks will be counted appeared first on TechGoing.

]]>
Tesla has stepped up monitoring of drivers to prevent distracted driving and fatigued driving, regardless of whether the vehicle uses the automatic driving function.

Tesla installed a driver-facing camera in the car that can judge the driver’s attention and status through eye movement and phone use. If the driver is found to be acting unsafely, Tesla will disable the Autopilot function and remind the driver to concentrate on driving.

According to the hacker Greentheonly, Tesla is upgrading the monitoring system, adding some new indicators, such as the number of times the driver yawns, the number and duration of blinks, and the sitting posture, etc., to calculate whether they have signs of fatigue. These improvements apply not only when using the Autopilot function, but also when the driver is fully in control of the vehicle, with the aim of ensuring driving safety in all situations.

These improvements are not yet officially live, likely because Tesla is still determining which behaviors would make a driver unfit to drive. This improvement is one of several safety features that Tesla has recently introduced, and other features include automatic emergency braking system upgrades.

The post Tesla’s driver monitoring function will be upgraded and the number of blinks will be counted appeared first on TechGoing.

]]>
Tesla autopilot team loses a key figure, Autopilot project product director leaves https://www.techgoing.com/tesla-autopilot-team-loses-a-key-figure-autopilot-project-product-director-leaves/ Wed, 19 Apr 2023 06:29:15 +0000 https://www.techgoing.com/?p=89871 Kate Park, product director of Tesla’s Autopilot project, recently announced that she has left Tesla to join the AI data platform Scale AI as product director. Kate Par tweeted that she had enjoyed building the Autopilot data engine at Tesla and was excited to expand the product to multiple industries from computer vision to large […]

The post Tesla autopilot team loses a key figure, Autopilot project product director leaves appeared first on TechGoing.

]]>
Kate Park, product director of Tesla’s Autopilot project, recently announced that she has left Tesla to join the AI data platform Scale AI as product director.

Kate Par tweeted that she had enjoyed building the Autopilot data engine at Tesla and was excited to expand the product to multiple industries from computer vision to large language models.

Kate Park, a former computer science student at Stanford University, joined Tesla in April 2018 as an Autopilot computer vision intern. Prior to that, she had internships at Uber, Palantir and Google. Having worked at Tesla for almost five years, she has held several roles on the Autopilot team, contributing to the development of the project.

Many Tesla fans may remember that late last year, Tesla held an AI Day where Kate Park demonstrated the Autopilot data engine and how it could improve the neural network to enhance the performance of Autopilot. Kate Park is also one of the inventors of a patent awarded to Tesla in 2020 called “Autonomous and user-controlled vehicle summoning target”, which she co-wrote with Elon Musk and Andrej Karpathy, among others.

Kate Park will be moving to Scale AI in San Francisco, where Scale AI says, “At Scale AI, our mission is to accelerate the development of AI applications. We believe that to make the best models, you need the best data.” Scale AI works with companies and clients such as Microsoft and Meta, as well as AI companies such as Open AI and Cohere.

According to her Link profile, Kate Park currently serves as Director of Product Management at Scale AI.

The post Tesla autopilot team loses a key figure, Autopilot project product director leaves appeared first on TechGoing.

]]>
Tesla removed radar sensors and NHTSA has received hundreds of false complaints https://www.techgoing.com/tesla-removed-radar-sensors-and-nhtsa-has-received-hundreds-of-false-complaints/ Tue, 21 Mar 2023 02:10:36 +0000 https://www.techgoing.com/?p=81453 Tesla CEO Elon Musk (Elon Musk) reportedly announced nearly two years ago that Tesla would stop installing radar sensors in its cars. Data shows that the number of subsequent accidents and near-misses of Tesla cars has increased. Interviews with dozens of former Tesla employees, test drivers and other experts show that after the 2021 upgrade, […]

The post Tesla removed radar sensors and NHTSA has received hundreds of false complaints appeared first on TechGoing.

]]>
Tesla CEO Elon Musk (Elon Musk) reportedly announced nearly two years ago that Tesla would stop installing radar sensors in its cars. Data shows that the number of subsequent accidents and near-misses of Tesla cars has increased.

Interviews with dozens of former Tesla employees, test drivers and other experts show that after the 2021 upgrade, Tesla cars driving through Autopilot or FSD features are braking more often because of non-existent obstacles, misidentifying street signs, and having difficulty identifying emergency vehicles.

Some sources say the increasing number of false braking cases is related to Tesla’s decision to remove radar sensors from its vehicles. Data from the National Highway Traffic Safety Administration (NHTSA), which is investigating the issue, shows that the agency has received hundreds of complaints about false braking in the past nine months. Last year, more than 750 Tesla owners complained that their cars suddenly and inexplicably braked while driving.

Meanwhile, NHTSA also stepped up its investigation of Tesla’s Autopilot feature in 2022 after more than a dozen accidents in which Tesla cars crashed into emergency vehicles. NHTSA said the driver-assist feature had difficulty identifying parked vehicles.

Musk initially announced that Tesla would stop having radar sensors in its cars starting in 2021. Some engineers were reportedly “taken aback” by the statement and contacted a former Tesla executive in hopes of convincing Musk to change his decision. Musk has also said in the past that he wants Tesla’s FSD and Autopilot software to simulate the senses of human drivers through cameras rather than radar.

Currently, all Tesla cars are equipped with Autopilot driver assistance. Users can also pay a one-time $15,000 or $199 a month to enable FSD, which helps cars recognize stop signs and traffic lights, automatically adjust lanes and park themselves. But Tesla says neither Autopilot nor FSD can replace a licensed driver.

Until 2021, Tesla cars will use radar sensors in addition to cameras to identify obstacles. Currently, Tesla relies on eight cameras and Autopilot image taggers to train cars to react to their environment. Tesla employees also tag videos taken by the car’s cameras to train the software to recognize and respond to different obstacles.

Other self-driving sensors, such as LIDAR, are also used by Tesla’s competitors. These vehicles use LIDAR to digitally map the environment and avoid errors, even if the onboard cameras are obscured by external obstacles, such as rain, snow and fog. Musk, however, has previously said that LIDAR is too costly and therefore “doomed to fail.”

Since 2016, Musk has been promising that Tesla will soon launch a truly self-driving car, but experts are not optimistic.

Earlier this year, several experts said that the Tesla FSD is still a long way from self-driving. In February, Tesla released an OTA software update to 362,000 vehicles to address a problem with the FSD, which the NHTSA said at the time could cause cars to “behave unsafely at intersections.

A Tesla spokesperson did not comment. In a voluntarily released vehicle safety report, Tesla said it had the lowest overall probability of injury among all vehicles tested by the U.S. government’s New Car Evaluation Program.

In January, Tesla also said that in the third quarter of 2022, the probability of a Tesla accident was already as low as one accident every 6.26 million miles (about 10.074 million kilometres).

The post Tesla removed radar sensors and NHTSA has received hundreds of false complaints appeared first on TechGoing.

]]>
Tesla Releases 2022 Q4 Safety Report: Autopilot Effective in Reducing Crash Rates https://www.techgoing.com/tesla-releases-2022-q4-safety-report-autopilot-effective-in-reducing-crash-rates/ Tue, 14 Mar 2023 05:51:21 +0000 https://www.techgoing.com/?p=79263 Tesla is now releasing its fourth quarter 2022 safety report. Compared to the previous quarter, Tesla Autopilot saw a decrease in miles between accidents in the fourth quarter of 2022, but an improvement compared to the fourth quarter of 2021. According to Tesla’s records, users of Autopilot technology averaged one crash every 4.85 million miles […]

The post Tesla Releases 2022 Q4 Safety Report: Autopilot Effective in Reducing Crash Rates appeared first on TechGoing.

]]>
Tesla is now releasing its fourth quarter 2022 safety report. Compared to the previous quarter, Tesla Autopilot saw a decrease in miles between accidents in the fourth quarter of 2022, but an improvement compared to the fourth quarter of 2021.

According to Tesla’s records, users of Autopilot technology averaged one crash every 4.85 million miles (approximately 7.853 million kilometers) in the fourth quarter, while for drivers not using Autopilot technology, Tesla recorded one crash every 1.4 million miles.

By comparison, recent data from NHTSA and the FHWA (from 2021) show that a crash occurs approximately every 652,000 miles in the United States.

Tesla Releases 2022 Q4 Safety Report

While this data suggests that Autopilot is effective in reducing crash rates, a side-by-side comparison shows that Tesla’s figure dropped from 6.26 million miles in the third quarter to 4.85 million miles, indicating that Tesla owners were in more frequent crashes during the fourth quarter.

The post Tesla Releases 2022 Q4 Safety Report: Autopilot Effective in Reducing Crash Rates appeared first on TechGoing.

]]>
Tesla confirms Autopilot investigation by U.S. Justice Department, denies wrongdoing https://www.techgoing.com/tesla-confirms-autopilot-investigation-by-u-s-justice-department-denies-wrongdoing/ Wed, 01 Feb 2023 01:37:32 +0000 https://www.techgoing.com/?p=68126 Tesla Inc. confirmed in a regulatory filing that the U.S. Department of Justice has asked the company to provide documents related to its Advanced Assisted Driving System after a criminal investigation was launched against it. Tesla Autopilot System Tesla said in a regulatory filing that the U.S. Department of Justice requested information about its Autopilot […]

The post Tesla confirms Autopilot investigation by U.S. Justice Department, denies wrongdoing appeared first on TechGoing.

]]>
Tesla Inc. confirmed in a regulatory filing that the U.S. Department of Justice has asked the company to provide documents related to its Advanced Assisted Driving System after a criminal investigation was launched against it.

Tesla Autopilot System

Tesla said in a regulatory filing that the U.S. Department of Justice requested information about its Autopilot system, which helps owners complete tasks such as steering and maintaining a safe distance from other vehicles on the highway.

Foreign media previously reported that the U.S. Department of Justice and the U.S. Securities and Exchange Commission are investigating whether Tesla misled consumers and investors about Autopilot’s performance. Federal prosecutors are reviewing statements made by Tesla and its executives about Autopilot’s safety and functionality. The U.S. Department of Justice’s criminal investigation involves authorities in Washington and San Francisco.

In its regulatory filing, Tesla said, “We are not aware of any ongoing investigation in which any government agency has concluded that (we) engaged in any misconduct.” Tesla added that it cannot predict the outcome of any ongoing matters and that “our business, results of operations, prospects, cash flow and financial condition could be materially and adversely affected if the government decides to take enforcement action.”

Tesla has previously said it is safer to drive with Autopilot than without the system. Tesla says its internal data shows that accidents are less common when owners use the Autopilot feature. However, some researchers have criticized the methods used by Tesla.

Tesla would not comment. The U.S. Department of Justice would not comment. Tesla shares rose 3.9 percent yesterday and have accumulated a 41 percent gain this year. Last year, Tesla shares plunged 65 percent, the largest annual decline in its history.

The post Tesla confirms Autopilot investigation by U.S. Justice Department, denies wrongdoing appeared first on TechGoing.

]]>
Tesla senior engineer reveals: 2016 Autopilot demo video was faked https://www.techgoing.com/tesla-senior-engineer-reveals-2016-autopilot-demo-video-was-faked/ Wed, 18 Jan 2023 02:33:51 +0000 https://www.techgoing.com/?p=64900 A senior Tesla engineer’s testimony reveals that the company’s much-touted 2016 demo video of Autopilot, the driver-assisted driving system, was actually faked. In the video, a Tesla Model X is shown driving around cities, suburbs and highways as it automatically stops at red lights and accelerates through green ones. The ad is still displayed on […]

The post Tesla senior engineer reveals: 2016 Autopilot demo video was faked appeared first on TechGoing.

]]>
A senior Tesla engineer’s testimony reveals that the company’s much-touted 2016 demo video of Autopilot, the driver-assisted driving system, was actually faked.

In the video, a Tesla Model X is shown driving around cities, suburbs and highways as it automatically stops at red lights and accelerates through green ones. The ad is still displayed on the Tesla website with the tagline: “The person in the driver’s seat only has to sit there for legal reasons. He doesn’t have to do anything, this car can drive itself.”

▲ Tesla’s 2016 Autopilot demo video (Source: Netease Technology Report)

Tesla CEO Elon Musk used the video as evidence that the company relies on many built-in sensors and driver-assisted driving software to help make the vehicle self-driving.

However, Ashok Elluswamy, director of Tesla’s driver-assisted software Autopilot, said in his latest revealing testimony that the Model X in the video did not use Tesla’s technology for its self-driving test, but rather used 3D maps to navigate on a predetermined route. In other words, Tesla Autopilot does not have dynamic route planning capabilities, so it required the company’s engineers to pre-map its driving route for the filming of the promotional video.

Ehlerswami’s testimony was used as evidence in a lawsuit against Tesla following the 2018 fatal crash of former Apple engineer Walter Huang. The testimony, which has not been publicly reported before, is the first time a Tesla employee has confirmed and detailed how the video was made.

In his testimony, Ellerswami said that at Musk’s request, the Tesla Autopilot team documented “the capabilities of the system and demonstrated them. To create the video, Tesla used 3D maps on a predetermined route. In addition, a human driver intervened during the demonstration. While trying to demonstrate that the Model X can park itself without a driver, a test car crashed into a Tesla parking lot fence.

Elleswami explained, “The video was not made to accurately depict the features available to consumers in 2016, but rather to depict what might be built into the system.”

When Tesla released the video, Musk tweeted, “Tesla’s vehicles can drive themselves on city streets, highways, completely without human driver intervention, and then find parking spaces automatically.”

The New York Times had reported in 2021, citing anonymous sources, that Tesla engineers had produced a 2016 demo video to promote Autopilot, but did not reveal that the route had been mapped ahead of time or that the test vehicle had been involved in a crash. When asked if the 2016 video demonstrated the performance of Autopilot available on production cars at the time, Ellerswami also answered in the negative.

Andrew McDevitt, an attorney representing Walter Huang’s wife, challenged Ellerswami’s testimony in July, saying that “a video without any disclaimer or explanation is clearly misleading.

The National Transportation Safety Board (NTSB) concluded in 2020 that Walter Huang’s fatal crash may have been caused by a combination of distraction and limitations in the Autopilot feature. The agency said Tesla’s “ineffective monitoring of the driver’s attention” contributed to the crash.

Ellerswami said the driver could “fool the monitoring system” into believing their attention was on the steering wheel. But he said he doesn’t think Autopilot is a safety issue if the driver really stays focused.

Neither Ellerswami, Musk nor Tesla returned requests for comment. However, the company has warned drivers that they must keep their hands on the wheel and maintain control of the vehicle when using Autopilot. On the company’s website, Tesla emphasized that its technology is designed to help vehicles steer, brake, accelerate and change lanes automatically, but that it “does not make the vehicle self-driving.

The testimony comes at a time when Tesla is facing multiple lawsuits and regulatory scrutiny for misrepresenting its driver assistance system. In 2021, the U.S. Department of Justice reportedly began a criminal investigation into Tesla’s claims that its vehicles could drive themselves, following a number of crashes involving Autopilot.

The post Tesla senior engineer reveals: 2016 Autopilot demo video was faked appeared first on TechGoing.

]]>