Close Menu
  • Home
  • Business News
    • Entrepreneurship
  • Investments
  • Markets
  • Opinion
  • Politics
  • Startups
    • Stock Market
  • Trending
    • Technology
  • Online Jobs

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Tech Entrepreneurship: Eliminating waste and eliminating scarcity

July 17, 2024

AI for Entrepreneurs and Small Business Owners

July 17, 2024

Young Entrepreneurs Succeed in Timor-Leste Business Plan Competition

July 17, 2024
Facebook X (Twitter) Instagram
  • Home
  • Business News
    • Entrepreneurship
  • Investments
  • Markets
  • Opinion
  • Politics
  • Startups
    • Stock Market
  • Trending
    • Technology
  • Online Jobs
Facebook X (Twitter) Instagram Pinterest
Prosper planet pulse
  • Home
  • Privacy Policy
  • About us
    • Advertise with Us
  • AFFILIATE DISCLOSURE
  • Contact
  • DMCA Policy
  • Our Authors
  • Terms of Use
  • Shop
Prosper planet pulse
Home»Technology»Deposition reveals programming flaws in Tesla Autopilot
Technology

Deposition reveals programming flaws in Tesla Autopilot

prosperplanetpulse.comBy prosperplanetpulse.comApril 7, 2024No Comments8 Mins Read0 Views
Share Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


Tesla’s Autopilot driver-assistance system uses “sophisticated cameras, sensors, and computing power” to automatically steer, accelerate, and brake to “avoid getting stuck at low speeds,” Tesla marketing materials say. It has been introduced as an amazing technology that can even change lanes. cars or trucks. ”

However, upon oath, Tesla engineer Akshay Fatak Last year, I explained that this software was pretty basic in at least one respect. I will steer it myself.

“If there is a clearly marked lane, the system will follow the lane,” Fatak said in a July 2023 cross-examination. He said Tesla’s revolutionary system was simply “designed” to follow painted lanes.

Fatak’s testimony, obtained by The Washington Post, was included in a deposition in a wrongful death lawsuit scheduled for trial Tuesday. This incident includes In March 2018, a Tesla on autopilot crashed into a highway barrier near Mountain View, California, causing a fatal accident. after being confused by what the company’s lawyers described in court documents as “faded and nearly obliterated” lanes.

The driver, Walter Huang, 38, was killed. A subsequent investigation by the National Transportation Safety Board cited Tesla’s failure to restrict Autopilot’s use in such situations as a contributing factor. The company confirmed to the National Transportation Safety Board that Autopilot is designed for areas with “clear lane markings.”

Fatak’s testimony marks the first time Tesla has publicly explained these design decisions and peels back the curtain on a system that has been shrouded in secrecy by the company and controversial CEO Elon Musk. become. Musk, Fatak and Tesla did not respond to requests for comment.

Staying in your lane isn’t limited to Teslas. Many modern cars use technology that alerts drivers when drifting. But by promoting the technology as “Autopilot,” Tesla may be misleading drivers about the car’s capabilities. This is the central claim in a number of lawsuits scheduled for trial this year, and a major concern for federal security officials.

For years, Tesla and Federal regulators were aware of problems with lane-following Autopilot, including steering cars in the wrong direction. placed In some cases, this can lead to fatal consequences. Unlike vehicles designed to be fully autonomous, like Waymo and Cruise cars, Tesla currently doesn’t use sensors like radar or lidar to detect obstacles. Instead, Tesla relies on cameras.

After the crash that killed Hwang, Tesla told authorities it updated its software to better recognize “poor and faded” lane markings and give drivers audible warnings if the vehicle is about to lose its lane. It was announced that. A disappearing lane. However, the updates stopped to the point where the feature was automatically forced out in such situations. Nearly two years after Huang’s death, federal investigators said they could not determine whether those updates were sufficient to “accurately and consistently detect abnormal or worn lane markings.” . thus preventing yellow attacks crash.

Huang, an Apple engineer, bought a Tesla Model Did. On the day of the accident, his car’s lane lines faded and it began to drift. He then got a clearer line on the left and placed his car between the lanes and in the straight path of the safety barrier separating the highway and the State Route 85 exit.

Huang’s car hit the barrier at 111mph, shattering its front end and twisting it into an unrecognizable heap. Huang was pronounced dead several hours later, according to court documents.

In the months before the crash, Huang’s car swerved in similar locations 11 times, according to internal Tesla data discussed by Huang’s lawyers during a court hearing last month. According to the data, the car self-corrected seven times. The other four required Hwang’s intervention.Suspect Hwang is said to have been playing games at his home. A phone call when an accident occurs.

The NTSB concluded that driver distraction and Autopilot’s “system limitations” likely contributed to Huang’s death. In a report released nearly two years after the crash, investigators said Tesla’s “ineffective oversight” of driver involvement also “reinforced driver complacency and carelessness.”

Investigators also said the California Highway Patrol’s failure to report damage to a crash barrier that had been destroyed in an earlier crash contributed to the severity of Huang’s injuries.

Huang’s family sued Tesla, alleging wrongful death, and sued the state of California over the damaged crash barrier. The Post obtained copies of several depositions in the case, including previously unreported testimony. Reuters also recently reported on depositions in the case.

The document reveals one of federal regulators and safety officials’ biggest complaints about Tesla: Why Autopilot is sometimes used It is used on roads that Tesla’s manual states it is not designed for. These areas include roads with cross traffic, urban roads with frequent traffic lights and stop signs, and roads without clear lane markings.

In his sworn testimony, Fatak said Autopilot works wherever the car’s camera detects lines on the road: “As long as the lanes are drawn, the system will follow them.”

Asked about another accident involving the software, Fatak said that in 2019, driver Jeremy Banner was killed when his Tesla crashed into a semi-truck and became trapped under a tractor-trailer. The pilot disputed the NTSB’s assertion that it should not have worked. “If I remember correctly, there were lane lines painted on that road,” Fatak said. Banner’s family has filed a wrongful death lawsuit, but the case has not yet gone to trial.

Musk has said cars operating on Autopilot are safer than those controlled by humans, but multiple plaintiffs and some experts say this message creates a false sense of complacency among Tesla drivers. points out. The company claims it is not responsible for accidents because user manuals and dashboard screens make it clear to Tesla drivers that they are solely responsible for maintaining control of the vehicle at all times. are doing. So far, this argument has prevailed in the courts, with a California jury recently finding Tesla not liable for a fatal accident that occurred when Autopilot was allegedly engaged.

Autopilot is included in almost every Tesla. It steers on public roads, follows a set course on the highway, and maintains a set speed and distance without human input. Depending on the driving mode you select, you can also change lanes to pass cars or maneuver aggressively in traffic. Do not stop at stop signs or traffic lights. For his additional $12,000, a driver can purchase a package called Full Autonomous Driving, which responds to traffic signals and provides the vehicle with the following features: Follow turn-by-turn instructions on the road.

Since 2017, NTSB officials have asked Tesla to limit the use of Autopilot to highways with no intersecting traffic, and to areas specified in the company’s user manual as Autopilot’s target areas. asked a lawyer for Huang’s family. If Tesla “decides not to do anything” about that recommendation, Fatak argued, the company is already following NTSB guidance to limit the use of Autopilot to roads with lane markings.

“In my opinion, we are already doing that,” Fatak said. “We have already restricted the use of Autopilot.”

A Washington Post investigation last year detailed at least eight fatal or serious Tesla accidents that occurred with Autopilot engaged on busy roads.

Last month, the Government Accountability Board ordered the National Highway Traffic Safety Administration, the top auto safety regulator, to make additional changes regarding driver assistance systems “to clarify the scope of intended use and the driver’s responsibility to monitor the system and driving.” asked to provide information. The environment must be protected while such systems are in operation. ”

Fatak’s testimony also shed light on other driving aids. These include design choices, such as Tesla’s decision to monitor driver attention through sensors that measure steering wheel pressure. Lawyers for the Huang family repeatedly asked what tests and studies Tesla had conducted to confirm the effectiveness of the method, but Fatak said it had simply tested it on employees. answered.

Other design decisions Tesla makes differ from competitors pursuing self-driving cars. First, Tesla sells its systems to consumers, while other companies tend to deploy their vehicles as taxis. It also uses a unique camera-based system, which reduces the restrictions on where the software can be used. For example, a spokesperson for Alphabet Inc.’s self-driving car company Waymo said its vehicles only drive in areas that have been rigorously mapped and where the vehicles have been tested in conditions such as fog and rain. -fencing. “

“We designed the system recognizing that lanes and their markings can change, become temporarily blocked, move, or even disappear completely.” Waymo spokeswoman Katherine Barna said.

California regulators are also limiting where these self-driving cars can drive and how fast they can drive.

Asked whether Autopilot uses GPS or other mapping systems to check whether roads are suitable for the technology, Fatak said no. “It’s not map-based,” he said, but this is a reference to a 2016 call with reporters in which he said Tesla could rely on GPS as a backup “if a road sign could disappear.” This was different from what Musk said at the meeting. In an audio recording of the call cited by the Huang family’s lawyers, Musk said he was able to rely on satellite navigation for “a few seconds” while the car searched for lane markings.

Tesla’s heavy reliance on lane markings reflects a widespread lack of redundancy within the system when compared to its competitors. The newspaper previously reported that Tesla’s decision to omit radar from new models at Musk’s order led to an increase in crashes.

Rachel Lerman contributed to this report.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
prosperplanetpulse.com
  • Website

Related Posts

Technology

Empowered Funds LLC Increases Holdings in Micron Technology, Inc. (NASDAQ:MU)

July 14, 2024
Technology

Portland Film, Animation and Technology Festival Returns with Over 100 Films

July 14, 2024
Technology

Quest from the infinite stairs

July 14, 2024
Technology

Intel and State of Oregon Advance National Semiconductor Technology Center

July 14, 2024
Technology

Leveraging Technology to Boost Malaysia’s Sports Economy – OpEd – Eurasia Review

July 14, 2024
Technology

Digital technology can help avoid medical malpractice lawsuits: Judge Madhav Devi

July 14, 2024
Add A Comment
Leave A Reply Cancel Reply

Subscribe to News

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

Editor's Picks

The rule of law is more important than feelings about Trump | Opinion

July 15, 2024

OPINION | Biden needs to follow through on promise to help Tulsa victims

July 15, 2024

Opinion | Why China is off-limits to me now

July 15, 2024

Opinion | Fast food chains’ value menu wars benefit consumers

July 15, 2024
Latest Posts

ATLANTIC-ACM Announces 2024 U.S. Business Connectivity Service Provider Excellence Awards

July 10, 2024

Costco’s hourly workers will get a pay raise. Read the CEO memo.

July 10, 2024

Why a Rockland restaurant closed after 48 years

July 10, 2024

Stay Connected

Twitter Linkedin-in Instagram Facebook-f Youtube

Subscribe