[ad_1]
DETROIT (AP) — A US investigation into Teslas working on partially automated driving systems that crashed into parked emergency vehicles is one step closer to a recall.
The National Highway Traffic Safety Administration said Thursday it has upgraded the probe to an engineering analysis, another sign of the electric vehicle manufacturer’s increased scrutiny and automated systems that do at least some driving tasks.
An engineering analysis is the final stage of an investigation, and in most cases the NHTSA will decide within a year whether there will be a recall or whether the investigation should be closed.
Documents released by the agency on Thursday raise some serious issues about Tesla’s Autopilot system. The agency found that it was used in areas where its capabilities were limited, and that many drivers did not take action to prevent accidents despite warnings from the vehicle.
The agency reported 16 crashes that crashed into emergency vehicles and trucks with warning signs, injuring 15 people and killing one.
The probe now includes 830,000 vehicles, Austin, Texas, pretty much everything the automaker has sold in the US since the start of the 2014 model year.
Additional data from investigators will assess vehicle performance and “discover the extent to which Autopilot and related Tesla systems may exacerbate human factors or behavioral safety risks that undermine the effectiveness of driver control,” the agency said.
A message was left from Tesla on Thursday asking for comment.
In most of the 16 accidents, Teslas issued forward collision warnings to drivers just before they crashed. Automatic emergency braking intervened in about half of the cases to at least slow the cars. According to NHTSA documents, Autopilot gave up control of the Teslas less than a second before the crash.
In documents detailing its engineering analysis, NHTSA wrote that it also investigated accidents involving similar patterns that did not include emergency vehicles or trucks with warning signs.
The agency found that in most cases, drivers had their hands on the wheel but did not take action to avoid a collision. “This shows that drivers can be aligned with the driver engagement strategy as designed,” the agency said.
Investigators also wrote that a driver’s use or abuse of the driver monitoring system “or the unintentional starting of a vehicle does not necessarily preclude a system failure.”
The agency will have to decide if there is a security flaw before recalling it.
The agency looked at 191 crashes in total, but removed 85 because other drivers were involved or because there wasn’t enough information to make a definitive assessment. Of the remaining 106, the main cause of the crash appears to be operating Autopilot in areas that have limitations or where conditions could hinder their operations. “For example, working on roads other than limited-access highways or operating in low-traction or visibility environments such as rain, snow or ice.”
NHTSA has reported that since 2018, Teslas have crashed into vehicles following a series of accidents using the company’s Autopilot or Traffic Sensitive Cruise Control systems, in scenes where first responders used flashing lights, flares, a lighted arrowboard or a cone warning. It began its investigation in August of that year. dangers.
[ad_2]
Source link