[ad_1]
Tesla every three months security report This provides the number of miles between accidents when drivers use Autopilot, the company’s driver assistance system, and the number of miles between accidents when they don’t.
These numbers show that accidents are always less with Autopilot, a collection of technologies that can steer, brake and accelerate Tesla vehicles on its own.
But the numbers are misleading. Autopilot is mainly used for highway driving and is usually Twice as safe as driving on city streetsAccording to the Ministry of Transport. Fewer accidents can occur as autopilot is only used in generally safer situations.
Tesla did not provide data that would allow comparing the safety of Autopilot on the same type of roads. There are no other automakers that offer similar systems.
Autopilot has been on public roads since 2015. General Motors launched the Super Cruise in 2017, while Ford Motor launched the BlueCruise last year. However, there is insufficient publicly available data to reliably measure the safety of these technologies. American drivers—whether they use these systems or share the road with them—are effectively guinea pigs in an experiment whose results have not yet been announced.
Automakers and tech companies are adding more vehicle features that they claim increase safety, but these claims are hard to verify. Meanwhile, deaths on the country’s highways and streets have been climbing in recent years, Reaching a 16-year high in 2021. Any additional safety afforded by technological advances doesn’t seem to make up for the poor judgment of drivers behind the wheel.
“There is a lack of data to give the public confidence that these systems are delivering the expected safety benefits as they are deployed,” said J. Christian Gerdes, professor of mechanical engineering and co-director of the Stanford University Center for Environment and Urbanization. Automotive Research, the first chief innovation officer of the Ministry of Transport.
GM collaborated with the University of Michigan on a study investigating the potential safety benefits of Super Cruise, but concluded that they did not have enough data to understand whether the system reduced crashes.
A year ago, the National Highway Traffic Safety Administration, the government’s auto safety regulator, ordered companies to report potentially serious accidents involving advanced driver assistance systems along Autopilot lines within one day of learning about them. The Emir said the agency would make the reports public, but has yet to do so.
The security agency declined to comment on what information it has collected so far, but said in a statement that the data will be released “in the near future”.
Tesla and its CEO, Elon Musk, did not respond to requests for comment. GM said it reported two Super Cruise-related incidents to NHTSA: one in 2018 and one in 2020. Ford declined to comment.
The agency’s data is unlikely to give a complete picture of the situation, but it could encourage legislators and drivers to take a closer look at these technologies and ultimately change the way they are marketed and regulated.
“To solve a problem, you must first understand it,” said Bryant Walker Smith, an associate professor in emerging transportation technologies at the University of South Carolina law and engineering schools. “It’s a way to get more ground truth as a basis for investigations, regulations, and other actions.”
Despite its capabilities, the autopilot does not remove the responsibility from the driver. Tesla tells drivers to be alert and always ready to take control of the car. The same goes for BlueCruise and Super Cruise.
But many experts worry that these systems, which allow drivers to relinquish active control of the car, may lead them to think they are driving their car. Then, when technology fails or cannot resolve a situation on its own, drivers may be unprepared to take control as quickly as necessary.
Older technologies such as automatic emergency braking and lane departure warning have long provided safety nets for drivers by slowing down or stopping the vehicle or alerting drivers when they leave their lane. But newer driver assistance systems are reversing this regulation by making the driver the safety net of technology.
Security professionals are particularly concerned about Autopilot because of the way it is marketed. For years, Mr. Musk has said that the company’s cars are on the verge of true autonomy and drive themselves in almost every situation. The name of the system also refers to the automation that technology has not yet achieved.
This can cause driver drowsiness. The autopilot has played a role in many fatal crashes, in some cases due to the drivers not being prepared to take control of the car.
Mr. Musk has long promoted Autopilot as a way to improve safety, and Tesla’s quarterly safety reports seem to back it up. However last study A report from the Virginia Transportation Research Council, an arm of the Virginia Department of Transportation, shows that these reports are not what they seem.
“We know that cars using Autopilot have fewer crashes than when not using Autopilot,” said Noah Goodall, a researcher at the council who studies the safety and operational issues surrounding autonomous vehicles. “But are they driven by the same drivers, on the same roads, at the same time of day, in the same way?”
How Elon Musk’s Twitter Deal Was Opened
A blockbuster deal. The world’s richest man, Elon Musk, has capped off a seemingly impossible venture by the famous mercurial billionaire. Buy Twitter for about $44 billion. The deal went like this:
When analyzing police and insurance data, the Insurance Institute for Highway Safety, a nonprofit research organization funded by the insurance industry, discovered that legacy technologies such as automatic emergency braking and lane departure warning improved safety. But the organization says studies have yet to show that driver assistance systems offer similar benefits.
Part of the problem is that police and insurance data don’t always indicate whether these systems were in use at the time of an accident.
The federal auto safety agency has ordered companies to provide data on collisions when driver assistance technologies are used within 30 seconds of a collision. This can provide a broader picture of how these systems are performing.
But security experts said that even with this data, it will be difficult to determine whether using these systems is safer than turning them off in the same situations.
The Alliance for Automotive Innovation, a trade group for auto companies, has warned that the federal safety agency’s data could be misinterpreted or misrepresented. Some independent experts have expressed similar concerns.
“My biggest concern is that we have detailed data on crashes involving these technologies, without comparable data on crashes involving conventional cars,” said Matthew Wansley, who specializes in emerging automotive technologies at the Cardozo School of Law in New York. General counsel at an autonomous vehicle start-up called nuTonomy. “Potentially, these systems may appear to be much less secure than they actually are.”
For these and other reasons, automakers may be reluctant to share some data with the agent. Under the order, companies can require companies to withhold certain data, claiming that it will reveal trade secrets.
The agency also collects crash data on automated driving systems, which are more advanced technologies aimed at completely removing drivers from cars. These systems are often referred to as “self-driving cars”.
For the most part, this technology is still being tested in a relatively small number of cars where drivers are behind the wheel as backup. Waymo, a company owned by Google’s parent company Alphabet, offers a driverless service in the suburbs of Phoenix, and similar services are planned in cities like San Francisco and Miami.
In some states, companies are already required to report accidents involving automated driving systems. The data of the federal security agency, which will cover the whole country, should provide additional information in this area as well.
But the more pressing concern is the safety of Autopilot and other driver assistance systems installed in hundreds of thousands of vehicles.
“There is an open question: Does the autopilot increase or decrease the frequency of collisions?” said Mr Wansley. “We may not get a complete answer, but we will get some useful information.”
[ad_2]
Source link