Over the course of 10 months, nearly 400 car crashes in the United States have involved advanced driver assistance technologies, the federal government’s top auto safety regulator revealed in its first-ever publication on Wednesday. large-scale data on these burgeoning systems.
In 392 incidents recorded by the National Highway Traffic Safety Administration from July 1 last year to May 15, six people died and five were seriously injured. Teslas running on Autopilot, the more ambitious Full Self Driving mode, or one of their associated features have been in 273 crashes.
The disclosures are part of a broad effort by the federal agency to determine the safety of advanced driving systems as they become increasingly common. Beyond the futuristic look of self-driving cars, dozens of automakers have rolled out automated components in recent years, including features that let you take your hands off the wheel in certain conditions and help you park while driving. parallel.
In Wednesday’s statement, NHTSA said Honda vehicles were involved in 90 incidents and Subarus in 10. Ford Motor, General Motors, BMW, Volkswagen, Toyota, Hyundai and Porsche each reported five or fewer.
“These technologies hold great promise for improving safety, but we need to understand how these vehicles perform in real-world situations,” said Steven Cliff, the agency’s administrator. “This will help our investigators quickly identify potential emerging defect patterns.”
Speaking to reporters ahead of Wednesday’s publication, Dr Cliff also cautioned against drawing conclusions from the data collected so far, noting that it does not take into account factors such as the number of cars from each manufacturer that are on the road and equipped with these types. of technologies.
“Data can raise more questions than it answers,” he said.
About 830,000 Tesla cars in the United States are equipped with Autopilot or the company’s other driver assistance technologies, which is why Tesla vehicles account for nearly 70% of reported crashes.
Ford, GM, BMW and others have similar advanced systems that allow hands-free driving in certain highway conditions, but far fewer of these models have been sold. These companies, however, have sold millions of cars over the past two decades that are equipped with individual components of driver assistance systems. Components include something called lane keeping, which helps drivers stay in their lanes, and adaptive cruise control, which maintains a car’s speed and automatically brakes when traffic slows.
Dr. Cliff said NHTSA would continue to collect data on crashes involving these types of features and technologies, noting that the agency would use them as guidance in establishing rules or requirements for how they should be designed and used.
The data was collected as part of an order issued by NHTSA a year ago that required automakers to report crashes involving cars equipped with advanced driver assistance systems, also known as ADAS or level 2 automated driving systems.
The order was prompted in part by crashes and deaths over the past six years that involved Teslas operating on autopilot. Last week, NHTSA expanded an investigation to determine whether the Autopilot has technological and design flaws that pose safety risks. The agency has investigated 35 crashes that occurred while Autopilot was on, including nine that have resulted in the deaths of 14 people since 2014. It had also opened a preliminary investigation into 16 incidents in which Teslas were on Autopilot. crashed into emergency vehicles that had pulled over and flashed their lights.
As part of the order issued last year, NHTSA also collected data on crashes or incidents involving fully automated vehicles that are mostly still in development but are being tested on public roads. Manufacturers of these vehicles include GM, Ford and other traditional automakers as well as technology companies such as Waymo, which is owned by Google’s parent company.
These types of vehicles have been involved in 130 incidents, according to NHTSA. One resulted in a serious injury, 15 minor or moderate injuries and 108 resulted in no injury. Many crashes involving automated vehicles have resulted in fender bending or bumper banging, as they are primarily used at low speeds and in city driving.
Waymo, which runs a fleet of driverless taxis in Arizona, was part of 62 incidents. GM’s Cruise division, which just started offering driverless taxi rides in San Francisco, was involved in 23 minor crashes involving an automated test vehicle made by start-up Pony.ai. resulted in the recall of three of the company’s test vehicles. vehicles to correct the software.
NHTSA’s order was an unusually bold step for the regulator, which has come under fire in recent years for not being more assertive with automakers.
“The agency is collecting information to determine whether, in the field, these systems pose an unreasonable safety risk,” said J. Christian Gerdes, professor of mechanical engineering and director of the Center for Automotive Research at the University of Stanford.
The problems with Tesla’s Autopilot system
Safer driving claims. Tesla cars can use computers to manage certain aspects of driving, such as changing lanes. But there are concerns that this driver assistance system, called Autopilot, is not safe. Here’s a closer look at the matter.
An advanced driver assistance system can steer, brake and accelerate vehicles on its own, although drivers must remain alert and ready to take control of the vehicle at all times.
Safety experts are concerned that these systems allow drivers to relinquish active control of the car and could lull them into thinking their car is driving itself. When technology malfunctions or cannot handle a particular situation, drivers may not be ready to take control quickly.
The NHTSA order required companies to provide crash data when advanced driver assistance systems and automated technologies were used within 30 seconds of impact. Although this data provides a broader picture than ever before of the behavior of these systems, it is still difficult to determine whether they reduce accidents or improve safety.
The agency has not collected data that would allow researchers to easily determine whether using these systems is safer than turning them off in the same situations.
“The question: what baseline are we comparing this data to? said Dr. Gerdes, a Stanford professor who from 2016 to 2017 served as the first director of innovation for the Department of Transportation, of which NHTSA is a part.
But some experts say comparing these systems to human driving shouldn’t be the goal.
“When a Boeing 737 falls out of the sky, we don’t ask, ‘Is it falling out of the sky more or less than other airplanes?'” said Bryant Walker Smith, associate professor of law and law at the University of South Carolina. engineering schools specializing in emerging transport technologies.
“Accidents on our roads are equivalent to several plane crashes each week,” he added. “Comparison is not necessarily what we want. If there are any crashes these drive systems are contributing to – crashes that otherwise wouldn’t have happened – that’s a potentially fixable problem we need to know about.
#Selfdriving #driverassist #technology #linked #hundreds #crashes #data #shows