How Safe Are Systems Like Tesla’s Autopilot? No One Knows.

Every three months, Tesla publishes a safety report that provides the number of miles it crashes when drivers use the company’s driver-assistance system, Autopilot, and the number of miles it crashes when they don’t.

These figures always show that accidents are less frequent with Autopilot, a collection of technologies that can steer, brake and accelerate Tesla vehicles on its own.

But the numbers are misleading. Autopilot is used for driving highway, which is usually twice as safe as city streets, according to the Department of Transportation. Fewer crashes may occur with Autopilot simply because it is typically used in safer situations.

Tesla has not provided data that will allow a comparison of Autopilot’s safety on the same types of roads. Otherwise there are other carmakers that offer similar systems.

Autopilot has been on public roads since 2015. General Motors introduced Super Cruise in 2017, and Ford Motor brought out BlueCruise last year. But publicly available data that reliably measures the safety of these technologies is scant. American drivers – whether using these systems or sharing the road with them – are effectively an experiment in guinea pigs whose results have not yet been revealed.

Carmakers and tech companies are adding more vehicle features that they claim improve safety, but it’s difficult to verify these claims. All the while, fatalities on the expected highways and streets have been climbing in recent years, reaching a 16-year high in 2021. It would seem that any additional safety provided by technological advances is not offsetting poor decisions by drivers behind the wheel.

“There is a lack of data that will give the public the confidence that these systems, as deployed, will live up to their expected safety benefits,” said J. Christian Gerdes, a professor of mechanical engineering and co-director of Stanford University’s Center. Who was the first chief innovation officer for the Department of Transportation.

GM collaborated with a University of Michigan study that explored the potential safety benefits of Super Cruise but concluded that they did not have enough data to understand why the system crashes.

A year ago, the National Highway Traffic Safety Administration, the government’s auto safety regulator, ordered companies to report serious crashes involving advanced driver-assistance systems along the lines of Autopilot about a day’s learning about them. The order said the agency would make the reports public, but it has not yet done so.

The safety agency declined to comment on what information it had collected so far but said in a statement that the data would be released “in the near future.”

Tesla and its chief executive, Elon Musk, did not respond to requests for comment. GM said it had reported two incidents involving Super Cruise to NHTSA: one in 2018 and one in 2020. Ford declined to comment.

The agency’s data is likely to provide a complete picture of the situation, but it could encourage lawmakers and drivers to take a closer look at these technologies and ultimately change the way they are marketed and regulated.

“To solve a problem, you first have to understand it,” said Bryant Walker Smith, an associate professor at the University of South Carolina’s law and engineering schools who specializes in emerging transportation technologies. “This is a way of getting more ground truth as a basis for investigations, regulations and other actions.”

Due to its abilities, Autopilot does not remove responsibility from the driver. Tesla tells drivers to stay alert and to take control of the car at all times. The same is true of BlueCruise and Super Cruise.

But many experts worry that these systems, because they enable drivers to relinquish active control of the car, may lull them into thinking that their cars are driving themselves. Then, when technology malfunctions or can’t handle a situation on its own, drivers may be unprepared to take control as quickly as needed.

Older technologies, such as automatic emergency braking and lane departure warning, have long provided safety nets for slowing down or stopping the car or warning drivers when they drift out of their lane. But newer driver-assistance systems flip that arrangement by making the driver the safety net for technology.

Safety experts are particularly concerned about Autopilot because of the way it is marketed. For years, Mr. Musk said the company’s cars were on the verge of true autonomy – driving themselves into practically any situation. The system’s name also implies automation that the technology has not yet achieved.

This may lead to driver complacency. Autopilot has played a role in many fatal crashes, in some cases because drivers were not prepared to take control of the car.

Mr. Musk has long promoted Autopilot as a way of improving safety, and Tesla’s quarterly safety reports seem to back him up. But a recent study from the Virginia Transportation Research Council, an arm of the Virginia Department of Transportation, shows that these reports are not what they seem.

“We know cars using Autopilot are crashing less often than when Autopilot is not used,” said Noah Goodall, a researcher at the council who explores safety and operational issues surrounding autonomous vehicles. “But are they being driven in the same way, on the same roads, at the same time of day, by the same drivers?”

Analyzing police and insurance data, the Insurance Institute for Highway Safety, a nonprofit research organization funded by the insurance industry, has found that older technologies like automatic emergency braking and lane departure warning have improved safety. But the organization says studies have not yet shown that driver-assistance systems provide similar benefits.

Part of the problem is that the police and insurance data do not always supply these systems at the time of the crash.

The federal auto safety agency has ordered companies to provide data on crashes when driver-assisted technologies were used within 30 seconds of impact. This could provide a broader picture of how these systems are performing.

But even with that data, safety experts said, it would be difficult to determine if using these systems is safer than turning them off in the same situation.

The Alliance for Automotive Innovation, a trade group for car companies, warned that the federal safety agency’s data could be misconstrued or misrepresented. Some independent experts express similar concerns.

“My big worry is that we will have detailed data on these technologies, including crashes involving comparable cars without comparable data,” said Matthew Wansley, a professor at the Cardozo School of Law in New York who specializes in emerging automotive technologies and was up to date. General counsel at an autonomous vehicle start-up called nuTonomy. “It might look like these systems are a lot less safe than they really are.”

For this and other reasons, carmakers may be reluctant to share some data with the agency. Under its order, companies can ask to hold certain data by claiming it will reveal business secrets.

The agency is also collecting automated driving systems on crash data – more advanced technologies that aim to completely remove drivers from cars. These systems are often referred to as “self-driving cars.”

For the most part, this technology is still being tested in a relatively small number of cars with drivers behind the wheel as a backup. Waymo, a company owned by Google’s parent, Alphabet, operates a service without drivers in the suburbs of Phoenix, and similar services are planned in cities like San Francisco and Miami.

Companies are already required to report automated driving systems in some states. The federal safety agency’s data, which will cover the entire country, should provide additional insight into this area, too.

But of more immediate concern is the safety of the Autopilot and other driver-assistance systems, which are installed on hundreds of thousands of vehicles.

“There is an open question: Is Autopilot increasing crash frequency or decreasing it?” Mr. Wansley said. “We may not get a complete answer, but we will get some useful information.”

Leave a Comment