Tesla crash casts doubt on safety of self-driving cars

8
7 minutes

BERKELEY, California, April 20- The fatal crash of a Tesla with no one apparently behind the wheel has cast a new light on the safety of semi-autonomous vehicles and the nebulous U.S. regulation terrain they navigate.

On Saturday, the police in Harris County, Texas, said a Tesla Model S burst into a tree at high speed after failing to negotiate a bend and smashed into flames killing one occupant in the front passenger seat and owner in the back seat.

On Monday, Tesla CEO Elon Musk tweeted that preliminary data retrieved by Tesla indicated that the vehicle was not operating on Autopilot, and was not part of the full self-driving system of the automaker.

Autopilot and FSD, as well as the increasing number of other automakers have similar semi-autonomous driving functions in cars, present a challenge to officials responsible for motor vehicle and highway safety.

The National Highway Traffic Safety Administration has yet to issue fully autonomous regulations or performance standards for semi-autonomous systems such as Autopilot or fully autonomous vehicles.

There are no NHTSA rules requiring carmakers to prevent vehicles misusing systems, or to ensure drivers are using them as intended. The only significant federal limit is that cars have steering wheels and federal controls required under Federal laws.

With no performance or regulatory standards, systems such as Autopilot inhabit a technical grey zone.

The Texas crash follows a string of crashes involving Tesla cars driven on Autopilot, its partially automated driving system that performs a range of functions such as helping drivers stay in lanes and accelerate on highways.

Since October, Tesla has also rolled out what it describes as a public version of its FSD system to about 2,000 customers, which can effectively test how well it works on beta roads.

The Harris County police are now seeking a search warrant for Tesla data and said witnesses told them the victims intended to test the auto's automated driving.

The regulatory confusion is that traditionally NHTSA supervises vehicle safety, while Departments of Motor Vehicle in individual states regulate drivers.

When it comes to semi-autonomous functions, it may not be apparent whether the onboard computer or the driver are controlling the car or if the supervision is shared, says the National Transportation Safety Board of America.

California has implemented AV regulations but these only apply to cars equipped with technology that can perform the active physical driving task without the dynamic control or monitoring of a human operator, the state's DMV told Reuters.

It said Tesla's full self-driving system does not yet meet those standards and is considered a type of Advance Driver Assistance System that it does not regulate.

That leaves Tesla's Autopilot and its FSD system operating in California in regulatory limbo as the automaker rolls out new versions of the systems for its customers to test.

NHTSA, the Federal Body responsible for vehicle safety, said this week it had opened 28 investigations into crashes of Tesla vehicles, 24 of which remain active and at least four, including the fatal Texas accident, have occurred since March.

NHTSA has repeatedly argued that its unreasonable authority to demand automakers recall any vehicle that poses an unreasonable safety risk is sufficient to address driver assistance systems.

So far, Tesla has not taken any enforcement action against its advanced driving systems.

Jen Psaki, a White House spokeswoman, said the NHTSA was actively involved with Tesla and local law enforcement regarding the Texas crash.

The NTSB, a U.S. government agency charged with investigating road accidents, has criticised the hands-off approach to regulating cars with self-driving features and AVs.

NHTSA refuses to take action for vehicles termed as having minimal level or lower automation and continues to wait for higher levels of automation before requiring that AV systems meet partial national standards, writes NTSB chairman Robert Sumwalt in a Feb. 1 letter to NHTSA.

Because NHTSA has put in place no requirements, manufacturers can test vehicles virtually anywhere, even if the location exceeds the AV control system limits, the letter said.

NHTSA told Reuters that with a new administration in place, it was reviewing the regulations around AVs and welcomed the NTSB's input as it advanced policies on automated driving systems.

It said the most human technology in a car required a fully advanced human driver at all times.

Abusing these technologies is, at a minimum, distracted driving. All states in the nation hold the driver responsible for the safe operation of the vehicle, Reuters told NHTSA.

NTSB also says that the NHTSA does not have any method to verify whether carmakers have adopted system safeguards. For example, there are no federal regulations that require drivers to touch the steering wheel within a specific time frame.

NHTSA is drafting regulations on semi-autonomous vehicles, but it has been slow to regulate autonomous vehicles, said Bryant Walker Smith, a law professor at the University of South Carolina. There is a growing awareness that they deserve more attention for the regulator and better governance.

New York has a law requiring drivers to keep at least one hand on the wheel at all times, but no other states have legislation that could prevent the use of semi-autonomous cars.

When it comes to AVs, according to the National Conference of State Legislatures, 35 states have signed legislation or state governors have enacted executive orders covering AVs.

Such rules allow companies such as Google and General Motors to test their Waymo and Cruise vehicles on public roads, among others.

AV regulations in Texas state that vehicles must comply with NHTSA processes, though there are no such federal regulations. The Texas Department of Public Safety, the regulator charged with controlling AVs, did not respond to a request for comment.

Arizona's transportation department requires companies to submit autonomous filings, among other things, to verify that vehicles can operate safely if the regular technology fails.

While most automakers offer vehicles with autonomous driving, there are no fully autonomous vehicles on sale for customers in the United States of America.

Concerns about autonomous driving technology, however, have been mounting in recent years and Tesla has warned about its limitations.

In February 2020, Andrej Karpathy, Director of Autopilot Technology, identified a challenge for its autonomous driving system: how do we recognize when a parked police car's emergency flashing lights are on?

This is an example of a new task that we want to know about, said Karpathy at a conference during a talk about Tesla's effort to deliver FSD technology.

In just over a year since then, Tesla vehicles have parked into police cars on four fatal occasions and since 2016 at least three Tesla vehicles operating on Autopilot have been involved in separate crashes.

All four incidents have been investigated by the United States safety regulators, the police and local government, officials told Reuters.

At least three of the vehicles were on Autopilot, police said. In one of the cases, a doctor was watching a movie on a phone when his car crashed into a police officer in North Carolina.

Tesla did not immediately respond to a request for comment; accidents and investigations have not slowed Musk's drive to promote Tesla cars as capable of driving themselves.

In a recent Tweet, Musk said Tesla was almost ready with FSD Beta V 9.0.

Improvement of steps is massive, especially for bad corners in weird weather. No radar, pure vision. Tesla also said it used 1 million cars on the road to improve image data and collect Autopilot, using machine learning and artificial intelligence.

Karpathy of Tesla said he has ridden in his Tesla for 20 minutes to get coffee in Palo Alto without intervention.

It is not a perfect system but it is getting there, he said in a Robot Brains podcast in March.

I always keep my hands on the wheel. What are some good examples?

  • Comments
Loading comments...