Electric CarsTesla News

Tesla | Safety questions raised over autonomous driving

(Detroit) Three times in the past four months, Truist Securities technology analyst William Stein has accepted Elon Musk's invitation to try out the latest versions of Tesla's fully self-driving system.

A Tesla equipped with the technology, the company says, can move from one point to another with minimal human intervention. Yet every time Mr. Stein drove one of the cars, he said, the vehicle made dangerous or illegal maneuvers. His most recent test drive earlier this month left his 16-year-old son, who was with him, “terrified,” he said.

Stein's experiences, along with a crash involving a Tesla in the Seattle area that killed a motorcyclist in April, have drawn the attention of federal regulators.

Tesla's automated driving systems have already been under investigation for more than two years due to dozens of crashes that have raised safety concerns.

The problems have led people who monitor autonomous vehicles to become more skeptical about the ability of Tesla's automated system to operate safely at scale.

Mr. Stein said he doubts Tesla will even come close to deploying a fleet of self-driving robotaxis by next year as Elon Musk has predicted.

A decisive moment

The latest incidents come at a crucial time for Tesla. Elon Musk has told investors that it is possible that fully autonomous driving will be able to operate more safely than human drivers by the end of this year or even next year.

And in less than two months, the company is expected to unveil a vehicle built specifically to be a robotaxi. In order for Tesla to put robotaxis on the road, Musk has said the company will have to show regulators that the system can drive more safely than humans.

Under federal rules, Teslas must meet national vehicle safety standards.

Mr. Musk has released data showing the number of miles driven per crash, but only for Tesla’s less sophisticated Autopilot system. Safety experts say the data is invalid because it only takes into account serious crashes with airbag deployments and does not show how often human drivers had to take over to avoid a collision.


PHOTO DAVID SWANSON, REUTERS ARCHIVES

Tesla CEO Elon Musk

Fully autonomous driving is used on public roads by about 500,000 Tesla owners, or just over one in five Teslas on the road today. Most of them paid US$8,000 (about CAD$10,500) or more for the optional system.

Human intervention remains required

The company has warned that cars equipped with the system cannot truly drive themselves and that drivers must be prepared at all times to intervene if necessary. Tesla also says it tracks each driver's behavior and will suspend their ability to use full self-driving if they fail to properly monitor the system. Recently, the company has started calling the system “Fully Self-Driving.” [supervisée] “.

Elon Musk, who has acknowledged that his past predictions about the use of self-driving cars have proven too optimistic, promised in 2019 a fleet of autonomous vehicles by the end of 2020. Five years later, many technology observers say they doubt it will work across the United States as promised.

“It's not even close, and it won't be next year,” said Michael Brooks, executive director of the Center for Auto Safety.

The car Mr. Stein was driving was a Tesla Model 3, which he picked up at a Tesla showroom in Westchester County, north of New York. The car, Tesla’s least expensive vehicle, was equipped with the latest fully self-driving software. Mr. Musk says the software now uses artificial intelligence to help control the steering and pedals.

During his drive, Mr. Stein said the Tesla felt smooth and more human-like than previous versions. But on a trip of less than 10 miles, he said the car veered left from a through lane while running a red light.

“It was amazing,” he said.

He said he didn't take control of the car because there was little traffic and it didn't seem dangerous at the time. Later, however, the car rolled into the middle of a driveway, straddling two lanes of traffic going in the same direction. This time, he said he intervened.

The latest version of full self-driving, Stein wrote to investors, does not “solve the autonomy problem” as Musk predicted. Nor does it “appear to come close to the capabilities of a robotaxi.” In two previous tests, in April and July, Stein said Tesla vehicles also surprised him with dangerous behavior.

Tesla did not respond to messages seeking comment.

While he believes Tesla will eventually make money from its driving technology, Stein doesn't foresee a driverless robotaxi in the near future. He predicted the system would be significantly delayed or limited in its travel.

There is often a big gap, Stein said, between what Musk says and what is likely to happen.

Technology not up to par

Alain Kornhauser, who directs autonomous vehicle studies at Princeton University, said he drove a Tesla borrowed from a friend for two weeks and found that it consistently spotted pedestrians and detected other drivers.

While the car works well most of the time, Kornhauser said he has to take control when the Tesla makes moves that scare him. He cautions that fully autonomous driving isn’t ready to be left unsupervised in every place. “This thing,” he said, “isn’t at the point where it can go anywhere.”

Mr. Kornhauser thinks the system could operate autonomously in smaller areas of a city where detailed maps help guide vehicles. He questions why Mr. Musk doesn't start by offering smaller-scale routes.

“People could really use the mobility that this could bring,” he noted.

For years, experts have warned that Tesla's system of cameras and computers isn't always able to spot objects and determine what they are.

Cameras can’t always see in bad weather and darkness. Most other autonomous robotaxis companies, such as Alphabet’s Waymo and General Motors’ Cruise, combine cameras with radar and laser sensors.

“If you can't see the world properly, you can't plan, move and act properly in the world,” said Missy Cummings, a professor of engineering and computer science at George Mason University.

Even cars that are equipped with lasers and radars can't always drive reliably, she said, raising safety questions about Waymo and Cruise. Representatives for Waymo and Cruise declined to comment.

Phil Koopman, a professor at Carnegie Mellon University who studies the safety of autonomous vehicles, believes it will be many years before autonomous vehicles running solely on artificial intelligence are able to handle all real-world situations.

“Machine learning has no common sense and learns narrowly from a large number of examples,” he said. “If the computer that's driving gets into a situation it hasn't been trained to handle, it's likely to crash.”

Last April, in Washington state near Seattle, a Tesla using full-self-driving struck and killed a motorcyclist, authorities said. The Tesla driver, who has not yet been charged, told authorities he was using full-self-driving while looking at his phone when the car struck the motorcyclist from behind. The motorcyclist was pronounced dead at the scene.

See also:   Stellantis | A glimpse of the electric future at Chrysler
Back to top button

Adblock Detected

Please disable your ad blocker to be able to see the content of the page. For an independent site with free content, it is literally a matter of life and death to have ads. Thank you for your understanding!