Skip to content

‘It’s everyone’s nightmare — the robot doesn’t stop’: Tesla recall and robotaxi crash throw self-driving car dream into crisis

Δ

Thanks for contacting us. We’ve received your submission.

It seemed like the ultimate tech dream.

Want to go out to dinner? Order a robotaxi. Roadtrip? Hop in your Tesla, don’t worry, it will drive for you.

But now the mixture of science fiction and marketing boosterism has collided with reality — and some in the car industry believe the dream will be stuck in the garage for a long time to come.

Last week virtually every Tesla on the road was recalled over regulators’ concerns that its “Autopilot” system is unsafe, part of a years-long National Highway Traffic Safety Administration investigation into Elon Musk’s cars.

Musk had been the biggest public booster of the idea of the car doing the driving, first promising immediate “full self-driving” in 2016.

The Autopilot system allows Teslas to self-steer, accelerate and brake, but needs a driver in the front seat. Tesla is updating the software but says the system is safe.

Tesla has not gone as far as others, which have launched live trials on city streets of cars with empty driving seats, among them GM’s subsidiary Cruise; Google’s Waymo unit; Volkswagen ADMT; Subaru; and Uber.

But a hair-raising crash in San Francisco on Oct. 2 involving a Cruise AV called Panini — a driverless Chevy Bolt — has thrown the “autonomous vehicle” project into crisis.

It started when a woman was hit by a normal car on the city’s Market St. and thrown into Panini’s path, ending up literally sandwiched under it.

Panini stopped — but incredibly, then restarted and dragged her for 20 feet towards the curb at 7mph. Firefighters had to use the jaws of life to free her.

On its own, it might have been dismissed as an error.

Panini stopped — but incredibly, then restarted and dragged her for 20 feet towards the curb at 7mph. Firefighters had to use the jaws of life to free her.

But it was just the latest in a long run of crashes, injuries and death in states which have allowed driverless trials.

This year alone, self-driving cars crashed 128 times in California; one Cruise AV hit a firetruck; another hit a bus; a Waymo car delayed a firefighter rushing to a 911 call by seven minutes; a Cruise Origin embedded itself in a building in Austin, Tex., then couldn’t be moved because it didn’t have a steering wheel; and in San Francisco a Waymo car killed a dog while a Cruise AV got stuck in wet concrete.

Now industry analysts warn there’s no way AV manufacturers can quickly convince the public that driverless transportation is safe.

James Meigs, a senior fellow at the Manhattan Institute and former editor-in-chief of Popular Mechanics, said Panini’s San Francisco crash shows autonomous vehicles — “AVs” — aren’t “ready for the wild.”

“It’s kind of like everyone’s nightmare — you know, the robot doesn’t stop,” Meigs told The Post.

The issue isn’t that autonomous cars are objectively more unsafe than regular ones. They can’t be impaired by drink and drugs — up to half the drivers in serious or fatal crashes are, a federal study found — and they have not killed anything close to the 42,795 lives claimed by cars in 2022 in the US.

In fact, said Jason Stein, former publisher of industry bible Automotive News and host of podcast “Cars & Culture with Jason Stein,” they’re far safer than putting a human behind the wheel.

Today's News.
For Conservatives.
Every Single Day.

This field is for validation purposes and should be left unchanged.
News Opt-in
(Optional) By checking this box you are opting in to receive news notifications from News Rollup. Text HELP for help, STOP to end. Message & data rates may apply. Message frequency varies. Privacy Policy & Terms: textsinfo.com/PP