Car Makers Take a Dangerous Turn

Auto companies are rolling out "hands free" systems that can lull drivers into a false sense of security.

Image
ai car

At a time when highway deaths in the U.S. are finally in a sustained decline again, after a surge related to COVID recklessness and distracted driving, car makers seem to be set on a dangerous path. Wanting to show off their AI chops, they are rolling out and emphasizing autonomy-lite capabilities that are convenient in the best of all possible worlds but that could well increase accidents and fatalities.

Basically, many car companies seem to be heading in the Tesla direction, bragging about their cars' autonomous capabilities while telling drivers they need to have their hands on the wheel and be constantly alert — a combination that just doesn't work. Once you tell people their cars can do the driving, they let their cars do the driving, and the sort of autonomy that exists this side of Waymo and a few other providers of fully autonomous vehicles simply isn't reliable enough yet. 

I've become what's called a 410-er, and I think insurers should be, too.

410 — unrelated to 420, a coinage celebrating cannabis that seems to amuse Elon Musk — refers to three of the six levels of autonomy: levels 4, 1, and 0, which I believe are the safe levels at the moment. Car makers, meanwhile, are focusing on providing levels 2 and 3 to the mass market, levels that I believe are fraught with danger. 

Level 0, as you can imagine, involves no autonomy. Level 1 is referred to as driver-assist — your car keeps you in the center of the lane if you're starting to drift and lets your cruise control maintain a safe distance from cars around you. Both provide little enough assistance that the driver stays fully engaged.

Level 4 is fully autonomous within a certain area and under certain conditions. As long as you stay within, say, a well-mapped area in good weather in daytime, you could be asleep in the back of the car. This level is safe, too — once the technology is proven, of course.

Combine those three levels, all of them helpful without overpromising to the driver, and you get to 410. You could also add level 5, which is fully autonomous anywhere at any time, but we're not there yet, even with Waymo, et al. I think we'll get to a 5410 paradigm, just not yet.

The problem is that car companies want to brag about levels 2 and 3, which can lull drivers into danger. Level 3 is especially seductive, because it promises "conditional driving automation": The system handles all aspects of driving under certain conditions, but the driver must be available and able to take over.

Google, Waymo's parent, tried a Level 3 approach years ago and quickly gave up. The technology was so good that Google employees who volunteered as test subjects soon zoned out and started checking their phones, playing video games or whatever — but the car wasn't totally reliable. And when the car told the driver to reengage, many seconds passed before they could stop whatever they were doing, size up the issue and act. When you're traveling at 75mph, 10 seconds equals almost 400 yards of distance traveled while the driver is taking control of a situation. 

Tesla is finding out how dangerous Level 3 autonomy can be. It's faced numerous lawsuits over fatal accidents that occurred while drivers had engaged what Tesla calls Full Self-Driving (FSD) but which is really a Level 3 system. Tesla has mostly escaped liability by arguing that it warned drivers repeatedly that they were responsible for the car's actions, but it did recently lose a $240 million wrongful death judgment that should serve as a warning to Tesla, to other auto makers offering Level 3 systems and to auto insurers.

Tesla, seemingly unchastened, recently told its car owners that if they felt drowsy they should engage FSD — even though a drowsy driver would take even longer to reengage if told to by the AI driving the car. Other car companies are promising what are generally referred to as Level 2+ systems, and Telemetry says more than half of new cars will be equipped with such "hands free" systems by 2028. 

Car companies will surely promote these systems. Everybody wants to be seen as being at the cutting-edge of technology, and AI can be a real selling point these days. Autonomy is exciting.

But I hope cooler heads will prevail. 

"Customers really love these hands-free systems, especially on longer drives, but God is in the details, and... the worst of these systems may result in preventable injury or worse," Telemetry says.

Traffic deaths in the U.S. had been declining steadily for decades, reaching a low of roughly 35,400 in 2014, but increased steadily as smartphones tempted drivers with distractions and then surged as COVID somehow made drivers more reckless. Deaths peaked at approximately 47,000 in 2021. They dropped slowly from there, falling back to 44,700 last year, and declined a gratifying 8.2% in the first half of this year. 

Let's keep the progress going and push back on efforts to promote Level 2+ autonomy as anything more than an occasional convenience. Level 2+ and Level 3 are wildly impressive technology — impressive enough to be truly dangerous.

410, 410, 410....

Cheers,

Paul

P.S. What happens when a driverless car commits a traffic violation? Who gets the ticket?

That was the issue facing police officers when a Waymo vehicle made an illegal U-turn right in front of them in San Bruno, CA, just south of San Francisco, a week ago. They pulled the car over — Waymo cars pull off the side of the road when a police car turns on its emergency lights — but when they approached the car they found... no one in it.  

Under California law, tickets can't be issued to driverless vehicles until next summer — “Our citation books don’t have a box for "robot,” the San Bruno Police Department noted. Even once tickets can be issued, they carry no penalty. 

Expect the law to change — and expect further oddities as autonomous vehicles become more common.