Viewpoint

Self-driving cars pose risk to drivers

No one likes a backseat driver. Nagging, nannying. Questioning every decision, constantly attempting to correct what he or she considers to be your errors of judgment.

How about an “it” doing the same thing? One you can’t kick to the curb?

The “it” in question being the backseat computer. The one that will in the not-far future take over the driving and not just second-guess yours.

It’s the self-driving or autonomous car. In fact, it’s already here. Bits and pieces of it, anyhow. Many new cars have collision avoidance systems that can completely stop the car without the driver even touching the brakes.

Next year, GM’s Cadillac division will debut vehicle-to-vehicle, or V2V, communications in some models.

The system makes it possible for cars so equipped to have electronic conversations among themselves – to be aware of one another’s relative position and velocity – in order to anticipate and hopefully avoid potential collisions.

These are some of the elements of the fully autonomous, self-driving car. And some of it sounds good. But taking the driver out of the equation entirely – or relying too much on technology – can have its downsides, too.

Computers develop glitches sometimes. It’s annoying when it happens at your desk. But it could be lethal when it happens at 75 mph on the freeway.

And it’s probably more likely to happen with an autonomous car, because its computer will experience extreme temperatures, vibration and moisture, et cetera.

Something’s likely to go on the fritz. If the human driver has become a passenger, what will happen?

And who will be responsible? Legally speaking, the driver is currently responsible for the safe operation of his vehicle.

But how can we hold him responsible when he’s no longer the driver?

Will the manufacturer of the self-driving car be liable in that case?

If the driver no longer is a driver, why should he be required to buy insurance? Or have a license?

An even bigger problem with autonomous cars is how to program them to disregard traffic laws when it’s necessary to do so in order to avoid an accident.

For example, it’s illegal to cross the double yellow line – but what if a child runs into the car’s path and the only way to avoid hitting her is to swerve out of the way?

A human driver would do it. An autonomous car wouldn’t.

Also, how will autonomous cars deal with cars not autonomous? Will people who own human-controlled cars be required to turn their cars in or no longer be allowed to drive them?

Technology is usually a good thing, but problems arise when technology is no longer under human control, as could happen here.

Technology that assists human drivers – that’s a great idea. But technology that pre-empts them and dumbs them down – that could be a very bad idea, indeed.

Eric Peters is a veteran automotive journalist and author of “Road Hogs” and “Automotive Atrocities.”

  Comments