QOTD: When Should Your Self-Driving Car Ask For Help?

Jack Baruth
by Jack Baruth

Don’t look now, but it would appear that SkyNet has finally arrived — in an expert system designed to make certain judgments during autonomous vehicle operation. NetworkWorld’s breathless report states, “Basically the patented IBM system employs onboard sensors and artificial intelligence to determine potential safety concerns and control whether self-driving vehicles are operated autonomously or by surrendering control to a human driver.” We don’t need to worry about preserving John Connor’s life, or even conceiving that life (with your friend’s mom!) quite yet, however.

The definition of “artificial intelligence” that NetworkWorld is using could just as easily apply to your “smart”phone’s various character-recognition systems. But the problem that this so-called AI purports to solve is one that has far-reaching implications for the timeline, and methods, by which autonomous vehicle operation enters the mainstream.

And it leads to a very simple question.

According to the article, this “smart wingman” would be used to make a particular set of decisions.

“The idea is that if a self-driving vehicle experiences an operational glitch like a faulty braking system, a burned-out headlight, poor visibility, bad road conditions, it could decide whether the on-board self-driving vehicle control processor or a human driver is in a better position to handle that anomaly. If the comparison determines that the vehicle control processor is better able to handle the anomaly, the vehicle is placed in autonomous mode,” IBM stated.

This is the sort of thing that will be critical in autonomous-car adoption, because it has direct bearing on issues of responsibility and, more importantly, liability. We’ve already seen this with Tesla’s “Autopilot” and its inability to completely identify the set of all potential situations where a human being should really be paying attention. Tesla gets around this the same way that fireworks vendors in certain states do: they make you sign a form in which you claim you won’t be doing any of the things that you’re actually going to do. From a legal perspective, Autopilot is a sort of fatigue-mitigation device where you look straight ahead with both hands on or near the wheel, ready to take over at a sub-second’s notice.

This sort of subterfuge works okay enough for Veblen goods sold to the inexplicably rich, but it won’t fly when the hoi polloi have to join the party. There will almost certainly be situations where an autonomous vehicle will require some level of human interaction, whether that takes the form of answering yes/no questions or manipulating a pop-up set of emergency controls. So it’s merely a question of when the car should give up and ask for advice or control from the on-board wetware.

The answers to this will not be cut and dried. Is an SCCA Runoffs champion better at handling a car with a disabled brake caliper than the computer would be? Maybe and maybe not. Is a 16-year-old with minimal training better? Almost certainly not. But what about that burned-out headlight? What combinations of problems will require human intervention? Remember that the autonomous vehicle fleet of the future will stop being shiny and perfect about three months after the first sizable batch of cars hits the ground. After that, it’s basically the low road to Children Of Men. What happens when a child in a low-income community is sick and Prole Mom, who can no more drive than she can conjugate Latin, demands that a car take her to a hospital? When the “smart wingman” determines that Onboard Motor #3 is in suboptimal state, what will it do? Refuse to go?

If you ask me, the road to these answers will be paved in lawsuits. And bankruptices. And blood. But if you disasgree, let’s hear it.

Jack Baruth
Jack Baruth

More by Jack Baruth

Comments
Join the conversation
2 of 30 comments
  • V-Strom rider V-Strom rider on Apr 03, 2017

    228 people were killed on Air France 447 when the autopilot handed over to the human crew and they failed to take correct action. The fact that the correct action was "do nothing" doesn't obviate the fact that the crash was caused by the hand-off. This was in 2009 so it's not a new, or unforeseeable, problem.

  • V-Strom rider V-Strom rider on Apr 03, 2017

    Bottom line - my own definition of an autonomous vehicle is one where I can get into the passenger seat and go for a ride. Any so-called autonomous vehicle that expects me to be ready to take over will have the "autopilot" switched off and I'll drive myself. If that can't be done I'll ride in something else!

  • Analoggrotto I don't see a red car here, how blazing stupid are you people?
  • Redapple2 Love the wheels
  • Redapple2 Good luck to them. They used to make great cars. 510. 240Z, Sentra SE-R. Maxima. Frontier.
  • Joe65688619 Under Ghosn they went through the same short-term bottom-line thinking that GM did in the 80s/90s, and they have not recovered say, to their heyday in the 50s and 60s in terms of market share and innovation. Poor design decisions (a CVT in their front-wheel drive "4-Door Sports Car", model overlap in a poorly performing segment (they never needed the Altima AND the Maxima...what they needed was one vehicle with different drivetrain, including hybrid, to compete with the Accord/Camry, and decontenting their vehicles: My 2012 QX56 (I know, not a Nissan, but the same holds for the Armada) had power rear windows in the cargo area that could vent, a glass hatch on the back door that could be opened separate from the whole liftgate (in such a tall vehicle, kinda essential if you have it in a garage and want to load the trunk without having to open the garage door to make room for the lift gate), a nice driver's side folding armrest, and a few other quality-of-life details absent from my 2018 QX80. In a competitive market this attention to detai is can be the differentiator that sell cars. Now they are caught in the middle of the market, competing more with Hyundai and Kia and selling discounted vehicles near the same price points, but losing money on them. They invested also invested a lot in niche platforms. The Leaf was one of the first full EVs, but never really evolved. They misjudged the market - luxury EVs are selling, small budget models not so much. Variable compression engines offering little in terms of real-world power or tech, let a lot of complexity that is leading to higher failure rates. Aside from the Z and GT-R (low volume models), not much forced induction (whether your a fan or not, look at what Honda did with the CR-V and Acura RDX - same chassis, slap a turbo on it, make it nicer inside, and now you can sell it as a semi-premium brand with higher markup). That said, I do believe they retain the technical and engineering capability to do far better. About time management realized they need to make smarter investments and understand their markets better.
  • Kwik_Shift_Pro4X Off-road fluff on vehicles that should not be off road needs to die.
Next