June 5, 2016

"Here are 5 situations that, for now at least, often confound self-driving cars...."

"In the midst of busy traffic, a ball bounces into the road, pursued by two running children. If a self-driving car’s only options are to hit the children or veer right and strike a telephone pole, potentially injuring or killing the car’s occupants, what does it do? Should its computer give priority to the pedestrians or the passengers?"

Don't you want to decide for yourself just how much, if at all, you're going to sacrifice for the kids? Or is the answer it depends on whether the engineers give priority to my self-interest? Won't your guilty conscience feel better if the car made the call?

57 comments:

sinz52 said...

It seems to me that this should just be one of the settings that you can set for yourself on the dashboard display screen:

If Accident is inevitable, give priority to:

1. Saving your own life

2. Saving the lives of your passengers

3. Saving the lives of pedestrians

And you just pick one of those options. You can even change your mind about it later.

The good news is that when you choose one of those options, that information can be instantly transmitted to your automobile insurer, who will reset your premium accordingly.

Bob Boyd said...

If given the option, most people would set their car somewhere in middle of the range....in which case it would run over one of the kids then smash into a pole.

David Begley said...

Are consumers really interested in self-driving cars or is this an engineering project to drive up the price of a car?

Sharc said...

Surely they can come up with an algorithm that instantaneously compares the relative value to society of the children vs. the driver.

rhhardin said...

"Confronted with a situation where you can save the life of an old lady or rescue the complete works of Shakespeare, what do you do?"

"Shakespeare is worth any number of old ladies."

Faulkner, I think.

So carry the works of Shakespeare.

Ignorance is Bliss said...

Don't you want to decide for yourself just how much, if at all, you're going to sacrifice for the kids?

Do passengers in a car with a human driver lose any sleep over this?

Bob Boyd said...

What if your self-driving car rounds a corner and suddenly you find yourself in the middle of an angry mob...and you have a Trump bumper sticker? Does the damn thing stop, at which point you are dragged from your car and beaten to death?
Or does it plow through, killing and injuring a dozen misguided souls, but delivering you safely to your yoga class with time to spare?
Questions for the salesman, I guess.

traditionalguy said...

What is one surplus human that causes CO2 emissions worth? I think we can predict the answer coming from the We Rule the World Digital Taipans who are all disciples of Malthus and want to eliminate most of the world's surplus life.

Rae said...

The solution is a network enabled ball. The car will tell the ball to bounce elsewhere, leading the children to safety.

virgil xenophon said...

Such moral dilemmas and technical problems will never be solved--which is why self-driving cars are non-starters. UNLESS, of course, the Feds require EVERYONE to give up their drivers license and use automated cars scheduled by the government which would be nothing more than the functional equivalent of a government-controlled bus-line, taking away all democratic spontaneity from the public ( no more stopping by to say hello to Aunt Elsie on the spur of the moment on the way home from work!) As the philosopher Eric Voegelin once wrote: "The end result of all 'progressive' policies is totalitarianism."

PS: Many "Urban Planners" both in and out of government are ALREADY suggesting that requiring the public to give up their drivers licenses is the only solution that is workable given the moral and technical problems involving mixed use roads and hwy systems. (We also see the Feds "encouragement" of building housing/commercial centers along rail nodes in metro areas eliminating the need (supposedly) for individuals to drive as yet naught but a back-door attempt at population control via the (very expensive) carrot rather than the stick of banning privately owned automobiles..)

Ryan Johnson said...

There is an underlying false assumption in this senerio. The self driving car is not going to identify "children" in its decision making. It is going to identify an "obstacle" and not make an distinction between children, pets or road debris. Given the choice between large off road obstacle and small on road obstacle, the car is going to mow down the children evertime.....

Chuck said...

At least the self-driving car is not going to slam into the car in front, or remain stopped at a traffic light when the red light turns green, because it is busy updating its Facebook page.

So there's that. All of which will outweigh the children-versus-telephone pole hypotheticals by a thousand-fold.

Freeman Hunt said...

I think Ryan is right.

Even if he weren't, who would get into a car that was programmed to kill oneself on purpose sometimes? The car might kill you over a dog or a bag of leaves. No thanks.

whswhs said...

If you put a car on the market that is known to be programmed to sacrifice your life to save the lives of people outside the car, who's going to buy the car? The marketing seems impossible.

Bob Boyd said...

How are you going to conceal your steamy affair when the self-driving car will show your wife that it was parked in front of room 14 at the In Out Inn for two hours yesterday when you were supposed to be working?

Laslo Spatula said...

The Guy Who Drives the White Van With No Windows In The Back Says:

I'm driving down a city street when a dog runs out on the road right in front of me -- a little bull terrier.

I slam on the brakes, hard, and I'm sickly sure that there is not enough time to stop, but somehow I end up, tires smoking, just inches from the dog!

For a moment the dog and I share eye contact: there is a flashing sense of the Universe and our Places in it, we see it, Together, and I can see Gratitude in the dog's dark eyes.

Well, when I get out into the forests I find that the girl in the back of the van hit her head from the abrupt stop: she is bleeding from the top of her head, and looks frightened and disoriented: maybe she has a concussion.

"Am I OK?" she asks, tremulously.

"I had to stop to avoid hitting a dog," I say.

"You didn't hit the doggie, did you?" she asks.

"No, no. The dog is fine. Just a little scared."

"Ah Thank God. I hate to think of animals getting hurt."

"There is a Time and a Place for Everything," I tell her.

"Mister?"

"Yes?"

"I'm still a little scared."

"I understand."

"Am I going to be OK?"

"Well," I say, trying to think of the right words. "Today was the DOG'S Lucky Day," I reply.

I try to be as Honest as I can be, even in these uncomfortable situations...

I am Laslo.

Mark said...

Self driving vehicles are most likely going to replace taxis and long haul trucks only, in which we can expect insurance rates to make sacrificing the passenger the only affordable option.

The self driving car scene in the first season of Silicon Valley on HBO is hilarious and it will be a long time until anyone matches as the setup is so perfect.

rhhardin said...

Self-driving car chase scenes in the movies will be awesome.

rhhardin said...

Self-driving airplanes have been around forever. It's called dynamic stability.

No additional mechanism needed.

Jack Wayne said...

Looks like some young programmer stayed up late one night and watched I, Robot. I wonder if he extrapolated to hacking a car to drive over an enemy? The interesting thing is how many people want to give up the freedom of driving for the "freedom" of riding.

Original Mike said...

" The interesting thing is how many people want to give up the freedom of driving for the "freedom" of riding."

Not me. Fortunately, I doubt the technology will "be there" in my lifetime. While the current level of technology may be impressive, what the developers have achieved so far is the easy stuff.

richlb said...

Or what if it's shoot the gorilla or see if it kills the child? Even now humans are debating that.

Richard said...

I tried to think of how Isaac Asimov’s three (four) laws of robots would address this issue but they are not directly applicable. You could say that the zeroth law may come into play, but then you would have to know the impact on civilization of the killing the children vs. killing the driver.

0 A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
1 A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2 A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3 A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

However it turns out that Isaac Asimov dealt with this issue in one of his short stories. He wrote a short story entitled "Sally" in which cars fitted with positronic brains are apparently able to harm and kill humans in disregard of the First Law. The story portrays a future in which the only cars allowed on the road are those that contain positronic brains; these do not require a human driver.

https://en.wikipedia.org/wiki/Sally_(short_story)

steve uhr said...

Is the car going to take into account legal rules of liability before making a decision? If it is unable to stop in time to avoid the children, it probably has no liability. If is intentionally swerves to avoid the children and runs into bystanders, I would think it is much more likely to be liable for injury to those bystanders.

A swerve could be very costly. Probably best to hit the negligent kids.

SteveR said...

I'm almost 59 and this will never be a reality I have to deal with and the things I imagine are far more pleasant.

FullMoon said...

Whenever I scoff at self driving cars, California bullet trains, and Hyperloops, I realize I am like those who considered the automobile a fad that would never replace the horse.

mikee said...

I taught my kids to NOT swerve for small critters like squirrels, but to continue driving steadily, because hitting a squirrel will not damage the car or its occupants.

There is an old joke from driving school, about a kid who calls his dad from the middle of a farmer's fields, asking for help and a tow truck. The dad, of course, asks the kid why the car is in the middle of the field. The kid says, "Remember when you told me that if a squirrel crossed the road, I was supposed to keep driving and hit it? Well, it ran pretty fast, but I finally caught up to the little bastard!"

Roy Lofquist said...

I am sure that the engineers are well aware of the contingencies and are writing bit buckets full of code to deal with it. The thing is that that's a 1 per 100 billion miles driven scenario.

But what about when some yahoo tosses an empty beer can out the window or a tumbleweed suddenly appears right in front of you? That happens a lot, beer cans and tumbleweed both. You can see the plaintiff bar, the ambulance chasers, salivating already.

Some of us codgers remember the Chevrolet Corvair. It was an innovative rear-engine car that was quickly gaining in popularity. Then Ralph Nader wrote "Unsafe At Any Speed". We haven't seen a rear-engine car on the road since.

Original Mike said...

@FullMoon - Not everything scoffed at came to pass. Flying cars come to mind.

I don't doubt self-driving cars will become a reality. But I think proponents are dazzled with the progress so far and ignore that it gets harder from here on out.

Paul Snively said...

Ignorance is Bliss: Do passengers in a car with a human driver lose any sleep over this?

No, and the only reason we're having a conversation about this is because the general populace is unacquainted with the current state of the art in computerized decision making under uncertainty, even though we all engage in a truly staggering number of transactions based on exactly that. The only difference is that most such transactions are not life-and-death affairs.

Ryan Johnson: There is an underlying false assumption in this senerio. The self driving car is not going to identify "children" in its decision making. It is going to identify an "obstacle" and not make an distinction between children, pets or road debris.

That's already false.

Ignorance is Bliss said...

whswhs said...

If you put a car on the market that is known to be programmed to sacrifice your life to save the lives of people outside the car, who's going to buy the car? The marketing seems impossible.

There will be vastly more cases where the car saves the lives of its occupants compared to a human driver than cases where the car costs the lives of its occupants.

People who avoid self-driving cars over this issue make Darwin smile.

Jupiter said...

There is a very simple solution to all such problems, which is to make the car drive very slowly, and stop frequently. And that is the solution that will eventually be taken, once the government and the lawyers are done. Already, they are causing crashes because they won't speed up to merge on a freeway if that requires exceeding the speed limit.

Original Mike said...

@Jupiter - Yep. And the Feds will outlaw people driving themselves. That's what I object to.

Char Char Binks said...

Ann said,

"Won't your guilty conscience feel better if the car made the call?.

But you still CHOSE to get in the self-driving car and go, and you still chose where it took you, so you're on the hook.

Blogger Original Mike said...

"Not everything scoffed at came to pass. Flying cars come to mind."

We have flying cars. They're called "airplanes".

Saint Croix said...

In the midst of busy traffic, a ball bounces into the road, pursued by two running children. If a self-driving car’s only options are to hit the children or veer right and strike a telephone pole

Can our nice liberal friends tell us if it's constitutional to use race in the algorithm?

I can just see HAL going, "illegal immigrant, negative .462 basis points."

Hey, is it 2028 in this hypothetical? It would be kind of ironic if 98-year-old Sandra Day O'Connor gets smacked by a self-driving car because the algorithm came up honky. Plus she's old!

Freeman Hunt said...

You can't make the car sacrifice its occupants to people outside the car. You would screw up the incentive for people to stay out of the road.

JAORE said...

No matter the choice made there will be attorneys ready and willing to sue that the choice was wrong.

Plus we see evidence routinely that, among the millions of lines of code, there are bugs that erupt only when unique, sometimes multiple, scenarios present themselves. No one can be sure the code is clean and green in all possible events, no one.

Plus, hacking.... oh the possibilities with hacking....

But like others, I'm old and it won't happen on my watch.... or to my motorcycle.

Ignorance is Bliss said...

JAORE said...

Plus we see evidence routinely that, among the millions of lines of code, there are bugs that erupt only when unique, sometimes multiple, scenarios present themselves. No one can be sure the code is clean and green in all possible events, no one.

Isn't the same true, one hundred times over, for the wetware currently driving your car?

JAORE said...

"Isn't the same true, one hundred times over, for the wetware currently driving your car?"

No argument. But a single responsible party is no fun. A megaCorp that issues the software is a source of endless fun.

Sammy Finkelman said...

The car will assume its occupants are wearing seat belts.

Original Mike said...

"Isn't the same true, one hundred times over, for the wetware currently driving your car?"

Yes, but tough shit. You can't drive my car, Ig.

Sammy Finkelman said...

Airplanes are like tachyons. They ca't travel belowe a certain speed. A car is not like that.

Ignorance is Bliss said...

Sammy Finkelman said...

The car will assume its occupants are wearing seat belts.

Why would it assume something that it can already check in most cars, at least for the front seats?

rehajm said...

I test drove a Model X yesterday evening and utilized the autopilot feature. It easily distinguishes between an automobile, bicycle, pedestrian or object. It does an excellent job of maintaining lane position and adjusting at interstate speeds. The wife drove in stop and go traffic and it was flawless, though the salesman warned the car will follow the car in front if it drifts to the edge of the lane. It also has trouble with traffic merging into your lane. It is disconcerting to sit in the driver's seat letting the car drive 70mph and would never trust it outside of bumper to bumper and other low speed situations.

The lawyers will likely quash the technology before it gets going.

Bob R said...

I'm 59 too, and a couple of weeks ago, my wife and I had to take the car keys away from my 87-yo father-in-law. He's not happy. A self-driving car would come in real handy right now. I see that as a huge market.

Mike said...

I've seen this before and these sort of questions are mostly garbage. Most traffic deaths do not result from these sort of moral quandries. They result from people driving drunk or distracted or tired or too fast or too incautiously. An automated car is way way less likely to face that dilemma in the first place because it is way less likely to be driving too fast and way less likely to be unaware of what's going on around it.

Original Mike said...

I'm kind of hoping to hit the sweet spot. Maybe I'll live long enough that they are available when it's time to take the keys away from me but not so long that I see the day when the government takes them away from its citizens.

Hammond X. Gritzkofe said...

If it ever comes to it, I hope to die in my sleep like my Grandfather - not yelling and screaming like the passengers in his car.

(Somebody had to say it.)

But to answer the question: "Won't your guilty conscience feel better if the car made the call?" No.

Achilles said...

If the car hits the kids it will take little damage.

If the car hits the telephone poll it will take catastrophic damage.

As it is now a collision with a telephone poll rarely ends in serious injury to the occupants of the vehicle at anything less than extreme speeds. One would hope that in an area where balls bouncing on the road followed by children speeds will not be such.

The answer to what will happen will have to do with who foots the bill for replacing the vehicle.

exhelodrvr1 said...

Do you feel better if a drone kills someone instead of a bomb from a piloted aircraft?

GRW3 said...

I think this will take a long time in the full fruition of the techie mind. Probably limited areas, say SF or NY city, where it will be intense. More likely, we'll see it used for long distance driving for people, like Ann, that like to drive long distances (instead of flying) or people who do so professionally.

I make the San Antonio to Houston drive fairly often. I'd love to hand over the chore to the car from city limit to city limit. Once a year I drive up to Lubbock. Boy, that would be a great time for a self driving car.

Of real importance, the driverless cars must be polite. Stay out of the left lane, except while passing. Don't pass if it's going to take 5 minutes when average traffic running above the speed limit. Give way when car wants to pull into your lane and uses a turn signal. So on and so forth...

Rusty said...

This is only going to work if everybody has a self driving car or if a highway lane is reserved for self driving cars.

Rusty said...

Sammy Finkelman said...
The car will assume its occupants are wearing seat belts.

The car won't move unless everyone is wearing a seatbelt.

Peter said...

Let us assume that self-driving cars will, due to their faster reaction time and consistent performance, outperform human drivers to the point where injury-accidents (including fatalities) are perhaps 50% lower than is the case now, and with improvements approaches 90% lower. In that case, no matter how the moral dilemma is resolved, the result is likely to be few children run over and fewer drivers killed by collisions with utility poles and other relatively immovable objects.

Assuming this is achievable, the obvious logical solution is to favor self-driving cars regardless of whether their algorithms favor occupants or pedestrians.

But, can mere logic prevent tort and liability law from bleeding makers of self-driving cars until they become unaffordable for practically everyone? If not, how likely is it that legislators might find sufficient political support to favor logic over bad hard-case legal precedent (and the Trial Lawyers Association of America)?

Witness said...

More realistically, the car detected earlier that there were children playing near the side of the road and slowed down early, rendering the rest of the hypothetical moot.

(This is not unlike how most human drivers behave)

tim in vermont said...

Assuming that there is a probability function that describes the various outcomes, why not simply take the path that provides the lowest probability for any death? The idea that it would ever be a tie is absurd.

RonF said...

Is my kid in the car? I'll choose someone else's kid over me, but my kid over someone else's kid. The self-driving car won't know that.