like Google’s autonomous cars which are a regular sight on
California’s highways,
Volvo
has proposed a commercial self-driving car in Sweden by 2017, and
self-driving Teslas
could hit the road within the year, but has anyone thought this through?
there will inevitably be life threatening accidents, a tyre
blows, a tree falls across the road, you name it,
but think on this,
the computer that controls the car you are in will
with lightning speed work out what to do, in an extreme case let you
and your car run over and kill say five innocent bystanders, or
kill you to save them? a decision has to be made, so I guess the program that
controls your car will have a switch in the car which
will give instructions to save you at all costs or let you die
to possibly save others, which position will you put
the kill switch in at the start of your journey?
the dilemma is a modern retelling of the famous Trolley Problem, in one variation, imagine you are the operator of a railway switch and you see
a train car hurtling toward a group of five children playing on the tracks, you
can flip the switch to divert the train onto a separate track, thereby saving
the five children, but notice that your own child has stumbled onto the
alternate track, what do you do? what would a computer do when the decision is
based purely on logic and unclouded by emotion or compassion? some thought, although no decision, has been given to this dilemma by the University of Alabama, but as stated no decision has been reached about what the programmers of these self drive cars are to do in regard to this moral dilemma,
but here are the opposing views, ethicists classify
the two viewpoints on this situation as utilitarianism and deontology, utilitarianism aims for the most number of happy people: the car kills you to
save a greater number of others, you flip the switch and kill your child in
order to save the children of five other families, deontology states that
certain actions are categorically bad and should never be justified: flipping
the switch to kill one child is an unacceptable act of murder whereas standing
by and letting the train hit the five children is passive, programming a car to
choose the death of one person is programming it to be an active killer, so
back to the basic question, when you get into your driverless car which way
will you flip the switch, kill or be killed?
No comments:
Post a Comment