Driverless cars must learn to
take ethical route. There are some countries (Germany one of them) where
the right to life and dignity could mean that it is illegal to reduce
human life to a value that is processed by some cost function of an
algorithm.
So if you are running a driverless car which is going
to crash and (potentially) kill two people, being only one person on the
car, it is illegal that any algorithm decides that it is better to kill
one person than two people.
Will this mean that injured/fatalities should be randomly chosen?
Seen at Financial Times via Reddit
Related: Mercedes' driverless cars will kill pedestrians over drivers
Related: Mercedes' driverless cars will kill pedestrians over drivers