On the http://moralmachine.mit.edu website, you can choose between two different options. This is to show how complex it is to develop self-driving cars. A self-driving car sounds great, but is subject to ethical problems. Some of these ethical problems are presented on the website. As a user, you can decide how the situation should develop and which people should survive. Usually, it is relatively easy to balance/ evaluate between situations in which different numbers of people/ various/masses of people are affected. When deciding whether to kill two or six people, the decision is made relatively quickly. It is more difficult to judge between social aspects.
How can a self-driving car recognize social aspects?
What if it makes mistakes when recognizing people because they wear a mask?
Who bears legal responsibility for the self-driving cars?
As a person, we always act instinctively and decide without noticing that we have just made a decision. However, self-driving cars must decide and there have to be developers introducing these algorithms in the vehicles and ensure that they make the right decisions.
But what is the right decision?
Is it a right decision to proceed according to law? Nobody would have gone by car, if it had killed others by chance. These ethical concerns are also the reason why there are problems licensing self-propelled cars in so many countries.
The right solution for such problems does not exist because you have to make a decision about people’s lifes and not just about material things.