@Olivier5,
I think you are demonstrably wrong Olivier.
1.
Computers don't have to be very "intelligent" to be better than human beings at driving. Humans get distracted. Humans get angry and act irrationally. Humans get tired. Humans get drunk. Humans sometimes just zone out for no reason. Human are crappy drivers... yet we accept tens of thousands of deaths each year from car crashes, most of which are caused by human error.
2. Driving is a repetitive task; the type of task that can be automated.
Creativity while driving is a bad thing. Ninety percent of driving is simple rule following; stay in your lane, keep distance from other cars, don't hit anything. The difficult part is handling the unexpected... which is a difficult programming challenge. But it can be done.
3. The technology can be tested, and is being tested. Remember, that
for self driving cars to save lives, they only have to perform better than human beings. I think they can do much better.
What is happening right now is that the software is being tested in simulations. It is being tested in laboratories and it is being tested on the roads.
Companies are investing lots of money into the technology because it is passing the tests.
4. As far as the moral questions... these will have to be solved by human programmers (unfortunately long before they will be answered). But machines already make these decisions. A certain number of people die because of seat belts (they wouldn't have died if they weren't wearing seat belts). We make the decision to use seat belts because more people are saved by wearing them then by not wearing them.
We can make these decisions about technology.
If the technology can significantly cut the number of deaths; meaning that there are fewer deaths per mile driven from self-driving cars than from human driven cars, would that change your mind?