Autonomous cars have been a hot topic for a couple of years now. It is indeed a very interesting and complex technical challenge connected to potential resolution of very serious problems and innovative use cases. But it is also about culture and liability which I believe always takes very long time to cope with.
The tragic news that a car in an autonomous trial in US killed a woman yesterday can’t be a surprise to anyone. There has been a lot of accidents with autonomous cars before, traffic is very dangerous and cars kill people every day. Each accident is followed up carefully to understand what happened, why and who is to be blamed for what. Traffic rules, law, policies, insurances, vendor responsibilities etc are in place to help deal with these tragic events. Here is the core problem with autonomous cars – when a car make a mistake, who is liable? And, do I want to meet self-driving cars on the road or fly airplanes without pilots?
Already in the Reuter article we start to see arguments about who owned the software in the car, which brand it had, if the car actually did anything wrong or if a person driving would have been able to avoid the collision. Also that the woman seems to have walked outside the crosswalk, that she had a bicycle and that she might have been homeless. But truth is that it is unclear who is responsible and for what.
I remain convinced that we will not have self-driving cars on normal roads together with other traffic and in normal speed until you and I are ready to have our kids walking to school meeting those cars. My best guess is not before 2030. And given that “rain, snow and ice are particularly challenging for autonomous cars” maybe we just should forget about it in Sweden.