Even then, who wants the liability of coding the decisions when it has to choose between hitting a cyclist who will almost certainly die but with little threat to the occupants of its own vehicle, nailing the back end of a pickup at higher risk to its own occupants, but lower overall risk of serious injury or death, or running off the road, which could be anything from a rough stop in the ditch to flying off a cliff?
Actually, it turns out to be a false premise. Those doing the studies and running the simulations have found that one answer is always the correct solution when a collision is imminent: Hit the brakes.
IMO, liability concerns like that will delay widespread self-driving cars for decades. Right now, collision avoidance isn't really making complex decisions, just hitting the brakes and/or aiming for an empty chunk of road.
The way it's shaping up,
pathfinding is the bigger concern, not liability. Self driving cars still lack the ability to get you from point A to point B in an efficient fashion without becoming stuck or 'confused', needing human assistance to tell it which way to go.
Safety they've actually had nailed for a while. Because guess what? Collision avoidance actually isn't about complex decisions, hitting the brakes and aiming for an empty chunk of road is relatively easy, and the 100% attention paid makes them better than humans, even if you're part of the 90% of drivers that think they're in the top 50%.
As for the legislators, do you think that they'll be able to prevent themselves from mandating self-drivers for DUI convicts, like they mandate interlocks now? That the soccer moms won't put Johnny into a self driver for his first car for the safety and insurance cut?