Thanks for doing the math. I've always been of the mind that cars are ALWAYS going to kill people, because people move more dynamically, in more directions, more quickly than any car can. But that math makes me rethink that position, and makes it so I can't fucking wait for fully autonomous vehicles to be the norm, and show just how terrible humans are at navigating in 3D space. (Incidentally, the accident photos do not show any obstructions. They aren't comprehensive, but they don't show any vehicles parked along the curb or anything... so... I need to walk back even further on my potential victim-blaming...)
It's a peeve of mine. veen could comment at greater length and probably more eloquently, but fundamentally, there are two viable approaches to autonomous vehicles: turn the whole world into a slot car track or invent robots capable of driving better than humans. Google is going the slot car approach. This is obvious from a monetization standpoint: they want to sell the track. They've already got heinous map data and now they're mapping the world to the quarter inch so that when something deviates from their known world, it's a hazard. Full stop. Update the map, upload to the mothership, act accordingly. Google will win this way because the product is the map, which they own entirely. Tesla and Uber are going the robotaxi approach. This is fucking stupid because AI ain't there yet. More than that, humans have a social contract with vehicles - we expect them to act like we do because there are people controlling them. Watch on your way home how much of the traffic around you fundamentally depends on the kindness of strangers; traffic follows flock dynamics in which a very few set of instructions perform complex behavior (like dealing with traffic). AI isn't a part of that social contract. It can ape it, so long as things are in the 95th percentile. Corner-case it falls apart. Accidents are, by definition, corner case but so long as you can be as smart as adaptive cruis control you can pretend you have an autonomous vehicle. Google is pushing from zero to "there are no drivers" because in Google's opinion, human intervention is a false panacea. Google's testing shows that the human fuckin' checks out as soon as she's decided the car's got this so they don't want the car operating in any conditions where the car can't handle it 100% of the time. Which, if you're converting the world to a slot car track, works fine. You run or you don't. Tesla and Uber are basically coming up with a bells'n'whistles cruise control and pretending it's autonomous. The fact that a car going 38 can clip a pedestrian walking her bike shows that they shouldn't be trusted to play. And I think it's really, really important for the future of transportation for people to understand that.
The video of the accident pushes my victim-blaming meter back up to 15%... https://boingboing.net/2018/03/22/dashcam-video-of-fatal-uber-co.html
C'mon. You know better. Edited: from your own link