a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment
ilex  ·  1846 days ago  ·  link  ·    ·  parent  ·  post: Uber's 5.6 Seconds of Incompetence

This is more than the classification software getting confused. This is more than a couple programming mistakes. This is more than bad software design.

Uber failed at even the most basic safety engineering. Not only did they not consider safety when building their own stuff, they even intentionally disabled existing safety measures.

The thing that really gets me is that self-driving cars have a lot of hard problems to solve, but all these factors are not open problems! Engineers in safety-conscious fields have been thinking about and preventing problems like these for fucking decades. We know how to keep drivers paying attention even when the thing they're driving is mostly automated. We know how to alert operators of unusual circumstances early so they can make informed decisions. We know, for christ's sake, how to run several unreliable control systems and take the majority vote of their results if we're worried one might be making a mistake! This shit isn't even new -- someone from 1995 could tell you how to do all these things without knowing anything about the last 30 years of technological advancement.

But goddamn it, software engineers, if they think of anything, think of security -- how to stop "bad things" from happening. Or they think in probabilistic terms -- 90% accuracy is pretty good, right? Safety engineers, though, they know that no matter what the probabilistic models say, something bad will always eventually happen -- so how do you stop that bad thing from hurting humans.

And then, on top of all that, you have some really fucking stupid programming decisions that should never have made it onto a public road in the first place. Jesus christ.