While I agree that full self driving in 100% of conditions is still a ways off, it is short sighted to say it's not going to happen. Right now tesla autopilot is like a beefed up adaptive cruise control. There are plenty of warnings that state to keep your eyes on the road and hands on the wheel. In no way are tesla's current iteration of autopilot "self driving". They are making great steps towards that but still not there yet. You are judging this accident from such a human perspective rather than seeing it from an engineering problem. Are humans currently better at seeing perpendicular semi trucks crossing in front of them in certain circumstances? Apparently yes. Are humans better at keeping their attention on the road 100% of the time than computers? The statistics say no. I can think of loads of examples of things that are going to be difficult to program for, things like emergency vehicles, severe weather, erratic drivers, etc. but I have confidence eventually they will figure it out. In the meantime I will enjoy the progress autopilot is making with every over-the-air update. However, both of those figures are better than the statistics for Teslas without Autopilot engaged: one accident for every 1.76 million miles driven in Q4 2018 and one every 1.58 million miles driven in Q1 2019. Teslas with or without Autopilot engaged also appear substantially safer than the average car, based on the new Tesla data. According to the National Highway Traffic Safety Administration’s most recent data, there’s an auto crash every 436,000 miles driven in the United States.In the fourth quarter of 2018, Tesla reported one accident for every 2.91 million miles driven with Autopilot engaged. The first Tesla safety report of 2019 shows that rate increasing slightly, to one accident every 2.87 million miles.
Actually, I think it's so difficult because self-driving cars is an engineering problem rather than a human one. It's like ironing and folding clothes. Looks really simple, but insanely difficult to build a computer to do it, because the domain is one that our brain is well-suited for, but computers are not. If we see a squirrel add numbers, we see intelligence; but we are not much impressed with a computer adding numbers. If we see a computer dashing among the treetops we see intelligence; but we are not much impressed with a squirrel dashing among the treetops. Self-driving cars are imitating human driving, and the imitation is hitting diminishing returns because computers are ill-adapted for the domain. We'd probably get more ROI changing the roads (the domain) at this point.You are judging this accident from such a human perspective rather than seeing it from an engineering problem.
Completely agree that there are diminishing returns with regards to effort and that last few percent of full self driving. But changing our roads would be trillions of dollars and would still require new car tech. Full self driving will happen by the end of this decade.Self-driving cars are imitating human driving, and the imitation is hitting diminishing returns because computers are ill-adapted for the domain. We'd probably get more ROI changing the roads (the domain) at this point.