And yet Elon keeps slinging BS about self-driving cars just around the corner. It’s not going to happen because the cars do not understand that the objective is to keep the bag of meat inside alive. Self-driving cars usually act like they don’t want to crash, but in fact, they are ambivalent about it.
There's like 370,000 Teslas on the road. There have been six self-driving fatalities. Compare and contrast: There were 350 737MAX in the sky and two of them crashed. Tesla is a thousand times safer than Boeing! We're bad at assessing the risk of systems we don't understand. It's in Elon's best interest to make that risk acceptable to you and in his worst interest to accept your irrational, fear-based assessment of the situation because if he's gotta make a car that chooses correctly 99.99999% of the time instead of 99% of the time he's gonna spend between now and forever chasing asymptotes and OBVIOUSLY that's bad for progress, bad for humanity, bad for traffic, bad for shareholders, bad for Grimes, bad for space exploration etc etc etc. Autonomous cars have a different understanding of the road than we do. When you consider traffic to be a system of independent vehicles sharing a basis of understanding, the parameter mismatch becomes obvious.
Those 370k Teslas aren’t autonomous, and won’t be any time soon. But that’s not what Elon is selling. My point is that the last 10% isn’t only more difficult, it’s nothing like the first 90%. You could spend incredible effort training the car not to drive under semi trailers, only to treat a trash bag on the wind like a falling boulder. The cars don’t understand the road at all.
And I am in 100% agreement. But if Elon can make you think that the last 10% doesn't matter, he doesn't have to do it. I personally don't think it will work until the cars are all operating on the same networked system. If there's nothing but GoogleCars on the road, Google won't run into itself. Start mixing Google and Uber and Ford? Yeah, if they're all using the same benchmarked software and they're all networked, maybe. Add humans? Or ripoff Chinese autonomy conversions? It'll never get there.
In an ideal world, I think we'd develop a system that anyone could tie into. I don't really want a Tesla network dominating the road because it feels so anti-competitive, BUT all the technical hurdles of having an openly licensed API that is safe and effective leave proprietary road networks as the only real possibility in my mind. Not because an open system like that can't exist, but because it seems there's no feasible way to implement and maintain it in the status quo with all of these companies working against each other for the same goal.
While I agree that full self driving in 100% of conditions is still a ways off, it is short sighted to say it's not going to happen. Right now tesla autopilot is like a beefed up adaptive cruise control. There are plenty of warnings that state to keep your eyes on the road and hands on the wheel. In no way are tesla's current iteration of autopilot "self driving". They are making great steps towards that but still not there yet. You are judging this accident from such a human perspective rather than seeing it from an engineering problem. Are humans currently better at seeing perpendicular semi trucks crossing in front of them in certain circumstances? Apparently yes. Are humans better at keeping their attention on the road 100% of the time than computers? The statistics say no. I can think of loads of examples of things that are going to be difficult to program for, things like emergency vehicles, severe weather, erratic drivers, etc. but I have confidence eventually they will figure it out. In the meantime I will enjoy the progress autopilot is making with every over-the-air update. However, both of those figures are better than the statistics for Teslas without Autopilot engaged: one accident for every 1.76 million miles driven in Q4 2018 and one every 1.58 million miles driven in Q1 2019. Teslas with or without Autopilot engaged also appear substantially safer than the average car, based on the new Tesla data. According to the National Highway Traffic Safety Administration’s most recent data, there’s an auto crash every 436,000 miles driven in the United States.In the fourth quarter of 2018, Tesla reported one accident for every 2.91 million miles driven with Autopilot engaged. The first Tesla safety report of 2019 shows that rate increasing slightly, to one accident every 2.87 million miles.
Actually, I think it's so difficult because self-driving cars is an engineering problem rather than a human one. It's like ironing and folding clothes. Looks really simple, but insanely difficult to build a computer to do it, because the domain is one that our brain is well-suited for, but computers are not. If we see a squirrel add numbers, we see intelligence; but we are not much impressed with a computer adding numbers. If we see a computer dashing among the treetops we see intelligence; but we are not much impressed with a squirrel dashing among the treetops. Self-driving cars are imitating human driving, and the imitation is hitting diminishing returns because computers are ill-adapted for the domain. We'd probably get more ROI changing the roads (the domain) at this point.You are judging this accident from such a human perspective rather than seeing it from an engineering problem.
Completely agree that there are diminishing returns with regards to effort and that last few percent of full self driving. But changing our roads would be trillions of dollars and would still require new car tech. Full self driving will happen by the end of this decade.Self-driving cars are imitating human driving, and the imitation is hitting diminishing returns because computers are ill-adapted for the domain. We'd probably get more ROI changing the roads (the domain) at this point.