a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by thundara
thundara  ·  2953 days ago  ·  link  ·    ·  parent  ·  post: George Hotz cancels Comma One because of NHTSA letter

    That's an admission that neural networks are unknowable, but an assertion that they are better because, you know, neural networks.

I'd imagined that that statement was based on some collection of training and test data run through various simulation algorithms. It's hard to imagine where comma one would have gotten such a data set, but the problem he's commenting on is one universal to all of machine learning. What algorithm do you use to identify pedestrians from a point cloud? What is the estimated velocity of neighboring vehicles? Is that debris on the ground roadkill or nails?

At the end of the day you're mapping some noisy inputs to an abstracted output, and you can try fitting a bipedal model to a person, but that may error on statues near a roadway. Or you can plug the whole thing into a shiny set of general algorithms that integrate over space and time, let them work their magic, and pick the one with the highest false positive or false negative.

Not saying it's a right or wrong approach, and obviously any tests should include an appropriately large data set, along with added perturbations for all manner of lighting, noise, angles, added vehicles, etc. But it's a common question: do you trust the most accurate model or the one you understand the most? Ideally the former is the latter, but that can sometimes only come after years of analysis. I think the hip-young computer scientist answer to this would be make all data and analysis pipelines open, but something tells me comma wants the quick and easy solution.





kleinbl00  ·  2953 days ago  ·  link  ·  

It would be a herculean labor to convince me that any "startup" has access to or has put the effort into training a car to drive in such a way that won't lead to tragedy.

I include Tesla in that grouping.

I do not believe that on-the-job training for any neural network won't kill way too many people unless you offer people the sensors for free and compare real-time human decisions with projected neural network decisions until you have near-total overlap over millions and millions of miles driven. Tesla could be doing that but I don't think they would have let the monster loose as early as they did if they'd taken this approach.

thundara  ·  2953 days ago  ·  link  ·  

    It would be a herculean labor to convince me that any "startup" has access to or has put the effort into training a car to drive in such a way that won't lead to tragedy.

    I include Tesla in that grouping.

Ditto, though it seems like Google has at least taken that approach. I'm a little surprised that Tesla skipped the line on the regulation. Maybe something to do with the legalese of how the feature is offered?

kleinbl00  ·  2952 days ago  ·  link  ·  

Google has no interest in selling cars. Google will license their dataset and path-following technology (because that's what they're building) to anybody who wants to pay the fees, thereby allowing anyone and everyone to hop onto a crowdsourced traffic system rather than building "autonomous vehicles."

Tesla mostly wants to sell batteries and battery-powered cars. They do that by being innovative and a leader in the industry. They're largely appealing to rich eccentrics (for now) but Tesla's exit is probably to another car company. Elon Musk never wanted to be Henry Ford; his spiritual hero is DD Harriman. I think he's said as much although I can't find a source at the moment.