a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by coffeesp00ns
coffeesp00ns  ·  3320 days ago  ·  link  ·    ·  parent  ·  post: Drivers are being idiots with Teslas new autopilot features

Humans being idiots? I can't imagine that.

edit: In seriousness, Tesla has hit upon the fundamental problem: there can't be an intermediary step between fully human controlled, and fully automated when it comes to passenger cars. Basic autopilot works in planes because pilots have extensive training, and there is constant monitoring and communication between the pilots and various air traffic controls - Everyone knows, or can know (mostly) where everyone else is in their local sky area.

Neither of these things are true of cars. As a result any measures to "bridge the gap" between human and machine are going to fail. Leave emergency measures to the machine? Safer, but it will pull control when drivers "feel" they are still in control, and no one will use it. People will cry nanny state and robot apocalypse. Leave emergency measures to the driver? Increase in accidents, because any driver relying on Autopilot is probably focusing on something else. If you ask someone who is not focused on the road to suddenly take the wheel in an emergency decision, it's gonna be a bad time.

Full automation, or full human autonomy, is what I'm saying.





veen  ·  3320 days ago  ·  link  ·  

Straight from my thesis:

    It takes at least four seconds for a person to take over the wheel, at least eight seconds before the driver can respond adequately and it takes around thirty seconds before the driver

    has [regained] the same precision as a regular driver.

Yes, it's possible. It is needed, for training purposes. Google takes all the training upon itself, which is extremely expensive as you need to do thousands of miles, all of which monitored by a driver, before you hit a good enough level. Using beta testers makes sense from a technical and business point of view - but it's quite risky from a human factors engineering standpoint.

mk  ·  3320 days ago  ·  link  ·  

Google said that it found the same in its testing. Humans couldn't be trusted not to trust the car too much. That was their reasoning behind removing the steering wheel, (and the brake and gas pedals) in their auto-cars.Somewhat ironically, CA regulators worried it makes them less safe.

My guess is that Google is right, and that Tesla, et al. (Tesla's halfway approach is the same that other manufacturers are taking) are wrong.

insomniasexx  ·  3320 days ago  ·  link  ·  

You've hit the nail on the head here. We saw the same things happen with planes when their autopilot and auto-features first started happening.

There is an excellent podcast about the phenomenon and how some car makers are handling it here. Highly recommended.

http://www.npr.org/sections/money/2015/07/29/427467598/episode-642-the-big-red-button

user-inactivated  ·  3320 days ago  ·  link  ·  

    Basic autopilot works in planes because pilots have extensive training, and there is constant monitoring and communication between the pilots and various air traffic controls - Everyone knows, or can know (mostly) where everyone else is in their local sky area.

Well, even airline automation has its challenges.