a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment
veen  ·  3617 days ago  ·  link  ·    ·  parent  ·  post: When autonomous software breaks the law, who's to blame?

This is pretty much how I understand the culpability issue. Most of the problems with automated vehicles will be accidents, and most of the situations will see the blaim to be put on the company making the automated vehicle.

Automated vehicles will try their best to avoid accidents, and will make judgements just like a human does (the intent part of culpability). The question is whether those judgements are good or bad ones. If there is a clear misjudgement that could have been changed by changing some code, than the car (and the company) is to blame. When the car couldn't have done it any better (aka best intentions), it's not to blame. If you were to fall in front of an automated vehicle going 100, no, it's not to blame.

There are already automated vehicles in positions where they could hurt people. ASI is a company which automates mining and farming operations. Out of interest I've emailed them who they think is responsible when causing damage, and I've asked them for a license or terms of use for me to read. Let's hope they respond!