CletiusMaximus
Footballguy
I think there are probably dozens of ethical issues involved in FSD tech, but I haven't seen much discussion of the Trolley Problem recently and wonder where that's been addressed and a consensus of some sort reached. In other words, if the self-driving vehicle is put in a situation where it has to avoid a hazard and chose among veering into oncoming traffic, driving off the side of a cliff, or plowing through a group of school children, how does it weigh those options and make that decision? A couple years ago I saw some discussion regarding all the different possible permutations of this. One of the answers is that, regardless of a few odd instances of glitches that may cause negative outcomes due to lack of a human involvement, self-driving vehicles will undoubtedly save many lives overall. Someone posed the question to a Mercedes exec at a car show a while back and his answer was that their self-driving cars would never put the passenger (their customer) at risk. That was a very good and expected half-answer to the question.