Yup. The trolly problem is one of ethics and responsibility, not whether one person or several people die.
The death of the people is irrelevant, your responsibility for those deaths is the point.
I didn’t get it either until a good friend and I were discussing it and he said: forget the trolly. How about this, you’re walking down the street after eating at Subway (or some similar shop) and you have half a sandwich left, you pass by someone begging for food. You can either choose to give it to them or not. If you choose not to, and later that same day the person dies from starvation, are you responsible for their death because you didn’t give them the excess food you had?
The dilemma is based on a few points, if you take action and the person dies, are you responsible for the death you caused, if you take no action are you responsible for deaths you could have avoided by taking action, when you chose not to?
In OP’s post, legally, if you are the driver/operator of the vehicle, you are always, 100% responsible for anything the vehicle does, whether under autonomous control or not. This is the law. Whether you are morally at fault, is a matter of debate. You didn’t direct the car to run over people, but you also did not stop the car from running over the people.
There’s an argument to be made about duty of care, etc.
Agreed. I won’t get into it, since the trolley problem has taught me that there’s a lot of opinions on it, which makes it seem relevant, but there’s nearly zero consensus on what the correct analysis of the situation is. At the end of the day, the dilemma is a near impossibility.
The courts have made up their mind on it and that’s all I’m going to concern myself with for the moment.
This issue has been explored previously, and with a better example of the trolley problem that centers the ethical dilemma entirely on the autopilot.
I do agree that in most situations, the driver retains full control over the vehicle, and therefore remains fully responsible, even if there’s a case to be made that the autopilot neglected the safety of others outside the car.
However, I’d also argue that this example leaves a possibility where fault cannot be assigned to them: If the driver became aware of the hazards at a reasonable time (i.e. spotting the pedestrians just around a sharp bend, rather than 200m down a straightaway), and made every reasonable effort to stop within that time but could not. There are limits to the driver’s responsibility, but the most interesting cases are crashes that the autopilot is capable of preventing (even if the driver reasonably cannot), but fails to do so.
Given the amount of distance between that car and the crosswalk, and the fact that it’s a crosswalk meaning the car is not going to be traveling at freeway speeds. I would hazard a third option and say maybe just kind of lightly press on the brakes? ( ͡° ͜ʖ ͡°)
Yup. The trolly problem is one of ethics and responsibility, not whether one person or several people die.
The death of the people is irrelevant, your responsibility for those deaths is the point.
I didn’t get it either until a good friend and I were discussing it and he said: forget the trolly. How about this, you’re walking down the street after eating at Subway (or some similar shop) and you have half a sandwich left, you pass by someone begging for food. You can either choose to give it to them or not. If you choose not to, and later that same day the person dies from starvation, are you responsible for their death because you didn’t give them the excess food you had?
The dilemma is based on a few points, if you take action and the person dies, are you responsible for the death you caused, if you take no action are you responsible for deaths you could have avoided by taking action, when you chose not to?
In OP’s post, legally, if you are the driver/operator of the vehicle, you are always, 100% responsible for anything the vehicle does, whether under autonomous control or not. This is the law. Whether you are morally at fault, is a matter of debate. You didn’t direct the car to run over people, but you also did not stop the car from running over the people.
There’s an argument to be made about duty of care, etc.
However, this is the root of the trolly problem.
Thank you for coming to my Ted talk.
with level 3+ autonomous driving the “driver” is not responsible.
Legally, or morally?
Maybe neither?
IDK. I’m not going to start a philosophical debate here. Just asking for you to clarify.
fair. I was talking legally but morally is a whole other thing
Agreed. I won’t get into it, since the trolley problem has taught me that there’s a lot of opinions on it, which makes it seem relevant, but there’s nearly zero consensus on what the correct analysis of the situation is. At the end of the day, the dilemma is a near impossibility.
The courts have made up their mind on it and that’s all I’m going to concern myself with for the moment.
This issue has been explored previously, and with a better example of the trolley problem that centers the ethical dilemma entirely on the autopilot.
I do agree that in most situations, the driver retains full control over the vehicle, and therefore remains fully responsible, even if there’s a case to be made that the autopilot neglected the safety of others outside the car.
However, I’d also argue that this example leaves a possibility where fault cannot be assigned to them: If the driver became aware of the hazards at a reasonable time (i.e. spotting the pedestrians just around a sharp bend, rather than 200m down a straightaway), and made every reasonable effort to stop within that time but could not. There are limits to the driver’s responsibility, but the most interesting cases are crashes that the autopilot is capable of preventing (even if the driver reasonably cannot), but fails to do so.
Meme template
Given the amount of distance between that car and the crosswalk, and the fact that it’s a crosswalk meaning the car is not going to be traveling at freeway speeds. I would hazard a third option and say maybe just kind of lightly press on the brakes? ( ͡° ͜ʖ ͡°)
What if the brakes were malfunctioning.
Then the car could just turn to the closest side and grind on the barrier to stop itself ༼ つ ◕_◕ ༽つ
Additionally the car should honk frantically to alert the pedestrians and fellow drivers of a dangerous situation happening.
deleted by creator
Bro, is this really the time for that 💀