When backed into a corner, most hard-line utilitarians concede that the standard counter-examples seem extremely persuasive. They know they’re supposed to think that pushing one fat man in front of a trolley to save five skinny kids ismorally obligatory. But the opposite moral intuition in their heads refuses to shut up.
Why can’t even utilitarians fully embrace their own theory?
He raises this question to argue that ‘there was evolutionary pressure to avoid activities such as pushing people in front of trolleys’ is not an adequate debunking explanation of the moral intuition, since there was also plenty of evolutionary pressure to like not dying, and other things that we generally think of as legitimately good.
I agree that one can’t easily explain away the intuition that it is bad to push fat men in front of trolleys with evolution, since evolution is presumably largely responsible for all intuitions, and I endorse intuitions that exist solely because of evolutionary pressures.
Bryan’s original question doesn’t seem so hard to answer though. I don’t know about other utilitarian-leaning people, but while my intuitions do say something like:
‘It is very bad to push the fat man in front of the train, and I don’t want to do it’
They also say something like:
‘It is extremely important to save those five skinny kids! We must find a way!’
So while ‘the opposite intuition refuses to shut up’, if the so-called counterexample is persuasive, it is not in the sense that my intuitions agree that one should not push the fat man, and my moral stance insists on the opposite. My moral intuitions are on both sides.
Given that I have conflicting intuitions, it seems that any account would conflict with some intuitions. So seeing that utilitarianism conflicts with some intuitions here does not seem like much of a mark against utilitarianism.
The closest an account might get to not conflicting with any intuitions would be if it said ‘pushing the fat man is terrible, and not saving the kids is terrible too. I will weigh up how terrible each is and choose the least bad option’. Which is what utilitarianism does. An account could probably concord more with these intuitions than utilitarianism does, if it weighed up the strength of the two intuitions instead of weighing up the number of people involved.
I’m not presently opposed to an account like that I think, but first it would need to take into account some other intuitions I have, some of which are much stronger than the above intuitions:
- Five is five times larger than one
- People’s lives are in expectation worth roughly the same amount as one another, all else equal
- Youth and girth are not very relevant to the value of life (maybe worth a factor of two, for difference in life expectancy)
- I will be held responsible if I kill anyone, and this will be extremely bad for me
- People often underestimate how good for the world it would be if they did a thing that would be very bad for them.
- I am probably like other people in a given way, in in expectation
- I should try to make the future better
- Doing a thing and failing to stop the thing have very similar effects on the future.
So in the end, this would end up much like utilitarianism.
Do others just have different moral intuitions? Is there anything wrong with this account of utilitarians not ‘fully embracing’ their own theory, and nonetheless having a good, and highly intuitive, theory?