Jen Wright at Experimental Philosophy:
… I’m writing this post because I found something even more interesting…and puzzling. Leaving people’s actual looking behavior aside, I found a very powerful effect — consistent across all the vignettes — for which side of the screen the potential victim (the fat guy or the baby) was on. When the victims were on the right-side of the screen, people’s would and should judgements were significantly higher (i.e., they were more willing to, and thought more strongly that they should, kill the victim to save the others), than when they were on the left-side of the screen.
So, does anyone have any suggestions as to what might explain this finding?
My guess is that it’s related to the previous findings that people tend to place active people on the left of passive people in pictures (though it seems to vary across languages). The easiest interpretation is that it seems more moral to sacrifice passive people than active ones. That would also fit with the pattern I pointed out before in our moral intuitions, that moral concern is highly contingent on whether we can be rewarded or punished by the beneficiary of our ‘compassion’.
Publicly refuting facts often reinforces their believed truth in the minds of the public, and they will even credit the misinformation to the organisation denying it.
A Washington Post article reports an experiment where people were given fliers labelling common ideas about influenza ‘true’ or ‘false’. Half an hour later older people already remembered 28% of falsities as facts, and three days later 40% , by which time younger people caught up to the older people’s half hour figure. It seems that the repetition of the false information helps to ingrain it, while the extra information – that it is false – is soon lost.
So how do you have factual public debate when whoever starts it automaticly has a major advantage? Denial and silence can have the same effect as agreeing, but denying is still best. A good proportion of people (a few days later at least) do remember whether their facts are false or not. Though as TWP discusses, it’s probably best to deny things without actually mentioning them if possible. That is, fiercly support something mutually exclusive.
As noted in the discussion of Overcoming Bias’ post on this, if people have anything at stake they might pay more attention. While this has problems of its own (discussed there), a big obvious gap where it’s important for people to have accurate information on topics not directly concerning them is in democracy. Just another in a long list of problems with the kinds of democratic systems we use, but in conjunction with rational ignorance it makes the chance of voters having a clue about anything not immediately concerning them both tiny and tied firmly to the chance of the first buyer of lots of ads happening to be right.