I wonder how this relates to a "divide" among people who are, to simplify, "peaceniks" and "warniks". Warniks can talk about things like "collateral damage" (even if includes thousands of civilian deaths) as being "worth the price to defend our freedoms" whereas peaceniks don't understand how people can think this way.

Collateral damage is a concept that is relatively new, in evolutionary terms. For most of our history, warfare was done the old fashioned way - by hand, hacking people to bits or bashing their brains in personally. In that kind of warfare, there wouldn't seem to be as much, or any, "collateral" damage - pretty much you only bash in the brains of those you want to bash. And it was all very personal.

Now people look at it as an abstraction - they don't have to kill any of those collaterals personally, they don't have to see the bodies, they don't have to deal with them as human beings at all. But some still choose to see them that way anyway. Thus, the difference between the apparently bloodthirsty warniks and the peaceniks.

Warniks can talk about things like "collateral damage" (even if includes thousands of civilian deaths) as being "worth the price to defend our freedoms" whereas peaceniks don't understand how people can think this way.

The study in question specifically details how people with damage to a certain portion of the frontal cortex are more apt to be willing to make utilitarian judgments, such as that illustrated above. The study found that those with this specific type of damage were more willing to throw someone under the wheels of a bus in order to save five other people. It did not, however, find that people with this damage were more willing to kill for personal gain (and in fact asked the question and received a negative response; whether truthful or not is an open question, as sociopaths tend to be quite cognizant of the social consequences of such a response).

What I find striking about this study is that it could easily be read as another criticism of our hard-wired inclinations when it comes to moral dilemmas - yet the press seems to present it as a positive study about how morality/empathy is hardwired.

In the situation presented to the participants, it seems to me that utilitarian thinking is the most appropriate approach. I don't know the exact propositions presented to the subjects, but unlike forced organ donation scenarios, the train problem seemed a clear case in which it would be moral - and perhaps even heroic - to sacrifice one life (especially assuming it could be one's own) to save 5 others. I would hope all people in similar circumstances might be able to overcome their instincts and weigh the consequences rather than act by force of "blind" empathy.

I don't know that there is a "correct" answer. There's certainly one answer you personally prefer.

According to this amusing parody: A utilitarian is "a creature generally believed to be endowed with the propensity to ignore their own drowning children in order to push buttons which will cause mild sexual gratification in a warehouse full of rabbits."

This parody does highlight a substantive problem—one of the many problems—with utilitarianism: It's impossible to come up with The Right Method of calculating the value of outcomes.

We often feel we are more ethically responsible for the actions we actually perform than for the indirect consequences of our actions. Sometimes this individualistic focus seems desirable, sometimes not.

All moral sentiments are personal preferences. I tried to be precise in saying that I thought the utilitarian heuristic seemed the right one for this particular circumstance - I would not want to say utilitarian thinking is always the right moral strategy.

My mention of "forced organ donation" was a nod to such a moral problem with a general application of utilitarianism - so I'll not be sexually gratifying all of Pasadena at the expense of an orphan's life or anything similar. But certainly there are circumstances where a simple assessment of the overall consequences can provide moral insight.

I wasn't able to track down a description of the specific setup used for this scenario, but a situation in which one innocent life is pitted against 5 other innocent lives seems to be one where utilitarian thinking makes the most sense - and one in which our basic instinct to avoid causing death leads to a greater loss.

That isn't to say that sacrifice is the only moral course of action - nor that the person who failed to sacrifice the one would be responsible for the death of the 5 (the psychological repurcussions for taking on such moral entanglements - even if one acted in the "right" way - are significant).

But doesn't it seem a little problemmatic that we are so hard-wired to avoid causing death that we'll watch many of our fellow humans die before we'll resort to drastic measures ourselves? I do not want a world full of lions - but maybe an instinct for sheepishness (lambs to the slaughter?) has negative consequences as well...

God walks into the room and asks you to accompany him to the Death Reduction Control Center. He then wisks you away to a small office with a large console and multiple monitors.

He says he's created this system to make the world more perfect by reducing death. For whatever reason (you decide not to press cause it is God after all) the DRCC can only prevent each death at the cost of increasing the liklihood of a random death by 20%. For every 5 people saved from a random death by the administrator of the DRCC, one random person will die because you saved those 5.

The knowledge that I caused *any* death would weigh heavy on my conscience - even if it was because I had saved 5 others. But still, I would hope God could find people willing to staff the DRCC - don't you?

All moral sentiments are personal preferences. I tried to be precise in saying that I thought the utilitarian heuristic seemed the right one for this particular circumstance - I would not want to say utilitarian thinking is always the right moral strategy.

Certainly, members of the two control groups also chose the utilitarian answers. Just not with as much regularity and alacrity, and not to such a degree. I'll have a post on this in a bit.

I'm reading Marc Hauser's "Moral Minds," which covers this sort of thing in detail. He discusses the evidence from testing lots of babies that shows that an emotional basis to moral judgment is hardwired.

It's interesting to someone like myself who believes in ideas like Natural Law and conscience. I have no problem with emotional response and even moral judgment being hardwired. Obviously, there are ways to overcome that (brain damage, brain-washing), and similarly there are ways that we can improve it. So what it says about the way we form judgments can be fit into a variety of moral systems, from a strict Christian ethics to a loose "do whatever I want" system.

Please pick a handle or moniker for your comment. It's much easier to address someone by a name or pseudonym than simply "hey you". I have the option of requiring a "hard" identity, but I don't want to turn that on... yet.

With few exceptions, I will not respond or reply to anonymous comments, and I may delete them. I keep a copy of all comments; if you want the text of your comment to repost with something vaguely resembling an identity, email me.

No spam, pr0n, commercial advertising, insanity, lies, repetition or off-topic comments. Creationists, Global Warming deniers, anti-vaxers, Randians, and Libertarians are automatically presumed to be idiots; Christians and Muslims might get the benefit of the doubt, if I'm in a good mood.

Search This Blog

About the Blog

Notwithstanding fair use, you are free to republish the contents of this blog, but you may not directly or indirectly charge money for its use nor edit the content without my permission, and you must properly credit and link to the original. You don't have to tell me, but it would feed my vanity if you did.