This Is Your Brain. This Is Your Brain on Neurotechnology.

New brain research is leading to second thoughts on our morality

Bennett Gordon Utne.com

| May 10, 2007

There is a classic dilemma in studies of ethics: Imagine a trolley is speeding down a track toward five unsuspecting people. You have the power to divert the trolley down a different track so that it kills only one innocent bystander. The question is, would it be morally right to do so?

Questions like the 'trolley problem' have traditionally been the arena of philosophers, but recent advances in technology have allowed psychologists and neurologists to weigh in with a new set of answers. According to Richard Brandt of Technology Review, researchers like Harvard University's Joshua Greene have been using functional magnetic resonance imaging (fMRI) scans to figure out how people's brains react in moral quandaries. What Greene has found, Brandt reports, is that 'either an emotional center or a reasoning center may play the dominant role, depending on the kind of moral decision being pondered.'

William Saletan of Slate Magazine puts it this way: 'If you often feel as though two parts of your brain are fighting it out, that's because, in fact, they are.' People tend to see the brain as a single organ, when in reality, Saletan writes, it's actually 'an assembly of modules that sometimes cooperate and sometimes compete.' Many neuroscientists have concluded that competing tendencies inside the brain -- not some transient being or God -- are the true source of morality.

So if morality is reduced to brain chemistry, do we face a scary sci-fi dystopia? Quite possibly. Hugh Gusterson of the Bulletin of the Atomic Scientists, writes that the US military is exploring neurotechnology for national defense. Gusterson cites Jonathan Moreno's book Mind Wars: Brain Research and National Defense (Dana Press, 2006), saying that the Department of Defense is already funding 'neuroweapons,' much like neuron toxins, and drugs that would 'repress psychological inhibitions against killing.'

-Advertisement-

When neurotechnology begins to change 'psychological inhibitions,' the role of morality is called into question. As Saletan points out: 'Once technology manipulates ethics, ethics can no longer judge technology.'

I'm a victim of extremely unethical, violent non consensual experimentation that appears to include brain trauma and brain imaging. Here's my take:
I am opposed to this kind of binary thinking, e.g. reason versus emotion. I've seen other articles on this kind of thing. Sure, the brain can provide information about where certain thought processes are taking place *in individuals* but to stretch that out to behavior seems absurd.
How people respond to questionnaires may not correspond to what they actually do. It's easier to say that one would allow one individual to die than to actually let them die.
"Morality" is to some degree "relative." That is, there are cultural factors in place in terms of one's moral views. These can be explicit rules or implicit rules - but also in a multicultural society determined by whose rules one follows.
There is another factor to consider - how does the person envision the future? The questions presume that there is a certain future (e.g. all would die if one is not killed) as opposed to an uncertain future which is more realistic. Finally, I've seen this paired with a concept of "utilitarianism" which then is extended to all kinds of culturally bound ideas. That's where it gets really crazy.
My take: if the one person I had to kill to save the others was the person experimenting on me, I would but I'd rather see them exposed and PROSECUTED - if the death penalty was involved, that's not my problem. We've had enough.

Anonymous for obvious reasons

10/12/2010 4:14:56 PM

I'm a victim of extremely unethical, violent non consensual experimentation that appears to include brain trauma and brain imaging. Here's my take:
I am opposed to this kind of binary thinking, e.g. reason versus emotion. I've seen other articles on this kind of thing. Sure, the brain can provide information about where certain thought processes are taking place *in individuals* but to stretch that out to behavior seems absurd.
How people respond to questionnaires may not correspond to what they actually do. It's easier to say that one would allow one individual to die than to actually let them die.
"Morality" is to some degree "relative." That is, there are cultural factors in place in terms of one's moral views. These can be explicit rules in society or implicit ones through practice rather than words and teaching.
There is another factor to consider - how does the person envision the future? The questions presume that there is a certain future (e.g. all would die if one is not killed) as opposed to an uncertain future which is more realistic. Finally, I've seen this paired with a concept of "utilitarianism" which then is extended to all kinds of culturally bound ideas. That's where it gets really crazy.
My take: if the one person I had to kill to save the others was the person experimenting on me, I would but I'd rather see them exposed and PROSECUTED - if the death penalty was involved, that's not my problem. We've had enough.