Tuesday, January 19, 2010

When a soldier is killed while in the military service the President writes a condolence letter to the family. However, if a soldier is psychologically injured and then commits suicide, there is no Presidential letter of condolence. There is apparently an unwritten policy that this does not include families of soldiers who have committed suicide.

It is easy to imagine how hurtful that must be for families who are burying a loved one who came back from war with psychological problems and then committed suicide or perhaps did it overseas.One Such Family
After my blog on this subject appeared in PsychiatryTalk.com

I received a comment from Gregg Keesling the father of one such soldier and then had a correspondence with him. His story was also written up in the NY Times.

His 25-year-old son Chancellor served two deployments . He committed suicide in Iraq after sending his parents an email telling them of his decision. He said that “ military investigations demonstrated, our son Chancellor was a good soldier. He succumbed to an illness as much as someone who dies in the war theater from food poisoning or infection, and we believe that the President should send condolences and express the country’s appreciation of our family’s sacrifice.”

A spokesperson for President Obama said that the policy in regard to who should receive a letter of condolence is currently undergoing a review.

What Reasons Are Given For Opposing A Letter of Condolence ?
I have tried to understand why anyone would advocate that the President should not express condolence to families such as the Keeslings I heard one view that many soldiers would feel that their comrades combat death would be demeaned if the families of soldiers who suicided were given an equal letter of condolence. Another view is that treating suicide the same as other war deaths might encourage mentally frail soldiers to take their lives by making the act seem honorable.

I believe these ideas are misguided and resurrect the stigmatization of mental illness. Soldiers cannot will themselves to avoid these emotional states anymore than a soldier can avoid a bullet or an explosive device. Once you are in a combat zone, you are vulnerable to injury. I know of no evidence that people on the verge of suicide would be driven to it because their family would get a letter of condolence.

They Are All Heroes
If a soldier in Iraq were accidentally killed in a car accident, would his death be any less deserving a letter of condolence than a soldier who was killed in an enemy ambush? Would the family be any less deserving of the letter if the soldier made a tactical error leading to his death as compared to someone who bravely fell on a grenade to save others lives? Similarly, would you compare a soldier who faced many horrific combat situations and developed PTSD with another soldier who became severely depressed shortly after arrival in the combat zone if both ended up having intolerable suicidal feelings, which led to their death? I don’t believe that we judge some soldier’s deaths as being more worthy than others.

Yes, we do give out medals for unusual acts of bravery but this in no way diminishes the sacrifices that others have made.
Every soldier has volunteered and knows that he or she could be exposed to combat. For this they deserve our thanks and when they make the supreme sacrifice, their families deserve a letter of condolence.

Recent Actions To Attempt to Influence The PresidentOn December 23rd a bipartisan coalition of 44 House members initiated by Reps Patrick Kennedy and Dan Burton sent a letter to President Obama. They wrote “ By overturning this policy on letters of condolence to the families of suicide victims you can send a strong signal that you will not tolerate a culture in our Armed Forces that discriminates against those with a mental illness.”

The American Foundation for Suicide Prevention in a letter to the President on January 5th wrote “We agree with members of Congress that you can send a strong signal that you will not tolerate a culture in our military services that discriminates against those with mental illness. Please Mr. President, overturn this policy as soon as possible.”

On January 7th Mental Health American announced the adoption of a position requesting the President revoke the policy of not writing condolence letters to families of soldiers who have committed suicide They also started an online petition on Facebook.

Write to the President Now
I continue to urge anyone concerned about this issue to write to the President Those who are mental health professionals should state this and explain your views based on your understanding of mental illness. The email address to write the President is http://whitehouse.gov/contact

Michael Blumenfield, MD

Dr Blumenfield is The Sidney E. Frank Distinguished Professor Emeritus of Psychiatry and Behavioral Sciences at New York Medical College. He is a recent Past Speaker of the Assembly of the American Psychiatric Association. Dr Blumenfield lives and practices in Woodland Hills, CA, where he also writes a weekly blog, PsychiatryTalk.com.

Tuesday, January 12, 2010

In a very long essay in the Sunday (1/10/10) New York Times Magazine, entitled, “The Americanization of Mental Illness,” Ethan Watters suggests that a kind of psychiatric-cultural imperialism has been foisted on other countries and cultures by “the West.”

Specifically, Watters claims that, “For more than a generation now, we in the West have aggressively spread our modern knowledge of mental illness around the world…. There is now good evidence to suggest that in the process of teaching the rest of the world to think like us, we’ve been exporting our Western “symptom repertoire” as well. That is, we’ve been changing not only the treatments but also the expression of mental illness in other cultures.”

Watters claims, for example, that as the general public and mental health professionals in Hong Kong “…came to understand the American diagnosis of anorexia,, the presentation of the illness in Hong Kong actually became more “virulent.”

Though the Watters thesis has its merits, it is also glib and simplistic in many of its assumptions and conclusions. To be fair: Watters rightly calls attention to the ways that culture and ethnicity can shape both the diagnosis and expression of psychiatric conditions and symptoms. But this is hardly news to psychiatrists: the late Dr Ari Kiev advanced much the same thesis in his 1972 book, Transcultural Psychiatry.

Watters’ more controversial claim is that the exportation of American psychiatric nosology and “biomedical ideas” has changed the way symptoms are diagnosed and expressed in some other cultures. But this claim is very hard to validate. The “American” diagnostic system is, in the first place, not terribly different from the World Health Organization’s International Classification of Disease (ICD), whose descriptions of “mental and behavioral disorders” evolved almost contemporaneously with those of the last two DSMs. It would be very hard to tease out the cross- cultural influence of the DSM classification from that of the ICD, over the past 30 years. More important, Watters fails to consider alternative explanations for his “findings”; for example, rising rates of DSM-type anorexia nervosa in Hong Kong could be due largely to increased recognition of a long-standing, indigenous disorder that heretofore had not been fully appreciated by Chinese clinicians.

An example from American history helps make the point. Many of the basic symptoms of post-traumatic stress disorder (PTSD) have been recognized for centuries—at least since the U.S. Civil War, and probably much earlier—and have gone by various names, such as “soldier’s heart,” “combat fatigue," “shell-shock,” etc. But it took the efforts of troops returning from the Vietnam War to “push” psychiatry toward recognition of PTSD as a bona fide disorder. Understandably, apparent PTSD prevalence rates have soared since the diagnosis entered American nosology in 1980, with the advent of DSM-III. But it is entirely possible that the actual prevalence of PTSD symptoms in the U.S. has not changed markedly over many generations.

So, to return to Mr. Watters’ thesis: it would not be surprising to find that, as clinicians in other cultures began to familiarize themselves with DSM or ICD psychiatric disease criteria, the apparent prevalence rates of certain psychiatric conditions increased in those countries. It is quite another thing to imply that the actual prevalence of these conditions has increased—and that their morphology has changed--as a result of Western influences. Yet Watters seems to imply just this, when he asserts that

“…a handful of mental-health disorders — depression, post-traumatic stress disorder and anorexia among them — now appear to be spreading across cultures with the speed of contagious diseases. These [Western-based] symptom clusters are becoming the lingua franca of human suffering, replacing indigenous forms of mental illness.”

We would need several generations of very sophisticated epidemiological studies, carried out using identical diagnostic criteria, to substantiate this “contagion-replacement” hypothesis. Anecdotal data, such as those presented in the Watters article, are inadequate. But even if Watters is correct, his claims do not answer the fundamental medical-ethical question: will adopting “Western” diagnostic criteria ultimately lead to a net reduction in suffering, and a net increase in well-being, in other cultures? If, after careful systematic study, the answer to this question turns out to be no, our Western paradigms will have failed. If the answer turns out to be yes, we may conclude that we have been exporting a very valuable commodity.

Ronald Pies, MD

Dr. Pies would like to acknowledge both Rakesh Jain MD and Sandy Naiman for inspiring this blog.
Read more!

Wednesday, January 6, 2010

I don’t know how many psychiatrists paid much attention to the climate-change conference in Copenhagen last month, but I came away convinced they need our help. Here’s why. Given the scientific consensus that human behavior is the major cause of the planet’s undesired warming, who better to understand the roots, manifestations, repercussions, and treatment of that behavior?

But like much of our work, fostering change in this realm will not come easy. The industrialization that led to excessive use of the damaging fossil fuels fulfilled some of our basic psychological needs like safety, security, and survival. Once used to such lifestyles, they are hard to give up, and also come to be desired by poorer people. Usually, one has to receive some replacement reward to willingly want to try a change.

Moreover, the prediction that the major risks of climate-change are decades away naturally contributes to our passivity. We are equipped with a rapid fight-or-flight response to perceived immediate danger, which early humans faced day by day, but our brain has not evolved any automatic responses to future risks. Freud seemed to translate this biological process into psychodynamic understanding. The psychological defense mechanism of denial is helpful when we need to set priorities or to ignore something too painful at the moment. It is maladaptive when it leads to behavior that endangers us. Such maladaptive behavior as it relates to climate-change might be best illustrated by the legendary experiment of the frog and boiling water. If you throw a frog into boiling water, it will jump right out. However, if you put it in warm water and gradually increase the temperature to boiling, the frog will stay and cook to death. Who knows? If Freud were still alive, this experiment might be enough for him to revive his abandoned concept of a death instinct.

Maybe these natural tendencies contributed to some of the terminology that we use, which is not likely to evoke a sense of danger. The use of “warming”, as in the term of global warming, seems like psychological milquetoast. In fact, for people living in colder climates, like my Wisconsin, where the wind-chill as I write is minus 10 degrees, that terminology can paradoxically sound psychologically welcoming. Similarly for the terminology of the conference, climate-“change.” Such change can be perceived as potentially beneficial or not.

Unfortunately, the psychological repercussions of this challenge are slowly emerging. Among them are:

-Heat waves are associated with more alcohol and substance abuse.
-Just an average increase of 1 degree F seems to increase the risk of aggressive and violent behavior in warm climates, especially in the inner city and places of poor resources.
-Loss of your comfortable environment often causes a dangerous variation of grief, called solastalgia.
-So-called “climate refugees” have high rates of Posttraumatic Stress Disorders, often complicated by “cultural bereavement”.

How, then, can we help? As usual, we could treat the panic anxiety, PTSD, paralyzing guilt, and malignant narcissism as it emerges, but this might well be too little, too late. Up to now, most psychiatrists have not paid much attention to this problem, probably for the same sort of reasons as the rest of the public. Even our organizational Disaster Response Teams are prepared for responding to acute disaster, but not to the more chronic and slowly developing ones. For one thing, then, we can expand the charge of such teams accordingly.

For another thing, we can recommend better terminology to evoke an appropriate degree of anxiety and fear. Akin to the frog experiment, how about “global boiling”? Akin to mental instability, how about “climate instability”?

Even though psychiatrists are not yet known to be desired role models, we can model environmentally helpful behavior in our offices and personal life. We can use subliminal messages, such as wearing more green, the symbolic color of environmentalism. Why, I’ve gone so far as to wear the costume of the Jolly Green Giant of vegetable marketing fame to become “The Jolly Green Giant of Psychiatry,” which- - believe it or not - - actually won first place at a benefit costume contest.

Tuesday, December 15, 2009

Hired guns.” “Whores.” “Greedy, insensitive bastards. “ These are some of the more printable epithets used to describe psychiatric physicians who (allegedly) have “sold out to Big Pharma” ”—for example, by failing to disclose conflicts of interest, or to report large sums of money earned through their work with pharmaceutical companies. It may surprise some—but perhaps not many—that these terms of abuse were hurled not by members of some anti-psychiatry group, but by psychiatrists, writing on a well-known “watchdog” blog site. To be clear: I have no wish to excuse or rationalize the actions of some in our field who indeed have abused the public trust by engaging in any of the actions described. Anger—even outrage—is appropriate and healthy, with respect to their behavior. But must we also demonize these individuals, some of whom (notwithstanding their transgressions) have made important contributions to clinical care and scientific research?

The late Dr Albert Ellis—the psychologist who originated Rational Emotive Behavioral Therapy—always insisted that we distinguish between a person’s behavior, and the individual’s value as a human being. Writing in their classic 1961 book, A Guide to Rational Living, Ellis and his colleague, Robert Harper argued that, “…A person’s (good or bad) acts are the results of his being, but they are never that being itself.” (Ellis and Harper, 1961 p. 104, italics added). We often tell our patients they should not condemn the totality of their being on the basis of a selfish or hurtful act they have committed—yet some of us seem all too ready to condemn a colleague in the most sweeping and dehumanizing terms, because he or she is guilty (or is believed guilty) of one or more ethical lapses. By all means, let us condemn the transgressions! But let’s also retain a scintilla of human sympathy and understanding for the flawed human beings who committed them.

The problem of “demonizing rhetoric” is obviously not confined to the field of psychiatry, where it seems to be the effluvium of a few particularly bilious individuals. The language of demonization is all too prevalent in the narratives of many political and religious groups, who attack their opponents as infidels, heretics, traitors, or even worse. Carried to an extreme, we find terms like “vermin” applied to ethnic or religious groups who are the objects of hatred or persecution—the Nazis were infamous in their use of this term. Yes, I know—there is a difference between calling someone a “drug company whore” and reducing the person to the status of vermin. But the distance between the terms is not as wide as some would persuade themselves. And when one moves from the psychiatry blog sites to the rabid anti-psychiatry websites (eg, http://outlawpsychiatry.blogspot.com/), one sees in no uncertain terms how easily an unflattering epithet can morph into a dehumanizing slur.

The divisions within psychiatry extend far beyond concerns over “conflicts of interest” and “Big Pharma”—where there are, at least, valid ethical issues to be raised. Unfortunately, the profession continues to be mired in the same fruitless arguments over “biology” versus “psychology” that anthropologist Tanya Luhrmann documented in her classic investigation, Of Two Minds: The Growing Disorder in American Psychiatry (2000). And, all too often, one hears proponents of the “biomedical” model disparaging advocates of the “psychodynamic” model—or vice verse. This sterile debate persists, despite the heroic efforts of “pluralistic unifiers”, such as my colleagues Nassir Ghaemi, Godehard Oepen, Glen Gabbard, Michael A. Schwartz, and Cynthia Geppert. These clinician-scholars have refused to buy into the Manichean world of the “splitters”; rather, they embrace a scientific-humanistic perspective that comprehends both molecules and motives. This broad-based, pluralistic model was the one I imbibed during my residency at Upstate Medical University, where a fledgling resident could —as my teachers, Robert Daly and Eugene Kaplan might put it—“do theology in the morning and biology in the afternoon.” Oh, yes—and engage in heated but respectful debate with gadfly Tom Szasz, during lunch!

Psychiatry as a profession faces many challenges, and no small number of genuine threats. From excessive involvement with “Big Pharma” to the diminished role of psychotherapy; from managed care’s fifteen minute “med checks” to controversies over DSM-V, psychiatry understandably feels besieged, these days. No doubt, we have contributed to many of these problems through our own missteps or inaction. Yet, for all its flaws, psychiatry remains the most comprehensive and humanistic of the medical specialties— and has a great deal to offer the suffering individuals who seek our help. In 1858, Abraham Lincoln—addressing the issue of slavery-- warned the nation that, “A house divided against itself cannot stand.” It would be a genuine tragedy if psychiatry becomes “a house divided” by the rancor of its own rhetoric.Ronald Pies, MDEditor-in-Chief

"Every kingdom divided against itself is brought to desolation; and every city or house divided against itself shall not stand."--Matthew 12:25“Judge the whole of the person charitably.”—Talmud (Pirke Avot, 1:6)Read more!

Monday, November 30, 2009

Psychosurgery. For some, the word connotes all the promising therapeutic applications of modern neuroscience; for others, it connotes all the baneful excesses of unregulated pseudoscience. In his recent front page story for the New York Times (Nov. 27), Benedict Carey presented a balanced and nuanced view of what is sometimes termed “psychiatric neurosurgery,” focusing mainly on cingulotomy, capsulotomy, and other neurosurgical approaches to refractory obsessive-compulsive disorder (OCD).

For one of the patients interviewed by Carey, neurosurgery to treat her intractable OCD was clearly very helpful; yet for other patients, neurosurgery for their OCD had adverse behavioral effects despite some improvement in their obsessive-compulsive symptoms. Thus, in one Swedish study by Ruck and colleagues, only 3 of 25 patients with OCD undergoing capsulotomy were in remission without adverse effects at long-term follow-up. Ten patients “were considered to have significant problems with executive functioning, apathy, or disinhibition” after their procedures.

Psychiatric Times has discussed both the medical and ethical issues surrounding psychosurgery, pointing out both the need for better understanding of the underlying brain circuits targeted by psychosurgery and the imperative of safe-guarding the rights of potential candidates for psychosurgery. One conundrum noted by Dr. Christian Ruck, and cited in the Times article, is that “innovation is driven by groups that believe in their method, thus introducing bias that is almost impossible to avoid.” This is not to impugn the humanitarian motives that underlie most instances of psychosurgery; it is merely to say that institutions that perform psychosurgery may have a vested interest in seeing and presenting their outcome data in the best possible light.

“Psychosurgery and other intrusive and irreversible treatments for mental illness shall never be carried out on a patient who is an involuntary patient in a mental health facility and, to the extent that domestic law permits them to be carried out, they may be carried out on any other patient only where the patient has given informed consent and an independent external body has satisfied itself that there is genuine informed consent and that the treatment best serves the health needs of the patient."

Given the rarity of psychiatric neurosurgery—the Times noted that about 500 such procedures have been carried out in the United States over the past decade—review of each case by an independent neuropsychiatric expert outside the “provider” institution seems feasible. There is no doubt that for some suffering patients who have not responded to standard treatment, psychosurgery may be the only viable option. But as Alan Stone, MD concluded in his article for Psychiatric Times, "Given the historical burden of the old psychosurgery, the new neurosurgeons have a special obligation to proceed with utmost scientific caution."

Thursday, October 29, 2009

The press reported it in various ways—either as a “brutal gang rape” or, more forensically, as a “2½-hour assault” on the Richmond High School campus. Anyway you look at it, the horrendous attack on a 15-year old girl raises troubling questions for theologians, criminologists, and, of course, psychiatrists. How do we understand an act as brutal as rape? What factors and forces in the rapist’s development can possibly account for such behavior? And how on earth do we explain the apparent indifference of the large crowd that watched the attack in Richmond, California, and allegedly did nothing to stop it—or even, to report it?

In a thoughtful analysis on CNN, Stephanie Chen provides a range of “expert opinions” on this last question. Essentially, the various hypotheses asserted that:
o Bystanders in large groups are unlikely to take appropriate action in such cases, because they assume others have already done so; or because “doing nothing becomes the norm” (the so-called bystander effect).
o Witnesses who otherwise might have phoned 911 may have feared retaliation from the perpetrators.
o Bystanders do not feel a “bond” with the victim, and may actually identify with the perpetrator, who is perceived as “more important” than the victim.

The CNN report speculated at length on the so-called “Genovese Syndrome,” named for the woman stabbed to death in Queens, NY in 1964, supposedly after 38 witnesses to the attack did nothing to help her. (The facts, however, are almost certainly otherwise, as an article in American Psychologistargues.)

Most of the forensic experts quoted in the CNN piece took a predictably “objective” point of view. None ventured the opinion that the crowd at Richmond High School failed to aid the rape victim because many human beings often act in a selfish, callous, and cowardly manner. Nobody put forth the view of rabbinical Judaism; namely, that we are all born with 2 primal inclinations, constantly at war with one another. The “good inclination” (yetzer hatov) is usually held to be a kind of late “add-on” to the more powerful “evil inclination” (yetzer hara), which often gains the upper hand. The yetzer hara seems to have been alive and well at Richmond High—and nobody lifted a finger to stop it. Rabbi Bruce Kadden, however, points out that the yetzer hara is not some “devil” external to our own selves; rather,

“…the yetzer hara is very much a part of us. We therefore cannot deny personal responsibility for what the yetzer hara causes us to do. It may explain our behavior, but it does not excuse it.”

Many psychiatrists, it seems to me, have been reluctant to venture into the obscure headwaters of evil—the territory explored so vividly in Josef Conrad’s 1902 novella, “The Heart of Darkness,” Many in our profession have taken the “scientific” view that matters of good and evil are best left to theologians and clergy; and that clinicians should limit themselves to analyzing and correcting the developmental, biological, and psychological precursors of “anti-social behavior.”

I disagree. Psychiatrists and other mental health professionals should not avoid the issue of evil, if only for the reason that good and evil are very real, and matter very deeply, to most of our patients. A woman who presents in therapy with a rape-related traumatic syndrome may be said to embody the problem of human evil: even her physiological responses to trauma-related stimuli have been altered by her experience. But more than that, the patient (male or female) who has suffered a brutal assault may need to explore the moral dimensions of the act and its consequences: “How could another human being do such a horrible thing? And - - why me, Doctor? Was I being punished by God? Am I somehow responsible for what happened? What should I do with all the hatred and rage I feel toward this monster? Is it right that I want him to suffer as much as I have?”

These understandable questions do not arise for all victims of trauma; but when they do, psychiatrists must be prepared to engage the patient in a serious, “I-Thou” dialogue, to use Martin Buber’s term. Similarly, philosopher and ethicist Margaret U. Walker has written of the need for “moral repair” after an act of wrongdoing. As therapists, we help effect such repair by establishing trust—the first step in mending the torn fabric of the traumatized patient’s moral universe. To gain the patient’s trust, however, we must be ready to talk frankly about good and evil. Sometimes, this means confronting the enormity of acts such as those that occurred at Richmond High.

[UPDATE 11/06/09]

It seems there may be a bright spot to this horrendous story, after all. ABC News is reporting that, while nearly all the bystanders did nothing,

"...one woman called police as soon as she heard what was happening. The 18-year-old mother and former Richmond High School student was at home watching a movie when her brother-in-law came home and said he had seen a girl getting raped.

"He was like, 'I'm scared,' and I'm like, 'Well, we should call the cops because that's the thing to do,'" Margarita Vargas said. "I didn't think about it twice, I just, I'm like, I immediately grabbed the phone and said, 'I'm gonna call the cops,' because that's something I wouldn't want anybody to go through or if I was in that situation, I would want someone to do the same for me."

Vargas said after making the call to police, she walked over to the school to make sure the police had arrived."

It was not my intention to blame the internet for creating more narcissists or for causing irreparable harm to our children. In fact, nowhere in my article do I “demonize” the internet as this post asserts. It is my contention that the internet is not, in and of itself, inherently evil. I do not blame social networking sites for the rise of narcissism in our culture. A more careful reading of the piece would reveal that I consider social networking sites a symptom of a narcissistic society rather than the cause of it.

My argument was not that the internet is to blame for the sad state of affairs in which we find ourselves. Rather, it is the philosophy that influenced the rearing of an entire generation, namely, the self-esteem movement. By shielding our youth from the dangers of criticism and disappointment, they have arrived at adulthood without having developed the coping skills they need to survive in the real world. No one succeeds at everything. This is a fact of life. But the millennial generation was not exposed to this reality. Not only do they shun criticism, they feel entitled to praise, even if undeserved.

The studies of Twenge and Campbell[1-3] have shown a steady rise in narcissism in the past several decades. While the author is quick to point to statements he believes are not backed by data, he fails to even take note of this study. This rise in narcissism was evident before the advent of social networking sites. And it is my contention that these sites would not have risen to such prominence but for the fact that a generation of narcissists needed an outlet. The millennial generation needed a way to assert their uniqueness, their specialness and garner the attention and praise of the masses. Facebook, MySpace, YouTube and Twitter filled the bill.

Communication has certainly changed throughout the last century. And with each successive change, the degree of face to face contact has decreased. From in person visits and hand written notes, we have progressed to phone calls and emails. Each time we remove ourselves from face-to-face contact with each other, the communication becomes eroded. When we can see each other, we can appreciate important non-verbal cues, absent if we just speak over the phone.

When we write or email, we lose the information that can be gleaned from pauses, prosody, and intonation of speech that are still available over the phone. When we text or blog, we have none of those things. The words must stand alone and they are condensed to their most basic and, in some cases, completely replaced by shorthand such as “lol”and “omg.”

Call me old-fashioned, but having a close friend with whom I have shared real experiences and confided real feelings to beats being anyone’s “bff.”

Thursday, October 22, 2009

When I thought of writing this letter, I was put in mind of Jean-Jacques Dessalines (1758 – 1806), the Emperor of the tiny nation of Haiti, writing to the most powerful man of that time, Napoleon Bonaparte. Like Dessalines, I am a grain of sand standing next to the huge mountain of the psychiatric establishment. Fortunately, there is Psychiatric Times (PT), which allows a "half-island nation" like me, and many others, to reach a wide audience. Some publications welcome opinions only from the “Napoleons” of our profession!

I am trying to say that PT is "a citizen above suspicion." Its fairness is beyond question and the integrity of its editorial board, rock-solid. Still, your editorial, "Conflicts of Interest: Policies of Psychiatric Times" is the necessary spark that should ignite a long-awaited discussion. Given that PT's opinions are free of unethical influences from commercial sponsors, I want to ask Dr Pies and Ms Kweskin to explore the other side of the "coin" that they have just tossed: does the psychiatric establishment show fairness in its review of dissident opinions regarding non-pharmacological issues, such as psychiatric diagnosis? My own experience says no. With the exception of PT and the Journal of AffectiveDisorders, few psychiatric journals will publish unorthodox opinions—for example, those questioning the validity of DSM diagnoses such as Borderline Personality Disorder, Oppositional-Defiant Disorder, or the widespread notion of “Treatment-resistant Depression.” By the same token, many in this specialty consider it heresy to suggest that ADHD is not as prevalent as the establishment wants us to believe. Similarly, the psychiatric establishment resists suggestions that many cases of "comorbid" anxiety and depression are neither, but are actually cases of sub-threshold bipolar spectrum disorder: the so-called "anxiety" is in fact agitation secondary to the bipolar mood disturbance.

I think there is real fear, within the psychiatric establishment, of opening a Pandora’s Box that could bring about a complete overhaul of the most revered diagnostic dogmas in this field. I very much appreciate the fact that PT allows dissident readers to raise their voices against such entrenched orthodoxy. Often, it is not a case of “crying wolf”-- but of the wolf actually scratching at the door. Manuel Mota-Castillo MD

I want to thank Dr Mota for his kind and appreciative remarks concerning Psychiatric Times. We have a long tradition of allowing "dissident" voices and controversial opinions to be heard in our pages (paper and now, electronic). Our founding Editor-in-Chief, John Schwartz, MD, never shied away from taking on "the powers that be," or in confronting the misbehavior of some groups opposed to the field of psychiatry.

I do suspect that there is resistance to change among some representatives of the psychiatric "establishment" (although, to be candid, some might place me in that camp). I think there are many reasons for this. One is that once a scientific (or not-so-scientific) "paradigm" has been established (to use historian Thomas Kuhn's term), it is hard to challenge it, even with persuasive data. The DSM framework is such a paradigm, and there is understandable reluctance to move away from it on the part of some who have labored mightily to create it. I suppose we should not completely discount the role of "Big Pharma" in promoting some diagnoses--perhaps including ADHD--for obvious reasons, though I do not take the view that all pharmaceutical companies are driven only by the profit motive. Still, the "direct to consumer" advertising so common these days may have the effect of reifying or expanding some diagnoses, even in the absence of convincing evidence. On the other hand, I do not agree with the camp that points to "disease mongering" as the source of, for example, the increased recognition of bipolar spectrum disorders.

I also think that some dubious diagnoses, such as "Conduct Disorder", simply reflect our over-reliance on a purely descriptive (symptom-based) diagnostic framework, rather than on one that seeks to establish common biogenetic and phenomenological (experiential) factors that may underlie several seemingly diverse conditions. Another good example, in my view, is the push to reify "Internet Addiction" as a full-fledged and discrete disorder, when it may represent merely one manifestation of an underlying aberration in the brain's reward system.

Friday, October 16, 2009

Congratulations to Dr Alan A. Stone for taking the time to nail the bias contained in the Supplement discussed in his piece, “Reality-Checking: Case In Point.” And congratulations to Psychiatric Times for printing Dr Stone’s article. Perhaps PT is ready to take further steps in separating itself from the "educational efforts" sponsored by pharmaceutical companies that have so deeply corrupted the practice of psychiatry, and medicine.Chuck Joy, MD Erie PARead more!

Thursday, September 17, 2009

I read the article by Dr Frances and was impressed by its intelligence. I then read the response by the APA in the person of Dr Schatzberg et al. I was shocked by its sleazy attack on Dr Frances’ integrity. Dr Frances was accused of arguing because of anticipated personal financial gain. The accusation that Dr Frances was arguing ad hominem was the pot calling the kettle black. The whole business of never-ending updates and changes to our diagnoses—whether paradigm-shifting or minor—should remind those of us who need reminding how primitive is our knowledge in our specialty. Dr Frances seems more aware of this than Dr Schatzberg et al.

Join the Fray!

We Welcome Your Feedback!

We encourage you to comment on any of our blog posts. Please include your name in the body of the comment.

Be a Part of Our Blog

What do you believe are the key issues currently facing psychiatry. . . and why do you choose to (or not to) practice psychiatry? Send us your views. We invite your comments, which can be short (a few paragraphs) and informal.