"...the subject which will be of most importance politically is Mass Psychology. ... The populace will not be allowed to know how its convictions were generated. ... As yet there is only one country which has succeeded in creating this politician’s paradise.” - Bertrand Russell, The Impact of Science on Society, 1960.

Wednesday, December 31, 2008

Border Fence To Bypass Property Of Wealthy Oilman Who Donated $35 Million To Bush Libraryhttp://thinkprogress.org/2008/02/19/hunt-border-fence/

In October 2006, President Bush authorized the construction of a 700-mile border fence between the United States and Mexico. Now, however, the Department of Homeland Security’s construction plans are facing opposition from Texans who object to the fence cutting through their property. The Washington Post reports on the hard line the Bush administration is taking with these protesting landowners:

In December, officials sent warning letters to 135 private landowners, municipalities, universities, public utility companies and conservation societies along the border that had turned away surveyors. Landowners were given 30 days to change their minds or face legal action. More than 100 of them — 71 in Texas — let the deadline pass.

Over the past several weeks, U.S. attorneys acting on behalf of the Homeland Security Department have been filing lawsuits against the holdouts.

DHS has no problem pursuing elderly and struggling homeowners. In the small town of Granjeno (pop. 313), however, the border fence would, conveniently, “abruptly end” at the property owned by Dallas billionaire Ray L. Hunt.

It’s not surprising that the administration would be hesitant to upset Hunt, who was a Bush-Cheney campaign “Pioneer” in 2000. More recently, Hunt “donated $35 million to Southern Methodist University to help build Bush’s presidential library.” In 2001, Bush appointed Hunt to his Foreign Intelligence Advisory Board, granting him “a security clearance and access to classified intelligence.”

Hunt, one of the wealthiest oilmen in the world, previously served on the board of Halliburton and was National Petroleum Council chairman between 1991 and 1994.

Daniel Garza, a 76-year old man who might lose his home to the border fence’s intrusion, noted, “I don’t see why they have to destroy my home, my land, and let the wall end there.” Pointing across the street to Hunt’s land, he added, “How will that stop illegal immigration?”

In the spring of 1971, a young Marine captain named James L. Jones stood guard as part of a phalanx surrounding the Capitol, with shoot-to-kill orders should antiwar protesters try to storm the building. According to Boys of '67, a recently published biography written by his cousin, Jones, a decorated Vietnam combat officer, brooded about "the Jane Fondas and Jerry Rubins of the world" as he scanned the marchers for any sign of a long-haired Navy lieutenant, John Kerry, whose condemnation of atrocities by US troops rankled him. In Vietnam, Jones had served as aide-de-camp to gung-ho Maj. Gen. Raymond Davis, whose plan for defeating North Vietnam included "invading Laos, Cambodia, and the DMZ," said Jones sympathetically.

Jones was troubled when he found out that his sister Diane, nine years his junior and a student at Tufts University, had been among the throng of protesters that day in 1971. When their father, also a retired marine, made light of the differences between Jim and Diane, the younger Jones erupted. "OK, Dad, how would you have liked it if Uncle Vernon had wrapped himself in a Japanese flag while the Marines were out in the Pacific?"

Today Jones--a retired general and former Marine commandant who headed the US European Command and was commander of the North Atlantic Treaty Organization--will be at Obama's elbow in the White House as national security adviser. It's hard to imagine a less likely choice to be Obama's go-to guy on foreign policy. Hillary Clinton, Obama's nominee for secretary of state, and Robert Gates, his nominee for defense secretary, are already widely considered to be tough-minded hawks. But Jones is probably the most hawkish of all, and he seems least compatible with Obama.

That Jones has reached the pinnacle of the Washington power elite is testimony to his long-proven ability to operate in the corridors of power. The Republican-leaning Jones was introduced to Washington's political world in 1979 by John McCain, who was serving as the Navy's liaison to the Senate.

McCain took Jones, who'd just been appointed as the Marines' liaison, under his wing, and the 36-year-old Jones, by then a major, took avidly to politicking with the senators, cultivating ties to a freshman Republican from Maine, William Cohen. Ten years later during another stint on Capitol Hill, Jones bonded with Cohen.

After serving in the Gulf War and Bosnia in the 1990s, having risen to colonel and then three-star general, Jones's connection with Cohen paid off: when the latter became President Clinton's defense secretary, Cohen picked Jones for the plum post of his military assistant. Cohen lauds his loyal aide: "Jones knew where all the bodies were buried, and made sure mine wasn't one of them," he said recently. Jones, who retired from the Marines two years ago, has remained close friends with McCain. Jones made an appearance in Missouri with the Arizona senator in June, after he had clinched the GOP presidential nomination.

Jones is a fierce advocate of NATO expansion. As commander of the alliance from 2003 to 2006, he pushed for it to take greater responsibility for securing oil supplies in the Persian Gulf and the Middle East. "Our activities are definitely moving to the East and to the South," he declared, speaking to the National Press Club in 2006. He pushed NATO hard--encountering stiff resistance from European allies--to strengthen its commitment to Afghanistan, and he got NATO involved with training missions in Iraq too.

No longer, he says, can NATO confine itself to the defense of Europe; it must increasingly engage in out-of-area operations. "The term 'out of area' doesn't really apply anymore, because that geographical restriction has faded into history," he told the Council on Foreign Relations in 2006.

"NATO's also getting ready to certify a NATO response force, which is also a new operational concept that will give the alliance much more flexible capability to do things rapidly at very long distances."

In 2007 Jones became president of the US Chamber of Commerce's Institute for 21st Century Energy, meanwhile joining the boards of directors of Chevron and Boeing. Among the eighty-eight recommendations of the institute--including, naturally, Drill, baby, drill!--is this: "The U.S. government should engage the North Atlantic Treaty Organization (NATO) on energy security challenges and encourage member countries to support the expansion of its mandate to address energy security."

Jones pays lip service to Obama's oft-stated campaign pledge to pull US combat forces out of Iraq over sixteen months. Not long ago, however, Jones was of a different mind. "I think deadlines can work against us," he said in 2007. "And I think a deadline of this magnitude would be against our national interest." His views on Iraq during the run-up to the war aren't known, though it's reasonable to assume that, like Gen. Anthony Zinni, a former Centcom commander, Jones was skeptical of the neoconservative-promoted war. According to Bob Woodward's State of Denial, in 2005 Jones warned the man who was soon to be chairman of the Joint Chiefs, Gen. Peter Pace, that he faced a "debacle" in Iraq. But when many retired generals began to denounce the Bush administration's Iraq policy in 2006, Jones pointedly demurred. "I do not associate myself with the so-called 'revolt of the generals,'" he said.

Regarding Afghanistan, where Jones is a proponent of a troop surge, he's shown himself to be credulous at best. Repeatedly over the past three years he's touted the view that a newly arriving brigade would turn the tide, Vietnam-like, and repel the Taliban. And time and again he's cast doubt upon the plain-as-day fact that the Taliban are resurgent. It is worrying--again echoing hawkish arguments about the Vietnam War--that he links failure in Iraq and Afghanistan to loss of face: "I personally don't believe that the United States can afford to be perceived as having not been successful in either Iraq or Afghanistan, and I think the consequences for such a perception or such a reality will be with us for years to come in terms of our ability to be a nation of great influence in the twenty-first century."

As Supreme Allied Commander in Europe, Jones also sought increased engagement of US and NATO forces in Africa, and he strongly supported creation of the controversial new US Africa Command. "Africa is a continent of growing strategic importance," he said. Two years before the establishment of that command, Jones was enthusiastic. "My staff at EUCOM spends more than half of its time on African issues," he acknowledged.

Some of Jones's supporters point to his 2007-08 role as special envoy on Palestinian security issues as a hopeful sign that he will encourage Obama to confront the Israel lobby. Palestinian negotiators praise Jones's patience and willingness to listen to their complaints, and a report that he prepared after his mission, said to be critical of the Israeli army's role in the West Bank--it "makes Israel look very bad," said the Israeli daily Ha'aretz--was suppressed by the Bush administration.

But that's a slim reed for hope. While Jones is deeply familiar with the Middle East and South Asia, and is a fluent French speaker who lived in Paris for fifteen years during his youth, in the end he's the military's guy. He's the proverbial hammer in search of nails. "He's not a strategic thinker," says a prominent military analyst in Washington. But when Obama needs a hammer, he'll have one conveniently nearby.

" ... Joseph Schmitz--the former Pentagon Inspector General turned general counsel to Blackwater's parent, The Prince Group--lists on his résumé membership in the Sovereign Military Order of Malta ... ... "

Erik Prince, CEO of Blackwater, may find himself and his company in a lot more hot water as new details emerge in the aftermath of the allegedly unprovoked killing of at least 17 Iraqi civilians by Blackwater security guards that took place on September 16, 2007. According to Jeremy Scahill, author of Blackwater: The Rise of the World's Most Powerful Mercenary Army, the Iraqi men, women and children who were gunned down died as part of a tactic known as "spray and pray." According to Wikipedia, "spray and pray" is "a derisive term for firing an automatic firearm towards an enemy in long bursts, without aiming. This may be done especially when quick reaction is needed to achieve a form of suppressive fire, either when aiming proves too difficult (for example due to a moving shooting platform) or when the location of an opponent is not exactly known." With video.

Iraqi anger over the September 16th incident has not died down and Prime Minister Nouri al-Maliki and the Iraqi Parliament have demanded that Blackwater leave Iraq. According to a story aired on yesterday's Special Report, Secretary of State Condoleezza Rice has approved a "transparent" FBI investigation into the incident. The British paper, The Independent, notes that Blackwater has a bad reputation in Iraq and that this incident is just the latest in a series of similar occurrences.

Shortly after the incident, Blackwater's notoriously camera-and-interview-shy CEO, Erik Prince, suddenly started making the rounds of the talk shows, giving essentially the same story, i.e., that the Blackwater guards were completely innocent of any overreaction, that they responded to gunfire. However, several investigations into the shooting have not corroborated the Blackwater version of events.

Yesterday, with the release of the news that the Blackwater guards had been given immunity by the State Department, the likelihood of prosecution faded. According to reports just released, the State Department granted the immunity under something called the "Garrity Clause," which was designed to protect federal employees.

State Department spokesman Sean McCormack dutifully denied that the State Department can give anyone immunity from prosecution, but others disagree, contending that the immunity offered to the guards by the State Department will hamper future investigations.

However, evidence obtained in a different Congressional investigation may throw a monkey-wrench into what looks suspiciously like an orchestrated attempt by the State Department to whitewash Blackwater and protect its guards from facing trial here in the United States.

On October 22nd, Rep. Henry Waxman, Chairman of the House Committee on Oversight and Government Reform, sent a thirteen-page letter to Erik Prince regarding Blackwater's disputed claim that its guards are all "independent contractors" and therefore responsible for their own tax payments. The IRS - and Waxman - believe that Blackwater has been playing fast and loose with the tax code, thus saving the company millions in social security, disability, liability and workman's compensation outlays.

By definition, an independent contractor is not an employee, but a sole proprietor of his/her own business, who contracts to do work. Unless the federal government signed a separate contract with each and every Blackwater security guard in Iraq, it seems to me that there is no way they can be designated as "federal employees." With their immunity gone, the mercenaries will be hung out to dry, both by the State Department and by Blackwater. At that juncture, the first defendant to cop a plea gets a reduced sentence while the rest of the men will get the book thrown at them. ...

Jeremy Scahill reported on Erik Prince's religious connections in the following excerpt from The Nation magazine:

Blackwater founder Erik Prince shares [President George] Bush's fundamentalist Christian views. He comes from a powerful Michigan Republican family and social circle, and his father, Edgar, helped Gary Bauer start the Family Research Council. According to a report prepared for The Nation by the Center for Responsive Politics, in all of Erik Prince's political funding generosity since 1989, he has never given a penny to a Democrat running for national office. Company president Jackson has also given money to Republican candidates. For his part, Joseph Schmitz--the former Pentagon Inspector General turned general counsel to Blackwater's parent, The Prince Group--lists on his résumé membership in the Sovereign Military Order of Malta, a Christian militia formed before the First Crusade. Like Prince, he comes from a right-wing family; his father, former Congressman John Schmitz, was an ultraconservative John Birch Society director who later ran for President. Joseph Schmitz was once in charge of investigating private contractors like Blackwater, but he resigned amid allegations of stonewalling investigations conducted by his department. He now represents one of the most successful of those contractors.

It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or authoritative medical guidelines. I take no pleasure in this conclusion, which I reached slowly and reluctantly over my two decades as an editor of The New England Journal of Medicine.

The whole medical-industrial complex is another version of the Madoff scandal?

The charges by Dr. Marcia Angell in The New York Review of Books (article appended below) are damning. What some have said of psychiatry may be true generally. Drug money has completely corrupted the process by which we treat disease.

The statement appeared where it did because Angell was reviewing three books on the corruption of medical research. Her conclusion — “a need for the medical profession to wean itself from industry money almost entirely.”

Doing that may cost as much to doctors and researchers as Madoff cost foundations and investors.

Both this and the Madoff scandal come at a propitious time, a point where policymakers are seeking major reforms and seeking to save the system billions of dollars needed to pay for those reforms.

Just as Madoff, by personalizing the excesses of unregulated finance, will lead to structural change in ways the “big shitpile” seemed unable to, so it’s possible that Angell’s call will personalize the excesses of the medical industrial complex.

What we spend on computing in 2009 we may save on drugs. This could be the biggest medical story of the next year. And to think it came out in 2008.

Dana Blankenhorn has been a business journalist since 1978, and has covered technology since 1982. He launched the Interactive Age Daily, the first daily coverage of the Internet to launch with a magazine, in September 1994.•••••••Drug Companies & Doctors: A Story of CorruptionBy Marcia Angellhttp://www.nybooks.com/articles/22237Volume 56, Number 1 · January 15, 2009

Recently Senator Charles Grassley, ranking Republican on the Senate Finance Committee, has been looking into financial ties between the pharmaceutical industry and the academic physicians who largely determine the market value of prescription drugs. He hasn't had to look very hard.

Take the case of Dr. Joseph L. Biederman, professor of psychiatry at Harvard Medical School and chief of pediatric psychopharmacology at Harvard's Massachusetts General Hospital. Thanks largely to him, children as young as two years old are now being diagnosed with bipolar disorder and treated with a cocktail of powerful drugs, many of which were not approved by the Food and Drug Administration (FDA) for that purpose and none of which were approved for children below ten years of age.

Legally, physicians may use drugs that have already been approved for a particular purpose for any other purpose they choose, but such use should be based on good published scientific evidence. That seems not to be the case here. Biederman's own studies of the drugs he advocates to treat childhood bipolar disorder were, as The New York Times summarized the opinions of its expert sources, "so small and loosely designed that they were largely inconclusive."[1]

In June, Senator Grassley revealed that drug companies, including those that make drugs he advocates for childhood bipolar disorder, had paid Biederman $1.6 million in consulting and speaking fees between 2000 and 2007. Two of his colleagues received similar amounts. After the revelation, the president of the Massachusetts General Hospital and the chairman of its physician organization sent a letter to the hospital's physicians expressing not shock over the enormity of the conflicts of interest, but sympathy for the beneficiaries: "We know this is an incredibly painful time for these doctors and their families, and our hearts go out to them."

Or consider Dr. Alan F. Schatzberg, chair of Stanford's psychiatry department and president-elect of the American Psychiatric Association. Senator Grassley found that Schatzberg controlled more than $6 million worth of stock in Corcept Therapeutics, a company he cofounded that is testing mifepristone—the abortion drug otherwise known as RU-486—as a treatment for psychotic depression. At the same time, Schatzberg was the principal investigator on a National Institute of Mental Health grant that included research on mifepristone for this use and he was coauthor of three papers on the subject. In a statement released in late June, Stanford professed to see nothing amiss in this arrangement, although a month later, the university's counsel announced that it was temporarily replacing Schatzberg as principal investigator "to eliminate any misunderstanding."

Perhaps the most egregious case exposed so far by Senator Grassley is that of Dr. Charles B. Nemeroff, chair of Emory University's department of psychiatry and, along with Schatzberg, coeditor of the influential Textbook of Psychopharmacology.[2] Nemeroff was the principal investigator on a five-year $3.95 million National Institute of Mental Health grant—of which $1.35 million went to Emory for overhead—to study several drugs made by GlaxoSmithKline. To comply with university and government regulations, he was required to disclose to Emory income from GlaxoSmithKline, and Emory was required to report amounts over $10,000 per year to the National Institutes of Health, along with assurances that the conflict of interest would be managed or eliminated.

But according to Senator Grassley, who compared Emory's records with those from the company, Nemeroff failed to disclose approximately $500,000 he received from GlaxoSmithKline for giving dozens of talks promoting the company's drugs. In June 2004, a year into the grant, Emory conducted its own investigation of Nemeroff's activities, and found multiple violations of its policies. Nemeroff responded by assuring Emory in a memorandum, "In view of the NIMH/Emory/GSK grant, I shall limit my consulting to GSK to under $10,000/year and I have informed GSK of this policy." Yet that same year, he received $171,031 from the company, while he reported to Emory just $9,999—a dollar shy of the $10,000 threshold for reporting to the National Institutes of Health.

Emory benefited from Nemeroff's grants and other activities, and that raises the question of whether its lax oversight was influenced by its own conflicts of interest. As reported by Gardiner Harris in The New York Times,[3] Nemeroff himself had pointed out his value to Emory in a 2000 letter to the dean of the medical school, in which he justified his membership on a dozen corporate advisory boards by saying:

Surely you remember that Smith-Kline Beecham Pharmaceuticals donated an endowed chair to the department and there is some reasonable likelihood that Janssen Pharmaceuticals will do so as well. In addition, Wyeth-Ayerst Pharmaceuticals has funded a Research Career Development Award program in the department, and I have asked both AstraZeneca Pharmaceuticals and Bristol-Meyers [sic] Squibb to do the same. Part of the rationale for their funding our faculty in such a manner would be my service on these boards.

Because these psychiatrists were singled out by Senator Grassley, they received a great deal of attention in the press, but similar conflicts of interest pervade medicine. (The senator is now turning his attention to cardiologists.) Indeed, most doctors take money or gifts from drug companies in one way or another. Many are paid consultants, speakers at company-sponsored meetings, ghost-authors of papers written by drug companies or their agents,[4] and ostensible "researchers" whose contribution often consists merely of putting their patients on a drug and transmitting some token information to the company. Still more doctors are recipients of free meals and other out-and-out gifts. In addition, drug companies subsidize most meetings of professional organizations and most of the continuing medical education needed by doctors to maintain their state licenses.

No one knows the total amount provided by drug companies to physicians, but I estimate from the annual reports of the top nine US drug companies that it comes to tens of billions of dollars a year. By such means, the pharmaceutical industry has gained enormous control over how doctors evaluate and use its own products. Its extensive ties to physicians, particularly senior faculty at prestigious medical schools, affect the results of research, the way medicine is practiced, and even the definition of what constitutes a disease.

Consider the clinical trials by which drugs are tested in human subjects.[5] Before a new drug can enter the market, its manufacturer must sponsor clinical trials to show the Food and Drug Administration that the drug is safe and effective, usually as compared with a placebo or dummy pill. The results of all the trials (there may be many) are submitted to the FDA, and if one or two trials are positive—that is, they show effectiveness without serious risk—the drug is usually approved, even if all the other trials are negative. Drugs are approved only for a specified use—for example, to treat lung cancer—and it is illegal for companies to promote them for any other use.

But physicians may prescribe approved drugs "off label"—i.e., without regard to the specified use—and perhaps as many as half of all prescriptions are written for off-label purposes. After drugs are on the market, companies continue to sponsor clinical trials, sometimes to get FDA approval for additional uses, sometimes to demonstrate an advantage over competitors, and often just as an excuse to get physicians to prescribe such drugs for patients. (Such trials are aptly called "seeding" studies.)

Since drug companies don't have direct access to human subjects, they need to outsource their clinical trials to medical schools, where researchers use patients from teaching hospitals and clinics, or to private research companies (CROs), which organize office-based physicians to enroll their patients. Although CROs are usually faster, sponsors often prefer using medical schools, in part because the research is taken more seriously, but mainly because it gives them access to highly influential faculty physicians—referred to by the industry as "thought-leaders" or "key opinion leaders" (KOLs). These are the people who write textbooks and medical journal papers, issue practice guidelines (treatment recommendations), sit on FDA and other governmental advisory panels, head professional societies, and speak at the innumerable meetings and dinners that take place every year to teach clinicians about prescription drugs. Having KOLs like Dr. Biederman on the payroll is worth every penny spent.

A few decades ago, medical schools did not have extensive financial dealings with industry, and faculty investigators who carried out industry-sponsored research generally did not have other ties to their sponsors. But schools now have their own manifold deals with industry and are hardly in a moral position to object to their faculty behaving in the same way. A recent survey found that about two thirds of academic medical centers hold equity interest in companies that sponsor research within the same institution.[6] A study of medical school department chairs found that two thirds received departmental income from drug companies and three fifths received personal income.[7] In the 1980s medical schools began to issue guidelines governing faculty conflicts of interest but they are highly variable, generally quite permissive, and loosely enforced.

Because drug companies insist as a condition of providing funding that they be intimately involved in all aspects of the research they sponsor, they can easily introduce bias in order to make their drugs look better and safer than they are. Before the 1980s, they generally gave faculty investigators total responsibility for the conduct of the work, but now company employees or their agents often design the studies, perform the analysis, write the papers, and decide whether and in what form to publish the results. Sometimes the medical faculty who serve as investigators are little more than hired hands, supplying patients and collecting data according to instructions from the company.

In view of this control and the conflicts of interest that permeate the enterprise, it is not surprising that industry-sponsored trials published in medical journals consistently favor sponsors' drugs—largely because negative results are not published, positive results are repeatedly published in slightly different forms, and a positive spin is put on even negative results. A review of seventy-four clinical trials of antidepressants, for example, found that thirty-seven of thirty-eight positive studies were published.[8] But of the thirty-six negative studies, thirty-three were either not published or published in a form that conveyed a positive outcome. It is not unusual for a published paper to shift the focus from the drug's intended effect to a secondary effect that seems more favorable.

The suppression of unfavorable research is the subject of Alison Bass's engrossing book, Side Effects: A Prosecutor, a Whistleblower, and a Bestselling Antidepressant on Trial. This is the story of how the British drug giant GlaxoSmithKline buried evidence that its top-selling antidepressant, Paxil, was ineffective and possibly harmful to children and adolescents. Bass, formerly a reporter for the Boston Globe, describes the involvement of three people—a skeptical academic psychiatrist, a morally outraged assistant administrator in Brown University's department of psychiatry (whose chairman received in 1998 over $500,000 in consulting fees from drug companies, including GlaxoSmithKline), and an indefatigable New York assistant attorney general. They took on GlaxoSmithKline and part of the psychiatry establishment and eventually prevailed against the odds.

The book follows the individual struggles of these three people over many years, culminating with GlaxoSmithKline finally agreeing in 2004 to settle charges of consumer fraud for $2.5 million (a tiny fraction of the more than $2.7 billion in yearly Paxil sales about that time). It also promised to release summaries of all clinical trials completed after December 27, 2000. Of much greater significance was the attention called to the deliberate, systematic practice of suppressing unfavorable research results, which would never have been revealed without the legal discovery process. Previously undisclosed, one of GlaxoSmithKline's internal documents said, "It would be commercially unacceptable to include a statement that efficacy had not been demonstrated, as this would undermine the profile of paroxetine [Paxil]."[9]

Many drugs that are assumed to be effective are probably little better than placebos, but there is no way to know because negative results are hidden. One clue was provided six years ago by four researchers who, using the Freedom of Information Act, obtained FDA reviews of every placebo-controlled clinical trial submitted for initial approval of the six most widely used antidepressant drugs approved between 1987 and 1999—Prozac, Paxil, Zoloft, Celexa, Serzone, and Effexor.[10] They found that on average, placebos were 80 percent as effective as the drugs. The difference between drug and placebo was so small that it was unlikely to be of any clinical significance. The results were much the same for all six drugs: all were equally ineffective. But because favorable results were published and unfavorable results buried (in this case, within the FDA), the public and the medical profession believed these drugs were potent antidepressants.

Clinical trials are also biased through designs for research that are chosen to yield favorable results for sponsors. For example, the sponsor's drug may be compared with another drug administered at a dose so low that the sponsor's drug looks more powerful. Or a drug that is likely to be used by older people will be tested in young people, so that side effects are less likely to emerge. A common form of bias stems from the standard practice of comparing a new drug with a placebo, when the relevant question is how it compares with an existing drug. In short, it is often possible to make clinical trials come out pretty much any way you want, which is why it's so important that investigators be truly disinterested in the outcome of their work.

Conflicts of interest affect more than research. They also directly shape the way medicine is practiced, through their influence on practice guidelines issued by professional and governmental bodies, and through their effects on FDA decisions. A few examples: in a survey of two hundred expert panels that issued practice guidelines, one third of the panel members acknowledged that they had some financial interest in the drugs they considered.[11] In 2004, after the National Cholesterol Education Program called for sharply lowering the desired levels of "bad" cholesterol, it was revealed that eight of nine members of the panel writing the recommendations had financial ties to the makers of cholesterol-lowering drugs.[12] Of the 170 contributors to the most recent edition of the American Psychiatric Association's Diagnostic and Statistical Manual of Mental Disorders (DSM), ninety-five had financial ties to drug companies, including all of the contributors to the sections on mood disorders and schizophrenia.[13] Perhaps most important, many members of the standing committees of experts that advise the FDA on drug approvals also have financial ties to the pharmaceutical industry.[14]

In recent years, drug companies have perfected a new and highly effective method to expand their markets. Instead of promoting drugs to treat diseases, they have begun to promote diseases to fit their drugs. The strategy is to convince as many people as possible (along with their doctors, of course) that they have medical conditions that require long-term drug treatment. Sometimes called "disease-mongering," this is a focus of two new books: Melody Petersen's Our Daily Meds: How the Pharmaceutical Companies Transformed Themselves into Slick Marketing Machines and Hooked the Nation on Prescription Drugs and Christopher Lane's Shyness: How Normal Behavior Became a Sickness.

To promote new or exaggerated conditions, companies give them serious-sounding names along with abbreviations. Thus, heartburn is now "gastro-esophageal reflux disease" or GERD; impotence is "erectile dysfunction" or ED; premenstrual tension is "premenstrual dysphoric disorder" or PMMD; and shyness is "social anxiety disorder" (no abbreviation yet). Note that these are ill-defined chronic conditions that affect essentially normal people, so the market is huge and easily expanded. For example, a senior marketing executive advised sales representatives on how to expand the use of Neurontin: "Neurontin for pain, Neurontin for monotherapy, Neurontin for bipolar, Neurontin for everything."[15] It seems that the strategy of the drug marketers—and it has been remarkably successful—is to convince Americans that there are only two kinds of people: those with medical conditions that require drug treatment and those who don't know it yet. While the strategy originated in the industry, it could not be implemented without the complicity of the medical profession.

Melody Petersen, who was a reporter for The New York Times, has written a broad, convincing indictment of the pharmaceutical industry.[16] She lays out in detail the many ways, both legal and illegal, that drug companies can create "blockbusters" (drugs with yearly sales of over a billion dollars) and the essential role that KOLs play. Her main example is Neurontin, which was initially approved only for a very narrow use—to treat epilepsy when other drugs failed to control seizures. By paying academic experts to put their names on articles extolling Neurontin for other uses—bipolar disease, post-traumatic stress disorder, insomnia, restless legs syndrome, hot flashes, migraines, tension headaches, and more—and by funding conferences at which these uses were promoted, the manufacturer was able to parlay the drug into a blockbuster, with sales of $2.7 billion in 2003. The following year, in a case covered extensively by Petersen for the Times, Pfizer pleaded guilty to illegal marketing and agreed to pay $430 million to resolve the criminal and civil charges against it. A lot of money, but for Pfizer, it was just the cost of doing business, and well worth it because Neurontin continued to be used like an all-purpose tonic, generating billions of dollars in annual sales.

hristopher Lane's book has a narrower focus—the rapid increase in the number of psychiatric diagnoses in the American population and in the use of psychoactive drugs (drugs that affect mental states) to treat them. Since there are no objective tests for mental illness and the boundaries between normal and abnormal are often uncertain, psychiatry is a particularly fertile field for creating new diagnoses or broadening old ones.[17] Diagnostic criteria are pretty much the exclusive province of the current edition of the Diagnostic and Statistical Manual of Mental Disorders, which is the product of a panel of psychiatrists, most of whom, as I mentioned earlier, had financial ties to the pharmaceutical industry. Lane, a research professor of literature at Northwestern University, traces the evolution of the DSM from its modest beginnings in 1952 as a small, spiral-bound handbook (DSM-I) to its current 943-page incarnation (the revised version of DSM-IV) as the undisputed "bible" of psychiatry—the standard reference for courts, prisons, schools, insurance companies, emergency rooms, doctors' offices, and medical facilities of all kinds.

Given its importance, you might think that the DSM represents the authoritative distillation of a large body of scientific evidence. But Lane, using unpublished records from the archives of the American Psychiatric Association and interviews with the princi-pals, shows that it is instead the product of a complex of academic politics, personal ambition, ideology, and, perhaps most important, the influence of the pharmaceutical industry. What the DSM lacks is evidence. Lane quotes one contributor to the DSM-III task force:

There was very little systematic research, and much of the research that existed was really a hodgepodge—scattered, inconsistent, and ambiguous. I think the majority of us recognized that the amount of good, solid science upon which we were making our decisions was pretty modest.

Lane uses shyness as his case study of disease-mongering in psychiatry. Shyness as a psychiatric illness made its debut as "social phobia" in DSM-III in 1980, but was said to be rare. By 1994, when DSM-IV was published, it had become "social anxiety disorder," now said to be extremely common. According to Lane, GlaxoSmithKline, hoping to boost sales for its antidepressant, Paxil, decided to promote social anxiety disorder as "a severe medical condition." In 1999, the company received FDA approval to market the drug for social anxiety disorder. It launched an extensive media campaign to do it, including posters in bus shelters across the country showing forlorn individuals and the words "Imagine being allergic to people...," and sales soared. Barry Brand, Paxil's product director, was quoted as saying, "Every marketer's dream is to find an unidentified or unknown market and develop it. That's what we were able to do with social anxiety disorder."

Some of the biggest blockbusters are psychoactive drugs. The theory that psychiatric conditions stem from a biochemical imbalance is used as a justification for their widespread use, even though the theory has yet to be proved. Children are particularly vulnerable targets. What parents dare say "No" when a physician says their difficult child is sick and recommends drug treatment? We are now in the midst of an apparent epidemic of bipolar disease in children (which seems to be replacing attention-deficit hyperactivity disorder as the most publicized condition in childhood), with a forty-fold increase in the diagnosis between 1994 and 2003.[18] These children are often treated with multiple drugs off-label, many of which, whatever their other properties, are sedating, and nearly all of which have potentially serious side effects.

The problems I've discussed are not limited to psychiatry, although they reach their most florid form there. Similar conflicts of interest and biases exist in virtually every field of medicine, particularly those that rely heavily on drugs or devices. It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or authoritative medical guidelines. I take no pleasure in this conclusion, which I reached slowly and reluctantly over my two decades as an editor of The New England Journal of Medicine.

One result of the pervasive bias is that physicians learn to practice a very drug-intensive style of medicine. Even when changes in lifestyle would be more effective, doctors and their patients often believe that for every ailment and discontent there is a drug. Physicians are also led to believe that the newest, most expensive brand-name drugs are superior to older drugs or generics, even though there is seldom any evidence to that effect because sponsors do not usually compare their drugs with older drugs at equivalent doses. In addition, physicians, swayed by prestigious medical school faculty, learn to prescribe drugs for off-label uses without good evidence of effectiveness.

It is easy to fault drug companies for this situation, and they certainly deserve a great deal of blame. Most of the big drug companies have settled charges of fraud, off-label marketing, and other offenses. TAP Pharmaceuticals, for example, in 2001 pleaded guilty and agreed to pay $875 million to settle criminal and civil charges brought under the federal False Claims Act over its fraudulent marketing of Lupron, a drug used for treatment of prostate cancer. In addition to GlaxoSmithKline, Pfizer, and TAP, other companies that have settled charges of fraud include Merck, Eli Lilly, and Abbott. The costs, while enormous in some cases, are still dwarfed by the profits generated by these illegal activities, and are therefore not much of a deterrent. Still, apologists might argue that the pharmaceutical industry is merely trying to do its primary job—further the interests of its investors—and sometimes it goes a little too far.

Physicians, medical schools, and professional organizations have no such excuse, since their only fiduciary responsibility is to patients. The mission of medical schools and teaching hospitals—and what justifies their tax-exempt status—is to educate the next generation of physicians, carry out scientifically important research, and care for the sickest members of society. It is not to enter into lucrative commercial alliances with the pharmaceutical industry. As reprehensible as many industry practices are, I believe the behavior of much of the medical profession is even more culpable.[19] Drug companies are not charities; they expect something in return for the money they spend, and they evidently get it or they wouldn't keep paying.

So many reforms would be necessary to restore integrity to clinical research and medical practice that they cannot be summarized briefly. Many would involve congressional legislation and changes in the FDA, including its drug approval process. But there is clearly also a need for the medical profession to wean itself from industry money almost entirely. Although industry–academic collaboration can make important scientific contributions, it is usually in carrying out basic research, not clinical trials, and even here, it is arguable whether it necessitates the personal enrichment of investigators. Members of medical school faculties who conduct clinical trials should not accept any payments from drug companies except research support, and that support should have no strings attached, including control by drug companies over the design, interpretation, and publication of research results.

Medical schools and teaching hospitals should rigorously enforce that rule, and should not enter into deals with companies whose products members of their faculty are studying. Finally, there is seldom a legitimate reason for physicians to accept gifts from drug companies, even small ones, and they should pay for their own meetings and continuing education.

After much unfavorable publicity, medical schools and professional organizations are beginning to talk about controlling conflicts of interest, but so far the response has been tepid. They consistently refer to "potential" conflicts of interest, as though that were different from the real thing, and about disclosing and "managing" them, not about prohibiting them. In short, there seems to be a desire to eliminate the smell of corruption, while keeping the money. Breaking the dependence of the medical profession on the pharmaceutical industry will take more than appointing committees and other gestures. It will take a sharp break from an extremely lucrative pattern of behavior. But if the medical profession does not put an end to this corruption voluntarily, it will lose the confidence of the public, and the government (not just Senator Grassley) will step in and impose regulation. No one in medicine wants that.

Notes

[1] Gardiner Harris and Benedict Carey, "Researchers Fail to Reveal Full Drug Pay," The New York Times, June 8, 2008.

[2] Most of the information in these paragraphs, including Nemeroff's quote in the summer of 2004, is drawn from a long letter written by Senator Grassley to James W. Wagner, President of Emory University, on October 2, 2008.

[14] On August 4, 2008, the FDA announced that $50,000 is now the "maximum personal financial interest an advisor may have in all companies that may be affected by a particular meeting." Waivers may be granted for amounts less than that.

[15] See Petersen, Our Daily Meds, p. 224.

[16] Petersen's book is a part of a second wave of books exposing the deceptive practices of the pharmaceutical industry. The first included Katharine Greider's The Big Fix: How the Pharmaceutical Industry Rips Off American Consumers (PublicAffairs, 2003), Merrill Goozner's The $800 Million Pill: The Truth Behind the Cost of New Drugs (University of California Press, 2004), Jerome Avorn's Powerful Medicines: The Benefits, Risks, and Costs of Prescription Drugs (Knopf, 2004), John Abramson's Overdo$ed America: The Broken Promise of American Medicine (HarperCollins, 2004), and my own The Truth About the Drug Companies: How They Deceive Us and What to Do About It (Random House, 2004).

[17] See the review by Frederick Crews of Lane's book and two others, The New York Review, December 6, 2007.

[18] See Gardiner Harris and Benedict Carey, "Researchers Fail to Reveal Full Drug Pay," The New York Times, June 8, 2008.

[19] This point is made powerfully in Jerome P. Kassirer's disturbing book, On the Take: How Medicine's Complicity With Big Business Can Endanger Your Health (Oxford University Press, 2005).

Tuesday, December 30, 2008

Nagata Tetsuzan, in company with Obata Toshiro and Okamura Yasuji (all three trained in military intelligence) and Prince Hirohito, had met secretly at the German spa Baden-Baden on October 27, 1921 to plan a total war against the West.[1] With the exception of Hirohito, all were then serving at Japanese embassies in Europe. They developed strategies to purge the army of the Samurai (knightly) leadership of the Choshu clan, to reorganize and modernize the army, and a plan to dominate Manchuria. Another young officer-attendee was Tojo Hideki, the future prime minister who would launch the Pacific war, the imperial scheme.

Corporations, like Matsui, first targeted Manchuria’s natural resources in the First Sino-Japanese War from August 1, 1894 to April 17, 1895. Japan, ready to accommodate Japanese corporations and with backing from foreign banks, like J. P. Morgan, declared war on Tsarist Russia on February 8, 1904 specifically to gain control of the strategic Kwantung Peninsula, the area leased for 25 years to Russia by China in March 1898. Japan was granted that area after the war and established the Kwantung Garrison in 1906. The semi-autonomous Kwantung Army, part of Japan’s Imperial Army took its name from the Kwantung (meaning east of Shanhaiguan) Peninsula,

General Araki Sadao was the Japanese military attaché in Russia during WWI.[2] He became the Commandant of the Army War College in 1928. He indoctrinated the younger pseudo-patriotic naïve officers with a radical-right-wing agenda. Araki embraced Bushido (“The Way of the Warrior”) and permitted officers to wear swords for the first time since the samurai rebellions of the 19th century.[3]

The ultranational (superior-nation complex) objective was to seize the government, divorce it from the people and transform it into a profitable military dictatorship by removing opposing imperial advisors, creating chaos among radical students and unifying the many secret societies. The military foundation of the conspiracy was set by 1929. They installed General Honjo Shigeru, a general in the Imperial Japanese Army during the early period of the Second Sino-Japanese War (1904-1905). He served as commander of the Kwantung Army from August 1, 1931 to August 8, 1932. Honjo was an 1897 graduate from the Imperial Japanese Army Academy. Some of his classmates included future Prime Minister Abe Nobuyuki, and generals Araki and Matsui. He would prevent any potential interference. The Kwantung Army then had 12,000 willing men and would grow to 700,000 by 1941.[4]

Araki, with three other conspirators, devised the Mukden Incident by May 31, 1931. On September 18, 1931, a non-strategic section of Japan's South Manchuria Railway was expertly dynamited by Japanese militarists. The Japanese, to be persuasive, claimed that the railway section was vital. It wasn’t! Asbestos-laden buildings unable to attract profitable renters and in need of being torn down and replaced would also serve the same purpose. Japan blamed the incident on the Chinese who, responding to the explosion, came running to investigate. The Japanese were prepared to invade; they already had the major munitions hidden at the Officer’s club in Mukden. This staged false flag incident was the pretext for an attack on the morning of September 19, 1931 against the Chinese garrison at Beidaying, eight hundred meters away from the incident. Five hundred unprepared Chinese soldiers perished; two Japanese soldiers died. Some researchers, such as David Bergamini, claim that Hirohito knew and approved of the whole plot. As a civilian, he was interned in a Japanese concentration camp in the Philippines.

In February 1932, Japan renamed Manchuria. It would now be Manchukuo. It was literally the kingdom of the Kwantung Army whose intent was to seize all of Manchuria, Mongolia and Northern China.[5] By 1936, the Kwantung Army managed Manchuria. Young bureaucrat Kishi Nobusuke, recognizing a great business opportunity, convinced his uncle to move the headquarters of his newly-formed Nissan zaibatsu (conglomerate) to Manchukuo. Nissan developed its iron and coal mines, timber and opium production. Life was luxurious for the 900,000 Japanese citizens now living in Manchuria as compared to the faltering economy in Japan.[6]

In the 1920s and 1930s, there were two opposing factions (similar to America’s two-party charade) both of which endorsed totalitarian, fascist political philosophies: The ultranationalist Kodoha was led by General Araki who became Minister of War in 1933. According to a Time Magazine article dated January 23, 1933, Araki considered Hirohito as “utterly perfect,” the “Son of Heaven.” Hence, his soldiers rendered absolute obedience and abided by the strict government regulations which prohibited criticism of the emperor or his policies.

The other clique, called the Control Group (Toseiha) was led by General Ugaki Kazushige and was composed of power-seeking, aggressive opportunistic officers who were gradually moving Japan towards a military dictatorship with an increased military budget. Uncooperative politicians were assassinated (as Kennedy and others in our own country) in order to establish a military dictatorship and escalate warfare in order to impose a colonial Japanese empire on the Asian mainland. This group wanted to conquer Soviet Siberia and China, followed by total war with the West.[7] Total war required the cooperation of the bureaucracy and the zaibatsu to maximize Japan’s industrial and military capacity. In 1936, the two factions would merge to form the Imperial Way Faction nationalist party.

With the goal of overthrowing the political and economic elite, allegedly to save Japan from the evil influences controlling the emperor, the far-right terrorist organization Ketsumeidan (League of Blood), composed of student radicals and young Japanese military cadets led by Inoue Nissho, assassinated Finance Minister Inoue Junnosuke (no relation) on February 9, 1932 and Baron Dan Takuma, the Director General of Mitsui, Japan’s largest zaibatsu conglomerate on March 5, 1932. On May 15, 1932, Prime Minister Inukai Tsuyoshi, born to a former samurai family, was assassinated because he objected to the army’s expansion into Manchuria. He wanted to negotiate for peace with China. Attempts to bomb the Mitsubishi Bank and Tokyo police headquarters occurred on the same day.[8] Inoue Nissho, after minimal rehabilitation, would maintain a prominent right-wing position after WWII. ...

Even as the media continue to repeat the claim that credit has frozen up, evidence has emerged suggesting the entire story is wrong.

There is something approaching a consensus that the Paulson Plan -- also known as the Troubled Asset Relief Program, or TARP -- was a boondoggle of an intervention that's flailed from one approach to the next, with little oversight and less effect on the financial meltdown.

But perhaps even more troubling than the ad hoc nature of its implementation is the suspicion that has recently emerged that TARP -- hundreds of billions of dollars worth so far -- was sold to Congress and the public based on a Big Lie.

President George W. Bush, fabulist-in-chief, articulated the rationale for the program in that trademark way of his -- as if addressing a nation of slow-witted 12-year-olds -- on Sept. 24: "Major financial institutions have teetered on the edge of collapse ... [and] began holding onto their money, and lending dried up, and the gears of the American financial system began grinding to a halt." Bush said that if Congress didn't give Treasury Secretary Hank Paulson the trillion dollars (give or take) for which he was asking, the results would be disastrous: "Even if you have good credit history, it would be more difficult for you to get the loans you need to buy a car or send your children to college. And ultimately, our country could experience a long and painful recession."

For the most part, the press has continued to echo Bush's central assertion that there's a "credit crunch" preventing even qualified borrowers -- that's the key point -- from getting loans, and it's now part of the conventional wisdom.

But a number of economists are questionioning the factual basis of the credit crunch narrative. Columnist David Sirota recently looked at those claims and concluded that Americans "had been punk'd" -- that "the major claims about a credit crisis that justified Congress cutting a trillion-dollar blank check to Wall Street were demonstrably false," and the threat of a systemic banking crash was used by the Bush administration to overcome popular resistance to the "bailout."

It's a reasonable conclusion; this is an administration that used the threat of thousands of al-Qaida sleeper cells in the United States to sell Congress on the Patriot Act, the specter of mushroom clouds rising over American cities to push through the Iraq war resolution and the supposedly imminent crash of the Social Security system to push for privatizing Americans' retirement savings.

But the question comes down to what they knew and when they knew it. The analyses that suggest the whole credit crunch narrative is false are based on data that lagged behind the numbers that policymakers had available, in real time, back in September. So the question -- probably unanswerable at this point -- comes down to whether or not they looked at the situation and in good faith believed that pumping hundreds of billions of dollars into the banking system would contain the damage and save an economy teetering on the brink of collapse.

What Else Could Be Happening?

Of course, no one disputes the fact that as the economy has tanked, the number of new loans being issued to American families and businesses has plummeted. But is because credit has dried up for qualified borrowers?

Economist Dean Baker doesn't think so. He explains the situation in simple terms: The media, he argues, "are blaming the economic collapse on a 'credit crunch' instead of the more obvious problem that consumers just lost $6 trillion of housing wealth and another $8 trillion of stock wealth." It's a commonsense argument: much of the economic growth of the Bush era existed on paper only, built on the rise of a massive bubble in real estate values rather than growth in productive industries. When all that ephemeral wealth vaporized -- and with the economy shedding jobs like a dog with dermatitis -- consumers stopped buying, and businesses, anticipating a long slowdown, stopped seeking the loans that they might have otherwise tapped to expand their operations.

Whether good borrowers can't get credit from banks because the latter are hoarding cash or lending has stopped because of a drop-off in demand for new loans is not some wonky academic debate; it's of crucial significance. Because if lending to qualified parties has truly frozen, then even if the specific implementation of the Paulson Plan was deeply flawed, its broad approach -- "recapitalizing" banks in various ways, buying up some of their crappy paper and guaranteeing some of their transactions -- is fundamentally sound.

If, on the other hand, the primary problem is that people are broke and maxed out on debt, and firms aren't looking for money to expand, then the kind of massive stimulus package being considered by the Obama transition team and congressional Dems -- largely designed to stimulate demand from the bottom up, with public works projects, tax cuts for working families, aid to tapped-out state and municipal governments and new money for unemployment and food stamps -- is obviously the best approach to take.

Broadly speaking, these are the parameters of the debate in Washington, and that means that properly diagnosing the underlying problem is crucially important.

Is the Credit Crunch a Big Lie?

There's plenty of evidence that Baker's right. He points out that even though mortgage rates have plummeted, the number of applications for new loans has dropped to very low levels and argues it's "the most glaring refutation of the claim that people are unable to get credit." If creditworthy applicants were being denied loans by banks unable or unwilling to lend, Baker explains, "then the ratio of mortgage applications to home sales should be soaring" as qualified homebuyers apply to multiple banks for a loan. "Since there is no notable increase in this ratio, access to credit is obviously not an issue."

Again, this is common sense. Consumer spending drives about 70 percent of the U.S. economy, and in recent years, much of that spending was financed by people taking chunks of home equity out of their properties -- people might have been eating in fancy restaurants, but they were essentially eating their living rooms to do so.

That the American people don't have the appetite to go deeper into debt than they already are in order to make new purchases is hard to dispute. In November, consumer prices across the board fell at a record rate for the second month in a row. And even with mortgage rates plummeting, so many homeowners are "underwater" -- owing more on their homes than they're worth -- that they're unable to refinance because the equity isn't there. Paul Schuster, a vice president at Marketplace Home Mortgage, told the St. Paul Pioneer Press, "What I'm really concerned about is the job picture ... If (people) don't feel good about their jobs, rates aren't going to matter."

The National Federal of Independent Business' November survey of small-business owners found no evidence of a credit crunch to date, concluding that if "credit is going untapped, it's largely because company operators are not choosing to pursue the credit. It's not that companies can't get the extra money, it's that they don't want or need it because of the broader slowdown in economic activity."

The credit crunch narrative -- and the justification for creating Paulson's $700 billion TARP honeypot -- is built on three related assertions: 1) banks, fearing that they'll be unable to meet their own financial obligations, aren't lending money to one another; 2) they're also not lending to the public at large -- neither to firms nor individuals; and 3) businesses are further unable to raise money through ordinary channels because investors aren't eager to buy up corporate debt, including commercial paper issued by companies with decent balance sheets.

Economists at the Federal Reserve Bank of Minnesota's research department -- V.V. Chari and Patrick Kehoe of the University of Minnesota, and Northwestern University's Lawrence Christiano -- crunched the Fed's numbers in an examination of these bits of conventional wisdom (PDF), and concluded that all three claims are myths.

The researchers found that "interbank lending is healthy" and "bank credit has not declined during the financial crisis"; that they've seen "no evidence that the financial crisis has affected lending to non-financial businesses" and that "while commercial paper issued by financial institutions has declined, commercial paper issued by non-financial institutions is essentially unchanged during the financial crisis." The researchers called on lawmakers to "articulate the precise nature of the market failure they see, [and] to present hard evidence that differentiates their view of the data from other views."

That finding was backed up by a study issued by Celent Financial Services, a consulting firm, again using the Treasury Department's own data. According to a story on the report by Reuters, Celent's researchers concluded that the "data actually suggest world credit markets are functioning remarkably well." Rather than a widespread banking problem, Celent found that the rot was limited to "a few big, vocal banks and industries such as car manufacturing, which would be in difficulty anyway."

There are also some important caveats. Economists at the Boston Federal Reserve responded to the Minnesota Fed's research (PDF), arguing that the use of aggregate data doesn't fully reflect the dysfunction in specific subsectors of the economy, nor does it adequately reflect the decline in new loans.

It's also the case that single-cause explanations for complex crises usually fail to hit the mark. Banks, having fueled the housing bubble (and similar bubbles before that) with the creation of ever-shadier "exotic" securities, are probably erring on the side of caution in writing new loans. They're looking at their balance sheets as quarterly reports approach, and the number of foreign investment dollars coming into the U.S. has declined, meaning that some qualified firms may, indeed, have trouble raising cash in the near future.

Dean Baker, while arguing that "the main story is that people don't have money and therefore want to spend," acknowledged that "some banks are undoubtedly anticipating more write-offs from other loans going bad, so they will hang on to their capital now rather than make new loans." And, as Sirota notes, some of the institutions that are relatively healthy are reportedly holding cash in anticipation of picking up weaker banks on the cheap.

But one thing is clear: the economic crisis may have woken up Washington's political class when it hit the banks, but it remains a product of long-term imbalances in the economy, and the idea that it's primarily a pathology of the banking system in isolation is a misdiagnosis that, if uncorrected, can only result in a longer, deeper and more painful recession than might otherwise be the case.

IRVINE, Calif. (Dec. 29) -- Late at night, the neighbors saw a little girl at the kitchen sink of the house next door. They watched through their window as the child rinsed plates under the open faucet. She wasn't much taller than the counter and the soapy water swallowed her slender arms.

To put the dishes away, she climbed on a chair.

A Childhood in Slavery

Shyima Hall, 19, was just 9 when she began working as a servant for a wealthy couple in Alexandria, Egypt. A year later, the couple moved to California with their five children and took Shyima with them. She worked up to 20 hours a day with no days off. Her pay: $45 a week. She is part of a surge of child trafficking for domestic labor in the United States.

But she was not the daughter of the couple next door doing chores. She was their maid.

Shyima was 10 when a wealthy Egyptian couple brought her from a poor village in northern Egypt to work in their California home. She awoke before dawn and often worked past midnight to iron their clothes, mop the marble floors and dust the family's crystal. She earned $45 a month working up to 20 hours a day. She had no breaks during the day and no days off.

The trafficking of children for domestic labor in the U.S. is an extension of an illegal but common practice in Africa. Families in remote villages send their daughters to work in cities for extra money and the opportunity to escape a dead-end life. Some girls work for free on the understanding that they will at least be better fed in the home of their employer.

The custom has led to the spread of trafficking, as well-to-do Africans accustomed to employing children immigrate to the U.S. Around one-third of the estimated 10,000 forced laborers in the United States are servants trapped behind the curtains of suburban homes, according to a study by the National Human Rights Center at the University of California at Berkeley and Free the Slaves, a nonprofit group. No one can say how many are children, especially since their work can so easily be masked as chores.

Once behind the walls of gated communities like this one, these children never go to school. Unbeknownst to their neighbors, they live as modern-day slaves, just like Shyima, whose story is pieced together through court records, police transcripts and interviews.

"I'd look down and see her at 10, 11 — even 12 — at night," said Shyima's neighbor at the time, Tina Font. "She'd be doing the dishes. We didn't put two and two together."—Shyima cried when she found out she was going to America in 2000. Her father, a bricklayer, had fallen ill a few years earlier, so her mother found a maid recruiter, signed a contract effectively leasing her daughter to the couple for 10 years and told Shyima to be strong.

For a year, Shyima, 9, worked in the Cairo apartment owned by Amal Motelib and Nasser Ibrahim. Every month, Shyima's mother came to pick up her salary.

Tens of thousands of children in Africa, some as young as 3, are recruited every year to work as domestic servants. They are on call 24 hours a day and are often beaten if they make a mistake. Children are in demand because they earn less than adults and are less likely to complain. In just one city — Casablanca — a 2001 survey by the Moroccan government found more than 15,000 girls under 15 working as maids.

The U.S. State Department found that over the past year, children have been trafficked to work as servants in at least 33 of Africa's 53 countries. Children from at least 10 African countries were sent as maids to the U.S. and Europe. But the problem is so well hidden that authorities — including the U.N., Interpol and the State Department — have no idea how many child maids now work in the West.

"In most homes, these girls are not allowed to use so much as the same spoon as the rest of the family," said Hany Helal, the Cairo-based director of the Egyptian Organization for Child Rights.

By the time the Ibrahims decided to leave, Shyima's family had taken several loans from them for medical bills. The Ibrahims said they could only be repaid by sending Shyima to work for them in the U.S. A friend posed as her father, and the U.S. embassy in Cairo issued her a six-month tourist visa.

She arrived at Los Angeles International Airport on Aug. 3, 2000, according to court documents. The family brought her back to their spacious five-bedroom, two-story home, decorated in the style of a Tuscan villa with a fountain of two angels spouting water through a conch. She was told to sleep in the garage.

It had no windows and was neither heated nor air-conditioned. Soon after she arrived, the garage's only light bulb went out. The Ibrahims didn't replace it. From then on, Shyima lived in the dark.

She was told to call them Madame Amal and Hajj Nasser, terms of respect. They called her "shaghala," or servant. Their five children called her "stupid."While the family slept, she ironed the school outfits of the Ibrahims' 5-year-old twin sons. She woke them, combed their hair, dressed them and made them breakfast. Then she ironed clothes and fixed breakfast for the three girls, including Heba, who at 10 was the same age as the family's servant.

Neither Ibrahim nor his wife worked, and they slept late. When they awoke, they yelled for her to make tea.

While they ate breakfast watching TV, she cleaned the palatial house. She vacuumed each bedroom, made the beds, dusted the shelves, wiped the windows, washed the dishes and did the laundry.

Her employers were not satisfied, she said. "Nothing was ever clean enough for her. She would come in and say, 'This is dirty,' or 'You didn't do this right,' or 'You ruined the food,'" said Shyima.

She started wetting her bed. Her sheets stank. So did her oversized T-shirt and the other hand-me-downs she wore.

While doing the family's laundry, she slipped her own clothes into the load. Madame slapped her. "She told me my clothes were dirtier than theirs. That I wasn't allowed to clean mine there," she said.

She washed her clothes in a bucket in the garage. She hung them to dry outside, next to the trash cans.

When the couple went out, she waited until she heard the car pull away and then she sat down. She sat with her back straight because she was afraid her clothes would dirty the upholstery.

It never occurred to her to run away.

"I thought this was normal," she said.—If you could fly the garage where Shyima slept 7,000 miles to the sandy alleyway where her Egyptian family now lives, it would pass for the best home in the neighborhood.

The garage's walls are made of concrete instead of hand-patted bricks. Its roof doesn't leak. Its door shuts all the way. Shyima's mother and her 10 brothers and sisters live in a two-bedroom house with uneven walls and a flaking ceiling. None of them have ever had a bed to themselves, much less a whole room. At night, bodies cover the sagging couches.

Shown a snapshot of the windowless garage, Shyima's mother in the coastal town of Agami made a clucking sound of approval.

"It's much cleaner than where many people here sleep," said Helal, the child rights advocate. He explains that Shyima's treatment in the Ibrahim home is considered normal — even good — by Egyptian standards.

Even though many child maids are physically abused, child labor is rarely prosecuted because the work isn't considered strenuous. Many employers even see themselves as benefactors.

"There is a sense that children should work to help their family, but also that they are being given an opportunity," said Mark Lagon, the director of the U.S. State Department's Office to Monitor and Combat Trafficking in Persons.That's especially the case for well-off families who transport their child servants to Western countries.

In 2006, a U.S. district court in Michigan sentenced a Cameroonian man to 17 years in prison for bringing a 14-year-old girl from his country to work as his unpaid maid. That same year, a Moroccan couple was sentenced to home confinement for forcing their 12-year-old Moroccan niece to work grueling hours caring for their baby.

In Germantown, Md., a Nigerian couple used their daughter's passport to bring in a 14-year-old Nigerian girl as their maid. She worked for them for five years before escaping in 2001. In Germany, France, the Netherlands and England, African immigrants have been arrested for forcing children from their home countries to work as their servants.

In several of these cases, the employers argued that they took the children with the parents' permission. The Cameroonian girl's mother flew to Detroit to testify in court against her daughter, saying the girl was ungrateful for the good life her employers had provided her.

Shyima's mother, Salwa Mahmoud, said her father believed she would have better opportunities in America.

"I didn't want her to travel but our family's condition dictated that she had to go," explained Mahmoud, a squat, round-faced woman with calloused hands and feet. She is missing two front teeth because she couldn't afford a dentist.

"If she had stayed here in Egypt, she would have been ordinary," said Awatef, Shyima's older sister. "Just like us."

—On April 3, 2002, an anonymous caller phoned the California Department of Social Services to report that a young girl was living inside the garage of 28 Pacific Grove.

A few days later, Nasser Ibrahim opened the door to a detective from the Irvine Police Department. Asked if any children lived there beside his own, he first said no, then yes — "a distant relative." He said he had "not yet" enrolled her in school. She did "chores — just like the other kids," according to the police transcript.

Shyima was upstairs cleaning when Ibrahim came to get her. "He told me that I was not allowed to say anything," said Shyima. "That if I said anything I would never see my parents again."

When police searched the house, they turned up several home videos showing Shyima at work. They seized the contract signed by Shyima's illiterate parents.

Asked by police if anyone other than his immediate family lived in the house, Eid, one of the twins, said: "Hummm ... Yeah ... Her name is Shyima," according to the transcript. "She uh ... She works — she works for us at the house, like, she cleans up the dishes and stuff like that."

Twelve-year-old Heba got flustered: "Yeah. She's uh — my — uh — How do I say this? Uh ... My dad's ... Oh, wait, like ... She's like my cousin, but — She's my dad's daughter's friend. Oops! The other way. Okay, I'm confused."Heba eventually admitted that Shyima had lived with the family for three years in Egypt and in California.

The police put Shyima in a squad car. They noted her hands were red and caked with dead, hard-looking skin.—For months Shyima lied to investigators, saying what the Ibrahims had told her to say.

She went without sleep for days at a stretch. She was put on four different types of medication. She moved from foster home to foster home. Her mood swings alarmed her guardians. In school for the first time, she struggled to learn to read.

Investigators arranged for her to speak to her parents. She told them she felt like a "nobody" working for the Ibrahims and wanted to come home. Her father yelled at her.

"They kept telling me that they're good people," Shyima recounted in a recent interview. "That it's my fault. That because of what I did my mom was going to have a heart attack."

Three years ago, she broke off contact with her family. Since then she has refused to speak Arabic. She can no longer communicate in her mother tongue.

During the 2006 trial, the Ibrahims described Shyima as part of their family. They included proof of a trip she took with the family to Disneyland.

Shyima's lawyer pointed out that the 10-year-old wasn't allowed on the rides — she was there to carry the bags.

The couple's lawyers collected photographs of the home where Shyima grew up, including close-ups of the feces-stained squat toilet and of Shyima's sisters washing clothes in a bucket.

In her final plea, Madame Amal told the judge it would be unfair to separate her from her children. Enraged, Shyima, then 17, told the court she hadn't seen her family in years.

"Where was their loving when it came to me? Wasn't I a human being too? I felt like I was nothing when I was with them," she sobbed.

The couple pleaded guilty to all charges, including forced labor and slavery.

They were ordered to pay $76,000, the amount Shyima would have earned at the minimum wage. The sentence: Three years in federal prison for Ibrahim, 22 months for his wife, and then deportation for both. Their lawyers declined to comment for this story.

"I don't think that there is any other term you could use than modern-day slavery," said Bob Schoch, the special agent in charge for Immigration and Customs Enforcement in Los Angeles, in describing Shyima's situation.

Shyima was adopted last year by Chuck and Jenny Hall of Beaumont, Calif. The family lives near Disneyland, where they have taken her a half-dozen times. She graduated from high school this summer after retaking her exit exam and hopes to become a police officer.

Shyima, now 19, has a list of assigned chores. She wears purple eyeshadow, has a boyfriend and frequently updates her profile on MySpace. Her hands are neatly manicured.

But in her closet, she keeps a box of pictures of her parents and her brothers and sisters. "I don't look at them because it makes me cry," she said. "How could they? They're my parents."

When her father died last year, her family had no way of reaching her.—EPILOGUE: On a recent afternoon in Cairo, Madame Amal walked into the lobby of her apartment complex wearing designer sunglasses and a chic scarf.

After nearly two years in a U.S. prison cell, she's living once more in the spacious apartment where Shyima first worked as her maid. The apartment is adorned in the style of a Louis XIV palace, with ornately carved settees, gold-leaf vases and life-sized portraits of her and her husband.

She did not agree to be interviewed for this story.

Before the door closed behind her, a little girl slipped in carrying grocery bags. She wore a shabby T-shirt. Her small feet slapped the floor in loose flip-flops. Her eyes were trained on the ground.

She looked to be around 9 years old.—EDITOR'S NOTE — This story is based on interviews in Los Angeles, Irvine and Beaumont, Calif., and in Cairo and Agami, Egypt, in September and October. In addition to interviews with Shyima, her mother and nine of her brothers and sisters, the AP also interviewed her neighbors in Irvine, law enforcement officials and the lawyer who prosecuted her case. Quotes and scenes were observed by the reporter or described by Shyima and confirmed in police transcripts and court records.

" ... Woodward limited the investigation to the cover-up of a routine phone tap, slush funds, bribes and "what the president knew." Behind the whitewash you find CHAOS, the interagency assassination program, and Nixon connections to the Kennedy murders, CIA & drugs and other atrocities. Woodward's intelligence function was to publicly contain these horror stories and limit the damage to the same intelligence community that had no qualms about recruiting Nazis and provoking genocidal wars. What do you expect from these bottom feeders? Woodward is one of them - a Linda Lovelace with no gag reflex. ... "

By Alex Constantine

Woodward didn't need a secret informant because he had the second-highest known security clearance as a Naval intelligence officer assigned to the Pentagon, where, it has been reported, he briefed steely-jawed fascist Al Haig. This was SIX mere months before taking a reporter's position (intelligence assignment?) at the Washington Post - a charge that he has publicly denied.

That's a serious lie - especially considering that he's entrusted with feeding the masses credible information on critical national issues- but he had to deny it or the whole charade behind the books and movie falls apart and he's no longer top cat.

But he's been caught fabricating facts since, offers no real explanation, just moves on to the next stretcher.

Then we learned that Woodward had been leaked the CIA connection and had a secret, vested interest in lying about the case, conveniently not mentioning that he was an unindicted co-conspirator in the Plame investigation.

Academics challenged the claim that Mark Felt was Deep Throat, but the Mockingbird press ran with it anyways - with some cautions. Now they state it as a fact, disregarding those informed reservations.

"In his most recent book, Bush at War, Bob Woodward brags that he was given access to the deeply classified minutes of National Security Council meetings. ... "http://nightlight.typepad.com/nightlight/2005/10/either_woodward.html

What do you say about a reporter like this, how to describe him? Red-lights come to mind ...

Deep Throat was a complex of Woodward himself in collusion with officials of the CIA, FBI and military intelligence. Woodward limited the investigation to the cover-up of a routine phone tap, slush funds, bribes and "what the president knew." Behind the whitewash you find CHAOS, the interagency assassination program, and Nixon connections to the Kennedy murders, CIA & drugs and other atrocities. Woodward's intelligence function was to publicly contain these horror stories and limit the damage to the same intelligence community that had no qualms about recruiting Nazis and provoking genocidal wars. What do you expect from these bottom feeders? Woodward is one of them - a Linda Lovelace with no gag reflexººººººDeep Throat's Operation COINTELPRO dwarfed Watergate in criminality

December 22, 2008by Michael Richardson, Boston Progressive Examiner

William Mark Felt was 'Deep Throat' of Watergate infamy, but he was not the hero some have portrayed him for his disclosures of Richard Nixon's 'Plumbers' squad. Felt was a counter-intelligence agent during WWII as a new recruit to the Federal Bureau of Investigation. For years leading up to the Watergate burglary, Felt was chief of the Inspection Division at the Bureau and oversaw a massive, clandestine operation against domestic political targets code-named COINTELPRO.

At the time of Watergate, J. Edgar Hoover was director of the FBI and was responsible for the creation of COINTELPRO. Felt's role was quality-control over field operations. Felt sat on the COINTELPRO directorate and never blew the whistle on illegal conduct by FBI agents nationwide that went on for years.

Hoover's secret war on domestic political activists had a lethal ferocity against the main targets, the Black Panther Party. The undercover operation functioned at full steam from the mid-60's until a break-in at the Media, Pennsylvania satellite FBI office in March 1971 which revealed COINTELPRO memos.

Noam Chomsky recently put Watergate and COINTELPRO in context during an interview with journalist Hans Bennett. Chomsky explained, "The information about COINTELPRO actually came out at the same time as the Watergate crisis (mid-70's)."

"Here's the national political police--for four administrations--carrying out a massive campaign of repression leading all the way to assassinations. It targeted everyone--including the women's-New Left-and Black movements--and the media simply didn't care."

Chomsky continued, "That shows how human and civil rights are valued by the political and intellectual elite."

Felt was hauled before a U.S. Senate committee investigating COINTELPRO five times but stonewalled the committee. In his autobiographical memoir, Felt was unrepentant calling the investigation "an exercise in futility and frustration."

Felt advocated continuing COINTELPRO's illegal tactics. "I emphasized as strongly as I could that our country's complacency against domestic terrorism would eventually lead to disaster."

Felt's enthusiasm for dirty tricks caused him to order illegal wiretaps after Hoover's death. Felt was prosecuted and convicted for his misdeeds but pardoned by President Ronald Reagan and escaped punishment for his crimes.