Drugačna spletna televizija

Drugačna spletna televizija

Mesec: avgust 2012

In recent years, there’s been a dramatic increase in the number of children being diagnosed with serious psychiatric disorders and prescribed medications that are just beginning to be tested in children. The drugs can cause serious side effects, and virtually nothing is known about their long-term impact. “It’s really to some extent an experiment, trying medications in these children of this age,” child psychiatrist Dr. Patrick Bacon tells FRONTLINE. “It’s a gamble. And I tell parents there’s no way to know what’s going to work.”

In The Medicated Child, FRONTLINE producer Marcela Gaviria confronts psychiatrists, researchers and government regulators about the risks, benefits and many questions surrounding prescription drugs for troubled children. The biggest current controversy surrounds the diagnosis of bipolar disorder. Formerly called manic depression, bipolar disorder was long believed to exist only in adults. But in the mid-1990s, bipolar in children began to be diagnosed at much higher rates, sometimes in kids as young as 4 years old. “The rates of bipolar diagnoses in children have increased markedly in many communities over the last five to seven years,” says Dr. Steven Hyman, a former director of the National Institute of Mental Health. “I think the real question is, are those diagnoses right? And in truth, I don’t think we yet know the answer.”

Like many of the 1 million children now diagnosed with bipolar, 5-year-old Jacob Solomon was initially believed to suffer from an attention deficit disorder. His parents reluctantly started him on Ritalin, but over the next five years, Jacob would be put on one drug after another. “It all started to feel out of control,” Jacob’s father, Ron, told FRONTLINE. “Nobody ever said we can work with this through therapy and things like that. Everywhere we looked it was, ‘Take meds, take meds, take meds.'”

Over the years, Jacob’s multiple medications have helped improve his mood, but they’ve also left him with a severe tic in his neck which doctors are having trouble fully explaining. “We’re dealing with developing minds and brains, and medications have a whole different impact in the young developing child than they do in an adult,” says Dr. Marianne Wamboldt, the chief of psychiatry at Denver Children’s Hospital. “We don’t understand that impact very well. That’s where we’re still in the Dark Ages.”

DJ Koontz was diagnosed with bipolar at 4 years old, after his temper tantrums became more frequent and explosive. He was recently prescribed powerful antipsychotic drugs. “It is a little worrisome to me because he is so young,” says DJ’s mother, Christine. “If he didn’t take it, though, I don’t know if we could function as a family. It’s almost a do-or-die situation over here.” DJ’s medicines seem to be helping him in the short run, but the longer-term outlook is still uncertain. “What’s not really clear is whether many of the kids who are called bipolar have anything that’s related to this very well-studied disorder in adults,” says Dr. Thomas Insel, the director of the National Institute of Mental Health. “It’s not clear that people with that adult illness started with what we’re now calling bipolar in children. Nor is it clear that the kids who have this disorder are going to grow up to have what we used to call manic-depressive illness in adulthood.”

While some urge caution when it comes to bipolar in children, FRONTLINE talks with others who argue that we should intervene with drug treatments at even younger ages for children genetically predisposed to the disorder. “The theory is that if you get in early, before the first full mood episode, then perhaps we can delay the onset to full mania,” says Dr. Kiki Chang of Stanford University. “And if that’s the case, perhaps finding the right medication early on can protect a brain so that these children never do progress to full bipolar disorder.”

In Sick Around the World, FRONTLINE teams up with veteran Washington Post foreign correspondent T.R. Reid to find out how five other capitalist democracies — the United Kingdom, Japan, Germany, Taiwan and Switzerland — deliver health care, and what the United States might learn from their successes and their failures.

Reid’s first stop is the U.K., where the government-run National Health Service (NHS) is funded through taxes. “Every single person who’s born in the U.K. will use the NHS,” says Whittington Hospital CEO David Sloman, “and none of them will be presented a bill at any point during that time.” Often dismissed in America as “socialized medicine,” the NHS is now trying some free-market tactics like “pay-for-performance,” where doctors are paid more if they get good results controlling chronic diseases like diabetes. And now patients can choose where they go for medical procedures, forcing hospitals to compete head to head.

While such initiatives have helped reduce waiting times for elective surgeries, Times of London health editor Nigel Hawkes thinks the NHS hasn’t made enough progress. “We’re now in a world in which people are much more demanding, and I think that the NHS is not very effective at delivering in that modern, market-orientated world.”

Reid reports next from Japan, which boasts the second largest economy and the best health statistics in the world. The Japanese go to the doctor three times as often as Americans, have more than twice as many MRI scans, use more drugs, and spend more days in the hospital. Yet Japan spends about half as much on health care per capita as the United States.

One secret to Japan’s success? By law, everyone must buy health insurance — either through an employer or a community plan — and, unlike in the U.S., insurers cannot turn down a patient for a pre-existing illness, nor are they allowed to make a profit.

Reid’s journey then takes him to Germany, the country that invented the concept of a national health care system. For its 80 million people, Germany offers universal health care, including medical, dental, mental health, homeopathy and spa treatment. Professor Karl Lauterbach, a member of the German parliament, describes it as “a system where the rich pay for the poor and where the ill are covered by the healthy.” As they do in Japan, medical providers must charge standard prices. This keeps costs down, but it also means physicians in Germany earn between half and two-thirds as much as their U.S. counterparts.

In the 1990s, Taiwan researched many health care systems before settling on one where the government collects the money and pays providers. But the delivery of health care is left to the market. Every person in Taiwan has a “smart card” containing all of his or her relevant health information, and bills are paid automatically. But the Taiwanese are spending too little to sustain their health care system, according to Princeton’s Tsung-mei Cheng, who advised the Taiwanese government. “As we speak, the government is borrowing from banks to pay what there isn’t enough to pay the providers,” she told FRONTLINE.

Reid’s last stop is Switzerland, a country which, like Taiwan, set out to reform a system that did not cover all its citizens. In 1994, a national referendum approved a law called LAMal (“the sickness”), which set up a universal health care system that, among other things, restricted insurance companies from making a profit on basic medical care. The Swiss example shows health care reform is possible, even in a highly capitalist country with powerful insurance and pharmaceutical companies.

Today, Swiss politicians from the right and left enthusiastically support universal health care. “Everybody has a right to health care,” says Pascal Couchepin, the current president of Switzerland. “It is a profound need for people to be sure that if they are struck by destiny … they can have a good health system.”

As the worsening economy leads to massive job losses—potentially forcing millions more Americans to go without health insurance—FRONTLINE travels the country examining the nation’s broken health care system and explores the need for a fundamental overhaul. Veteran FRONTLINE producer Jon Palfreman dissects the private insurance system, a system that not only fails to cover 46 million Americans but also leaves millions more underinsured and at risk of bankruptcy.

At its best, American health care can be very good. For Microsoft employee Mark Murray and his wife, Melinda, their employee health plan paid for eight years of fertility treatments and covered all the costs of a very complicated pregnancy. “If it wasn’t for our health insurance,” Murray says, “we wouldn’t have a baby boy right now.” The Murrays’ medical bills totaled between $500,000 and $1 million, and their plan covered every penny.

But beyond large, high-wage employers like Microsoft, FRONTLINE learns that available, affordable, adequate insurance is becoming hard to find. Small businesses face a very bleak outlook for finding and keeping coverage. Coverage is becoming more expensive and less comprehensive, with high deductibles, co-pays and coverage limits. Georgetown University Research Professor Karen Pollitz explains that for many people, the current system is “like having an airbag in your car that’s made out of tissue paper: I’m so glad that it’s there, but if I ever get in a crash, it’s not going to protect me.”

Outside of employer-based health care plans, matters are even worse. Americans seeking insurance in the individual market must submit to “medical underwriting,” and if they have a pre-existing condition, they will likely be denied. Kaiser Permanente Chairman and CEO George Halverson says frankly: “I could not get insurance. I’ve had heart surgery, and so I am completely uninsurable in the private market. So it’s important that I keep my job.”

Across the U.S., FRONTLINE finds people making life decisions based on health insurance, stuck in jobs because of so-called job lock. One such person is 23-year-old Twin Cities, Minn., resident Matt Johnson, who put his career dreams on hold to get a job at Menards home improvement store because its benefits package covers his ulcerative colitis. Americans even stay in bad marriages, says Pollitz, “because they just can’t afford to divorce their health insurance.”

For those Americans who find health coverage in the private market, there’s no guarantee it will protect them. In 2007, Palm Desert, Calif., realtor Jennifer Thompson received a letter from Blue Cross accepting her for coverage that read: “Congratulations! You have been approved for coverage with Blue Cross of California. … The immediate value of your coverage is peace of mind.” But then Thompson discovered she had a cancer that required surgery, and three days after leaving the hospital, she received a letter from Blue Cross saying that her insurance was “rescinded,” leaving her uninsured and owing more than $160,000 in medical bills. Blue Cross cited Thompson’s previous history of cancer and results from a recent doctor’s visit as the reasons for the rescission. “Our system is not working,” says Professor Pollitz. “It’s designed to cut out on you right when you need it the most.” When questioned about Thompson’s case, Sam Nussbaum, chief medical officer of WellPoint, which owns Blue Cross of California, told FRONTLINE that because of legal considerations, “I can’t speak to that circumstance … but no one likes to see a situation like this. People are buying health security.”

In the past, some states required insurance companies to cover everyone but found that many people waited to buy insurance until they fell ill, causing “adverse selection,” or a higher ratio of unhealthy to healthy people in the insurance pool. As a result, insurance companies stopped doing business in those states. Today, only five states—New York, New Jersey, Massachusetts, Maine and Vermont—guarantee everyone insurance, a “privilege” reflected in premiums. “If we look at the average premium of those states,” says WellPoint’s Nussbaum, “that premium is three times higher on average—maybe $600 to $700 versus a [state] where the insurance market has allowed medical underwriting.”

For some Americans, life becomes a quest to find and keep health insurance. In 1994, Nikki White, a Bristol, Tenn., native with dreams of becoming a doctor, was diagnosed with lupus, a serious but treatable autoimmune disorder. Too ill to work, she lost her health insurance for several years, but then received coverage from the state’s Medicaid program. Soon, budget cuts made her ineligible for the state program. A few months later, White was rushed to the ER with severe lupus complications and racked up nearly $1 million in medical bills. She finally secured insurance under the government HIPPA law, but her condition was too advanced, and in 2006, at the age of 32, she died. White’s primary care physician, Amylyn Crawford, tells FRONTLINE: “Nikki didn’t die from lupus. Nikki died secondary to the complications of a failing health care system.”

Around the world, other developed democracies offer universal health care, requiring insurance companies to cover everyone. People are mandated to buy it; insurance for the poor is subsidized; and governments control prices by setting the cost of everything from doctors’ salaries and hospital rooms to drugs and MRIs. But efforts to implement similar policies in the U.S. have proven unsuccessful. In 2006, Massachusetts implemented reforms mandating everyone be covered by health insurance, but there are still problems of affordability. FRONTLINE profiles the Abramses, a Massachusetts family of four earning $63,000 annually, who found that although they were too prosperous to receive a health care subsidy, they could not afford to buy a health care insurance policy at around $12,000 a year. “What we’re finding out in Massachusetts,” says veteran insurance industry executive and consultant Robert Laszewski, “you can mandate that people have health insurance, but if it costs more than they can afford, it doesn’t matter.”

As President Obama launches his plan for reforming health care, Kaiser Family Foundation President Drew Altman tells FRONTLINE: “This is the first big opportunity for health reform since … [the] early 1990s. And a question is again, pointedly, whether we will blow the opportunity again this time or [whether] we will actually get it all done or get something significant done.” But consultant Laszewski wonders if Americans have the will to make it happen. “Every doctor I meet says he’s underpaid. I’ve yet to meet a hospital executive who thinks he or she can operate on less. I have yet to meet a patient who is willing to sacrifice care. So we have this $2.2 trillion system, and I haven’t met anybody in any of the stakeholders that’s willing to take less. And until we’re willing to have that conversation, we’re just sort of nibbling around the edges.”

This documentary takes a critical look at the history and future of public space – from the Agora to shopping malls to NYC’s community gardens. The film unrolls the complicated relationship between public and private control over space. Featuring scholars, activists and community organizers, including Don Mitchell, Sharon Zukin, Galen Kranz, and Rob Robbins.

I Lost My Job is a short documentary film which sets out to explore the phenomenon which is affecting and due to affect many people’s lives – namely, technological unemployment. The documentary also examines what we can do about it as a society through the analysis of a transitional direction.

Particular questions need to be raised about this subject. What are the social consequences of ongoing technological unemployment within our current economic system? How do we handle such a situation when this process is inevitable with the ongoing emergence of machine automation and new technologies taking over repetitive jobs?

It’s the mystery of mysteries — especially to parents — the unpredictable and sometimes incomprehensible moods and behaviors of the American teenager. Generations of adults have pondered its cause. Hormones? Rock music? Boredom? Drugs?

In “Inside the Teenage Brain,” FRONTLINE chronicles how scientists are exploring the recesses of the brain and finding some new explanations for why adolescents behave the way they do. These discoveries could change the way we parent, teach, or perhaps even understand our teenagers.

New neuroscience research has shown that a crucial part of the brain undergoes extensive changes during puberty — precisely the time when the raging hormones often blamed for teen behavior begin to wreak havoc. It’s long been known that the architecture of the brain is largely set in place during the first few years of life. But with the aid of new technologies such as magnetic resonance imaging (MRI), scientists are mapping changes in pre-teen and teenage brains and finding evidence that remarkable growth and change continue for decades.

The vast majority of brain development occurs in two basic stages: growth spurts and pruning. In utero and throughout the first several months of life, the human brain grows at a rapid and dramatic pace, producing millions of brain cells.

“This is a process that we knew happened in the womb, maybe even in the first 18 months of life,” explains neuroscientist Dr. Jay Giedd at the National Institute of Mental Health. “But it was only when we started following the same children by scanning their brains at two-year intervals that we detected a second wave of overproduction.”

This second wave — occurring roughly between ages 10 and 13 — is quickly followed by a process in which the brain prunes and organizes its neural pathways. “In many ways, it’s the most tumultuous time of brain development since coming out of the womb,” says Giedd.

Confronted by these new discoveries, academics, counselors, and scientists are divided on just what all this means for children.

“Our leading hypothesis … is the ‘use it or lose it’ principle,” Jay Giedd tells FRONTLINE. “If a teen is doing music or sports or academics, those are the cells and connections that will be hardwired. If they’re lying on the couch or playing video games or [watching] MTV, those are the cells and connections that are going to survive.”

But others voice caution in leaping to conclusions about the implications of these findings.

“The relationship between desired behaviors and brain structure is totally unknown,” John Bruer tells FRONTLINE. He is president of the James S. McDonnell Foundation and author of The Myth of the First Three Years. “This simple, popular, newsweekly-magazine idea that adolescents are difficult because their frontal lobes aren’t mature is one we should be very cautious of.”

This FRONTLINE report also looks at research that is helping scientists understand another puzzling aspect of adolescent behavior — sleep.

Mary Carskadon, director of the E.P. Bradley Hospital Sleep Research Laboratory at Brown University, has spent years mapping the brains of sleepy teens. She has calculated that most teens get about seven and a half hours of sleep each night, while they need more than nine. Some say these sleep debts can have a powerful effect on a teen’s ability to learn and retain new material — especially abstract concepts like physics, math, and calculus.

Despite all the new scientific research, “Inside the Teenage Brain” suggests that there is a consensus among experts that the most beneficial thing for teenagers is good relationships with their parents. Even Dr. Giedd wonders about the kinds of lessons parents can draw from his science. “The more technical and more advanced the science becomes, often the more it leads us back to some very basic tenets. … With all the science and with all the advances, the best advice we can give is things that our grandmother could have told us generations ago: to spend loving, quality time with our children.”

Ellen Galinsky, a social scientist and the president of the Families and Work Institute, has seen scientific fads come and go. But she says her research for a book about children shows there are enduring lessons for parents. Drawing on her interviews with more than a thousand children, she found that, to her surprise, teens were yearning for more time and more communication with their parents, even when they seemed to be pushing them away. She told FRONTLINE, “Even though the public perception is about building bigger and better brains, what the research shows is that it’s the relationships, it’s the connections, it’s the people in children’s lives who make the biggest difference.”

On the day after Martin Luther King Jr. was murdered in April 1968, Jane Elliott’s third graders from the small, all-white town of Riceville, Iowa, came to class confused and upset. They recently had made King their “Hero of the Month,” and they couldn’t understand why someone would kill him. So Elliott decided to teach her class a daring lesson in the meaning of discrimination. She wanted to show her pupils what discrimination feels like, and what it can do to people.

Elliott divided her class by eye color — those with blue eyes and those with brown. On the first day, the blue-eyed children were told they were smarter, nicer, neater, and better than those with brown eyes. Throughout the day, Elliott praised them and allowed them privileges such as a taking a longer recess and being first in the lunch line. In contrast, the brown-eyed children had to wear collars around their necks and their behavior and performance were criticized and ridiculed by Elliott. On the second day, the roles were reversed and the blue-eyed children were made to feel inferior while the brown eyes were designated the dominant group.

What happened over the course of the unique two-day exercise astonished both students and teacher. On both days, children who were designated as inferior took on the look and behavior of genuinely inferior students, performing poorly on tests and other work. In contrast, the “superior” students — students who had been sweet and tolerant before the exercise — became mean-spirited and seemed to like discriminating against the “inferior” group.

“I watched what had been marvelous, cooperative, wonderful, thoughtful children turn into nasty, vicious, discriminating little third-graders in a space of fifteen minutes,” says Elliott. She says she realized then that she had “created a microcosm of society in a third-grade classroom.”

Elliott repeated the exercise with her new classes in the following year. The third time, in 1970, cameras were present. Fourteen years later, FRONTLINE’s “A Class Divided” chronicled a mini-reunion of that 1970 third-grade class. As young adults, Elliott’s former students watch themselves on film and talk about the impact Elliott’s lesson in bigotry has had on their lives and attitudes. It is Jane Elliott’s first chance to find out how much of her lesson her students had retained.

“Nobody likes to be looked down upon. Nobody likes to be hated, teased or discriminated against,” says Verla, one of the former students.

Another, Sandra, tells Elliott: “You hear these people talking about different people and how they’d like to have them out of the country. And sometimes I just wish I had that collar in my pocket. I could whip it out and put it on and say ‘Wear this, and put yourself in their place.’ I wish they would go through what I went through, you know.”

In the last part of “A Class Divided,” FRONTLINE’s cameras follow Jane Elliott as she takes her exercise to employees of the Iowa prison system. During a daylong workshop in human relations she teaches the same lesson to the adults. Their reactions to the blue-eye, brown-eye exercise are similar to those of the children.

“After you do this exercise, when the debriefing starts, when the pain is over and they’re all back together, you find out how society could be if we really believed all this stuff that we preach, if we really acted that way, you could feel as good about one another as those kids feel about one another after this exercise is over. You create instant cousins,” says Elliott. “The kids said over and over, ‘We’re kind of like a family now.’ They found out how to hurt one another and they found out how it feels to be hurt in that way and they refuse to hurt one another in that way again.”