Saturday, May 27, 2006

Does anyone else out there find it strange that the media is treating Elizabeth Vargas' demotion as ABC's nightly news anchor as a complex triple tragedy of tanking ratings, job loss, and pregnancy?

Here is the Washington Post, describing the replacement of a "pregnant woman with an older, more experienced man." Here is the New York Times describing Vargas feeling " 'an enormous amount of sadness' that a job to which she had aspired for sometime had slipped from her grasp." Everywhere, Vargas' pregnancy, her second, is linked to the end of her career.

But—and here's where the story gets sort of interesting—everyone is spectacularly and explicitly clear that Vargas was not forced from behind the anchor's desk; she asked for it: She explains that it's been a "difficult pregnancy" and the doctors have ordered her to ramp it down. She is quoted as saying that, "Every woman has the right to make that decision for herself and her family without anybody judging it. … It's just what's right for me now. … I would hesitate to draw any large conclusions about working women or working mothers."

Of course that is just what everyone is racing to do. We are desperate to draw large conclusions about working women and mothers—our bookshelves are groaning under the weight of those broad conclusions and sweeping theories. So why not use Vargas as a litmus test for it all?

Do you have to, Dahlia? If Vargas is a litmus test "for it all" then are we all supposed to be just like Elizabeth Vargas? For otherwise this makes no sense at all.

Yet this is what I constantly find: womanhood as some sort of a homogeneous substance, kneaded and rolled out into billions of identical gingerbread women. Whatever one woman does is somehow a sign of what all other women will do. Or rather, whatever one woman fails to do somehow proves the failings of all women. We don't treat men like this.

I don't actually think that Dahlia meant the Vargas case to be taken that way, but I wanted to put in all that gingerbread stuff. What she may have meant by the reference to a litmus test is exactly what I despaired over in the previous paragraph: the idea that all women are somehow part of the same homogeneous woman-substance and that one can stand for all in every way.

It's not too farfetched to suggest that Vargas may make different decisions about her career than other women would were they in the same place. On the other hand, Lithwick may have a point if the Vargas demotion is not really a voluntary one but a carefully staged move by the employer who doesn't care for the idea of any pregnant woman as the sole anchor of a serious news program:

Most of the news accounts go to great pains to explain that in fact all this is about bigger things than Vargas' belly: ABC recently slipped into third place in the evening news ratings; the experiment with Vargas and her co-host, Bob Woodruff, as the younger, hipper anchors failed when Woodruff was injured in Iraq and Vargas was left to carry the show all alone; the networks are all bracing for the September descent of Katie Couric at CBS. All plausible. And yet every account also mentions Vargas' looming pregnancy. If this was just a business decision, why cloud it with a great big dogfight about breeding women in the media?

"Why cloud it with a great big dogfight about breeding women in the media?" Because a business decision which consists of demoting a pregnant woman while keeping her injured cohost's seat open indefinitely might not go down with some of the show's audience. It's much nicer if Vargas demotes herself for all the family values.

I called this post "the curious case" because only those negotiating over the deal know what really happened. Was Vargas demoted or did she ask to be demoted? We don't know. But what we do know is that she is going to be replaced by an older man who will not get pregnant any time soon.

Politics is a funny bidness sometimes. For example, when the popularity of a leader plummets he may suddenly have to resort to totally new tactics, such as apologizing for being a macho man. That's what happened to George Bush recently. He and his sidekick Tony Blair gave a press conference with the message "We Are Sorry! So Sorry! Now Like Us Again!", and George took back his famous cowboy statements:

But in an unusual admission of a personal mistake, Mr. Bush said he regretted challenging insurgents in Iraq to "bring it on" in 2003, and said the same about his statement that he wanted Osama bin Laden "dead or alive." Those two statements quickly came to reinforce his image around the world as a cowboy commander in chief. "Kind of tough talk, you know, that sent the wrong signal to people," Mr. Bush said. "I learned some lessons about expressing myself maybe in a little more sophisticated manner."

What would have been a more sophisticated manner? "Advance upon us, if you may" instead of "bring it on"? No, the problem was not just in what George said but that he also acted in the exact manner those early statements reflect. He saw the war as a cockfight or a computer game. Something without lots of dead civilians, in any case. Where was the necessary foresight or planning? Nowhere, it seems. Foresight and planning are not as macho as photo ops of a president clad in a flightsuit.

Mr. Blair, whose approval levels have sunk even lower than Mr. Bush's, said he particularly regretted the broad decision to strip most members of Saddam Hussein's Baath Party of their positions in government and civic life in 2003, leaving most institutions in Iraq shorn of expertise and leadership.

You know what? I worried about this before the war had even started, and I'm not especially well informed about Iraq. But if Saddam Hussein's party was the Baath Party it seemed fairly obvious that you had to be a member of it to get any civil service jobs, and that getting rid of all the Baathists would leave the country in anarchy.

Why didn't these two brave leaders of the Free World worry about it any earlier? Did they listen to any experts in the area? Did they ever plan for an occupation longer than a few months?

It's not enough for these two men to come out now and say that they are sorry about the whole mess but that we should help them get out of it. It's not enough, because they are not some semilunatic goddess blogger who has no power over people's lives. They are the people responsible for running vast countries and they should have known better. The costs of their stupid mistakes are mouldering in graves right now and can't hear the belated apologies.

But the press conference wasn't just about apologies. It also contained a little bit more macho huffing and puffing:

Mr. Bush called the terrorists in Iraq "totalitarians" and "Islamic fascists," a phrase he has used periodically to give the current struggle a tinge of the last great American-British alliance, during World War II.

I'd call the terrorists demented madmen, myself, but it's interesting that Bush uses the qualifer "Islamic" when calling the enemy fascist, perhaps to distinguish them from the Christian variety of fascists. And the reference to World War II makes me frightened, because I sometimes think that Bush would like to be remembered as the man who started World War III.

Have you noticed how almost everything is now made in China? Check the labels on your clothing and electronics or on your shoes, and the chances are that you have a lot of Chinese stuff around.

I did some retail therapy during my vacation and at one point I started a game of trying to find something made in a country not China. Points were awarded for anything made in, say, Pakistan or Indonesia or even the Northern Mariana Islands. The jackpot was to find something made in the United States. By the end of the day I had found exactly one item of clothing (a leather belt) which was born here in the U.S..

Now, this is interesting. The theory of international trade teaches us that countries should specialize according to their comparative advantage, but it's hard to believe that the Chinese have such an advantage in practically everything. And having most of our consumption goods produced by one foreign country is a tad dangerous. I have these visions of China suddenly refusing to sell us anything (and also refusing to finance our trade deficit). All these Americans running around wearing nothing but leather belts...

Blogging topics are like that: iridescent soap bubbles leaving a pipe and floating uncertainly upwards until they !pop!. And then they are done. This is the sad aspect of blogging, its ephemerality, and also its good aspect, because the old mistakes are buried in no time at all. But right now I'm more annoyed by the fact that my vacation coincided with some events I wanted to write about, and now it's too late to do so.

For example, the fascinating question about the religious dress code in Iran and what it means to have this particular topic pop up in the Western media right around the time when George Bush wants more ammunition, and how the initial story appears to be false and how that means...what? That we can all sigh in relief because it's just the Iranian women who must follow a strict dress code? Because that specific dress code is an internal matter for the Iranian state, but one which would single out Jews or Christians would not be? Because women are "owned" in some sense? You get the point, and I wanted to make it when the topic was beautifully iridescent and still floating around. Now that it's just a wet patch on some journalist's face my point is lost.

Then there is the story about the Clintons and their marriage, so important that it had to be put on the front page of the New York Times, so important that it had to be commented on extensively by David Broder in the Washington Post who said this about Hillary Clinton:

The two sides of Hillary Rodham Clinton -- the opposites that make her potential presidential candidacy such a gamble -- came into sharp focus Tuesday morning at the National Press Club.

For the better part of an hour, the senator from New York held forth in a disquisition on energy policy that was as overwhelming in its detail as it was ambitious in its reach.

But the buzz in the room was not about her speech -- or her striking appearance in a lemon-yellow pantsuit -- but about the lengthy analysis of the state of her marriage to Bill Clinton that was on the front page of that morning's New York Times.

I'm confused. Are the two sides of Hillary Rodham Clinton her great knowledge base and her lemon-yellow pantsuit or are they her great knowledge base and the question how often she and Bill have sex? Or does she have three opposite sides: intelligence, pantsuits and Bill's penis needs? All of these seem to frighten Broder. It would probably be better to have a female candidate who is not smart or knowledgeable, who wears pinstripes and who has no husband at all. But then these journalists would write about her hidden lesbianism. Oh wait, they already do that with Hillary...

Broder is wading into some no-no areas here, unless he's willing to do a similar analysis of the marriages of male candidates for the job of the president of the United States. And yes, I know that the Clintons' marriage has been fair game for over ten years now, but it's still wrong to respect the privacy of other political marriages while attacking one of them.

Perhaps this isn't a soap bubble, after all. It smells a little different to me, like something from Karl Rove's little arsenal of smears.

Wednesday, May 24, 2006

With the inexplicably popular novel The Da Vinci Code back on top as the certainly-a-blockbuster movie version comes out, I thought it might be time to look back and wonder, once again, what bothered me so dang much the first time I came across this pretentious mess of second-rate historical revisionism.

At first, I thought it might be the utter lack of any female characters with more depth than the cardboard box I ate last night’s pizza out of – in a novel with almost sickly pretenses of feminist grandeur. But no, that wasn’t really it.

Then I thought it could be the tired (and very well-described) association of imperfect bodies with evil minds – because, as we all know, if you use a cane or can’t sunbathe in June, you must be the scion of satan’s minions, or some other such offensive inanity. But no – as bothersome as that was, that wasn’t really quite the gist of what rubbed me so much the wrong way.

And then I stumbled on it, in the most unlikely of places. Because I can’t seem to stop my Saturday-afternoon trash novel indulgence, I was recently reading John LeCarre’s The Constant Gardener – a novel as far away geographically and philosophically as one can get from Dan Brown, and which yet possesses that same excuse of a plot device that pretends to answer all the questions of the universe: One White Dude. Yes, that’s it – One White Dude.

Without ruining too much of either plot (for those gullible fans among us who haven’t yet ponied up the cash to pay off your library fines and borrow the books – or, god forbid, actually buy one of them: you might want to stop reading now) both novels ultimately find the root of evil in the dullest of places. Both take on the great evils of their realm – the founding lies of the Christian church in the former, the fatal machinations of the industrial-pharmaceutical complex in the latter – and after a couple hundred pages of intrigue and the rising hooded face of evil darkening our doors, what we’re left with is, put simply, One White Dude. One Bad White Dude. As it turns out, it’s not the thousand-year history of the church that foments Dan Brown’s ultimate evil, or the legacy of imperial rule over the black body in Africa that flows from Le Carre’s righteously-angered pen: it’s just One White Dude.

This is rich material. This is the territory of great philosophers and rabid socialists and radical reformers of the status quo hegemonic capitalist-patriarchal establishment. This is the heady stuff from which novelists can turn from mere commentators into shapers of a transformed reality. And yet, neither one managed to cough up more than one pale white man to play fall guy to the corruption of a humanity gone bad.

In other words, I wanted something juicy. I wanted world systems theory and a big hit of post-patriarchal punch wrapped into a novel I could digest between afternoon tea and midnight snack and still be satisfied at breakfast the next morning. But instead of digging into the trove of tarnished treasure and singing out the screaming indictment of human decay, Brown and LeCarre backed down when it counted most. Instead of reaming through the lies and the corruption with the laser-clean cut of diamond through cheap glass, all I got was One White Dude. Who’s the villain? Not post-imperialist capitalism, not hetero-patriarchal cultural appropriation, not hegemonic neo-colonialism, none of that delicious stuff. Just One White Dude.

And therein lies my great disappointment. These are the sorts of authors that pick their topics with just enough acumen to lend them some street cred – but without ever having to do the dirty work of wondering at (never mind actually questioning) the way in which those One White Dudes and their ilk rise to the corrupting power, the way in which those One White Dudes represent something deeper, grander, more powerful, and far more sinister than what’s hidden only inside their own pale skins. Nope, this is what counts as cultural criticism for the masses these days: just One White Dude. And once he’s vanquished…well, there’s nothing left to see here, folks. Move along.

Just don’t trip over that pesky hetero-imperialist hegemon on your way out the theatre door.

Prevention is a good thing, right? The health care system should spend more on prevention. That way we'd save money, prolong lives and avoid pain and suffering. Yes, probably. But sometimes it pays to look at concepts from a different angle, to abstain from the instant emotional reaction and to ask some hard questions, and I feel like doing that with prevention.

Take flossing. Suppose that you floss five minutes a day. That means spending more than a month flossing in the next forty years. Is flossing the best way to spend that month? What if you took the same time and meditated or performed jumping jacks? What are the health benefits of flossing like this?

The point of this example is not that you should stop flossing and grow green moss over your teeth. The point is that prevention also has its costs and sometimes these costs are considerable in time and perhaps also loss of enjoyment. We are told that exercizing is good for health, and it probably is. But is it still good if the person doing the exercizing hates every single minute of it?

There is something puritanical in the American fascination with prevention. If it hurts it must be good for you, so you should eat lots of bran while running around the kitchen and flossing. Then you will live for ever, and if you do not, well, it's your own sinful lifestyle that caused you to die. Sadly, nobody has yet managed to get out of this life alive, and in that sense all prevention is in vain. But we like to pretend that if we only cut out all the fat and the caffeine and the chocolates (!) it just might be possible to live forever. And those who fail to do so must have been bad. Perhaps they had too many hamburgers or eclairs. In any case, they deserved to die.

That puritanical whiff is something I intensely dislike, partly because I'm totally addicted to chocolate, but mostly because it's unbecoming. But prevention has other problems, and one of them is that its costs are rarely addressed. The assumption is that prevention saves money for the health care sector, and it may* do so, or at least some types of prevention do so, but the people doing the prevention will incur costs, both in money (buy dental floss and a skipping rope), time (exercize four hours a week) and psychological adjustments (learn to love cabbage).

And then there is the much bigger problem of establishing when prevention actually works. It's such a nice idea, prevention, that we'd love to just assume that it will always work. But it may not, and only proper medical studies can find out whether certain preventive measures are efficient.

Doing such studies can be difficult. Think about trying to establish whether being physically active reduces depression. If you are depressed you won't feel like being physically active, so finding a correlation between the two doesn't necessarily mean that physical activity causes less moodiness. To establish that one must study people who are not yet depressed, and these people must be randomly divided into two groups, one a control group who is allowed to live as they usually do, and the other a group which is assigned a physical exercize program. Then one must follow the two groups for quite a long time to find out if the depression rates differ. All this is also quite expensive.

Or consider what is sometimes called secondary prevention: the use of screening tests such as mammography to detect illnesses early. The rationale of such screening is that early diagnosis improves the effectiveness of treatment. But does it? Note that it's not enough to find that people who have been diagnosed early appear to live longer with the disease, as this is one obvious consequence of finding out about the illness earlier. Now, mammography has been shown to improve treatment outcomes in breast cancer, but whenever a new screening tool becomes available we should not just assume that it's an improvement.

In any case, prevention is like playing a game of chance. What you try to do is raise the odds that you will live long and happy. But this may not work out. You might run a marathon every day, eat nothing but cabbage and such, and get hit by an SUV during the twentieth mile of your daily run.

Which means that perhaps we should take prevention with a pinch of salt or some laughter. Or some nice dark chocolate.---*May because people who live a very long time often end up in nursing homes and these are expensive to run.

Today was election day in my small town. There were only three school bond proposals to decide and unfortunately they don't have a chance in hell of passing, but I went to vote just the same. Because I can. I turned down the absentee ballot option because I wanted to go vote at the poll and I was sure access here, at this time, wouldn't be a problem.

With September primaries quickly coming up, the fiasco of Florida's hanging chads still haunting election judges everywhere, and the requirements to provide fully accessible voting for all varieties of disabled people, there's a considerable amount of voting angst among public officials and private citizens who keep up on voting issues.

HAVA, the federal Help America Vote Act of 2002, requires that every polling place in the country provide a voting system that persons with disabilities can use independently and privately. Much voting for disabled people has been known to occur at a table in public, with one or two poll workers assisting with the voting procedure. This system lacks privacy and provides no way for blind citizens to know if the poll workers truly marked the ballot as instructed.

Enter the machines. Since HAVA means every voting district in the country needs some way to meet federal requirements, many business opportunities sprouted for manufacturers of electronic voting machines. But acquiring voting machines that satisfy disability access,voter trust, and accuracy has been a nightmare for voting officials around the country. Citizens are suing the states for better set-ups, states are suing the companies manufacturing the machines for failures of all kinds, and September looks closer than ever.

It seems certain that disabled voters will be the ones to bear the brunt of this problem. In New York City, there will be just five polling places where disabled people can hope to find total access this fall. That's one polling site in each borough for a population of people largely dependent on public transportation that doesn't do well accommodating them either.

One solution to this whole mess that seems to be gaining currency is voting by mail. Absentee voting is being expanded to "permanent" absentee voting and then to "no excuse" absentee balloting and voting by mail for all. Many claim it's a much better system and supposedly many disabled people would prefer to always vote by mail.

I think it's a bad idea. Oh, it might be smart in the short-term while the numerous problems with voting are minimized, but in the long-term it's maybe bad for democracy and certainly bad for the disabled. If the solution to problems of accessibility is to not require anyone to show up, then all the churches and rec centers and other polling sites that are not currently accessible will have less pressure to become so. And all the poll workers who will be trained on how to interact with disabled people to help them vote will never be trained. And all the disabled people who rarely get out of the house because of Medicare homebound laws* and lack of transportation, will have one less reason to interact with the world. All this equals less accessibility and freedom for the disabled in the long-run.

Additionally, I believe the assurance of maximizing privacy and actual casting of the votes disabled people choose themselves can only happen at polling sites. This may be true for many women as well, if they are in coercive relationships. A private vote taken at a public place ensures society's most vulnerable citizens the freedom to make their own political decisions. Should disabled persons require human assistance to vote after all, at least it is legally required that someone impartial -- or two people, one from each party -- assist. If privacy must be sacrificed in any way, as it most certainly will be for many severely disabled people if everyone votes by mail, there should be neutrality built into the assistance.

Of course, voting that discriminates against the disabled hasn't been resolved even with the ADA being 16 years old. There's no reason to expect any future public outcry about voting by mail -- if there is one -- will center on the rights of disabled persons now. But there are other reasons it remains a bad idea.

__________________________________________________________

*From an article at New Mobility (italics mine): In 2002, at the 10-year anniversary of the ADA implementation... President Bush (announced), "Today Medicare recipients who are considered homebound may lose coverage if they go to a baseball game--which, of course, I encourage them to do--or meet with a friend or go to a family reunion. So today I announce we're clarifying Medicare policy. So people who are considered homebound can occasionally take part in their communities without fear of losing their benefits."

Sunday, May 21, 2006

The U.S. Supreme Court has just declined to hear arguments in a case that I think is worth noting, about the rights of gay parents who separate from their child's biological parent. The Boston Globe has a good story on the case here, which expresses what the legal issues in the case are more clearly than some of the other articles out there, and the Seattle Post-Intelligencer has a story with some additional details here. Essentially, while the two parties were a couple, Britain conceived a child through artificial insemination. Carvin stayed home to care for their daughter after she was born; the child called her Mama. When the couple split up in 2001, Britain refused to allow Carvin to see the child. (She also, in an unusual twist, married the sperm donor, referred to in the articles as a "gay friend" of the couple). Carvin went to court; while the articles don't clarify exactly what rights she sought, the Washington Supreme Court held that she could try to prove that her connection to the child was such that despite the lack of a biological relationship she could try to prove that she was her de facto parent, her parent in everything but blood, a status that would entitle her to parental rights.

Britain argued in response that the Washington court's ruling interfered with her constitutional right to make decisions for her daughter; it's that argument that the Supreme Court has recently declined to hear, allowing the lower court's ruling to stand. Her attorney says the following (in the Seattle Post-Intelligencer):

"I think it's inevitable the Supreme Court is going to take one of these cases," said Jordan Lorence, who represented Britain for the Alliance Defense Fund. "There is nothing that says family law is exempt from federal constitutional scrutiny."

My initial reaction is to say that the constitutional argument here is relatively weak, but I imagine that it's true that the Supreme Court will eventually hear one of these cases. It is worth noting, though, that it can't rewrite Washington's family law unless it finds it to be unconstitutional, which in my opinion would be extreme under the circumstances.

I think the Washington court reached the right result. Britain's attorney engages in some mild scaremongering on the topic, as follows:

They argued that the Washington ruling opened the door for all kinds of people -- from live-in boyfriends to roommates -- to claim parental rights.

"Under the court's ruling, a child could have an unlimited number of parents," said Britain's Seattle attorney, Kristen Waggoner.

I have to admit, the possibilities here do not strike me as all that frightening. First, seeing a woman who was partnered with the child's mother and was the child's primary caretaker until she reached the age of 7 as a de facto parent doesn't appear really lead to a conclusion that a parent's roommate--who didn't participate in the decision to conceive the child and hasn't acted as a parent or occupied the role of a parent in the child's life--would also be treated as a de facto parent. I'm also not especially phased by the idea that a child could have (gasp!) more than two parents. In fact, I think that recognizing the possibility that for some children more than two adults do play parental roles might usefully clarify some aspects of family law.

But most importantly, I think that the core of the ruling is correct. I don't believe that blood should be our only touchstone for parenthood. Barbara Bennett Woodhouse has written, in a slightly different context, that “at a time when children are suffering no shortage of begetting but face a serious shortage of care, our laws on fathering consistently place small value on nurturant, interdependent conduct, and instead overvalue ownership through procreation.”* She uses the Dr. Suess story of Horton hatching the egg to suggest that the act of fathering a child shouldn't grant absolute parental rights in the absence of care and effort--but care and effort should lead to some right to continued involvement in a child's life, genetics notwithstanding. My work with domestic violence victims--which has often taken the form of fighting against granting biological fathers visitation or custody--has convinced me that common genes aren't enough to make a person a parent. And my friendships with gay and lesbian men and women who are, or will be, parents to children to whom they aren't related--as well as my strong relationships with my stepparents on both sides of my family--show me that family isn't adequately defined by our genes.

Support the Blog

More Ways To Support The Blog

About Me

For Readers Abroad

Permalink Notice

Because of changes created by Blogger, older permalinks to my archived posts no longer work. My apologies for that. The year-and-month in the old permalinks are correct, however, so you may be able to find the post you are looking for with some work. Alternatively, e-mail me for the currently functioning permalink.