July 13, 2014

Between 1990 and 2002, the rate among primiparous women living in cities in China rose from 18% to 39% and close to two thirds of urban women now give birth by caesarean section. Although the rate in rural China has risen more slowly, it is now thought to be above 25%.

Data from WHO suggest that nearly half of all births in China were delivered by caesarean section in 2007—08,1 which is three times higher than WHO's recommended proportion of 15%. Although the rate of caesarean section in China has decreased to about 42% in 2010, it is still the highest worldwide. ...

There are five reasons other than medical necessities that might explain the high rate of caesarean section in China. First, some women's concerns about pain and vaginal tone after vaginal birth: in their opinion, caesarean section is safer, faster, and less painful, and is less likely to affect the quality of sexual life than vaginal birth. Second, some women wrongly believe that they are more likely to regain their prepregnacy shape after caesarean section than vaginal birth. Third, Chinese mothers like to choose a delivery date on the basis of luck and belief, and it is easier to deliver on a scheduled day by caesarean section than to deliever an unplanned vaginal birth. Fourth, some doctors do recommend caesarean section to women in view of the present uneasy doctor—patient relationship and possible lawsuits. Furthermore, caesarean section is financially profitable for the hospital. For example, in large Chinese cities such as Beijing, the price is about 6000 RMB (US$1000) for vaginal birth, whereas it is at least 12 000 RMB ($2000) for caesarean section in some top-level hospitals. Fifth, increasing numbers of macrosomia in China attributable to the increasing prevalence of diabetes and obesity in women,4 and increasing pregnancies in older women will further increase the rate of caesarean section.

Slate writes:

One of the leading causes of cesareans in countries such as the United States—hospital policies discouraging subsequent vaginal births by women who have had C-sections—isn’t an issue in China, thanks to the one-child policy. ...

For modern expectant women, by contrast, the combination of the one-child policy and feverish economic development has yielded an environment in which they—and the in-laws and husbands who have so much riding on a single birth—fear any potential misstep. ... Obstetrics and Gynecology Department at Beijing United Family Hospital, said that “with the one-child policy, people don’t want to take any risks.” And many in China mistakenly believe cesareans to be safer for both mother and child. “As much as I try to tell patients what the evidence shows,” Afnan continued, “it’s not really so easy to convince them.” ...

[T]the one-child policy’s most significant contribution to China’s high C-section rate may lie in having erased any fear of the complications cesareans create for later deliveries. After all, the cesarean’s most significant risks to the mother—like a ruptured uterus or hemorrhaging caused by abnormalities in the placenta—only arise with later births.

May 25, 2014

Just saw that General Wojciech Jaruzelski died at age 90. Completely unrelated, I noticed yesterday that General Paul von Lettow-Vorbeck died at age 93 in 1964. And I've been googling a good amount of 20th century military history, and lots of generals make it into their late 80s and 90s. And many of them did not have easy lives, including Jaruzelski and Lettow-Vorbeck.

There are studies of pre-20th century life that indicated being a minister is the highest life expectancy occupation, and being in the military is not so good. But I'm wondering about how high ranking officers who make it out of the service do in retirement.

The story cites two recently published studies and one review paper. Only one of the studies looks at mortality:

Some of his recently published papers reviewing the latest research
suggest that regular marathon running increases the risks of an abnormal
heartbeat, damage to heart tissue, and hardening of the arteries. Other
research suggests that those who train hard every day don’t live as
long as those who run at a more moderate pace a few days a week.

In a February study,
Danish researchers followed nearly 1,900 runners for three decades and
found that those who jogged slowly for up to 2½ hours a week lived about
six years longer on average than those who ran longer and faster.
Swedish researchers reported in June
that elite cross-country skiers who had the fastest times in a 56-mile
ski marathon or those who competed in the greatest number of those
marathons were also 30 percent more likely than their fellow competitors
to be hospitalized for an irregular heartbeat.

“When there’s enough smoke, there’s usually some fire,” Thompson
said. “This may be a small fire, but I think most of us believe there’s
cause for some concern.”

I had a vague memory of having looked at the Danish study when it came out and not having put up a blog post. But I couldn't remember what I had wanted to write. And I do have my wishful thinking to contend with, having just signed up for a 50 mile race and still hoping to put in more fast running once my recent running injury is further in the past. So I pulled an ungated version of the Swedish paper. And what do I find? This the Boston Globe version:

That doesn't look good for 'fast' or 'frequent' runners...but what is 'fast' or 'slow' --- for bad runners a 'fast' pace may be what a good runner considers 'slow'. Hey, my standard for fast is faster than 7:10 min/mile and I don't do most of my running at that pace, or even 10% or 5% (my injury may in part be due to trying to do one mile fast per day). So the vast majority of my running is slow, i.e. 8:30-10 min/mile. One the other hand, my 'slow' def. overlaps with my wife's 'fast'.

But what does the relevant part of the paper say? Here CI is 'confidence interval'.

Jogging pace and mortality

In a subanalysis of the fourth survey, the hazard ratios
adjusted for sex and the confounders in model 2 were 0.37
(95% CI: 0.12, 1.17) for slow pace (178 joggers, 3 deaths),
0.53 (95% CI: 0.29, 0.95) for average pace (704 joggers,
12 deaths), and 1.22 (95% CI: 0.49, 3.04) for fast pace
(201 joggers, 5 deaths), compared with those for nonjoggers.
This analysis comprised only a few deaths among the
joggers and should be met with caution, but the results
suggest that a slow or average pace could be related to the
lowest mortality.

Frequency of jogging and mortality

A subanalysis of the fourth survey yielded similar results
for frequency of jogging with hazard ratios of 0.40 (95%
CI: 0.15, 1.10) for ≤1 time per week (323 joggers, 4
deaths), 0.40 (95% CI: 0.16, 0.98) for 2–3 times per week
(474 joggers, 5 deaths), and 1.24 (95% CI: 0.51, 3.02) for
>3 times per week (84 joggers, 5 deaths), compared with
values for nonjoggers. Hence, these data should also be
interpreted with caution. A frequency of jogging of ≤3
times per week was associated with the lowest mortality,
and we could find no increase in survival with >3 jogging
sessions per week compared with nonjogging.

That's it. 178 slow joggers with 3 deaths, 201 fast joggers with 5 deaths. Reallocate 1 death and the difference between slow and fast jogging mortality hazards goes away...And they don't have enough 3+ times a week joggers to determine much of anything, even with 5 deaths. And I don't see any significance testing for trend effects, which is standard for this sort of paper, in jogging speed or frequency --- probably since it is obvious that they don't have any trend effects at any conventional level of signicance.

Come on. This sort of data shouldn't move your prior belief much.

The AF study also doesn't seem to show much...yes, endurance athletes get more AF and have heart remodelling, we already know this. The question is what effect that has. Some forms of AF are bad, but are these ones bad?

I do agree that jogging should be fun and not 'stress'. Beyond that, don't worry too much. If going 50 miles/week with some nice fast stretches is fun, so be it.

Update #1: also, anybody who thinks slow and occasional joggers have less than half the mortality of non-joggers, 90% of the population, including all other non-running leisure time athletes, is kidding themselves. Light jogging, and if I had to guess, heavier jogging, doesn't add 6.2 years to male life expectancy holding everything else constant. That's some combination of sampling error, unobserved confounders, reverse causation, and bad modeling.

The study just don't have enough power here to say much. And I'm concerned that their parametric risk adjustment is doing too much work given how few deaths they have, lots of population heterogeneity and strong functional form assumptions that aren't tested.

Ah, now I remember my first reaction to the paper: is there an age-graded reporting bias on pace? Say old runners say they run 'fast' since they run faster than other old guys, but younger runners say they run 'slow' since they run as fast as other young runners. Reference groups change.

Also, can you take a flat slope for 'hours per week' vs mortality seriously, while having a the same time having a steep slope for 'runs per week' vs mortality? One 4 hour run per week is safe, four half hour runs are dangerous?

Update #2: on a more technical note, did they have one young fast high milage runner die on them? That would explain everything, including the high weight on that one occurance (from the runner's low age) --- not sure how small sample confidence intervals work in this sort of Cox model. Not what I do at work...though I do feel I should put I foot down more when I see these sorts of models go by...

May 8, 2013

I should dig out the associated mortality. From JAMA 2007. I'm trying to figure out what the risk factors for CKD are and how much of this prevalence they account for (i.e. how much is 'excess' prevalence).

September 29, 2012

More generally, I'm just confused about COPD. One issue is that it has sliding diagnostic boundaries and may not be a terribly unified disease.

More later...

Update #1: here some relatively clean lookind data (lots of time-series data has obvious breaks). Note that US men and women are now leading developed countries in COPD mortality, a new phenomena. It's not like the French or Poles don't smoke.

May 20, 2012

1) Sleep deprived children don't do well. Kids were overnighted with (different) friends so we could go off to run our half-marathon. Kids #1 and #2 came back with a severe lack of sleep. Mayhem ensued, probably the worst joint kid-parent blow out ever. Or at least close to it. Kid #1 threatend to call 911 to complain about child abuse, so I had to pull the landline to make sure she didn't. Kid #2 got in a fight with her mother over cleaning up, from there had a tantrum and kicked kid #1, who then took it out on her mother and kid #3. And overreacted to that. Everybody is fine now, it seems. Early (i.e. regular) bedtime was enforced with no pushback (okay, minor pushback from kid #2).

2) There was a large bulbonic plague pandemic in the late 19th century and early 20th century that killed several million people and had significant public policy/political impact, including providing a public health context for racial segragation worldwide (including in the US South's final big push for segregation in the 1890s). More on that later.

3) There are (were?) Muslims tribes in Yunnan. Still looking into this...related to the outbreak of the bulbonic plague due to a mid-19th century Muslim rebellion and the ensuing refugee flows and public health breakdown.

Update #1: one problem with all the weight/mortality studies is that almost all of them (not all though) don't have a long history of weight for their subjects. That is, they only have the weight of the subject at middle age or late middle age or older and then the mortality observations over a subsequent time period, usually something like 5 to 20 years. This makes it even harder to disentangle the health-weight correlation: lots of people are low weight (due to both unintentional and intentional weight loss) because of illnesses that go on to kill them. It isn't that being low weight causes mortality, the things that cause mortality also cause low weight, including by making people adopt healthy lifestyles. Having a longer history of subject weight might alleviate this problem: light 25 year olds are not, for the most part, light since they are unhealthy (but who knows?). Light 55 year old are already, it seems, a much more mix picture.

Also an issue: it does seem like higher lean body mass is associated with lower mortality, holding body fat (not body fat percentage) constant. But here is is mostly that the bottom quartiles of lean body weight has problems, anybody above the cut point seems fine. But it is hard to be above the bottom quartile (or is it bottom quintile?) of lean body BMI if you have a BMI of 20 or less.

Next we have Body Fat and Fat-Free Mass and All-Cause Mortality, a Danish study of 50+ year olds. Here the main figure is hard to read, but it also shows, dark lines for men and light lines for women, that low lean body BMI is bad, here up to around the median lean body BMI. Bottom line: being light because you have low muscle mass is bad. This is not necessarily a reflect causation, again, health issues that kill you may also cause low muscle mass. Knowing more about causality would be nice.

On this scale for old people I'm doing fine, in my current (higher than intended post-Christmas) weight my lean BMI is around 18.7 and my fat BMI is 3.3. So maybe I should get my lean BMI up to 19? Should be doable, that's only 3 lbs more muscle...harder to do if you want to loose 10 lbs overall. These guys have more fat than me, but they are older too...

January 21, 2012

From the CDC -- not sure what sort of smoother they are applying to the data here, I suspect they are potentially oversmoothing. Still, it looks like Americans at a BMI of 20 have halves in just 25 years.

I think I'm around 13% right now, but it could be 15% or 16% as well (or is there lots of adipose tissue lurking internally?)...I'm not doing any Dexa scan. Still, even 16% put you in the 5%tile of 20-39 year old men and 18% puts you in the 5%tile of 20-39 year old non-hispanic white men.

June 20, 2011

On any given day, most U.S. adults report performing predominantly sedentary and light activities, according to a new study published in the October issue of the American Journal of Preventive Medicine. Overall only 5.07% report any vigorous intensity activity. ..... researchers from the Pennington Biomedical Research Center, Baton Rouge, Louisiana, used data collected between 2003-2008 from close to 80,000 respondents to the American Time Use Survey (ATUS). This nationally representative telephone-based survey captures activities that people recall doing during the preceding 24 hours. These data were coupled with published Metabolic Equivalent (MET) intensity values in order to group activities into sedentary, light, moderate, and vigorous categories. While most Americans engage in sedentary activities such as eating and drinking (95.6%), followed by watching television/movies (80.1%), and light activities such as washing, dressing, and grooming oneself (78.9%), and driving a car, truck, or motorcycle (71.4%), most did not engage in moderate or vigorous activities. The most frequently reported moderate activities were food and drink preparation (25.7%), followed by lawn, garden, and houseplant care (10.6%). The most frequently reported vigorous activities were using cardiovascular equipment (2.2%) and running (1.1%).

What I want to know is how much the expercise equipment I see in the houses I look at gets used.

The BLS also writes up some data from the ATUS. They find that on any given day only 16% of Americans do any exercise, including (volunatary) 'walking'. Surprising to me is how many people go to the gym to life weights.

June 15, 2011

Fairfax County, Virginia, has the highest white male life expectancy in the US (Institute for Health Metrics and Evaluation via the WP). Not only that, Fairfax County has a higher white male life expectancy than any country has for male life expectancy. Fairfax County's while male life expecancy is 80.9 years, while Iceland, the world leader in male life expectancy, is 80.2 years. BTW, Japan isn't in the very top group here (see below). Not clear to me how much measurement error there is in these local life expectancy estimates.

At least I'm living in the right US country, even by global standards. Here are the top US counties for white male and white female life expectancy. And do scroll down for more after this table...

The headline for this data is, via Kevin Drum, that life expectancy has fallen in the last decade in many parts of the US. And the declines are in 'bad areas'. The largest increases are in wealthy high educated metro areas like NYC suburbas, the DC suburbs or the Bay Area suburbs with already the highest life expecancy in the US. This fits with the observation that Increases in Adult Lifespan in the US are Very Unequally Distributed by Income from a few days ago.

In 737 U.S. counties out of more than 3,000, life expectancies for women declined between 1997 and 2007. For life expectancy to decline in a developed nation is rare. Setbacks on this scale have not been seen in the U.S. since the Spanish influenza epidemic of 1918, according to demographers.

"There are just lots of places where things are getting worse," said Dr. Christopher Murray, director of the Institute for Health Metrics and Evaluation at the University of Washington, which conducted the research. "We're not keeping up."

....A key finding of the data is that "inequality appears to be growing in the U.S.," said Eileen Crimmins, a gerontologist at USC who also co-chaired the 2011 National Academies panel on life expectancies. "We are different than other countries."

The map is below.

Why is this, just from a variance decomposition into age specific death rates? Is this about young or old people mortality?

Also interesting: Japan's position as the country with the highest life expecancy in the world is due to the life expectancy of Japanese women -- Japanese male life expecancy is only # 10 in the world, shorter than countries like Switzerland, Canada, Sweden or even Israel or Australia! Though the differences are small.

April 11, 2010

Street trees weren’t always as allergenic as they are today. Back in
the 1950s, the most popular species planted in the United States was
the native American elm, which sheds little pollen. Millions of these
tall, stately trees lined the streets of towns and cities from coast to
coast. Sadly, in the 1960s and ’70s, Dutch elm disease killed most of
the elms, and many of them were replaced with species that are highly
allergenic.

This has caused trouble for Americans with allergies
— as many as 30 percent of adults and 40 percent of children — most of
whom are sensitive to pollen, as well as for the many millions who have
allergy-induced asthma. Although some pollen can be carried great
distances by the wind, most atmospheric pollen comes from plants
growing nearby. In other words, the pollen that’s making you sneeze as
you walk down the street probably came from the tree you just passed. So
it makes sense for gardeners, especially public gardeners who plant
trees by the dozens, to pay attention to the pollen their trees produce.

Some trees shed huge amounts of highly allergenic pollen; others
produce very little, or their pollen is only moderately irritating.
Female plants produce no pollen at all. But arborists rarely take this
into account. In New York City, street trees are selected only for their
hardiness in winter; their resistance to disease, insects and drought;
their ability to withstand smog; and their size, shape and color.

The pollen that causes the most severe allergic reactions comes from a
few so-called monoecious species of trees, which have both male and
female flowers, and from the males of separate-sexed (dioecious)
species. Many arborists and landscapers like to plant male trees and
shrubs because they’re “litter-free” — that is, they produce no seeds or
seedpods. But male trees shed lots of pollen; that’s their job. And
once it’s released, it can be blown around for months.

In New
York City, about 30 percent of the
street trees are Norway maples and London planes, both monoecious
kinds that always produce allergenic pollen. And of the total 5.2
million trees growing on the city’s private and public lands, some
300,000 are male mulberry trees and almost 100,000 are box elders,
mostly also male — making it all the more important to reduce the number
of allergenic trees along the streets.

Another problem with New
York City street trees is that there are so few kinds of them. Only 10
species of trees account for nearly three-fourths of the total. That
means New Yorkers are repeatedly exposed to the same kinds of pollen,
which increases the likelihood that they will develop allergies. City
arborists could set a healthy example for property owners by increasing
the diversity of street tree species and choosing low-pollen kinds.

October 18, 2009

I finally found the footnote in McPherson's Battle Cry of Freedom that states 50,000 civilian deaths for the US Civil War. There is no documentation, except for a) laying out the claim that most 19th century wars have a much higher ration of civilian to military deaths (civilian deaths one to two times the number of military deaths), b) there don't seem that many civilian deaths in the documents McPherson knows, except for one typhoid outbreak in Wilmington. I also found a statement by Drew Gilpin Faust from 2001 saying the current literature seems to run the risk of understating civilian death. I should take a look at her current book, but I doubt she's going to do any demographic work, that's not the kind of work she does. Who does? This is not an obscure topic and it's not like there's no documentation -- we even have administrative records. This should be a answerable question, unlike, say, civilian deaths in Russia during the Napoleonic Wars.

If US Civil War civilian deaths are so low, why is this? It seems weird. Civil Wars aren't known for low civilian death rates and typhoid isn't unknown in the 19th century South.

BTW, according to wiki, citing older sources with methods that aren't clear (are people just picking numbers that 'sound reasonable'?), Napoleonic Wars "military deaths are invariably put at between 2.5 million and 3.5 million, [figures for] civilian death tolls vary from 750,000 to 3 million." I'd also not recognized the size of the 1813 'Völkerschlacht bei Leipzig' -- four days, 520,000 men fielded, 80,000-110,000 casualties, not clear how many died, may be around 50,000-60,000 or more, hard to tell in part since as usual typhoid spread among military casualties and civilians. 'Bei' is a bit of a misnomer, this is more 'all around and in' Leipzig. Biggest land battle until WWI.

A 2004 article in the Wurzbg Medizinhist Mitt. claims a 1% civilian loss rate -- 250,000 civilian deaths in a population of 25 million -- from war associated typhoid in the winter of 1813/14. Methods are unclear. Wurzburg, Aschaffenburg, and Mainz hit in particular. But why would you get war associated typhoid in these cities? Taking care of wounded? I can see lots of infected and parasite ridden people moving about.

Although the United States was the first nation to introduce a regular census (taken decennially from 1790 onwards), vital registration was left to state and local governments. Consequently, it was instituted unevenly. A variety of churches kept parish records of baptisms, burials, and marriages, and these have been used to construct demographic estimates for the colonial period, especially for New England and the Middle Atlantic regions. Although some cities (e.g., New York, Boston, New Orleans, Baltimore, Philadelphia) began vital registration earlier in the 19th century, the first state to do so was Massachusetts in 1842. An official Death Registration Area (DRA) consisting of ten states and the District of Columbia was only successfully established in 1900, and data collection from all states was not completed until 1933. A parallel Birth Registration Area (BRA) was only instituted in 1915, and collection for all states was also achieved in 1933. There were also a significant number of “Registration Cities” outside the DRA and BRA were also included in the data reporting until 1933.1 The federal census did collect mortality information with the censuses of 1850 to 1900, but there were significant problems with completeness. The data do improve over time, and, after 1880, census information was merged with state registration data [Condran and Crimmins, 1979]. Nothing similar, however, was undertaken for birth data.

October 13, 2009

I was looking around for casualty numbers for the American Civil War, and there is little out there on civilian casualties. What were the excess civilian deaths from this war? The ratio of military deaths from combat (including later effects of wounds) vs. non-combat related disease seems to be in the 1:2 area. It's my understanding that less than 5% of African American Union deaths resulted from combat related causes. How does this compare to European 19th century wars? What did say Napoleonic Wars look like, or the US Revolutionary War? Is this in part that the South was a tougher terrain to fight in back then? All of which suggests there is lots of scope for excess civilian deaths.

October 10, 2009

Those who lack health insurance now are far more likely to live in
states that usually vote Republican — the states whose senators and
representatives are least likely to support a law to extend coverage.

That
would seem to indicate that Republican constituents are the ones who
would most benefit from passage of universal health insurance coverage.
But an analysis of Congressional districts within those states
indicates that those without health insurance are much more likely to
live in strongly Democratic Congressional districts. Many of those
contain large minority populations with relatively low incomes.

Texas is the #1 example. It would be good to know more. These are stark differences, not marginal differences in insurance coverage and voting at the Congressional district level. (map)

Update #1: looking at the map (which is for the population with < 200% poverty line income), there is lots of detail that is non-obvious. First, there are good and bad states that aren't what you'd expect given their region. All of New England is good, but NY is bad, especially downstate, but also update (and compare it to the improvement you get when you cross the border to equally depressed upstate Pennsylvania). New Jersey is bad also. Alabama and Tennessee are good. Colorado is bad and Utah is good (well, maybe that's not surprising). Urban areas are generally worse than rural areas, see for instance Virginia. Some of this -- how much? -- is simply where recent immigration is, most of it hispanic, see the urban/rural differences and the states: Texas, Florida, New Mexico, California (including the distribution with these states). The midwest does well in part since recent hispanic immigration didn't go there (is this even true?). And again, what is the deal with Alabama?

Update #2: Jody comments:

There are a ton of Latino immigrants in the upper Midwest now. In a lot of those highly-insured Minnesota counties, the biggest employers are the schools and the county governments. That's a pretty reliable route into insurance. (Also Minnesota has had a robust insurance program for low-income children since the 1980s.)

Yes, there has been a large influx of hispanic immigrants into the Midwest by past standards for that region, but Minnesota isn't a state that has received a lot of this immigration. States with high hispanic populations in the Midwest are Kansas with 8.9% and Nebraska with 7.5%. Minnesota is 4.0%, compared to a national figure of 15%. Connecticut is 11.7%, NY 16.1%. And then there are places like North Dakota with 1.4% or Maine with 1%.

My point is that Minnesota still has a small hispanic population.

Going further, I think the high fraction of uninsured recent hispanic immigrants does shape the politics of this issue, see the starting quote of this post. In some places this is a 'us vs. them' issue, see the infamous 'you lie' outburst thy Rep. Wilson.

A quick google got me these twofigures. It would be interesting to know what these figures look like once one accounts for age, income, occupation, industry, job tenure etc. I'm surprised that hispanics do so much worse than blacks. Is it higher public employment by blacks, while hispanics are in traditional non-public industries (agriculture, construction, service work). And this incidence of non-insured statues does, I think, help explain some of the politics here, at least the Republican opposition.

There's also a very steep gradient in insurance status by citizenship: 12.9% of native born citizens are not insured, 18.% of naturalized citizens, and 44.7% of non-citizens.

October 7, 2009

Not that this is the last word on the topic, but I'm glad somebody noticed and wrote a paper, since I've been puzzling about this myself. (via Razib)

Part of the abstract reads:

Population health did not decline and indeed generally improved during
the 4 years of the Great Depression, 1930–1933, with mortality
decreasing for almost all ages, and life expectancy increasing by
several years in males, females, whites, and nonwhites. For most age
groups, mortality tended to peak during years of strong economic
expansion (such as 1923, 1926, 1929, and 1936–1937). In contrast, the
recessions of 1921, 1930–1933, and 1938 coincided with declines in
mortality and gains in life expectancy. The only exception was suicide
mortality which increased during the Great Depression, but accounted
for less than 2% of deaths.

Update #1: Jody comments:

Workplace mortality? Those were eras of industrial malfeasance.

6.2 years would be a lot of workplace mortality. My impression from the literature is that alcohol and tobacco consumption play a big role here. Still, the effect seems too large and too uniform across demographic groups.

September 20, 2009

Gregory Paul argues that high religiosity is not universal to human
populations, and it is actually inversely related to a wide range of
socio-economic indicators representing the health of modern
democracies. Paul holds that once a nation's population becomes
prosperous and secure, for example through economic security and
universal health care, much of the population looses interest in
seeking the aid and protection of supernatural entities. This effect
appears to be so consistent that it may prevent nations from being
highly religious while enjoying good internal socioeconomic conditions.

National
level statistics suggest that strong mass religiosity is invariably
associated with high levels of stress and anxiety, which are created by
impoverishment, inequality, or economic security, related to high
levels of societal dysfunction. These relationships are largely
consistent when the United States, an outlier amongst advanced
democracies in the high level of both religious belief and social
decay, is removed from the comparison.

When the government killed all the pigs in Egypt in an attempt to
combat swine flu, it was warned the city would be overwhelmed with
trash. Now, it is. ....

People do not take their garbage out. They are accustomed to seeing someone collecting it from the door.

For
more than half a century, those collectors were the zabaleen, a
community of Egyptian Christians who live on the cliffs on the eastern
edge of the city. They collected the trash, sold the recyclables and
fed the organic waste to their pigs — which they then slaughtered and
ate.

Killing all the pigs, all at once, “was the stupidest thing
they ever did,” Ms. Kamel said, adding, “This is just one more example
of poorly informed decision makers.”

When the swine flu fear first emerged, long before even one case was reported in Egypt, President Hosni Mubarak ordered that all the pigs be killed in order to prevent the spread of the disease.

When
health officials worldwide said that the virus was not being passed by
pigs, the Egyptian government said that the cull was no longer about the flu, but was about cleaning up the zabaleen’s crowded, filthy, neighborhood.

That was in May.

Today
the streets of the zabaleen community are as packed with stinking trash
and as clouded with flies as ever before. But the zabaleen have done
exactly what they said they would do: they stopped taking care of most
of the organic waste.

Instead they dump it wherever they can
or, at best, pile it beside trash bins scattered around the city by the
international companies that have struggled in vain to keep up with the
trash.

“They killed the pigs, let them clean the city,” said
Moussa Rateb, a former garbage collector and pig owner who lives in the
community of the zabaleen. “Everything used to go to the pigs, now
there are no pigs, so it goes to the administration.” ...

“The state is troubled; as a result the system of decision making is
disintegrating,” said Galal Amin, an economist, writer and social
critic. “They are ill-considered decisions taken in a bit of a hurry,
either because you’re trying to please the president or because you are
a weak government that is anxious to please somebody.”

Pigs were the champion garbage consumers in Cairo. Goats just don't seem up to the task.

August 3, 2009

Lots of press reports about how people are not getting enough sun exposure and the resulting health problems, in part mediated by vitamin D deficiency. Today in the WP. Not sure what to make of this or where we fall in this spectrum. I sure don't get much sun when I'm working.

The researchers and others blamed the low levels on a combination of
factors, including children spending more time watching television and
playing video games instead of going outside, covering up and using
sunscreen when they do go outdoors, and drinking more soda and other
beverages instead of consuming milk and other foods fortified with
Vitamin D.....

The analysis and an accompanying federal study also found an
association between low Vitamin D levels and increased risk for high
blood pressure, high blood sugar, and a condition that increases the
risk for heart disease and diabetes, known as the metabolic syndrome.

Taken together, the studies provide new evidence that low Vitamin D
levels may be putting a generation of children at increased risk for
heart disease and diabetes, two of the nation's biggest health problems
that are also increased by the childhood obesity epidemic.

After looking at mortality spikes in Sweden's history, I went looking for similar spikes in US history. Though I don't have a good link now (here are unadjusted crude death rates), it appears that the Great Depression did not, surprisingly, have much of an impact on US mortality, despite the great dislocations and material suffering it caused. Surprising in light of the absence of Social Security, Medicare and Medicaid and the presence of severe fiscal crises for what ever institutions provided assistance. It would be nice to know more. Indeed, we may already be in the modern regime where economic downturns reduce mortality (less drinking, less smoking, less driving).

We may be facing some offsetting effects here, with more deaths in some population groups and less in others.

Birth rates, on the other hand, did fall substantialy, then more than recovered in the 1950s.

January 5, 2009

From the Human Mortality Database, survival by age for the US, Germany, Japan and Russia for 2005. Russians are as likely to live to 40 as Japanese to 68.

Lets do the death hazard rate as well (log scale for the annual death hazard). No wonder kids in their early teens think they are immortal with a death hazard under 1/10,000 per year or, if this hazard remained stable, a life expectancy over 10,000 years. And this is the death hazard for all people -- not just well situated healthy kids. Also note that the US death hazard is much higher than the Japanese or German one fore people in their twenties, around twice as high. That is not a small difference in the way the world works.

Also, some time series data for Sweden, the country with the longest history on the Human Mortality Database. Note that annual variability has pretty much gone away after 1945, but is pretty large before then (though it is important to realize that the survival rate here is a cumulative figure given that year's death hazard).

Some of the downward spikes aren't necessarily what you think they are. The biggest downward spike is 1772-73, a large multiyear famine that killed over 100,000 Swedes, with an annual death rate above 5% in 1773, supposedly an renewed impetus for land reform. The last big downward spike is in 1918, the Spanish flu.

Is there any current lore about this Swedish famine? Did it result in an immigration spike from Sweden? Anything else? I would expect an experience like this to leave traces even today. There are also smaller less deadly later famines in Sweden (1808-10?, 1869, 1899?!, 1903?), but I've not looked into these. 1903 doesn't show up in deaths, but seems to produce a larger humanitarian aid response.

Which is a reminder how small an impact AIDS has had so far. And the Bird Flu or similar has yet to strike. Even a 1% death rate would be more than 3 million people in the US. I ought to think about how to invest in this sort of situation.

BTW, what is the last US famine with (large) excess mortality? If one has to go back further than in most European countries, why is this?

January 29, 2008

We estimate mortality rates by a measure of socio-economic status in a very large sample
of male German pensioners aged 65 or older. Our analysis is entirely nonparametric.
Furthermore, the data enable us to compare mortality experiences in eastern and western
Germany conditional on socio-economic status. As a simple summary measure, we compute
period life expectancies at age 65. Our findings show a lower bound of almost 50
percent (six years) on the difference in life expectancy between the lowest and the highest
socio-economic group considered. Within groups, we find similar values for the former
GDR and western Germany. Our analysis contributes to the literature in three aspects.
First, we provide the first population-based differential mortality study for Germany. Second,
we use a novel measure of lifetime earnings as a proxy for socio-economic status that
remains applicable to retired people. Third, the comparison between eastern and western
Germany may provide some interesting insights for transformation countries.

The magnitude of the effect seems large, with the top group -- about the top 5% of lifetime earners -- having a 90% chance of making it from age 65 to 75, while the bottom group -- about the bottom 5% of lifetime earners -- only has 70% chance, making for a three times higher chance of dying. The impressive and noteworthy aspect of the results, to me, however is that the low life expectancy isn't something about the bottom 5% or bottom 10%, which would make sense in terms of health problems causing both higher death rates and low earnings, but something about the bottom half of lifetime earners. What is the causal mechanism here that generates this pattern? Indeed, not that being in the bottom 10% or the bottom 50% doesn't seem to make much difference, which moving up in the income distribution seems to be associated with monotone increases in life expectancy. Why is this?

I'd need to read the paper to see how it deals with some obvious heterogeneity issues, like spousal income.

October 11, 2007

The health insurance coverage of the population changes sharply at age 65 as most people
become eligible for Medicare. But do these changes matter for health? We answer this question
using data on over 400,000 hospital admissions for people who are admitted through the
emergency room for “non-deferrable” conditions—diagnoses with the same daily admission
rates on weekends and weekdays. Among this subset of patients there is no discernable rise in
the number of admissions at age 65, suggesting that the severity of illness is similar for patients
on either side of the Medicare threshold. The insurance coverage of the two groups is much
different, however, with a large jump at 65 in the fraction who have Medicare as their primary
insurer, and a reduction in the fraction with no coverage. These changes are associated with
significant increases in the number of procedures performed in hospital, and in the rate that
patients are transferred to other care units in the hospital. We estimate a nearly 1 percentage
point drop in 7-day mortality for patients at age 65, implying that Medicare eligibility reduces
the death rate of this severely ill patient group by 20 percent. The mortality gap persists for at
least two years following the initial hospital admission.

The results sound impressive, but they look less impressive when graphed (I cannot see a break in mortality at 65 by eyeballing this data...okay, there may be a visible break for 7 day mortality, but I doesn't look too significant to me). Not clear to me how robust these results actually are....time to read the paper? This could be big if it is a robust result.

March 21, 2007

The Massachusetts Department of Health put out its annual mortality report: Massachusetts Deaths 2005. One thing that stands out is the hispanic mortality gap, which is the opposite of what you'd expect given income and education levels of hispanics. The differences are huge, more than 10 years, about three times as big as the non-hispanic-white to black gap, which narrows with age, while the hispanic to non-hispanic gap grows with age. More later... and yes, I know some of this is just reverse migration. The issue is how much (and what mortality related migration means for life expectancy across states in general.). The current state or research suggests that reverse migration isn't close to accounting for all of this huge gap (for instance Turra and Goldman, also here, or, more ambiguously, Patel et.al., McKinnon and Hummer, Zhang et.al.).

Also, the sad flattening out of life expectancy, which is not true, I believe, for the US in general, ah, here is the data...not a puzzle, the pattern for the US looks similar to that in MA.

In 1971 President Nixon declared war on cancer and increased the federal funds allocated to
cancer research dramatically. Thirty years later, many have declared this war a failure. Overall
cancer statistics confirm this view: age-adjusted mortality in 2000 was essentially unchanged
from the early 1970s. At the same time, age-adjusted mortality rates from cardiovascular disease
[CVD] have fallen quite dramatically. Since the causes underlying cancer and cardiovascular disease
are likely to be correlated, the decline in mortality rates from cardiovascular disease may be
somewhat responsible for the rise in cancer mortality. It is natural to model mortality with more
than one cause of death as a competing risks model. Such models are fundamentally unidentified,
and it is therefore difficult to get a clear picture of the progress in cancer. This paper derives
bounds for aspects of the underlying distributions under a number of different assumptions.
Most importantly, we do not assume that the underlying risks are independent, and impose
weak parametric assumptions in order to obtain identification. The theoretical contribution of
the paper is to provide a framework to estimate competing risk models with interval data and
discrete explanatory variables, both of which are common in empirical applications. We use our
method to estimate changes in cancer and cardiovascular mortality since 1970. The estimated
bounds for the effect of time on the duration until death for either cause are fairly tight and
we find that trends in cancer show much larger improvements than previously estimated. For
example, we find that time until death from cancer increased by about 10% for white males and
20% for white women.

The basic idea is that cancer death rates are not going down because CVD death rate have come down so much: the people who are not dying of CVD now have a much higher risk of dying of cancer that the people who previously did not die of CVD, and that drives up cancer death rates by shifting the population characteristics of the surviving population. David saves people from CVD death not so that they can go on to die of cancer at the normal cancer hazard, but at a much higher cancer death hazard. I still need to figure out how the authors can figure this out from the data they have and the assumptions they make.

Here are the age adjusted mortality rates by disease:

Or, probably more interesting, the age profiles of death hazards for CVD and cancer:

March 8, 2006

The Harvard Magazine covers Nancy and James Krieger's work on poverty and health in the most recent issue, The People's Epidemiologists. See the pdf version for the photos and graphs.

The intro paragraph and later parts of the article keeps refering to poverty raising the premature mortality rate (for deaths under 65) by 30% (or, rather, making the death rate 30% lower if you are far from poverty):

In the city of Boston—and everywhere else—wealth
equals health. If you live in Beacon Hill’s Louisburg Square, which
sits in the federal census tract with the third highest median family
income in Suffolk County—$196,210—you’re sitting pretty. Your risk of
dying before the age of 65 is about 30 percent less than if you live on
Pleasanton Street in Roxbury, about four miles away, where the median
family income is $30,751, and where one-third of residents live (or
more accurately, survive) below the poverty line.

...

Another slide showed a census tract map of Boston, with rising
proportions of “excess” premature mortality—shown in deepening shades
of red—in lower-income districts, compared to the richest census
tracts. In essence, the map portrayed the economic geography of early
death. Krieger cited a figure known euphemistically as the “Population
Attributable Fraction,” or PAF, of premature mortality due to
census-tract poverty. Translated, it says that 30 percent of those who
died under age 65 in the city’s poorest areas would still be alive if
the premature death rates in their neighborhoods had been the same as
those in Louisburg Square.

But I've not read the paper, just glanced at the graphs for about 2 minutes....probably just a definitional issue related to weighting.

Update #1: yes, that's it: the PAF is the fraction of total deaths in the population group that would not have occured if all people in that population group had the death rate of low poverty census tracts. This implies a weighting by population in different census tracts. It is not the fraction of death that would not occur if the population in that group in the hightest poverty census tracts had the death rate of the low poverty census tracts. That fraction is much higher than 30% and much more frightful (more like 55-60%).

Update #2: In Painting
a truer picture Krieger et.al. report both the “Population
Attributable Fraction” emphasized in the Harvard Magazine article and what they call the "Rate Ratio (RR)", or the ration of the death rate in the poorest and the wealthiest cencus tract categories (so the RR depends on how these categories are defined). For Massachussets and Rhode Island (what is the coverage they have) they find for premature mortality of all persons a PAF of 22.2% and a RR of 2.2. So what we have is for a lower PAF than the 30% cited above a RR of 2.2 (or about a mortality reduction of 55% from moving from the poorest to the wealthiest cencus tract category, about what I guessed above). From skimming, it's not clear at all to me what other conditioning variables Krieger et. al. use -- I supect none or just age.

One observation here is that census tract in the US is often a much better indication of permanent income (i.e. wealth, including future labor income) than current income, and may be a very good instrument for picking up individiual (not social and economic environment) effects. So I don't think the interpretation of these results is totally clear. An exogenous movement of persons across census tract categories would be nice to look at, but the effects here are going to be very persistent and are not going to show up quickly. And it's hard enought to pick up these effects for phenomena like education outcomes that are much quicker to react to treatments.

Also interesting is that the PAF or RR is much higher for some morbidity other than death. For instance, the PAF for genorrea 71% and the RR is 11.5, and for syphillis the PAF is 72.7% and the RR is 16.9. And this, I'd guess, understates the concentration of these diseases, if you conditioned on other observables. A good amount of these differences are driven by racial and ethnic disparities not absorbed by census tract income (as can be seen by the considerably smaller PAFs and RRs racial and ethnic groups), thought this may mostly mean that census tract data understates actual racial and ethnic income disparities.

So why this emphasis on the PAF rather than the RR? Why look at how much overall death rates would fall if all census tracts had the lowest census tract death rate, rather than by how much death rates would fall in the poorest cencus tracts? Both numbers don't summarize the actual social choice problem very well, but there must be some implicit (or explicit) reason to prefer the PAF to the RR? I do suspect that the PAF is an attempt to appeal to a wider audience that may care more about everybody and see no particular reason to worry about poor people in particular (there are fewer poor people than lower middle class people), and putting a political coalition together is going to require focusing on a broader social support base than poor people and the altruistic better off. But there must be more going on here than just that. And outlook that emphasizes (correctly?) PAF over RR would seem to deserve some elaboration as well.

January 7, 2006

Set off by the issues raised by the post below on life expectancy by age, what are optimal mortality trade-offs by age? I know, we should not expect unanimity on this matter, but what sort of dominance relationships can we expect between different age profiles for mortality hazards?

For instance, it's not even clear to me that an increase in mortality at an earlier age offset by a decrease in mortality at an older age is always worse: for instance, wouldn't we prefer one extra death of a newborn to one extra death of a two year old, even if that means we are foregoing two years of life for one person? And if this is so, where does this effect stop? That is, how old does the older person have to be that having him die is preferable to having the newborn die? Or, more starkly, how do we feel about five year olds vs. twenty year olds v.s. forty year olds? What else do we need to know about them? What does this mean for health policy decisions? At what age are people at the peak value of their life (where the investments have been made and are now starting to pay off, plus some adjustments for discounting)?

I know, these trade-off often don't crop up in a stark manner and are often (always? for good reasons?) avoided. Still, at many levels I think reducing the deaths of 80 year olds and having more deaths of people in their middle age doesn't sound like a good trade-off to me (see the US death hazards by ago below)... though you need to be clear what the exact weighting of 'deaths' here is...life expectancy at birth does include the death rate at eighty, it just weights it by the (low) survival hazard to age eighty.

January 5, 2006

I've repeatedly come across the claim that national health issurance for the elderly -- Medicare -- has made US life expectancy at age 65 the highest in the world, while US life expecancy at birth is quite low by international standards for developed countries. For instance, I just ran across James Morone making this claim in his 2005 Storybook Truths about America, discussing Louis Hartz's Liberal Tradition, while reading on the train on the way home from work:

Compared to other nations, Americans have terrible
health. Girls in the United States are born with a life
expectancy that ranks twenty-eighth in the world.
Male babies weigh in at thirty-first – in a dead tie with
Belize and Brunei. Among the thirteen wealthiest
countries, the United States ranks last (or nearly last)
in almost every way we measure health: infant mortality,
low birth weight, life expectancy, teenage death
rates, and on and on.

There are a lot of explanations for this dismal data
but most include the 44 million Americans with no
health insurance (another thirty million have insurance
that is flatly inadequate). For fifty years, the left
has proposed a solution – national health insurance.
After all, argue proponents, the United States implemented
universal insurance for people over 65
(through Medicare, introduced in 1966) and American
elders rapidly rose to the top of the cross-national
health comparisons. Every other industrial nation,
continues the argument, insures that everyone can
get basic health care.

American leaders who actually propose national
health insurance, however, quickly find themselves
embroiled in the fight of their lives. Conservative

Senator Dole said 30 years ago he was one of 12 people that voted against Medicare and he was proud of it. A year ago he said, ``I was right then; I knew it wouldn't work.'' American seniors have the highest life expectancy in the world. We need to reform it, not wreck it.

James Carville -- most likely Clinton's source -- also makes this claim repeatedly, but I cannot find a good link now, at least not for any remarks after Clinton's 1996 comments.

This claim always struck me as suspect, and if true as very interesting and remarkable, since it would be such a turnaround from the low US life expectancy at birth, even if it was caused by such perverse mechanisms as 'kill the weak and poor off young, leaving the rest to live long once old'.

But, alas, it's a fact that doesn't seem to be true, at least not currently. See for instance the most recent OECD data for Life expectancy at age 65 in 2001 (not my graph or collection of data):

The US isn't doing badly for life expectancy at 65, but it's clearly not the leading country either.

As recently
as 1980, the United States led virtually all major
developed countries — with the exception of Canada
— in terms of life expectancy for women at age 65.
U.S. life expectancy for men in 1980 was toward the
middle of the pack. But since then, life expectancy at
age 65 has advanced far more rapidly in essentially all
other industrial nations.
In countries like Japan, France, and
Switzerland, both men and women age 65 now live
longer than 65-year-olds in the U.S. The divergence
is especially great for women, and in 1999 the
average 65-year-old American woman could have
expected to live 1.5 years less then her Swiss
counterpart, 1.8 years less than her French
counterpart, and 2.8 years less then her Japanese
counterpart. In fact, the U.S. ranked 18th in life
expectancy for women [at age 65] among the 30 OECD
countries, [but more than] slightly above Greece, Korea, Mexico, and
most of the former Warsaw Pact countries.

Despite a lower life expectancy at 65, the United States has a
much lower mortality rate at ages 80 and over. While this
mortality rate is still below that of other developed countries,
including Japan, the difference is marginal. Moreover, as late as
the early 1990s the United States was the world’s leader in life
expectancy at age 80. Since then, however, there has been little
improvement in mortality for those 80 years and older and the
United States’ advantage has been lost (Manton and Vaupel
1995 and Vaupel 2003).

The papers has some nice graphs, including graphs that show how the US is overtaken by other countries for life expectancy at 65 over the last 30 years. Since I cannot figure out how post the graphs here, you'll need to click the paper.

What does it all mean? What struck me as most interesting is that few people, or at least few in the media, seem to have 'fact checked' this supposed fact back in 1996, when it already was apparently false and should have been know to be falso given data available then fron the mid 1980s and later. Though in part it may depend on what you mean by 'seniors' -- 65 year olds or 80 year olds?