For the past seven years, I’ve been in an “interpenile relationship”—I, the lesser of the two you might say, am circumcised; my partner is not. This contrast between our members is not exactly at the top of our list of concerns. But it is nonetheless interesting how my prepuce came to disappear into a medical waste bin in a bustling New Jersey hospital on some springtime day in 1975, whereas his, by contrast, has remained a fellow traveler all the long way from that tiny Mexican village where he slipped from his young mother’s womb on a chilly December morning in 1981. That womb, incidentally, belonged to a Roman Catholic. The one that I bathed in, the place in which I had my “bones and sinews knitted together,” in the words of Job, was the property of a Jew. So despite neither of us being particularly patriotic nor, certainly, religious today, the organs dangling so differently between us are nevertheless the very incarnations of our parents’ vast cultural differences.

Whatever the reasons that previous generations may have had for choosing to remove their infant sons’ foreskins, they were almost always unconvincing. All else being equal—and let me reiterate that caveat because it’s likely to go unnoticed, with some readers eagerly pointing out to me those rare cases of congenital defects in which circumcision can legitimately improve the quality of life for some males, which is of course true—all else being equal, any dubious benefits derived from religious, social, hygienic, or aesthetic reasons are clearly outweighed by the costs of male circumcision. Because of some rabbi in Hackensack shaking his head over my intact genitalia, my parents went unblinkingly along with the amputation of a fully operational, perfectly healthy, and probably adaptive body part, all to sacrifice an ounce of their son’s tender flesh to a god that he would never believe in anyway.

Today, however, all is no longer equal, and the balance between the relative risks and benefits of male circumcision has clearly shifted in the other direction. That is, it has according to the American Academy of Pediatrics, which just earlier this week put out its revised position statement on infant male circumcision. Here’s the money quote:

Systematic evaluation of English-language peer-reviewed literature from 1995 through 2010 indicates that preventive health benefits of elective circumcision of male newborns outweigh the risks of the procedure. Benefits include significant reductions in the risk of urinary tract infection in the first year of life and, subsequently, in the risk of heterosexual acquisition of HIV and the transmission of other sexually transmitted infections.

Many of our parents, it seems, may have actually made the right decision for the wrong reasons. Although the task force behind the Academy’s reassessment stopped short of advising “routine” and “universal” removal of the foreskin for all newborn males, and stressed that it remains a personal decision to be made by informed parents, its language represents an increasingly unambiguous endorsement of male circumcision among the world’s leading health organizations (including the World Health Organization and UNAIDS) . By contrast, many of the world’s leading parents remain skeptical of the findings reviewed by the Academy, questioning both the methodologies and the generalizability of studies conducted overwhelmingly with African populations, in which rates of infection are dramatically higher than those in the US. (For more information on this research, as well as a description of the physical factors responsible for the reduction of HIV acquisition in circumcised males, see my earlier discussion at Scientific American.) The more vocal “intactivists,” who’ve long been protesting what they regard as an antiquated, cruel and unnecessary ritual act against little boys that is just as abhorrent as female clitoridectomy, have also responded bitterly to this newest AAP development, seeing fresh strands in an ongoing web of conspiracy between the major health organizations, third-party insurance companies implementing the policy views of these organizations, and greedy practitioners who mislead parents about the benefits of circumcision only to reap insurance payouts for “mutilating” children’s genitals.

Keith Kloor is a freelance journalist whose stories have appeared in a range of publications, from Science to Smithsonian. Since 2004, he’s been an adjunct professor of journalism at New York University. You can find him on Twitter here.

When it comes to climate change, the bad news pummels you the way Mike Tyson, in his prime, pummeled opponents into submission. The onslaught is so relentless that sometimes I just want to crumple into a heap and yell: Make it stop! The latest beat-down, for example, is news of the record ice shrinkage in the Arctic. That seems to have shaken up a lot of people.

But before everyone sinks into catatonic despair, I want to return to a recent piece of stunningly good news on the climate front. Perhaps you saw the headline several weeks ago: “U.S. carbon emissions drop to 20-year low.”

Alas, there was a catch. The biggest reason for the decline, as the AP reported, “is that cheap and plentiful natural gas has led many power plant operators to switch from dirtier-burning coal.”

If this isn’t the definition of quandary, I don’t know what is. Gas emits much less carbon than coal (probably between 25% and 50% less), which is a net plus on the global warming ledger. And shale gas, in case you hadn’t heard, is entering a golden age; it is abundant and newly retrievable across the world, not just in the United States. It’s the bridge fuel to a clean energy future that liberal think tanks and university researchers were touting just a few years ago. Given the political stalemate on climate change, one energy expert gushed in a recent NYT op-ed: “Shale gas to the rescue.”

But a grassroots backlash to the relatively new technology (hydraulic fracturing) that unlocks shale gas has set in motion powerful forces opposed to this bridge getting built. Leading climate campaigners, citing concerns about industry practices and continued reliance on fossil fuels (even if less carbon intensive), are now a big part of the growing anti-fracking coalition. Mainstream environmentalists have also jumped on that bandwagon.

Thus the battle lines are drawn, with enviros and climate activists digging in their heels against a shale gas revolution that could pay big climate dividends. This is a story in of itself. Now a new twist promises to make it even more interesting. Earlier this week, Michael Bloomberg, the billionaire philanthropist and New York City mayor, gave the Environmental Defense Fund (EDF) a $6 million grant for its work “to minimize the environmental impacts of natural gas operations through hydraulic fracturing.” The grant follows on the heels of a Washington Post op-ed that Bloomberg co-authored with a gas industry executive. In the piece, they champion the environmental and economic benefits of natural gas, while also calling for more stringent fracking rules and better industry practices.

Keith Kloor is a freelance journalist whose stories have appeared in a range of publications, from Science to Smithsonian. Since 2004, he’s been an adjunct professor of journalism at New York University. You can find him on Twitter here.

Myths about the Hero Twins,
one of whom is shown holding a bow here,
are an important part of Navajo identity.

In certain circles, there is a violent allergic reaction whenever someone suggests that religion and science are compatible. A particular type of atheist is especially vulnerable to this immune disorder. For example, P.Z. Myers, the evolutionary biologist and pugnacious blogger, became famously symptomatic at a 2010 gathering of atheists. After one participant suggested that non-religious people could still be spiritual, Myers nearly retched:

Whenever we start talking about spirituality, I just want to puke.

I hope Myers didn’t have too much to eat before reading the headline from this week’s commentary in Nature: “Sometimes Science Must Give Way to Religion.” The column, by Arizona State University’s Daniel Sarewitz, suggests that rational explanation of the universe’s existence, as advanced recently by discovery of the Higgs boson, can’t match the feelings evoked by spectacular religious symbolism, such as that found in Cambodia’s ancient Hindu temples, which Sarewitz explored this summer. He writes:

The overwhelming scale of the temples, their architectural complexity, intricate and evocative ornamentation and natural setting combine to form a powerful sense of mystery and transcendence, of the fertility of the human imagination and ambition in a Universe whose enormity and logic evade comprehension.

Science is supposed to challenge this type of quasi-mystical subjective experience, to provide an antidote to it.

But in the words of Time magazine’s Jeffrey Kluger, “our brains and bodies contain an awful lot of spiritual wiring.” Religion is the antidote our evolutionary history created. And even if you don’t buy that particular theory, you can’t simply dismiss the psychological and cultural importance of religion. For much of our history, religion has deeply influenced all aspects of life, from how we cope with death and random disaster to what moral codes we abide by. That science should (or could) eliminate all this with a rationalist cleansing of civilization, as a vocal group of orthodox atheists have suggested, is highly improbable.

Amy Shira Teitel is a freelance space writer whose work appears regularly on Discovery News Space and Motherboard among many others. She blogs, mainly about the history of spaceflight, at Vintage Space, and tweets at @astVintageSpace.

The idea of a red sky at night used to invoke beautiful images of vibrant sunsets, the product of warm sunlight bathing the sky near the horizon. The adage of “red sky at night, sailor’s delight” refers to a calm night ahead; a red sunset suggests a high-pressure system in the west is bringing calm weather. But red skies at night have taken on a new meaning in recent decades. As outdoor lighting become increasingly prominent, our night skies are gradually turning from black to red.

This discovery came from a team of scientists led by Christopher Kyba from the Freie Universitaet and the Leibniz Institute of Freshwater Ecology and Inland Fisheries. The scientists were tracking the effects of cloud cover on light pollution when the realized the colour of the night is changing. Their report, entitled “Red is the New Black,” was just published in the journal Monthly Notices of the Royal Astronomical Society.

Until relatively recently, nights skies were quite dark. The only major source of light was the Moon, allowing us to see thousands of individual stars and the wide, glowing swath of the Milky Way across the sky. Then people started illuminating the outdoors and nights became brighter. Benjamin Franklin helped promote street lamps in the U.S. and improved the designs of these early versions, which were made from candles in glass cases on top of high posts. These were replaced by gas lamps starting in Baltimore in 1816, which remained popular until Thomas Edison introduced the light bulb. Electric streetlights first appeared in Cleveland in 1879 and were the dominant form of street illumination by the turn of the century. As electricity became more affordable, the number of street lamps increased, turning dark city skies into a thing of the past.

This useful light doesn’t confine itself to the paths and streets we want to illuminate—much of it gets scattered by and into the atmosphere. This sky glow is a common phenomenon seen over busy urban areas. Some types of light fixtures produce more of a glow than others. Street lamps open on the top, unfocused lights, and upward-facing lights, like those placed under billboards, drastically increase the amount of sky glow. The more light sent upwards, the more light scattered back down by the atmosphere.

Derek Lowe is a medicinal chemist who has worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer’s, diabetes, osteoporosis, and other diseases. He has been writing about drug discovery at In the Pipeline, where this post originally appeared, for more than ten years.

The British Medical Journalsays that the “widely touted innovation crisis in pharmaceuticals is a myth.” The British Medical Journal is wrong.

There, that’s about as direct as I can make it. But allow me to go into more detail, because that’s not the the only thing they’re wrong about. This is a new article entitled “Pharmaceutical research and development: what do we get for all that money?”, and it’s by Joel Lexchin (York University) and Donald Light of UMDNJ. And that last name should be enough to tell you where this is all coming from, because Prof. Light is the man who’s publicly attached his name to an estimate that developing a new drug costs about $43 million dollars.

I’m generally careful, when I bring up that figure around people who actually develop drugs, not to do so when they’re in the middle of drinking coffee or working with anything fragile, because it always provokes startled expressions and sudden laughter. These posts go into some detail about how ludicrous that number is, but for now, I’ll just note that it’s hard to see how anyone who seriously advances that estimate can be taken seriously. But here we are again.

By Keith Kloor, a freelance journalist whose stories have appeared in a range of publications, from Science to Smithsonian. Since 2004, he’s been an adjunct professor of journalism at New York University. You can find him on Twitter here.

Last year, after Al Gore said in a speech that climate change was responsible for various extreme weather events around the globe, he got spanked by Oxford climate scientist Myles Allen, who wrote a column in the Guardian entitled, “Al Gore is doing a disservice by overplaying the link between climate change and the weather.”

I have to wonder if Allen is now thinking the same thing of NASA climatologist James Hansen. Because, as New York Times reporter John Broder puts it, Hansen, this week, has “roiled” the climate science community “with a new scientific paper explicitly linking high concentrations of carbon dioxide and other heat-trapping gases to recent severe heat waves and drought.”

The controversial paper was published on Monday in the Proceedings of the National Academies of Sciences (PNAS). A day earlier, Hansen previewed the study’s findings in a Washington Postop-ed, in which he made this jaw-dropping assertion:

It is no longer enough to say that global warming will increase the likelihood of extreme weather and to repeat the caveat that no individual weather event can be directly linked to climate change. To the contrary, our analysis shows that, for the extreme hot weather of the recent past, there is virtually no explanation other than climate change.

Around 12:15, on the afternoon of August 14, 2003, a software program that helps monitor how well the electric grid is working in the American Midwest shut itself down after after it started getting incorrect input data. The problem was quickly fixed. But nobody turned the program back on again.

A little over an hour later, one of the six coal-fired generators at the Eastlake Power Plant in Ohio shut down. An hour after that, the alarm and monitoring system in the control room of one of the nation’s largest electric conglomerates failed. It, too, was left turned off.

Those three unrelated things—two faulty monitoring programs and one generator outage—weren’t catastrophic, in and of themselves. But they would eventually help create one of the most widespread blackouts in history. By 4:15 pm, 256 power plants were offline and 55 million people in eight states and Canada were in the dark. The Northeast Blackout of 2003 ended up costing us between $4 billion and $10 billion. That’s “billion”, with a “B”.

But this is about more than mere bad luck. The real causes of the 2003 blackout were fixable problems, and the good news is that, since then, we’ve made great strides in fixing them. The bad news, say some grid experts, is that we’re still not doing a great job of preparing our electric infrastructure for the future.

Andres Barkil-Oteo is an assistant professor of psychiatry at Yale University School of Medicine, with research interests in systems thinking, global mental health, and experiential learning in medical education. Find him on Google+ here.

Last spring, the American Psychiatric Association (APA) sent out a press release [pdf] noting that the number of U.S. medical students choosing to go into psychiatry has been declining for the past six years, even as the nation faces a notable dearth of psychiatrists. The Lancet, a leading medical journal, wrote that the field had an “identity crisis” related to the fact that it doesn’t seem “scientific enough” to physicians who deal with more tangible problems that afflict the rest of the body. Psychiatry has recently attempted to cope with its identity problem mainly by assuming an evidence-based approach favored throughout medicine. Evidence-based, however, became largely synonymous with medication, with relative disregard for other evidence-based treatments, like some forms of psychotherapy. In the push to become more medically respected, psychiatrists may be forsaking some of the important parts of their unique role in maintaining people’s health.

Over the last 15 years, use of psychotropic medication has increased in all kinds of ways, including off-label use and prescription of multiple drugs in combination. While overall rates of psychotherapy use remained constant during the 1990s, the proportion of the U.S. population using a psychotropic drug increased from 3.4 percent in 1987 to 8.1 percent by 2001. Antidepressants are now the second-most prescribed class of medication in the U.S., preceded only by lipid regulators, a class of heart drugs that includes statins like Lipitor. Several factors have contributed to this increase: direct-to-consumer advertising; development of effective drugs with fewer side effects (e.g., SSRIs); expansion in health coverage for mental illness made possible through the Mental Health Parity Act; and an increase in prescriptions from non-psychiatric physicians.

Unfortunately, not all of these psychiatric drugs are going to good use. Antidepressive drugs are widely used to treat people with mild or even sub-clinical depression, even though drugs tend to be less cost-effective for those people. It may sound paradoxical, but to get more benefit of antidepressants, we need to use them less, and only when needed, for moderate to severe clinically depressed patients. Patients with milder forms should be encouraged to try time-limited, evidence-based psychotherapies; several APA-endorsed clinical guidelines center on psychotherapies (e.g., cognitive behavioral therapy or behavior activation) as a first-line treatment for moderate depression, anxiety, and eating disorders, and as a secondary treatment to go with medication for schizophrenia and bipolar disorder.

Sophie Bushwick (Twitter, Tumblr) is a science journalist and podcaster, and is currently an intern at DISCOVERmagazine.com. She has written for Scientific American, io9, and DISCOVER, and has produced podcasts for 60-Second Science and Physics Central.

Renowned biologist Elizabeth Blackburn has said that when she was a young post-doc, “Telomeres just grabbed me and kept leading me on.” And lead her on they did—all the way to the Nobel Prize in Medicine in 2009. Telomeres are DNA sequences that continue to fascinate researchers and the public, partially because people with longer telomeres tend to live longer. So the recent finding that older men father offspring with unusually lengthy telomeres sounds like great news. Men of advanced age will give their children the gift of longer lives—right? But as is so often the case in biology, things aren’t that simple, and having an old father may not be an easy route to a long and healthy life.

Every time a piece of DNA gets copied, it can end up with errors in its sequence, or mutations. One of the most frequent changes is losing scraps of information from each end of the strand. Luckily, these strands are capped with telomeres, repeating sequences that do not code for any proteins and serve only to protect the rest of the DNA. Each time the DNA makes a copy, its telomeres get shorter, until these protective ends wear away to nothing. Without telomeres, the DNA cannot make any more copies, and the cell containing it will die.

But sperm are not subject to this telomere-shortening effect. In fact, the telomeres in sperm-producing stem cells not only resist degrading, they actually grow. This may be thanks to a high concentration of the telomere-repairing enzyme telomerase in the testicles; researchers are still uncertain. All they know is that the older the man, the longer the telomeres in his sperm will be.