Harold Hamm, the Oklahoma-based founder and CEO of Continental Resources, the 14th-largest oil company in America, is a man who thinks big. He came to Washington last month to spread a needed message of economic optimism: With the right set of national energy policies, the United States could be "completely energy independent by the end of the decade. We can be the Saudi Arabia of oil and natural gas in the 21st century."

"President Obama is riding the wrong horse on energy," he adds. We can't come anywhere near the scale of energy production to achieve energy independence by pouring tax dollars into "green energy" sources like wind and solar, he argues. It has to come from oil and gas.

You'd expect an oilman to make the "drill, baby, drill" pitch. But since 2005 America truly has been in the midst of a revolution in oil and natural gas, which is the nation's fastest-growing manufacturing sector. No one is more responsible for that resurgence than Mr. Hamm. He was the original discoverer of the gigantic and prolific Bakken oil fields of Montana and North Dakota that have already helped move the U.S. into third place among world oil producers.

How much oil does Bakken have? The official estimate of the U.S. Geological Survey a few years ago was between four and five billion barrels. Mr. Hamm disagrees: "No way. We estimate that the entire field, fully developed, in Bakken is 24 billion barrels."
If he's right, that'll double America's proven oil reserves. "Bakken is almost twice as big as the oil reserve in Prudhoe Bay, Alaska," he continues. According to Department of Energy data, North Dakota is on pace to surpass California in oil production in the next few years. Mr. Hamm explains over lunch in Washington, D.C., that the more his company drills, the more oil it finds. Continental Resources has seen its "proved reserves" of oil and natural gas (mostly in North Dakota) skyrocket to 421 million barrels this summer from 118 million barrels in 2006.

"We expect our reserves and production to triple over the next five years." And for those who think this oil find is only making Mr. Hamm rich, he notes that today in America "there are 10 million royalty owners across the country" who receive payments for the oil drilled on their land. "The wealth is being widely shared."
One reason for the renaissance has been OPEC's erosion of market power. "For nearly 50 years in this country nobody looked for oil here and drilling was in steady decline. Every time the domestic industry picked itself up, the Saudis would open the taps and drown us with cheap oil," he recalls. "They had unlimited production capacity, and company after company would go bust."

Zina Saunders

Today OPEC's market share is falling and no longer dictates the world price. This is huge, Mr. Hamm says. "Finally we have an opportunity to go out and explore for oil and drill without fear of price collapse." When OPEC was at its peak in the 1990s, the U.S. imported about two-thirds of its oil. Now we import less than half of it, and about 40% of what we do import comes from Mexico and Canada. That's why Mr. Hamm thinks North America can achieve oil independence.

While the most common device for converting light into electricity may be photovoltaic (PV) solar cells, a variety of other devices can perform the same light-to-electricity conversion, such as solar-thermal collectors and rectennas. In a new study, engineers have designed a new device that can convert light of infrared (IR) and visible wavelengths into direct current by using surface plasmon excitations in a simple metal-insulator-metal (MIM) device.

The researchers, Fuming Wang and Nicholas A. Melosh of Stanford University, have published their study on the new device in a recent issue of Nano Letters. “The greatest significance thus far is to show an alternative method to rectennas and PV devices for IR and visible light conversion,” Melosh told PhysOrg.com. “The conversion efficiencies aren't amazingly high compared to a PV in visible, so it’s not going to replace PVs, but it could be used for energy scavenging later on.” The new device’s MIM architecture is similar to that of a rectenna. However, whereas rectennas operate with long-wavelength light such as microwaves and radio waves, the new device operates with a broad spectrum of infrared to visible wavelengths. When the MIM device is illuminated, incoming photons are absorbed by the top and bottom metal electrodes. Upon absorption, each photon excites an electron in the metal into a higher energy state so that it becomes a “hot electron.” About half of the hot electrons travel toward the metal-insulator interface, where they may be collected by the other electrode. However, photon absorption in the upper and lower electrodes generates currents with opposite signs, so a net DC current is achieved only if the absorption is larger at one electrode than the other.

This ability to maximize current from one electrode while minimizing it from the other is one of the biggest challenges for MIM devices. To do this, researchers can change the thicknesses of the electrodes. However, there is a tradeoff, since in a thicker electrode, more photons are absorbed but fewer electrons reach the interface due to increased scattering. Wang and Melosh’s solution is to use a prism to excite surface plamons (SPs) on the metal surface of the electrodes when under illumination. The SPs, which are small electron oscillations, can create a higher concentration of hot electrons in one electrode by efficiently coupling to light. The SP coupling efficiency depends on several factors, such as the thickness of the electrode, the type of metal used, and the wavelength of incoming light.

The impulse to invest significance in the bodies of the dead has usually been a religious one. Yet even my atheist father cared about the treatment of his remains, says Sarah Murray

Clay figures mark the Day of the Dead in Mexico: during the annual holiday, people pray for and honour their deceased relatives and friends
“Listen, darling,” my mother told me in her no-nonsense voice. “I know how to do this—I’ve done it with soap powder.” Using a pair of kitchen scissors, she was snipping away at a thick plastic bag to allow its contents, tightly packed inside a steel pot, to flow out more freely. I’d suggested that if we made too large an opening in the bag, its contents would spill out in a rush. My mother, however, was in household-chore mode and was convinced her method would work. She was right, and suddenly our task became much easier. But before we could continue, we found ourselves on the floor, weeping with laughter at the absurdity of what she had just said. For the dust we were decanting wasn’t soap powder. It was once a living, breathing human being: my father.
It was the day my mother and I had picked to scatter his ashes. We’d embarked on the rather curious task of getting them into a bag as soon as we had taken off the lid of the urn and realised they would be impossible to release from within the plastic container the funeral home had stuffed inside it. This procedure was probably not something my father had anticipated. But for everything else, he’d left instructions.

In a letter, he had asked that we scatter his remains around a church in a beautiful Dorset village near our family home and near where two of his closest friends were buried—not an unusual request, you might think. Except that he was a lifelong atheist and had always insisted that the “organic matter” left after a person takes their last breath (his included) had no significance whatsoever. That person had gone.

Looking down at his ashes, I wondered if he was right. How much of him was in that bag? What exactly are we, after our heart stops beating—person, or object?

As visitors lined up this summer to see the medieval casketed relics of saints at the British Museum’s “Treasures of Heaven” exhibition, I was reminded that our intense relationship with human remains is a longstanding one. For the faithful, the “organic matter” of saints is potent stuff. Wander through Europe’s ancient churches and cathedrals and, on an altar or in a side chapel, you find gilded caskets, elaborately decorated and with glass sides permitting a glimpse of what lies within. That’s often something decidedly grim: an elbow splinter, a blackened fingerbone or, if you’re lucky, a whole skull. They aren’t much to look at. But since the dawn of Christendom, fragments of dead martyrs and saints have been venerated as holy objects.
Europeans aren’t alone in their fondness for sacred remains. A Muslim pilgrimage shrine in Srinagar, India, houses a whisker—displayed on important religious days—believed to have come from the beard of the Prophet. In Sri Lanka, the town of Kandy is home to the Buddha’s tooth. In an inner chamber of the Dalada Maligawa (temple of the tooth), it sits in a glass case surrounded by gold and silk decorations and is put on display every five years (when I saw it some time ago, I thought it looked surprisingly large and rather yellow).

The notion that computers can think, or that one day they will do so, is rooted in one of two complementary misunderstandings. The first relates to the nature of computers and the second to the nature of thought. That these misunderstandings have had such a powerful hold on the minds of many otherwise intelligent people is due to a tendency to take useful metaphors – describing what computers do and how they do it – as literal truth.

Consider, first, misunderstanding about the nature of computers. Most people would agree that the computers we have at present are not conscious: the latest Super-Cray with gigabytes of memory is no less zomboid than a pocket calculator. But there is the feeling that at some stage, as a result of increasing computational power and in something called "complexity", the artefact that possesses this power and this complexity will wake up to its own existence, or at the very least, experience the transactions which take place in, through, and around it.

We should treat this claim with extreme scepticism because those who say that conscious computers are around the corner are not able to specify what features conscious computers will have in addition to those possessed by our current unconscious ones. There was a fashion in the 1980s and 1990s for invoking alternative architectures – in particular parallel rather than serial processing – as the basis for computers that would be aware of themselves. This fashion has now passed and the conceptual cupboard of the conscious-computers-round-the-corner brigade is now empty. We therefore have no reason for expecting that computers will be anything other than extremely complex devices in which unconscious electrical impulses pass into and out of unconscious electrical circuits and interact with any number of devices connected directly or indirectly to them.

As for thought, this has been even more profoundly misunderstood. Some have argued that thought does not require consciousness, so that computers can think, or will one day think, even though they will never be conscious. Thoughts, like other so-called conscious activities, are merely causal way-stations between inputs such as sense experience and outputs such as behaviour. They do not have to be conscious; indeed, consciousness contributes nothing to their causal efficacy. It requires no equipment or subtle argument to demonstrate that this is nonsense. All you need is to focus on the thoughts you are having now. To deny that thought is conscious is self-refuting: you cannot deny the consciousness of your thoughts without being conscious of doing so. And to claim that conscious thought, or indeed consciousness, has a central role in our lives belongs to an extreme behaviourism that is not able to explain even ordinary human behaviour.

In 1853 the Reverend Frederick Morris, an opponent of Charles Darwin’s and a man with a Victorian sense of propriety, urged his parishioners to emulate the fidelity of a small bird called the dunnock. Be thou like the dunnock, he told them – the female and the male impeccably faithful to each other.

Charles Robert Darwin. A copy made by John Collier (1850-1934) in 1883 of his 1881 portrait of Charles Darwin. According to Darwin's son Erasmus, "The picture is a replica of the one in the rooms in the Linnaean Society and was made by Collier after the original. I took some trouble about it and as a likeness it is an improvement on the original." Given to the National Portrait Gallery, London in 1896. See source website for additional information. (Photo credit: Wikipedia)

What would the Rev. Morris have made of the scandalous truth? Far from being monogamous, the dunnocks, from a Victorian point of view, have shockingly lax morals. The female dunnock often takes not one but two males as partners. The best a stern man of religion could say about dunnocks is that there’s no superfluous bump and grind when they mate – it’s strictly fertilisation business, over in 0.1 seconds. Fast enough to do it while your mother’s back is turned.

Tim Birkhead and his fellow evolutionary biologists, exploring the nature of sexuality across species from single-celled organisms to humankind, are the paparazzi of the science world. They travel to remote islands and put up with extreme discomfort in the hope of catching animals having sex with each other, and when they do, splash their names and their pictures over the pages of the science journals. It doesn’t always work out. Fiona Hunter and a colleague, later to expose what the mainstream media dubbed ‘penguin prostitution’ in the Antarctic, once watched a colony of fulmars on Fair Isle for 56 days on the trot, 18 hours a day, only to find the species relatively faithful: a mere 16 per cent of females had sex with a bird who wasn’t their partner, and there were no ‘illegitimate’ chicks. This isn’t a glamorous pursuit. Geoff Parker, one of the human heroes of Birkhead’s story, spent months with his face a few centimetres away from fresh cowpats, watching female dungflies being aggressively mounted by two males in turn. Sometimes the biologists witness scenes more disturbing than they had anticipated: Mats Olsson, observing the rape-like mating of the Lake Eyre dragon in Australia, saw a male lizard bite his female victim so hard while impregnating her that she died.

Often it is not enough to be a mere voyeur with a long lens. Like a manipulative aristocrat in a Jacobean drama, the intrepid investigator arranges things: Birkhead gets live zebra finches to mate with dead ones, Nicholas Davies and Ian Hartley make it possible for female dunnocks to take a third husband.

Decades of accumulated work of this kind have changed our understanding of the nature of sex, reproduction and the different roles of male and female. From Darwin’s time up to the late 1960s – not coincidentally, the time when the intellectual assault on male-centred academic thinking got under way in earnest – it was thought that male animals competed for female partners, with the strongest and most attractive impregnating the most females; that females sought only monogamy, and if they did have sex with multiple partners (and biologists couldn’t help noticing that they did) it was against their will, always a form of submission to rape.
In the past thirty years, the conventional wisdom has been destroyed. The truth is that females of most species actively seek multiple partners to have sex with. If the aim of males is to put their sperm into as many females as possible, females are trying, with equal determination, to get the very best sperm to fertilise their eggs – even if that means having sex with many males in turn.

Rivalry between males and discrimination by females extends beyond the sexual act itself. Inside the female, the sperm of different males fight for supremacy – this is sperm competition. At the same time, the female may be able to select the sperm that are best for her – this is sperm choice. This is the true battle of the sexes. The males and females of each species are permanently locked in a struggle to out-evolve each other as their reproductive equipment and behaviour change to achieve their conflicting aims – i.e. maximum fertilisation v. best fertilisation.

Thursday, October 25, 2012

Americans have tremendous fear of aging — and a great deal of prejudice against the elderly. But, as the joke has it, being old is better than the alternative. And, despite our fears, new research suggests that being old is also a lot better than it looks.22

I spoke with Dr. Marc Agronin, whose new book, How We Age: A Doctor’s Journey Into the Heart of Growing Old, explores these issues through the rich stories of his patients’ lives. Agronin is the psychiatrist for the Miami Jewish Health Systems, a nonprofit that is the largest provider of health care for seniors in the Southeast.What led you to want to work with older people?
When I went to medical school, I knew I wanted to go into psychiatry, that was a given. In my second year, by a chance encounter I began working with a geriatric psychiatrist and I loved the work. I took an immediate interest in working with older patients. They reminded me not only of my grandparents but also of my wonderful aunts and uncles. I felt that there was so much I could learn from these elders that I was just drawn to it and never looked back.Many people would think that would be a very depressing field of medicine.
I see that all the time and experienced it even when I was doing my training. The older patients were often the ones that the students didn’t want to be around.
[But] the difference wasn’t just in attitude for me: whenever I had experience with older individuals I quickly had deep appreciation for not only their life experience, but also the gratitude they had when someone younger would spend time with them. It was always a positive experience. I never regarded it as something frightening or unpleasant.
I love the stories they tell and hearing about history. So, for me, writing this book was natural — it would be full of stories.Research now suggests that as we age, our moods improve and we actually grow happier.
There’s great potential. It didn’t occur to me right away how much people learn and grow as they age. That message really transformed my view of aging. I wasn’t trained to look for strengths, but found over time that those strengths are the things that get people through difficult times, whether psychological or spiritual or emotional. They also allow people not only to overcome challenges but even to thrive. The more I saw, the more impressed I was.
Aging, in spite of the inevitable [challenges], is a long process and has its rewards. Amid all the challenges, often the rewards not only balance things out, but most individuals experience a greater degree of well-being and a deeper sense of meaning than they do when they are younger. I often find I get a little taste of that. I feel when I’m with older individuals, I’m as close to the fountain of knowledge as one can get.

Odds are you sometimes think about calories. They are among the most often counted things in the universe. When the calorie was originally conceived it was in the context of human work. More calories meant more capacity for work, more chemical fire with which to get the job done, coal in the human stove. Fat, it has been estimated, has nine calories per gram, whereas carbohydrates and proteins have just four; fiber is sometimes counted separately and gets awarded a piddling two. Every box of every food you have ever bought is labeled based on these estimates; too bad then that they are so often wrong.

A Food is Not a Food—Estimates of the number of calories in different kinds of foods measure the average number of calories we could get from those foods based only on the proportions of fat, carbohydrates, protein and sometimes fiber they contain (In essence, calories ingested minus calories egested). A variety of standard systems exist, all of which derive from the original developed by Wilbur Atwater more than a hundred years ago. They are all systems of averages. No food is average.

Differences exist even within a given kind of food. Take, for example, cooked vegetables. Cell walls in some plants are tougher to break down than those in others; nature, of course, varies in everything. If the plant material we eat has more of its cell walls broken down we can more of the calories from the goodies inside. In some plants, cooking ruptures most cell walls; in others, such as cassava, cell walls hold strong and hoard their precious calories in such a way that many of them pass through our bodies intact.
It is not just cooked vegetables though. Nuts flagrantly do their own thing, which might be expected given that nuts are really seeds whose mothers are invested in having them escape digestion. Peanuts, pistachios and almonds all seem to be less completely digested than their levels of protein, fat, carbohydrates and fiber would suggest. How much? Just this month, a new study by Janet Novotny and colleagues at the USDA found that when the “average” person eats almonds she receives just 128 calories per serving rather than the 170 calories “on the label.”

[Image 1. Some of the calories our bodies do not digest go to the dung beetles
and flies whose empire rises around our inefficiencies.
Photo of the species Garreta nitens by Piotr Naskrecki]

It is not totally clear why nuts such as almonds or pistachios yield fewer calories than they “should.” Tough cell walls? Maybe. But there are other options too, if not for the nuts themselves then for other foods.

For one, our bodies seem to expend different quantities of energy to deal with different kinds of food (the energy expended produces heat and so is referred to by scientists as “diet-induced thermogensis”); some foods require us to do more work than others. Proteins can require ten to twenty times as much heat-energy to digest as fats, but the loss of calories as heat energy is not accounted for at all on packaging.

So
you’ve labored with sweat and tears writing your résumé, and now you’re
all set to turn it into a magnificently designed creation.
Unfortunately, with the freedom of modern computers and fancy software,
comes huge opportunities for abuse. When it comes to résumés, both
non-designers and professional designers commit some almost unforgivable
sins. Here are the 7 deadly sins of résumé design and how to repent:

David Stove once ran a contest
to find the Worst Argument In The World, but he awarded the prize to
his own entry, and one that shored up his politics to boot. It hardly
seems like an objective process.

If he can unilaterally declare a
Worst Argument, then so can I. I declare the Worst Argument In The
World to be this: "X is in a category whose archetypal member gives us a
certain emotional reaction. Therefore, we should apply that emotional
reaction to X, even though it is not a central category member."

Call it the Noncentral Fallacy. It sounds dumb when you put it like that. Who even does that, anyway?

It sounds dumb only because we are talking soberly of categories and features. As soon as the argument gets framed in terms of words,
it becomes so powerful that somewhere between many and most of the bad
arguments in politics, philosophy and culture take some form of the
noncentral fallacy. Before we get to those, let's look at a simpler
example.

Suppose someone wants to build a statue honoring Martin
Luther King Jr. for his nonviolent resistance to racism. An opponent of
the statue objects: "But Martin Luther King was a criminal!"

Any
historian can confirm this is correct. A criminal is technically
someone who breaks the law, and King knowingly broke a law against
peaceful anti-segregation protest - hence his famous Letter from
Birmingham Jail.

But in this case calling Martin Luther King a
criminal is the noncentral. The archetypal criminal is a mugger or bank
robber. He is driven only by greed, preys on the innocent, and weakens
the fabric of society. Since we don't like these things, calling someone
a "criminal" naturally lowers our opinion of them.

The opponent
is saying "Because you don't like criminals, and Martin Luther King is a
criminal, you should stop liking Martin Luther King." But King doesn't
share the important criminal features of being driven by greed, preying
on the innocent, or weakening the fabric of society that made us dislike
criminals in the first place. Therefore, even though he is a criminal,
there is no reason to dislike King.

This all seems so nice and logical when it's presented in this
format. Unfortunately, it's also one hundred percent contrary to
instinct: the urge is to respond "Martin Luther King? A criminal? No he
wasn't! You take that back!" This is why the noncentral is so
successful. As soon as you do that you've fallen into their trap. Your
argument is no longer about whether you should build a statue, it's
about whether King was a criminal. Since he was, you have now lost the
argument.

Ideally, you should just be able to say "Well, King was
the good kind of criminal." But that seems pretty tough as a debating
maneuver, and it may be even harder in some of the cases where the
noncentral Fallacy is commonly used.

Now I want to list some of these cases. Many will be political1, for which I apologize,
but it's hard to separate out a bad argument from its specific
instantiations. None of these examples are meant to imply that the
position they support is wrong (and in fact I myself hold some of them).
They only show that certain particular arguments for the position are
flawed, such as:

"Abortion is murder!"
The archetypal murder is Charles Manson breaking into your house and
shooting you. This sort of murder is bad for a number of reasons: you
prefer not to die, you have various thoughts and hopes and dreams that
would be snuffed out, your family and friends would be heartbroken, and
the rest of society has to live in fear until Manson gets caught. If you
define murder as "killing another human being", then abortion is
technically murder. But it has none of the downsides of murder Charles
Manson style. Although you can criticize abortion for many reasons,
insofar as "abortion is murder" is an invitation to apply one's feelings
in the Manson case directly to the abortion case, it ignores the latter's lack of the features that generated those intuitions in the first place2.

"Genetic engineering to cure diseases is eugenics!"
Okay, you've got me there: since eugenics means "trying to improve the
gene pool" that's clearly right. But what's wrong with eugenics? "What's
wrong with eugenics? Hitler did eugenics! Those unethical scientists in
the 1950s who sterilized black women without their consent did
eugenics!" "And what was wrong with what Hitler and those unethical
scientists did?" "What do you mean, what was wrong with them? Hitler
killed millions of people! Those unethical scientists ruined people's
lives." "And does using genetic engineering to cure diseases kill
millions of people, or ruin anyone's life?" "Well...not really." "Then
what's wrong with it?" "It's eugenics!"